Updates from: 03/30/2022 01:19:04
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Partner Eid Me https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-eid-me.md
The following architecture diagram shows the implementation.
[Contact eID-Me](https://bluink.ca/contact) and configure a test or production environment to set up Azure AD B2C tenants as a Relying Party. Tenants must determine what identity claims they'll need from their consumers as they sign up using eID-Me.
-## Integrate eID-Me with Azure AD B2C
-
-### Step 1 - Configure an application in eID-Me
+## Step 1: Configure an application in eID-Me
To configure your tenant application as a Relying Party in eID-Me the following information should be supplied to eID-Me:
eID-Me will provide a Client ID and a Client Secret once the Relying Party has b
::: zone pivot="b2c-user-flow"
-### Step 2 - Add a new Identity provider in Azure AD B2C
+## Step 2: Add a new Identity provider in Azure AD B2C
1. Sign in to the [Azure portal](https://portal.azure.com/#home) as the global administrator of your Azure AD B2C tenant.
eID-Me will provide a Client ID and a Client Secret once the Relying Party has b
6. Select **Add**.
-### Step 3 - Configure an Identity provider
+## Step 3: Configure an Identity provider
To configure an identity provider, follow these steps:
To configure an identity provider, follow these steps:
6. Select **Save** to complete the setup for your new OIDC Identity provider.
-### Step 4 - Configure multi-factor authentication
+## Step 4: Configure multi-factor authentication
eID-Me is a decentralized digital identity with strong two-factor user authentication built in. Since eID-Me is already a multi-factor authenticator, you don't need to configure any multi-factor authentication settings in your user flows when using eID-Me. eID-Me offers a fast and simple user experience, which also eliminates the need for any additional passwords.
-### Step 5 - Create a user flow policy
+## Step 5: Create a user flow policy
You should now see eID-Me as a new OIDC Identity provider listed within your B2C identity providers.
For additional information, review the following articles:
>[!NOTE] >In Azure AD B2C, [**custom policies**](./user-flow-overview.md) are designed primarily to address complex scenarios. For most scenarios, we recommend that you use built-in [**user flows**](./user-flow-overview.md).
-### Step 2 - Create a policy key
+## Step 2: Create a policy key
Store the client secret that you previously recorded in your Azure AD B2C tenant.
Store the client secret that you previously recorded in your Azure AD B2C tenant
11. Select **Create**.
-### Step 3- Configure eID-Me as an Identity provider
+## Step 3: Configure eID-Me as an Identity provider
To enable users to sign in using eID-Me decentralized identity, you need to define eID-Me as a claims provider that Azure AD B2C can communicate with through an endpoint. The endpoint provides a set of claims that are used by Azure AD B2C to verify a specific user has authenticated using digital ID available on their device, proving the userΓÇÖs identity.
There are additional identity claims that eID-Me supports and can be added.
```
-### Step 4 - Add a user journey
+## Step 4: Add a user journey
At this point, the identity provider has been set up, but it's not yet available in any of the sign-in pages. If you don't have your own custom user journey, create a duplicate of an existing template user journey, otherwise continue to the next step.
At this point, the identity provider has been set up, but it's not yet available
5. Rename the ID of the user journey. For example, ID=`CustomSignUpSignIn`
-### Step 5 - Add the identity provider to a user journey
+## Step 5: Add the identity provider to a user journey
Now that you have a user journey, add the new identity provider to the user journey.
Now that you have a user journey, add the new identity provider to the user jour
```
-### Step 6 - Configure the relying party policy
+## Step 6: Configure the relying party policy
The relying party policy specifies the user journey which Azure AD B2C will execute. You can also control what claims are passed to your application by adjusting the **OutputClaims** element of the **eID-Me-OIDC-Signup** TechnicalProfile element. In this sample, the application will receive the userΓÇÖs postal code, locality, region, IAL, portrait, middle name, and birth date. It also receives the boolean **signupConditionsSatisfied** claim, which indicates whether an account has been created or not:
The relying party policy specifies the user journey which Azure AD B2C will exec
```
-### Step 7 - Upload the custom policy
+## Step 7: Upload the custom policy
1. Sign in to the [Azure portal](https://portal.azure.com/#home).
The relying party policy specifies the user journey which Azure AD B2C will exec
5. Under Policies, select **Identity Experience Framework**. Select **Upload Custom Policy**, and then upload the two policy files that you changed, in the following order: the extension policy, for example `TrustFrameworkBase.xml`, then the relying party policy, such as `SignUp.xml`.
-### Step 8 - Test your custom policy
+## Step 8: Test your custom policy
1. Select your relying party policy, for example `B2C_1A_signup`.
active-directory-b2c Partner Gallery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-gallery.md
Microsoft partners with the following ISVs for MFA and Passwordless authenticati
| ![Screenshot of a twilio logo.](./medi) provides multiple solutions to enable MFA through SMS one-time password (OTP), time-based one-time password (TOTP), and push notifications, and to comply with SCA requirements for PSD2. | | ![Screenshot of a typingDNA logo](./medi) enables strong customer authentication by analyzing a userΓÇÖs typing pattern. It helps companies enable a silent MFA and comply with SCA requirements for PSD2. | | ![Screenshot of a whoiam logo](./medi) is a Branded Identity Management System (BRIMS) application that enables organizations to verify their user base by voice, SMS, and email. |
+| ![Screenshot of a xid logo](./medi) is a digital ID solution that provides users with passwordless, secure, multifactor authentication. xID-authenticated users obtain their identities verified by a My Number Card, the digital ID card issued by the Japanese government. Organizations can get users verified Personal Identification Information (PII) through the xID API. |
## Role-based access control
active-directory-b2c Partner Xid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/partner-xid.md
+
+ Title: Configure Azure Active Directory B2C with xID
+
+description: Configure Azure Active Directory B2C with xID for passwordless authentication
++++++ Last updated : 03/18/2022++++
+# Configure xID with Azure Active Directory B2C for passwordless authentication
+
+In this sample tutorial, learn how to integrate Azure Active Directory B2C (Azure AD B2C) authentication with the xID digital ID solution. The xID app provides users with passwordless, secure, multifactor authentication. xID-authenticated users obtain their identities verified by a My Number Card, the digital ID card issued by the Japanese government. Organizations can get users verified Personal Identification Information (customer content) through the xID API. Furthermore, the xID app generates a private key in a secure area within userΓÇÖs mobile device, which can be used as a digital signing device.
++
+## Prerequisites
+
+To get started, you'll need:
+
+- An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+
+- An [Azure AD B2C tenant](./tutorial-create-tenant.md) that's linked to your Azure subscription.
+
+- Your xID client information provided by xID inc. [Contact xID](https://xid.inc/contact-us) for the xID client information that should include the following parameters:
+ - Client ID
+ - Client Secret
+ - Redirect URL
+ - Scopes
+- Download and install the [xID app](https://x-id.me/) on your mobile device.
+ - To complete registration, you'll need your own My Number Card.
+ - If you use the UAT version of API, you'll also need UAT version of xID app. To install UAT app, [contact xID inc](https://xid.inc/contact-us).
+
+## Scenario description
+
+The following architecture diagram shows the implementation.
+
+![image shows the architecture diagram](./media/partner-xid/partner-xid-architecture-diagram.png)
+
+| Step | Description |
+|:--|:--|
+| 1. |User opens Azure AD B2C's sign in page, and then signs in or signs up by entering their username. |
+| 2. |Azure AD B2C redirects the user to xID authorize API endpoint using an OpenID Connect (OIDC) request. An OIDC endpoint is available containing information about the endpoints. xID Identity provider (IdP) redirects the user to the xID authorization sign in page, allows the user to fill in or select their email address. |
+| 3. |xID IdP sends the push notification to the userΓÇÖs mobile device. |
+| 4. |The user opens the xID app and checks the request, then enters the PIN or authenticates with their biometrics. If PIN or biometrics is successfully verified, xID app activates the private key and creates an electronic signature. |
+| 5. |xID app sends the signature to xID IdP for verification. |
+| 6. |xID IdP shows consent screen to the user, requesting authorization to give their personal information to the service they're signing in. |
+| 7. |xID IdP returns the OAuth authorization code to Azure AD B2C. |
+| 8. |Using the authorization code, Azure AD B2C sends a token request. |
+| 9. |xID IdP checks the token request, and if still valid, returns the OAuth access token and the ID token containing the requested userΓÇÖs identifier and email address. |
+| 10. |In addition, if the user's customer content is needed, Azure AD B2C calls the xID userdata API. |
+| 11. |The xID userdata API returns the userΓÇÖs encrypted customer content. User can decrypt it with their private key, which they create when they request the xID client information. |
+| 12. | User is either granted or denied access to the customer application based on the verification results. |
++
+## Onboard with xID
+
+Request for API documents by filling out [the form](https://xid.inc/contact-us). In the message field, indicate that you would like to onboard with Azure AD B2C. The xID sales representatives will contact you. Follow the instructions provided in the xID API document and request a xID API client. xID tech team will send client information to you in 3-4 working days.
+
+## Step 1: Create a xID policy key
+
+Store the client secret that you received from xID in your Azure AD B2C tenant.
+
+1. Sign in to the [Azure portal](https://portal.azure.com/).
+
+2. Make sure you're using the directory that contains your Azure AD B2C tenant:
+
+ a. Select the **Directories + subscriptions** icon in the portal toolbar.
+
+ b. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the Directory name list, and then select **Switch**.
+
+3. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
+
+4. On the Overview page, select **Identity Experience Framework**.
+
+5. Select **Policy Keys** and then select **Add**.
+
+6. For **Options**, choose `Manual`.
+
+7. Enter a **Name** for the policy key. For example, `X-IDClientSecret`. The prefix `B2C_1A_` is added automatically to the name of your key.
+
+8. In **Secret**, enter your client secret that you previously received from xID.
+
+9. For **Key usage**, select `Signature`.
+
+10. Select **Create**.
+
+>[!NOTE]
+>In Azure AD B2C, [**custom policies**](./user-flow-overview.md) are designed primarily to address complex scenarios.
+
+## Step 2: Configure xID as an Identity provider
+
+To enable users to sign in using xID, you need to define xID as a claims provider that Azure AD B2C can communicate with through an endpoint. The endpoint provides a set of claims that are used by Azure AD B2C to verify that a specific user has authenticated using digital identity available on their device, proving the userΓÇÖs identity.
+
+Use the following steps to add xID as a claims provider:
+
+1. Get the custom policy starter packs from GitHub, then update the XML files in the SocialAndLocalAccounts starter pack with your Azure AD B2C tenant name:
+
+ i. Download the [.zip file](https://github.com/Azure-Samples/active-directory-b2c-custom-policy-starterpack/archive/master.zip) or [clone the repository](https://github.com/Azure-Samples/active-directory-b2c-custom-policy-starterpack).
+
+ ii. In all of the files in the **LocalAccounts** directory, replace the string `yourtenant` with the name of your Azure AD B2C tenant. For example, if the name of your B2C tenant is `contoso`, all instances of `yourtenant.onmicrosoft.com` become `contoso.onmicrosoft.com`.
+
+2. Open the `LocalAccounts/ TrustFrameworkExtensions.xml`.
+
+3. Find the **ClaimsProviders** element. If it doesn't exist, add it under the root element.
+
+4. Add a new **ClaimsProvider** similar to the one shown below:
+
+ ```xml
+
+ <ClaimsProvider>
+ <Domain>X-ID</Domain>
+ <DisplayName>X-ID</DisplayName>
+ <TechnicalProfiles>
+ <TechnicalProfile Id="X-ID-Oauth2">
+ <DisplayName>X-ID</DisplayName>
+ <Description>Login with your X-ID account</Description>
+ <Protocol Name="OAuth2" />
+ <Metadata>
+ <Item Key="METADATA">https://oidc-uat.x-id.io/.well-known/openid-configuration</Item>
+ <!-- Update the Client ID below to the X-ID Application ID -->
+ <Item Key="client_id">00000000-0000-0000-0000-000000000000</Item>
+ <Item Key="response_types">code</Item>
+ <Item Key="scope">openid verification</Item>
+ <Item Key="response_mode">query</Item>
+ <Item Key="HttpBinding">POST</Item>
+ <Item Key="UsePolicyInRedirectUri">false</Item>
+ <Item Key="DiscoverMetadataByTokenIssuer">true</Item>
+ <Item Key="token_endpoint_auth_method">client_secret_basic</Item>
+ <Item Key="ClaimsEndpoint">https://oidc-uat.x-id.io/userinfo</Item>
+ </Metadata>
+ <CryptographicKeys>
+ <Key Id="client_secret" StorageReferenceId="B2C_1A_X-IDClientSecret" />
+ </CryptographicKeys>
+ <OutputClaims>
+ <OutputClaim ClaimTypeReferenceId="issuerUserId" PartnerClaimType="sub" />
+ <OutputClaim ClaimTypeReferenceId="tenantId" PartnerClaimType="tid" />
+ <OutputClaim ClaimTypeReferenceId="email" />
+ <OutputClaim ClaimTypeReferenceId="sid" />
+ <OutputClaim ClaimTypeReferenceId="userdataid" />
+ <OutputClaim ClaimTypeReferenceId="X-ID_verified" />
+ <OutputClaim ClaimTypeReferenceId="email_verified" />
+ <OutputClaim ClaimTypeReferenceId="authenticationSource" DefaultValue="socialIdpAuthentication" AlwaysUseDefaultValue="true" />
+ <OutputClaim ClaimTypeReferenceId="identityProvider" PartnerClaimType="iss" DefaultValue="https://oidc-uat.x-id.io/" />
+ <OutputClaim ClaimTypeReferenceId="identityProviderAccessToken" PartnerClaimType="{oauth2:access_token}" />
+ </OutputClaims>
+ <OutputClaimsTransformations>
+ <OutputClaimsTransformation ReferenceId="CreateRandomUPNUserName" />
+ <OutputClaimsTransformation ReferenceId="CreateUserPrincipalName" />
+ <OutputClaimsTransformation ReferenceId="CreateAlternativeSecurityId" />
+ <OutputClaimsTransformation ReferenceId="CreateSubjectClaimFromAlternativeSecurityId" />
+ </OutputClaimsTransformations>
+ <UseTechnicalProfileForSessionManagement ReferenceId="SM-SocialLogin" />
+ </TechnicalProfile>
+
+ <TechnicalProfile Id="X-ID-Userdata">
+ <DisplayName>Userdata (Personal Information)</DisplayName>
+ <Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.RestfulProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
+ <Metadata>
+ <Item Key="ServiceUrl">https://api-uat.x-id.io/v4/verification/userdata</Item>
+ <Item Key="SendClaimsIn">Header</Item>
+ <Item Key="AuthenticationType">Bearer</Item>
+ <Item Key="UseClaimAsBearerToken">identityProviderAccessToken</Item>
+ <!-- <Item Key="AllowInsecureAuthInProduction">true</Item> -->
+ <Item Key="DebugMode">true</Item>
+ <Item Key="DefaultUserMessageIfRequestFailed">Cannot process your request right now, please try again later.</Item>
+ </Metadata>
+ <InputClaims>
+ <!-- Claims sent to your REST API -->
+ <InputClaim ClaimTypeReferenceId="identityProviderAccessToken" />
+ </InputClaims>
+ <OutputClaims>
+ <!-- Claims parsed from your REST API -->
+ <OutputClaim ClaimTypeReferenceId="last_name" PartnerClaimType="givenName" />
+ <OutputClaim ClaimTypeReferenceId="first_name" PartnerClaimType="surname" />
+ <OutputClaim ClaimTypeReferenceId="previous_name" />
+ <OutputClaim ClaimTypeReferenceId="year" />
+ <OutputClaim ClaimTypeReferenceId="month" />
+ <OutputClaim ClaimTypeReferenceId="date" />
+ <OutputClaim ClaimTypeReferenceId="prefecture" />
+ <OutputClaim ClaimTypeReferenceId="city" />
+ <OutputClaim ClaimTypeReferenceId="address" />
+ <OutputClaim ClaimTypeReferenceId="sub_char_common_name" />
+ <OutputClaim ClaimTypeReferenceId="sub_char_previous_name" />
+ <OutputClaim ClaimTypeReferenceId="sub_char_address" />
+ <OutputClaim ClaimTypeReferenceId="gender" />
+ <OutputClaim ClaimTypeReferenceId="verified_at" />
+ </OutputClaims>
+ <UseTechnicalProfileForSessionManagement ReferenceId="SM-Noop" />
+ </TechnicalProfile>
+ </TechnicalProfiles>
+ </ClaimsProvider>
+
+ ```
+
+4. Set **client_id** with your xID Application ID.
+
+5. Save the changes.
+
+## Step 3: Add a user journey
+
+At this point, you've set up the identity provider, but it's not yet available in any of the sign in pages. If you've your own custom user journey continue to [step 4](#step-4-add-the-identity-provider-to-a-user-journey), otherwise, create a duplicate of an existing template user journey as follows:
+
+1. Open the `TrustFrameworkBase.xml` file from the starter pack.
+
+2. Find and copy the entire contents of the **UserJourneys** element that includes `ID=SignUpOrSignIn`.
+
+3. Open the `TrustFrameworkExtensions.xml` and find the UserJourneys element. If the element doesn't exist, add one.
+
+4. Paste the entire content of the UserJourney element that you copied as a child of the UserJourneys element.
+
+5. Rename the ID of the user journey. For example, `ID=CustomSignUpSignIn`
+
+## Step 4: Add the identity provider to a user journey
+
+Now that you have a user journey, add the new identity provider to the user journey.
+
+1. Find the orchestration step element that includes Type=`CombinedSignInAndSignUp`, or Type=`ClaimsProviderSelection` in the user journey. It's usually the first orchestration step. The **ClaimsProviderSelections** element contains a list of identity providers that a user can sign in with. The order of the elements controls the order of the sign-in buttons presented to the user. Add a **ClaimsProviderSelection** XML element. Set the value of **TargetClaimsExchangeId** to a friendly name, such as `X-IDExchange`.
+
+2. In the next orchestration step, add a **ClaimsExchange** element. Set the **Id** to the value of the target claims exchange ID to link the xID button to `X-ID-SignIn` action. Update the value of **TechnicalProfileReferenceId** to the ID of the technical profile you created earlier.
+
+ The following XML demonstrates orchestration steps of a user journey with the identity provider:
+
+ ```xml
+
+ <UserJourney Id="X-IDSignUpOrSignIn">
+ <OrchestrationSteps>
+
+ <OrchestrationStep Order="1" Type="CombinedSignInAndSignUp" ContentDefinitionReferenceId="api.signuporsignin">
+ <ClaimsProviderSelections>
+ <ClaimsProviderSelection TargetClaimsExchangeId="X-IDExchange" />
+ </ClaimsProviderSelections>
+ </OrchestrationStep>
+
+ <OrchestrationStep Order="2" Type="ClaimsExchange">
+ <ClaimsExchanges>
+ <ClaimsExchange Id="X-IDExchange" TechnicalProfileReferenceId="X-ID-Oauth2" />
+ </ClaimsExchanges>
+ </OrchestrationStep>
+
+ <OrchestrationStep Order="3" Type="ClaimsExchange">
+ <ClaimsExchanges>
+ <ClaimsExchange Id="X-ID-Userdata" TechnicalProfileReferenceId="X-ID-Userdata" />
+ </ClaimsExchanges>
+ </OrchestrationStep>
+
+ <!-- For social IDP authentication, attempt to find the user account in the directory. -->
+ <OrchestrationStep Order="4" Type="ClaimsExchange">
+ <ClaimsExchanges>
+ <ClaimsExchange Id="AADUserReadUsingAlternativeSecurityId" TechnicalProfileReferenceId="AAD-UserReadUsingAlternativeSecurityId-NoError" />
+ </ClaimsExchanges>
+ </OrchestrationStep>
+
+ <!-- Show self-asserted page only if the directory does not have the user account already (i.e. we do not have an objectId). -->
+ <OrchestrationStep Order="5" Type="ClaimsExchange">
+ <Preconditions>
+ <Precondition Type="ClaimsExist" ExecuteActionsIf="true">
+ <Value>objectId</Value>
+ <Action>SkipThisOrchestrationStep</Action>
+ </Precondition>
+ </Preconditions>
+ <ClaimsExchanges>
+ <ClaimsExchange Id="SelfAsserted-Social" TechnicalProfileReferenceId="SelfAsserted-Social" />
+ </ClaimsExchanges>
+ </OrchestrationStep>
+
+ <!-- The previous step (SelfAsserted-Social) could have been skipped if there were no attributes to collect
+ from the user. So, in that case, create the user in the directory if one does not already exist
+ (verified using objectId which would be set from the last step if account was created in the directory. -->
+ <OrchestrationStep Order="6" Type="ClaimsExchange">
+ <Preconditions>
+ <Precondition Type="ClaimsExist" ExecuteActionsIf="true">
+ <Value>objectId</Value>
+ <Action>SkipThisOrchestrationStep</Action>
+ </Precondition>
+ </Preconditions>
+ <ClaimsExchanges>
+ <ClaimsExchange Id="AADUserWrite" TechnicalProfileReferenceId="AAD-UserWriteUsingAlternativeSecurityId" />
+ </ClaimsExchanges>
+ </OrchestrationStep>
+
+ <OrchestrationStep Order="7" Type="SendClaims" CpimIssuerTechnicalProfileReferenceId="JwtIssuer" />
+
+ </OrchestrationSteps>
+ <ClientDefinition ReferenceId="DefaultWeb" />
+ </UserJourney>
+
+ ```
+
+## Step 5: Upload the custom policy
+
+1. Sign in to the [Azure portal](https://portal.azure.com/#home).
+
+2. Make sure you're using the directory that contains your Azure AD B2C tenant:
+
+ a. Select the **Directories + subscriptions** icon in the portal toolbar.
+
+ b. On the **Portal settings | Directories + subscriptions** page, find your Azure AD B2C directory in the **Directory name** list, and then select **Switch**.
+
+3. In the [Azure portal](https://portal.azure.com/#home), search for and select **Azure AD B2C**.
+
+4. Under Policies, select **Identity Experience Framework**.
+
+5. Select **Upload Custom Policy**, and then upload the files in the **LocalAccounts** starter pack in the following order: the extension policy, for example `TrustFrameworkExtensions.xml`, then the relying party policy, such as `SignUpSignIn.xml`.
+
+## Step 6: Test your custom policy
+
+1. In your Azure AD B2C tenant blade, and under **Policies**, select **Identity Experience Framework**.
+
+1. Under **Custom policies**, select **CustomSignUpSignIn**.
+
+3. For **Application**, select the web application that you previously registered as part of this article's prerequisites. The **Reply URL** should show `https://jwt.ms`.
+
+4. Select **Run now**. Your browser should be redirected to the xID sign in page.
+
+5. If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
+
+## Next steps
+
+For additional information, review the following articles:
+
+- [Custom policies in Azure AD B2C](custom-policy-overview.md)
+
+- [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy)
active-directory Concept Continuous Access Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/concept-continuous-access-evaluation.md
CAE only has insight into [IP-based named locations](../conditional-access/locat
### Named location limitations
-When the sum of all IP ranges specified in location policies exceeds 5,000 for policies that will be enforced on the Resource provider, user change location flow isn't enforced. In this case, Azure AD will issue a one-hour CAE token and won't enforce client location change; security is improved compared to traditional one-hour tokens since we're still evaluating the [other events](#critical-event-evaluation) besides client location change events.
+When the sum of all IP ranges specified in location policies exceeds 5,000, user change location flow won't be enforced by CAE in real time. In this case, Azure AD will issue a one-hour CAE token. CAE will continue enforcing [all other events and policies](#critical-event-evaluation) besides client location change events. With this change, you still maintain stronger security posture compared to traditional one-hour tokens, since [other events](#critical-event-evaluation) will be evaluated in near real time.
### Office and Web Account Manager settings
active-directory Howto Conditional Access Insights Reporting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-insights-reporting.md
To configure a Conditional Access policy in report-only mode:
In order to access the workbook, you need the proper Azure AD permissions as well as Log Analytics workspace permissions. To test whether you have the proper workspace permissions by running a sample log analytics query: 1. Sign in to the **Azure portal**.
-1. Browse to **Azure Active Directory** > **Logs**.
+1. Browse to **Azure Active Directory** > **Log Analytics**.
1. Type `SigninLogs` into the query box and select **Run**. 1. If the query does not return any results, your workspace may not have been configured correctly.
active-directory Howto Conditional Access Policy All Users Mfa https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-all-users-mfa.md
Previously updated : 11/05/2021 Last updated : 03/28/2022
Conditional Access policies are powerful tools, we recommend excluding the follo
* **Emergency access** or **break-glass** accounts to prevent tenant-wide account lockout. In the unlikely scenario all administrators are locked out of your tenant, your emergency-access administrative account can be used to log into the tenant take steps to recover access. * More information can be found in the article, [Manage emergency access accounts in Azure AD](../roles/security-emergency-access.md).
-* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals are not blocked by Conditional Access.
+* **Service accounts** and **service principals**, such as the Azure AD Connect Sync Account. Service accounts are non-interactive accounts that aren't tied to any particular user. They're normally used by back-end services allowing programmatic access to applications, but are also used to sign in to systems for administrative purposes. Service accounts like these should be excluded since MFA can't be completed programmatically. Calls made by service principals aren't blocked by Conditional Access.
* If your organization has these accounts in use in scripts or code, consider replacing them with [managed identities](../managed-identities-azure-resources/overview.md). As a temporary workaround, you can exclude these specific accounts from the baseline policy. ## Application exclusions Organizations may have many cloud applications in use. Not all of those applications may require equal security. For example, the payroll and attendance applications may require MFA but the cafeteria probably doesn't. Administrators can choose to exclude specific applications from their policy.
+### Subscription activation
+
+Organizations that use the [Subscription Activation](/windows/deployment/windows-10-subscription-activation) feature to enable users to ΓÇ£step-upΓÇ¥ from one version of Windows to another, may want to exclude the Universal Store Service APIs and Web Application, AppID 45a330b1-b1ec-4cc1-9161-9f03992aa49f from their all users all cloud apps MFA policy.
+ ## Template deployment Organizations can choose to deploy this policy using the steps outlined below or using the [Conditional Access templates (Preview)](concept-conditional-access-policy-common.md#conditional-access-templates-preview).
active-directory Howto Conditional Access Policy Compliant Device https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/conditional-access/howto-conditional-access-policy-compliant-device.md
Previously updated : 11/05/2021 Last updated : 03/28/2022
After confirming your settings using [report-only mode](howto-conditional-access
On Windows 7, iOS, Android, macOS, and some third-party web browsers, Azure AD identifies the device using a client certificate that is provisioned when the device is registered with Azure AD. When a user first signs in through the browser the user is prompted to select the certificate. The end user must select this certificate before they can continue to use the browser.
+#### Subscription activation
+
+Organizations that use the [Subscription Activation](/windows/deployment/windows-10-subscription-activation) feature to enable users to ΓÇ£step-upΓÇ¥ from one version of Windows to another, may want to exclude the Universal Store Service APIs and Web Application, AppID 45a330b1-b1ec-4cc1-9161-9f03992aa49f from their device compliance policy.
+ ## Next steps [Conditional Access common policies](concept-conditional-access-policy-common.md)
active-directory Howto Create Self Signed Certificate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/howto-create-self-signed-certificate.md
Use the certificate you create using this method to authenticate from an applica
In an elevated PowerShell prompt, run the following command and leave the PowerShell console session open. Replace `{certificateName}` with the name that you wish to give to your certificate. ```powershell-
-$cert = New-SelfSignedCertificate -Subject "CN={certificateName}" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256 ## Replace {certificateName}
+$certname = "{certificateName}" ## Replace {certificateName}
+$cert = New-SelfSignedCertificate -Subject "CN=$certname" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256
```
The **$cert** variable in the previous command stores your certificate in the cu
```powershell
-Export-Certificate -Cert $cert -FilePath "C:\Users\admin\Desktop\{certificateName}.cer" ## Specify your preferred location and replace {certificateName}
+Export-Certificate -Cert $cert -FilePath "C:\Users\admin\Desktop\$certname.cer" ## Specify your preferred location
```
Use this option to create a certificate and its private key if your application
In an elevated PowerShell prompt, run the following command and leave the PowerShell console session open. Replace `{certificateName}` with name that you wish to give your certificate. ```powershell-
-$cert = New-SelfSignedCertificate -Subject "CN={certificateName}" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256 ## Replace {certificateName}
+$certname = "{certificateName}" ## Replace {certificateName}
+$cert = New-SelfSignedCertificate -Subject "CN=$certname" -CertStoreLocation "Cert:\CurrentUser\My" -KeyExportPolicy Exportable -KeySpec Signature -KeyLength 2048 -KeyAlgorithm RSA -HashAlgorithm SHA256
```
The **$cert** variable in the previous command stores your certificate in the cu
```powershell
-Export-Certificate -Cert $cert -FilePath "C:\Users\admin\Desktop\{certificateName}.cer" ## Specify your preferred location and replace {certificateName}
+Export-Certificate -Cert $cert -FilePath "C:\Users\admin\Desktop\$certname.cer" ## Specify your preferred location
```
Now, using the password you stored in the `$mypwd` variable, secure, and export
```powershell
-Export-PfxCertificate -Cert $cert -FilePath "C:\Users\admin\Desktop\{privateKeyName}.pfx" -Password $mypwd ## Specify your preferred location and replace {privateKeyName}
+Export-PfxCertificate -Cert $cert -FilePath "C:\Users\admin\Desktop\$certname.pfx" -Password $mypwd ## Specify your preferred location
```
If you created the certificate using Option 2, you can delete the key pair from
```powershell
-Get-ChildItem -Path "Cert:\CurrentUser\My" | Where-Object {$_.Subject -Match "{certificateName}"} | Select-Object Thumbprint, FriendlyName ## Replace {privateKeyName} with the name you gave your certificate
+Get-ChildItem -Path "Cert:\CurrentUser\My" | Where-Object {$_.Subject -Match "$certname"} | Select-Object Thumbprint, FriendlyName
```
active-directory Tutorial V2 Windows Desktop https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/tutorial-v2-windows-desktop.md
-# Tutorial: Call the Microsoft Graph API from a Windows Desktop app
+# Tutorial: Sign in users and call Microsoft Graph in Windows Presentation Foundation (WPF) desktop app
In this tutorial, you build a native Windows Desktop .NET (XAML) app that signs in users and gets an access token to call the Microsoft Graph API.
active-directory Directory Delete Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/directory-delete-howto.md
If you find that there are still enterprise applications that you can't delete i
1. Open PowerShell as an administrator. 1. Run `Connect-AzAccount -tenant <TENANT_ID>`. 1. Sign in to Azure AD in the Global Administrator role.
-1. Run `Get-AzADServicePrincipal | ForEach-Object {ΓÇïΓÇïΓÇïΓÇïΓÇï Remove-AzADServicePrincipal -ObjectId $_.Id -Force}ΓÇï`.ΓÇïΓÇïΓÇïΓÇï
+1. Run `Get-AzADServicePrincipal | ForEach-Object {ΓÇïΓÇïΓÇïΓÇïΓÇï Remove-AzADServicePrincipal -ObjectId $_.Id }ΓÇï`.ΓÇïΓÇïΓÇïΓÇï
## Trial subscription that blocks deletion
active-directory Groups Assign Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-assign-sensitivity-labels.md
To apply published labels to groups, you must first enable the feature. These st
``` In the **Sign in to your account** page, enter your admin account and password to connect you to your service, and select **Sign in**.
-1. Fetch the current group settings for the Azure AD organization.
+1. Fetch the current group settings for the Azure AD organization and display the current group settings.
```powershell $grpUnifiedSetting = (Get-AzureADDirectorySetting | where -Property DisplayName -Value "Group.Unified" -EQ)
- $template = Get-AzureADDirectorySettingTemplate -Id 62375ab9-6b52-47ed-826b-58e47e0e304b
- $setting = $template.CreateDirectorySetting()
+ $Setting = $grpUnifiedSetting
+ $grpUnifiedSetting.Values
``` > [!NOTE]
- > If no group settings have been created for this Azure AD organization you will get an error that reads "Cannot bind argument to parameter 'Id' because it is null". In this case, you must first create the settings. Follow the steps in [Azure Active Directory cmdlets for configuring group settings](../enterprise-users/groups-settings-cmdlets.md) to create group settings for this Azure AD organization.
-
-1. Next, display the current group settings.
-
- ```powershell
- $Setting.Values
- ```
+ > If no group settings have been created for this Azure AD organization, you will get an empty screen. In this case, you must first create the settings. Follow the steps in [Azure Active Directory cmdlets for configuring group settings](../enterprise-users/groups-settings-cmdlets.md) to create group settings for this Azure AD organization.
+
+ > [!NOTE]
+ > If the sensitivity label has been enabled previously, you will see **EnableMIPLabels** = **True**. In this case, you do not need to do anything.
1. Enable the feature:
active-directory Groups Dynamic Membership https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/groups-dynamic-membership.md
You can also create a rule that selects device objects for membership in a group
> [!NOTE] > systemlabels is a read-only attribute that cannot be set with Intune. >
-> For Windows 10, the correct format of the deviceOSVersion attribute is as follows: (device.deviceOSVersion -eq "10.0.17763"). The formatting can be validated with the Get-MsolDevice PowerShell cmdlet.
+> For Windows 10, the correct format of the deviceOSVersion attribute is as follows: (device.deviceOSVersion -startsWith "10.0.1"). The formatting can be validated with the Get-MsolDevice PowerShell cmdlet.
The following device attributes can be used.
The following device attributes can be used.
accountEnabled | true false | (device.accountEnabled -eq true) displayName | any string value |(device.displayName -eq "Rob iPhone") deviceOSType | any string value | (device.deviceOSType -eq "iPad") -or (device.deviceOSType -eq "iPhone")<br>(device.deviceOSType -contains "AndroidEnterprise")<br>(device.deviceOSType -eq "AndroidForWork")<br>(device.deviceOSType -eq "Windows")
- deviceOSVersion | any string value | (device.deviceOSVersion -eq "9.1")<br>(device.deviceOSVersion -eq "10.0.17763.0")
+ deviceOSVersion | any string value | (device.deviceOSVersion -eq "9.1")<br>(device.deviceOSVersion -startsWith "10.0.1")
deviceCategory | a valid device category name | (device.deviceCategory -eq "BYOD") deviceManufacturer | any string value | (device.deviceManufacturer -eq "Samsung") deviceModel | any string value | (device.deviceModel -eq "iPad Air")
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 03/24/2022 Last updated : 03/29/2022
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on March 23rd, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
+>This information last updated on March 29th, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
><br/> | Product name | String ID | GUID | Service plans included | Service plans included (friendly names) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| MICROSOFT 365 AUDIO CONFERENCING FOR GCC | MCOMEETADV_GOC | 2d3091c7-0712-488b-b3d8-6b97bde6a1f5 | EXCHANGE_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>MCOMEETADV_GOV (f544b08d-1645-4287-82de-8d91f37c02a1) | EXCHANGE FOUNDATION FOR GOVERNMENT (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>MICROSOFT 365 AUDIO CONFERENCING FOR GOVERNMENT (f544b08d-1645-4287-82de-8d91f37c02a1) | | MICROSOFT 365 BUSINESS BASIC | O365_BUSINESS_ESSENTIALS | 3b555118-da6a-4418-894f-7df1e2096870 | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | | MICROSOFT 365 BUSINESS BASIC | SMB_BUSINESS_ESSENTIALS | dab7782a-93b1-4074-8bb1-0e61318bea0b | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) | TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) |
-| MICROSOFT 365 BUSINESS STANDARD | O365_BUSINESS_PREMIUM | f245ecc8-75af-4f8e-b61f-27d8114de5f3 | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)| To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
+| Microsoft 365 Business Standard | O365_BUSINESS_PREMIUM | f245ecc8-75af-4f8e-b61f-27d8114de5f3 | CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>KAIZALA_O365_P2 (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>STREAM_O365_SMB (3c53ea51-d578-46fa-a4c0-fd0a92809a60)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN1 (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>RMS_S_BASIC (31cf2cfc-6b0d-4adc-a336-88b724ed8122)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee) | Common Data Service for Teams (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Exchange Online (Plan 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Business (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>Microsoft Kaizala Pro (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SharePoint (Plan 1) (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Stream for Office 365 (3c53ea51-d578-46fa-a4c0-fd0a92809a60)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 1) (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Common Data Service (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Microsoft Azure Rights Management Service (31cf2cfc-6b0d-4adc-a336-88b724ed8122)<br/>Power Apps for Office 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>Power Automate for Office 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>Power Virtual Agents for Office 365 (041fe683-03e4-45b6-b1af-c0cdc516daee) |
| MICROSOFT 365 BUSINESS STANDARD - PREPAID LEGACY | SMB_BUSINESS_PREMIUM | ac5cef5d-921b-4f97-9ef3-c99076e5470f | BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) | To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_MIDSIZE (41bf139a-4e60-409f-9346-a1361efc6dfb) |
-| MICROSOFT 365 BUSINESS PREMIUM | SPB | cbdc14ab-d96c-4c30-b9f4-6ada7cdc1d46 | AAD_SMB (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_SMBIZ (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>O365_SB_Relationship_Management (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E1 (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINBIZ (8e229017-d77b-43d5-9305-903395523b99)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | AZURE ACTIVE DIRECTORY (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>FLOW FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MICROSOFT INTUNE (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFT BOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OUTLOOK CUSTOMER MANAGER (5bfe124c-bbdc-4494-8835-f1297d457d79)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>POWERAPPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E1 SKU (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINDOWS 10 BUSINESS (8e229017-d77b-43d5-9305-903395523b99)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
+| Microsoft 365 Business Premium | SPB | cbdc14ab-d96c-4c30-b9f4-6ada7cdc1d46 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>M365_LIGHTHOUSE_PARTNER_PLAN1 (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>MDE_SMB (bfc1bbd9-981b-4f71-9b82-17c35fd0e2a4)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>KAIZALA_O365_P2 (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>OFFICE_SHARED_COMPUTER_ACTIVATION (276d6e8a-f056-4f70-b7e8-4fc27f79f809)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN1 (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WINBIZ (8e229017-d77b-43d5-9305-903395523b99)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_SMB (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_SMBIZ (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>STREAM_O365_E1 (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Exchange Online (Plan 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Business (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Lighthouse (Plan 2) (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Defender for Business (bfc1bbd9-981b-4f71-9b82-17c35fd0e2a4)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Forms (Plan E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>Microsoft Kaizala Pro (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Office Shared Computer Activation (276d6e8a-f056-4f70-b7e8-4fc27f79f809)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SharePoint (Plan 1) (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 1) (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Business (8e229017-d77b-43d5-9305-903395523b99)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>Microsoft Stream for Office 365 E1 (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>Power Apps for Office 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>Power Automate for Office 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
| Microsoft 365 Business Voice | BUSINESS_VOICE_MED2 | a6051f20-9cbc-47d2-930d-419183bf6cf1 | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOPSTN1 (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Domestic Calling Plan (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | Microsoft 365 Business Voice (US) | BUSINESS_VOICE_MED2_TELCO | 08d7bce8-6e16-490e-89db-1d508e5e9609 | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOPSTN1 (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Domestic Calling Plan (4ed3ff63-69d7-4fb7-b984-5aec7f605ca8)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | Microsoft 365 Business Voice (without calling plan) | BUSINESS_VOICE_DIRECTROUTING | d52db95a-5ecb-46b6-beb0-190ab5cda4a8 | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | Microsoft 365 Business Voice (without Calling Plan) for US | BUSINESS_VOICE_DIRECTROUTING_MED | 8330dae3-d349-44f7-9cad-1b23c64baabe | MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | MICROSOFT 365 DOMESTIC CALLING PLAN (120 Minutes) | MCOPSTN_5 | 11dee6af-eca8-419f-8061-6864517c1875 | MCOPSTN5 (54a152dc-90de-4996-93d2-bc47e670fc06) | MICROSOFT 365 DOMESTIC CALLING PLAN (120 min) (54a152dc-90de-4996-93d2-bc47e670fc06) | | Microsoft 365 Domestic Calling Plan for GCC | MCOPSTN_1_GOV | 923f58ab-fca1-46a1-92f9-89fda21238a8 | MCOPSTN1_GOV (3c8a8792-7866-409b-bb61-1b20ace0368b)<br/>EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8) | Domestic Calling for Government (3c8a8792-7866-409b-bb61-1b20ace0368b)<br/>Exchange Foundation for Government (922ba911-5694-4e99-a794-73aed9bfeec8) |
-| MICROSOFT 365 E3 | SPE_E3 | 05e9a617-0261-4cee-bb44-138d3ef5d965 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | AZURE ACTIVE DIRECTORY PREMIUM P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>CLOUD APP SECURITY DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>TO-DO (PLAN 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW FOR OFFICE 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MICROSOFT FORMS (PLAN E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MICROSOFT AZURE MULTI-FACTOR AUTHENTICATION (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS FOR OFFICE 365(c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>WINDOWS 10 ENTERPRISE (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>YAMMER ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
+| Microsoft 365 E3 | SPE_E3 | 05e9a617-0261-4cee-bb44-138d3ef5d965 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>M365_LIGHTHOUSE_PARTNER_PLAN1 (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>MDE_LITE (292cc034-7b7c-4950-aaf5-943befd3f1d4)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics - Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Lighthouse (Plan 2) (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Defender for Endpoint Plan 1 (292cc034-7b7c-4950-aaf5-943befd3f1d4)<br/>Microsoft Forms (Plan E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>Microsoft Kaizala Pro (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Common Data Service (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 (041fe683-03e4-45b6-b1af-c0cdc516daee) |
|Microsoft 365 E3 - Unattended License | SPE_E3_RPA1 | c2ac2ee4-9bb1-47e4-8541-d689c7e83371 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION_unattended (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>M365_LIGHTHOUSE_PARTNER_PLAN1 (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/> WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (Unattended) (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Lighthouse (Plan 2) (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/> To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Microsoft 365 E3_USGOV_DOD | SPE_E3_USGOV_DOD | d61d61cc-f992-433f-a577-5bd016037eeb | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS_AR_DOD (fd500458-c24c-478e-856c-a6067a8376cd)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams for DOD (AR) (fd500458-c24c-478e-856c-a6067a8376cd)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office Online (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SharePoint Online (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | | Microsoft 365 E3_USGOV_GCCHIGH | SPE_E3_USGOV_GCCHIGH | ca9d1dd9-dfe9-4fef-b97c-9bc1ea3c3658 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS_AR_GCCHIGH (9953b155-8aef-4c56-92f3-72b0487fce41)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1(6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/> Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/> Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/> Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/> Microsoft Teams for GCCHigh (AR) (9953b155-8aef-4c56-92f3-72b0487fce41)<br/> Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/> Office Online (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/> SharePoint Online (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) |
active-directory Custom Security Attributes Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/fundamentals/custom-security-attributes-overview.md
Previously updated : 02/04/2022 Last updated : 03/28/2022
If you use the Microsoft Graph API, you can use [Graph Explorer](/graph/graph-ex
Here are some of the known issues with custom security attributes: -- Users with attribute set-level role assignments can see other attribute sets and custom security attribute definitions. - Global Administrators can read audit logs for custom security attribute definitions and assignments. - If you have an Azure AD Premium P2 license, you can't add eligible role assignments at attribute set scope. - If you have an Azure AD Premium P2 license, the **Assigned roles** page for a user does not list permanent role assignments at attribute set scope. The role assignments exist, but aren't listed.
active-directory How To Connect Fed Saml Idp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-fed-saml-idp.md
na Previously updated : 01/21/2022 Last updated : 03/29/2022
This procedure shows how to add a single user to Azure AD.
-FirstName Elwood ` -LastName Folk ` -AlternateEmailAddresses "Elwood.Folk@contoso.com" `
- -LicenseAssignment "samlp2test:ENTERPRISEPACK" `
-UsageLocation "US" ```
active-directory Concept Identity Protection User Experience https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/concept-identity-protection-user-experience.md
Previously updated : 10/18/2019 Last updated : 01/21/2022
All of the Identity Protection policies have an impact on the sign in experience
## Multi-factor authentication registration
-Enabling the Identity Protection policy requiring multi-factor authentication registration and targeting all of your users, will make sure that they have the ability to use Azure AD MFA to self-remediate in the future. Configuring this policy gives your users a 14-day period where they can choose to register and at the end are forced to register. The experience for users is outlined below. More information can be found in the end-user documentation in the article, [Overview for two-factor verification and your work or school account](https://support.microsoft.com/account-billing/how-to-use-the-microsoft-authenticator-app-9783c865-0308-42fb-a519-8cf666fe0acc).
+Enabling the Identity Protection policy requiring multi-factor authentication registration and targeting all of your users, will make sure that they can use Azure AD MFA to self-remediate in the future. Configuring this policy gives your users a 14-day period where they can choose to register and at the end are forced to register.
### Registration interrupt
Enabling the Identity Protection policy requiring multi-factor authentication re
## Risky sign-in remediation
-When an administrator has configured a policy for sign-in risks, the affected users are notified when they try to sign in and trigger the policies risk level.
+When an administrator has configured a policy for sign-in risks, affected users are interrupted when they hit the configured risk level.
### Risky sign-in self-remediation
-1. The user is informed that something unusual was detected about their sign-in, such as signing in from a new location, device, or app.
+1. The user is informed that something unusual was detected about their sign-in. This could be something like, such as signing in from a new location, device, or app.
![Something unusual prompt](./media/concept-identity-protection-user-experience/120.png)
When an administrator has configured a policy for sign-in risks, the affected us
### Risky sign-in administrator unblock
-Administrators can choose to block users upon sign-in depending on their risk level. To get unblocked, end users must contact their IT staff, or they can try signing in from a familiar location or device. Self-remediation by performing multi-factor authentication is not an option in this case.
+Administrators can choose to block users upon sign-in depending on their risk level. To get unblocked, end users must contact their IT staff, or they can try signing in from a familiar location or device. Self-remediation by performing multi-factor authentication isn't an option in this case.
![Blocked by sign-in risk policy](./media/concept-identity-protection-user-experience/200.png)
When a user risk policy has been configured, users who meet the user risk level
## Risky sign-in administrator unblock
-Administrators can choose to block users upon sign-in depending on their risk level. To get unblocked, end users must contact their IT staff. Self-remediation by performing multi-factor authentication and self-service password reset is not an option in this case.
+Administrators can choose to block users upon sign-in depending on their risk level. To get unblocked, end users must contact their IT staff. Self-remediation by performing multi-factor authentication and self-service password reset isn't an option in this case.
![Blocked by user risk policy](./media/concept-identity-protection-user-experience/104.png) IT staff can follow the instructions in the section [Unblocking users](howto-identity-protection-remediate-unblock.md#unblocking-based-on-user-risk) to allow users to sign back in.
+## High risk technician
+
+If your organization has users who are delegated access to another tenant and they trigger high risk they may be blocked from signing into those other tenants. For example:
+
+1. An organization has a managed service provider (MSP) or cloud solution provider (CSP) who takes care of configuring their cloud environment.
+1. One of the MSPs technicians credentials are leaked and triggers high risk. That technician is blocked from signing in to other tenants.
+1. The technician can self-remediate and sign in if the home tenant has enabled the appropriate policies [requiring password change for high risk users](../conditional-access/howto-conditional-access-policy-risk-user.md) or [MFA for risky users](../conditional-access/howto-conditional-access-policy-risk.md).
+ 1. If the home tenant hasn't enabled self-remediation policies, an administrator in the technician's home tenant will have to [remediate the risk](howto-identity-protection-remediate-unblock.md#remediation).
+ ## See also - [Remediate risks and unblock users](howto-identity-protection-remediate-unblock.md)
active-directory F5 Aad Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-aad-integration.md
Refer to the following guided configuration tutorials using Easy Button template
- [BIG-IP Easy Button for SSO to Oracle JD Edwards](f5-big-ip-oracle-jde-easy-button.md)
+- [BIG-IP Easy Button for SSO to SAP ERP](f5-big-ip-sap-erp-easy-button.md)
+ ## Azure AD B2B guest access Azure AD B2B guest access to SHA protected applications is also possible, but some scenarios may require some additional steps not covered in the tutorials. One example is Kerberos SSO, where a BIG-IP will perform kerberos constrained delegation (KCD) to obtain a service ticket from domain contollers. Without a local representation of a guest user exisiting locally, a domain controller will fail to honour the request on the basis that the user does not exist. To support this scenario, you would need to ensure external identities are flowed down from your Azure AD tenant to the directory used by the application. See [Grant B2B users in Azure AD access to your on-premises applications](../external-identities/hybrid-cloud-to-on-premises.md) for guidance.
active-directory How To Assign App Role Managed Identity Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/how-to-assign-app-role-managed-identity-cli.md
In this article, you learn how to assign a managed identity to an application ro
```azurecli roleguid="0566419e-bb95-4d9d-a4f8-ed9a0f147fa6"
- az rest -m POST -u https://graph.microsoft.com/beta/servicePrincipals/$oidForMI/appRoleAssignments -b "{\"principalId\": \"$oidForMI\", \"resourceId\": \"$serverSPOID\",\"appRoleId\": \"$roleguid\"}"
+ az rest -m POST -u https://graph.microsoft.com/v1.0/servicePrincipals/$oidForMI/appRoleAssignments -b "{\"principalId\": \"$oidForMI\", \"resourceId\": \"$serverSPOID\",\"appRoleId\": \"$roleguid\"}"
``` ## Next steps
active-directory Managed Identities Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/managed-identities-status.md
The following Azure services support managed identities for Azure resources:
| API Management | [Use managed identities in Azure API Management](../../api-management/api-management-howto-use-managed-service-identity.md) | | Application Gateway | [TLS termination with Key Vault certificates](../../application-gateway/key-vault-certs.md) | | Azure App Configuration | [How to use managed identities for Azure App Configuration](../../azure-app-configuration/overview-managed-identity.md) |
-| Azure App Services | [How to use managed identities for App Service and Azure Functions](../../app-service/overview-managed-identity.md) |
+| Azure App Services | [How to use managed identities for App Service and Azure Functions](../../app-service/overview-managed-identity.md) |
| Azure Arc enabled Kubernetes | [Quickstart: Connect an existing Kubernetes cluster to Azure Arc](../../azure-arc/kubernetes/quickstart-connect-cluster.md) | | Azure Arc enabled servers | [Authenticate against Azure resources with Azure Arc-enabled servers](../../azure-arc/servers/managed-identity-authentication.md) | | Azure Automanage | [Repair an Automanage Account](../../automanage/repair-automanage-account.md) |
The following Azure services support managed identities for Azure resources:
| Azure Digital Twins | [Enable a managed identity for routing Azure Digital Twins events](../../digital-twins/how-to-enable-managed-identities-portal.md) | | Azure Event Grid | [Event delivery with a managed identity](../../event-grid/managed-service-identity.md) | Azure Image Builder | [Azure Image Builder overview](../../virtual-machines/image-builder-overview.md#permissions) |
-| Azure Import/Export | [Use customer-managed keys in Azure Key Vault for Import/Export service](../../import-export/storage-import-export-encryption-key-portal.md)
+| Azure Import/Export | [Use customer-managed keys in Azure Key Vault for Import/Export service](../../import-export/storage-import-export-encryption-key-portal.md)
| Azure IoT Hub | [IoT Hub support for virtual networks with Private Link and Managed Identity](../../iot-hub/virtual-network-support.md) | | Azure Kubernetes Service (AKS) | [Use managed identities in Azure Kubernetes Service](../../aks/use-managed-identity.md) | | Azure Logic Apps | [Authenticate access to Azure resources using managed identities in Azure Logic Apps](../../logic-apps/create-managed-service-identity.md) | | Azure Log Analytics cluster | [Azure Monitor customer-managed key](../../azure-monitor/logs/customer-managed-keys.md) | Azure Machine Learning Services | [Use Managed identities with Azure Machine Learning](../../machine-learning/how-to-use-managed-identities.md?tabs=python) | | Azure Managed Disk | [Use the Azure portal to enable server-side encryption with customer-managed keys for managed disks](../../virtual-machines/disks-enable-customer-managed-keys-portal.md) |
-| Azure Media services | [Managed identities](../../media-services/latest/concept-managed-identities.md) |
+| Azure Media services | [Managed identities](/media-services/latest/concept-managed-identities) |
| Azure Monitor | [Azure Monitor customer-managed key](../../azure-monitor/logs/customer-managed-keys.md?tabs=portal) | | Azure Policy | [Remediate non-compliant resources with Azure Policy](../../governance/policy/how-to/remediate-resources.md) | | Azure Purview | [Credentials for source authentication in Azure Purview](../../purview/manage-credentials.md) |
active-directory Services Azure Active Directory Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/managed-identities-azure-resources/services-azure-active-directory-support.md
The following services support Azure AD authentication. New services are added t
| Azure Kubernetes Service (AKS) | [Control access to cluster resources using Kubernetes role-based access control and Azure Active Directory identities in Azure Kubernetes Service](../../aks/azure-ad-rbac.md) | | Azure Machine Learning Services | [Set up authentication for Azure Machine Learning resources and workflows](../../machine-learning/how-to-setup-authentication.md) | | Azure Maps | [Manage authentication in Azure Maps](../../azure-maps/how-to-manage-authentication.md) |
-| Azure Media services | [Access the Azure Media Services API with Azure AD authentication](../../media-services/previous/media-services-use-aad-auth-to-access-ams-api.md) |
+| Azure Media services | [Access the Azure Media Services API with Azure AD authentication](/media-services/previous/media-services-use-aad-auth-to-access-ams-api) |
| Azure Monitor | [Azure AD authentication for Application Insights (Preview)](../../azure-monitor/app/azure-ad-authentication.md?tabs=net) | | Azure Resource Manager | [Azure security baseline for Azure Resource Manager](/security/benchmark/azure/baselines/resource-manager-security-baseline?toc=/azure/azure-resource-manager/management/toc.json) | Azure Service Fabric | [Set up Azure Active Directory for client authentication](../../service-fabric/service-fabric-cluster-creation-setup-aad.md) |
active-directory Confluencemicrosoft Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/confluencemicrosoft-tutorial.md
As of now, following versions of Confluence are supported:
- Confluence: 5.0 to 5.10 - Confluence: 6.0.1 to 6.15.9-- Confluence: 7.0.1 to 7.16.2
+- Confluence: 7.0.1 to 7.17.0
> [!NOTE] > Please note that our Confluence Plugin also works on Ubuntu Version 16.04
active-directory Jiramicrosoft Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/jiramicrosoft-tutorial.md
Use your Microsoft Azure Active Directory account with Atlassian JIRA server to
To configure Azure AD integration with JIRA SAML SSO by Microsoft, you need the following items: - An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).-- JIRA Core and Software 6.4 to 8.22.0 or JIRA Service Desk 3.0 to 4.22.0 should installed and configured on Windows 64-bit version
+- JIRA Core and Software 6.4 to 8.22.1 or JIRA Service Desk 3.0 to 4.22.1 should installed and configured on Windows 64-bit version
- JIRA server is HTTPS enabled - Note the supported versions for JIRA Plugin are mentioned in below section. - JIRA server is reachable on the Internet particularly to the Azure AD login page for authentication and should able to receive the token from Azure AD
To get started, you need the following items:
## Supported versions of JIRA
-* JIRA Core and Software: 6.4 to 8.22.0
-* JIRA Service Desk 3.0 to 4.22.0
+* JIRA Core and Software: 6.4 to 8.22.1
+* JIRA Service Desk 3.0 to 4.22.1
* JIRA also supports 5.2. For more details, click [Microsoft Azure Active Directory single sign-on for JIRA 5.2](jira52microsoft-tutorial.md) > [!NOTE]
active-directory Policystat Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/policystat-tutorial.md
Follow these steps to enable Azure AD SSO in the Azure portal.
1. In the **Sign on URL** text box, type a URL using the following pattern: `https://<companyname>.policystat.com` >[!NOTE]
- >These values aren't real. Update these values with the actual Identifier and Sign on URL. Contact [PolicyStat Client support team](https://rldatix.com/services-support/support) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ >These values aren't real. Update these values with the actual Identifier and Sign on URL. Contact [PolicyStat Client support team](https://rldatix.com/en-apac/customer-success/community/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
active-directory Sap Netweaver Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/sap-netweaver-tutorial.md
If you are expecting a role to be assigned to the users, you can select it from
![Configure OAuth](./media/sapnetweaver-tutorial/oauth03.png) > [!NOTE]
- > Message `soft state status is not supported` ΓÇô can be ignored, as no problem. For more details, refer [here](https://help.sap.com/doc/saphelp_nw74/7.4.16/1e/c60c33be784846aad62716b4a1df39/content.htm?no_cache=true).
+ > Message `soft state status is not supported` ΓÇô can be ignored, as no problem.
### Create a service user for the OAuth 2.0 Client
advisor Advisor Reference Performance Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-performance-recommendations.md
Learn more about [AVS Private cloud - vSANCapacity (vSAN capacity utilization ha
Cache instances perform best when not running under high network bandwidth which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce network bandwidth or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheNetworkBandwidth (Improve your Cache and application performance when running with high network bandwidth)](https://aka.ms/redis/recommendations/bandwidth).
+Learn more about [Redis Cache Server - RedisCacheNetworkBandwidth (Improve your Cache and application performance when running with high network bandwidth)](/azure/azure-cache-for-redis/cache-troubleshoot-server#server-side-bandwidth-limitation).
### Improve your Cache and application performance when running with many connected clients
Learn more about [Redis Cache Server - RedisCacheConnectedClients (Improve your
Cache instances perform best when not running under high server load which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce the server load or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheServerLoad (Improve your Cache and application performance when running with high server load)](https://aka.ms/redis/recommendations/cpu).
+Learn more about [Redis Cache Server - RedisCacheServerLoad (Improve your Cache and application performance when running with high server load)](/azure/azure-cache-for-redis/cache-troubleshoot-client#high-client-cpu-usage).
### Improve your Cache and application performance when running with high memory pressure Cache instances perform best when not running under high memory pressure which may cause them to become unresponsive, experience data loss, or become unavailable. Apply best practices to reduce used memory or scale to a different size or sku with more capacity.
-Learn more about [Redis Cache Server - RedisCacheUsedMemory (Improve your Cache and application performance when running with high memory pressure)](https://aka.ms/redis/recommendations/memory).
+Learn more about [Redis Cache Server - RedisCacheUsedMemory (Improve your Cache and application performance when running with high memory pressure)](/azure/azure-cache-for-redis/cache-troubleshoot-client#memory-pressure-on-redis-client).
## Cognitive Service
advisor Advisor Reference Reliability Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/advisor/advisor-reference-reliability-recommendations.md
Last updated 02/04/2022
# Reliability recommendations
-Azure Advisor helps you ensure and improve the continuity of your business-critical applications. You can get reliability recommendations on the **Reliability** tab on the Advisor dashboard.
+Azure Advisor helps you ensure and improve the continuity of your business-critical applications. You can get reliability recommendations on the **Reliability** tab on the Advisor dashboard.
1. Sign in to the [**Azure portal**](https://portal.azure.com).
Learn more about [Cosmos DB account - CosmosDBMongoSelfServeUpgrade (Upgrade you
### Add a second region to your production workloads on Azure Cosmos DB
-Based on their names and configuration, we have detected the Azure Cosmos DB accounts below as being potentially used for production workloads. These accounts currently run in a single Azure region. You can increase their availability by configuring them to span at least two Azure regions.
+Based on their names and configuration, we have detected the Azure Cosmos DB accounts below as being potentially used for production workloads. These accounts currently run in a single Azure region. You can increase their availability by configuring them to span at least two Azure regions.
> [!NOTE] > Additional regions will incur extra costs.
Learn more about [Cosmos DB account - CosmosDBSingleRegionProdAccounts (Add a se
We observed your account is throwing a TooManyRequests error with the 16500 error code. Enabling Server Side Retry (SSR) can help mitigate this issue for you.
-Learn more about [Cosmos DB account - CosmosDBMongoServerSideRetries (Enable Server Side Retry (SSR) on your Azure Cosmos DB's API for MongoDB account)](/azure/cosmos-db/prevent-rate-limiting-errors).
+Learn more about [Cosmos DB account - CosmosDBMongoServerSideRetries (Enable Server Side Retry (SSR) on your Azure Cosmos DB's API for MongoDB account)](/azure/cosmos-db/cassandra/prevent-rate-limiting-errors).
### Migrate your Azure Cosmos DB API for MongoDB account to v4.0 to save on query/storage costs and utilize new features
Learn more about [Kubernetes - Azure Arc - Arc-enabled K8s agent version upgrade
Please be advised that your media account is about to hit its quota limits. Please review current usage of Assets, Content Key Policies and Stream Policies for the media account. To avoid any disruption of service, you should request quota limits to be increased for the entities that are closer to hitting quota limit. You can request quota limits to be increased by opening a ticket and adding relevant details to it. Please don't create additional Azure Media accounts in an attempt to obtain higher limits.
-Learn more about [Media Service - AccountQuotaLimit (Increase Media Services quotas or limits to ensure continuity of service.)](../media-services/latest/limits-quotas-constraints-reference.md).
+Learn more about [Media Service - AccountQuotaLimit (Increase Media Services quotas or limits to ensure continuity of service.)](/media-services/latest/limits-quotas-constraints-reference).
## Networking
Learn more about [Application gateway - AppGwLog4JCVEPatchNotification (Azure WA
### Additional protection to mitigate Log4j2 vulnerability (CVE-2021-44228)
-To mitigate the impact of Log4j2 vulnerability, we recommend these steps:
+To mitigate the impact of Log4j2 vulnerability, we recommend these steps:
-1) Upgrade Log4j2 to version 2.15.0 on your backend servers. If upgrade isn't possible, follow the system property guidance link below.
+1) Upgrade Log4j2 to version 2.15.0 on your backend servers. If upgrade isn't possible, follow the system property guidance link below.
2) Take advantage of WAF Core rule sets (CRS) by upgrading to WAF SKU Learn more about [Application gateway - AppGwLog4JCVEGenericNotification (Additional protection to mitigate Log4j2 vulnerability (CVE-2021-44228))](https://aka.ms/log4jcve).
aks Azure Disk Volume https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/azure-disk-volume.md
Title: Create a static volume for pods in Azure Kubernetes Service (AKS)
description: Learn how to manually create a volume with Azure disks for use with a pod in Azure Kubernetes Service (AKS) Previously updated : 03/09/2019 Last updated : 03/29/2019 #Customer intent: As a developer, I want to learn how to manually create and attach storage to a specific pod in AKS.
Create a *pvc-azuredisk.yaml* file with a *PersistentVolumeClaim* that uses the
```yaml apiVersion: v1
+kind: PersistentVolumeClaim
metadata: name: pvc-azuredisk spec:
aks Scale Down Mode https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/scale-down-mode.md
Title: Use Scale-down Mode for your Azure Kubernetes Service (AKS) cluster (preview)
+ Title: Use Scale-down Mode for your Azure Kubernetes Service (AKS) cluster
description: Learn how to use Scale-down Mode in Azure Kubernetes Service (AKS).
-# Use Scale-down Mode to delete/deallocate nodes in Azure Kubernetes Service (AKS) (preview)
+# Use Scale-down Mode to delete/deallocate nodes in Azure Kubernetes Service (AKS)
By default, scale-up operations performed manually or by the cluster autoscaler require the allocation and provisioning of new nodes, and scale-down operations delete nodes. Scale-down Mode allows you to decide whether you would like to delete or deallocate the nodes in your Azure Kubernetes Service (AKS) cluster upon scaling down.
-When an Azure VM is in the `Stopped` (deallocated) state, you will not be charged for the VM compute resources. However, you will still need to pay for any OS and data storage disks attached to the VM. This also means that the container images will be preserved on those nodes. For more information, see [States and billing of Azure Virtual Machines][state-billing-azure-vm]. This behavior allows for faster operation speeds, as your deployment leverages cached images. Scale-down Mode allows you to no longer have to pre-provision nodes and pre-pull container images, saving you compute cost.
-
+When an Azure VM is in the `Stopped` (deallocated) state, you will not be charged for the VM compute resources. However, you'll still need to pay for any OS and data storage disks attached to the VM. This also means that the container images will be preserved on those nodes. For more information, see [States and billing of Azure Virtual Machines][state-billing-azure-vm]. This behavior allows for faster operation speeds, as your deployment uses cached images. Scale-down Mode removes the need to pre-provision nodes and pre-pull container images, saving you compute cost.
## Before you begin > [!WARNING] > In order to preserve any deallocated VMs, you must set Scale-down Mode to Deallocate. That includes VMs that have been deallocated using IaaS APIs (Virtual Machine Scale Set APIs). Setting Scale-down Mode to Delete will remove any deallocate VMs.
-This article assumes that you have an existing AKS cluster. If you need an AKS cluster, see the AKS quickstart [using the Azure CLI][aks-quickstart-cli] or [using the Azure portal][aks-quickstart-portal].
+This article assumes that you have an existing AKS cluster and the latest version of the Azure CLI installed. If you need an AKS cluster, see the AKS quickstart [using the Azure CLI][aks-quickstart-cli] or [using the Azure portal][aks-quickstart-portal].
### Limitations -- [Ephemeral OS][ephemeral-os] disks are not supported. Be sure to specify managed OS disks via `--node-osdisk-type Managed` when creating a cluster or node pool.-- [Spot node pools][spot-node-pool] are not supported.-
-### Install aks-preview CLI extension
-
-You also need the *aks-preview* Azure CLI extension version 0.5.30 or later. Install the *aks-preview* Azure CLI extension by using the [az extension add][az-extension-add] command. Or install any available updates by using the [az extension update][az-extension-update] command.
+- [Ephemeral OS][ephemeral-os] disks aren't supported. Be sure to specify managed OS disks via `--node-osdisk-type Managed` when creating a cluster or node pool.
-```azurecli-interactive
-# Install the aks-preview extension
-az extension add --name aks-preview
-
-# Update the extension to make sure you have the latest version installed
-az extension update --name aks-preview
-```
-
-### Register the `AKS-ScaleDownModePreview` preview feature
-
-To use the feature, you must also enable the `AKS-ScaleDownModePreview` feature flag on your subscription.
-
-Register the `AKS-ScaleDownModePreview` feature flag by using the [az feature register][az-feature-register] command, as shown in the following example:
-
-```azurecli-interactive
-az feature register --namespace "Microsoft.ContainerService" --name "AKS-ScaleDownModePreview"
-```
-
-It takes a few minutes for the status to show *Registered*. Verify the registration status by using the [az feature list][az-feature-list] command:
-
-```azurecli-interactive
-az feature list -o table --query "[?contains(name, 'Microsoft.ContainerService/AKS-ScaleDownModePreview')].{Name:name,State:properties.state}"
-```
-
-When ready, refresh the registration of the *Microsoft.ContainerService* resource provider by using the [az provider register][az-provider-register] command:
-
-```azurecli-interactive
-az provider register --namespace Microsoft.ContainerService
-```
+> [!NOTE]
+> Previously, while Scale-down Mode was in preview, [spot node pools][spot-node-pool] were unsupported. Now that Scale-down Mode is Generally Available, this limitation no longer applies.
## Using Scale-down Mode to deallocate nodes on scale-down
In this example, we create a new node pool with 20 nodes and specify that upon s
az aks nodepool add --node-count 20 --scale-down-mode Deallocate --node-osdisk-type Managed --max-pods 10 --name nodepool2 --cluster-name myAKSCluster --resource-group myResourceGroup ```
-By scaling the node pool and changing the node count to 5, we will deallocate 15 nodes.
+By scaling the node pool and changing the node count to 5, we'll deallocate 15 nodes.
```azurecli-interactive az aks nodepool scale --node-count 5 --name nodepool2 --cluster-name myAKSCluster --resource-group myResourceGroup
az aks nodepool update --scale-down-mode Delete --name nodepool2 --cluster-name
## Using Scale-down Mode to delete nodes on scale-down
-The default behavior of AKS without using Scale-down Mode is to delete your nodes when you scale-down your cluster. Using Scale-down Mode, this can be explicitly achieved by setting `--scale-down-mode Delete`.
+The default behavior of AKS without using Scale-down Mode is to delete your nodes when you scale-down your cluster. With Scale-down Mode, this behavior can be explicitly achieved by setting `--scale-down-mode Delete`.
In this example, we create a new node pool and specify that our nodes will be deleted upon scale-down via `--scale-down-mode Delete`. Scaling operations will be handled via the cluster autoscaler.
api-management Api Management Advanced Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-advanced-policies.md
The `choose` policy applies enclosed policy statements based on the outcome of e
</when> <otherwise> <!ΓÇö one or more policy statements to be applied if none of the above conditions are true -->
-</otherwise>
+ </otherwise>
</choose> ```
This policy can be used in the following policy [sections](./api-management-howt
- **Policy sections:** inbound, outbound, backend - **Policy scopes:** all scopes
api-management Api Management Howto Configure Notifications https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/api-management-howto-configure-notifications.md
Title: Configure notifications and email templates
-description: Learn how to configure notifications and email templates in Azure API Management.
+description: Learn how to configure notifications and email templates for events in Azure API Management.
- -- Previously updated : 01/10/2020+ Last updated : 03/28/2022
-# How to configure notifications and email templates in Azure API Management
+# How to configure notifications and notification templates in Azure API Management
-API Management provides the ability to configure notifications for specific events, and to configure the email templates that are used to communicate with the administrators and developers of an API Management instance. This article shows how to configure notifications for the available events, and provides an overview of configuring the email templates used for these events.
+API Management provides the ability to configure email notifications for specific events, and to configure the email templates that are used to communicate with the administrators and developers of an API Management instance. This article shows how to configure notifications for the available events, and provides an overview of configuring the email templates used for these events.
## Prerequisites
-If you do not have an API Management service instance, complete the following quickstart: [Create an Azure API Management instance](get-started-create-service-instance.md).
+If you don't have an API Management service instance, complete the following quickstart: [Create an Azure API Management instance](get-started-create-service-instance.md).
[!INCLUDE [premium-dev-standard-basic.md](../../includes/api-management-availability-premium-dev-standard-basic.md)]
-## <a name="publisher-notifications"> </a>Configure notifications
-1. Select your **API MANAGEMENT** instance.
-2. Click **Notifications** to view the available notifications.
+## <a name="publisher-notifications"> </a>Configure notifications in the portal
- ![Publisher notifications][api-management-publisher-notifications]
+1. In the left navigation of your API Management instance, select **Notifications** to view the available notifications.
The following list of events can be configured for notifications.
- - **Subscription requests (requiring approval)** - The specified email recipients and users will receive email notifications about subscription requests for API products requiring approval.
- - **New subscriptions** - The specified email recipients and users will receive email notifications about new API product subscriptions.
- - **Application gallery requests** - The specified email recipients and users will receive email notifications when new applications are submitted to the application gallery.
+ - **Subscription requests (requiring approval)** - The specified email recipients and users will receive email notifications about subscription requests for products requiring approval.
+ - **New subscriptions** - The specified email recipients and users will receive email notifications about new product subscriptions.
+ - **Application gallery requests** (deprecated) - The specified email recipients and users will receive email notifications when new applications are submitted to the application gallery on the legacy developer portal.
- **BCC** - The specified email recipients and users will receive email blind carbon copies of all emails sent to developers.
- - **New issue or comment** - The specified email recipients and users will receive email notifications when a new issue or comment is submitted on the developer portal.
+ - **New issue or comment** (deprecated) - The specified email recipients and users will receive email notifications when a new issue or comment is submitted on the legacy developer portal.
- **Close account message** - The specified email recipients and users will receive email notifications when an account is closed.
- - **Approaching subscription quota limit** - The following email recipients and users will receive email notifications when subscription usage gets close to usage quota.
+ - **Approaching subscription quota limit** - The specified email recipients and users will receive email notifications when subscription usage gets close to usage quota.
> [!NOTE]
- > Notifications are triggered by the [quota by subscription](api-management-access-restriction-policies.md#SetUsageQuota) policy only. [Quota by key](api-management-access-restriction-policies.md#SetUsageQuotaByKey) policy doesn't generate notifications.
+ > Notifications are triggered by the [quota by subscription](api-management-access-restriction-policies.md#SetUsageQuota) policy only. The [quota by key](api-management-access-restriction-policies.md#SetUsageQuotaByKey) policy doesn't generate notifications.
- For each event, you can specify email recipients using the email address text box or you can select users from a list.
+1. Select a notification, and specify one or more email addresses to be notified:
+ * To add the administrator email address, select **+ Add admin**.
+ * To add another email address, select **+ Add email**, enter an email address, and select **Add**.
+ * Continue adding email addresses as needed.
-3. To specify the email addresses to be notified, enter them in the email address text box. If you have multiple email addresses, separate them using commas.
-
- ![Notification recipients][api-management-email-addresses]
-
-4. Press **Add**.
+ :::image type="content" source="media/api-management-howto-configure-notifications/api-management-email-addresses.png" alt-text="Screenshot showing how to add notification recipients in the portal":::
## <a name="email-templates"> </a>Configure notification templates
-API Management provides notification templates for the email messages that are sent in the course of administering and using the service. The following email templates are provided.
+API Management provides notification templates for the administrative email messages that are sent automatically to developers when they access and use the service. The following notification templates are provided:
-- Application gallery submission approved
+- Application gallery submission approved (deprecated)
- Developer farewell letter - Developer quota limit approaching notification
+- Developer welcome letter
+- Email change notification
- Invite user-- New comment added to an issue-- New issue received
+- New comment added to an issue (deprecated)
+- New developer account confirmation
+- New issue received (deprecated)
- New subscription activated-- Subscription renewed confirmation-- Subscription request declines
+- Password change confirmation
+- Subscription request declined
- Subscription request received
-These templates can be modified as desired.
+Each email template has a subject in plain text, and a body definition in HTML format. Each item can be customized as desired.
-To view and configure the email templates for your API Management instance, click **Notifications templates**.
+To view and configure a notification template in the portal:
-![Email templates][api-management-email-templates]
+1. In the left menu, select **Notification templates**.
+ :::image type="content" source="media/api-management-howto-configure-notifications/api-management-email-templates.png" alt-text="Screenshot of notification templates in the portal":::
-Each email template has a subject in plain text, and a body definition in HTML format. Each item can be customized as desired.
+1. Select a notification template, and configure the template using the editor.
+
+ :::image type="content" source="media/api-management-howto-configure-notifications/api-management-email-template.png" alt-text="Screenshot of notification template editor in the portal":::
+
+ * The **Parameters** list contains a list of parameters, which when inserted into the subject or body, will be replaced by the designated value when the email is sent.
+ * To insert a parameter, place the cursor where you wish the parameter to go, and select the parameter name.
+
+1. To save the changes to the email template, select **Save**, or to cancel the changes select **Discard**.
+
+## Configure email settings
+
+You can modify general e-mail settings for notifications that are sent from your API Management instance. You can change the administrator email address, the name of the organization sending notification, and the originating email address.
+
+To modify email settings:
-![Email template editor][api-management-email-template]
+1. In the left menu, select **Notification templates**.
+1. Select **E-mail settings**.
+1. On the **General email settings** page, enter values for:
+ * **Administrator email** - the email address to receive all system notifications and other configured notifications
+ * **Organization name** - the name of your organization for use in the developer portal and notifications
+ * **Originating email address** - The value of the `From` header for notifications from the API Management instance. API Management sends notifications on behalf of this originating address.
-The **Parameters** list contains a list of parameters, which when inserted into the subject or body, will be replaced the designated value when the email is sent. To insert a parameter, place the cursor where you wish the parameter to go, and click the arrow to the left of the parameter name.
+ :::image type="content" source="media/api-management-howto-configure-notifications/configure-email-settings.png" alt-text="Screenshot of API Management email settings in the portal":::
+1. Select **Save**.
-To save the changes to the email template, click **Save**, or to cancel the changes click **Discard**.
+## Next steps
-[api-management-management-console]: ./media/api-management-howto-configure-notifications/api-management-management-console.png
-[api-management-publisher-notifications]: ./media/api-management-howto-configure-notifications/api-management-publisher-notifications.png
-[api-management-email-addresses]: ./media/api-management-howto-configure-notifications/api-management-email-addresses.png
-[api-management-email-templates]: ./media/api-management-howto-configure-notifications/api-management-email-templates.png
-[api-management-email-templates-list]: ./media/api-management-howto-configure-notifications/api-management-email-templates-list.png
-[api-management-email-template]: ./media/api-management-howto-configure-notifications/api-management-email-template.png
-[configure publisher notifications]: #publisher-notifications
-[configure email templates]: #email-templates
-[how to create and use groups]: api-management-howto-create-groups.md
-[how to associate groups with developers]: api-management-howto-create-groups.md#associate-group-developer
-[get started with azure api management]: get-started-create-service-instance.md
-[create an api management service instance]: get-started-create-service-instance.md
+* [Overview of the developer portal](api-management-howto-developer-portal.md).
+* [How to create and use groups to manage developer accounts](api-management-howto-create-groups.md)
api-management Devops Api Development Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/api-management/devops-api-development-templates.md
API developers face challenges when working with Resource Manager templates:
A tool called [Creator](https://github.com/Azure/azure-api-management-devops-resource-kit/blob/main/src/README.md#creator) in the resource kit can help automate the creation of API templates based on an Open API Specification file. Additionally, developers can supply API Management policies for an API in XML format.
-* For customers who are already using API Management, another challenge is to extract existing configurations into Resource Manager templates. For those customers, a tool called [Extractor](https://github.com/Azure/azure-api-management-devops-resource-kit/blob/main/src/APIM_ARMTemplate/README.md#creator) in the resource kit can help generate templates by extracting configurations from their API Management instances.
+* For customers who are already using API Management, another challenge is to extract existing configurations into Resource Manager templates. For those customers, a tool called [Extractor](https://github.com/Azure/azure-api-management-devops-resource-kit/blob/main/src/README.md#Extractor) in the resource kit can help generate templates by extracting configurations from their API Management instances.
## Workflow
app-service Deploy Zip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/deploy-zip.md
Publish-AzWebapp -ResourceGroupName <group-name> -Name <app-name> -ArchivePath <
The following example uses the cURL tool to deploy a .war, .jar, or .ear file. Replace the placeholders `<username>`, `<file-path>`, `<app-name>`, and `<package-type>` (`war`, `jar`, or `ear`, accordingly). When prompted by cURL, type in the [deployment password](deploy-configure-credentials.md). ```bash
-curl -X POST -u <username> --data-binary @"<file-path>" https://<app-name>.scm.azurewebsites.net/api/publish&type=<package-type>
+curl -X POST -u <username> --data-binary @"<file-path>" https://<app-name>.scm.azurewebsites.net/api/publish?type=<package-type>
``` [!INCLUDE [deploying to network secured sites](../../includes/app-service-deploy-network-secured-sites.md)]
app-service App Service App Service Environment Control Inbound Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-control-inbound-traffic.md
ms.assetid: 4cc82439-8791-48a4-9485-de6d8e1d1a08 Previously updated : 03/15/2022 Last updated : 03/29/2022
# How To Control Inbound Traffic to an App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> ## Overview
app-service App Service App Service Environment Create Ilb Ase Resourcemanager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-create-ilb-ase-resourcemanager.md
ms.assetid: 091decb6-b0de-42a1-9f2f-c18d9b2e67df Previously updated : 03/15/2022 Last updated : 03/29/2022
# How To Create an ILB ASEv1 Using Azure Resource Manager Templates > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> ## Overview
app-service App Service App Service Environment Intro https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-intro.md
ms.assetid: 78e6d4f5-da46-4eb5-a632-b5fdc17d2394 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Introduction to App Service Environment v1 > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> ## Overview
app-service App Service App Service Environment Layered Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-layered-security.md
ms.assetid: 73ce0213-bd3e-4876-b1ed-5ecad4ad5601 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Implementing a Layered Security Architecture with App Service Environments > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> Since App Service Environments provide an isolated runtime environment deployed into a virtual network, developers can create a layered security architecture providing differing levels of network access for each physical application tier.
app-service App Service App Service Environment Network Architecture Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-network-architecture-overview.md
ms.assetid: 13d03a37-1fe2-4e3e-9d57-46dfb330ba52 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Network Architecture Overview of App Service Environments > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> App Service Environments are always created within a subnet of a [virtual network][virtualnetwork] - apps running in an App Service Environment can communicate with private endpoints located within the same virtual network topology. Since customers may lock down parts of their virtual network infrastructure, it is important to understand the types of network communication flows that occur with an App Service Environment.
app-service App Service App Service Environment Network Configuration Expressroute https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-network-configuration-expressroute.md
ms.assetid: 34b49178-2595-4d32-9b41-110c96dde6bf Previously updated : 03/15/2022 Last updated : 03/29/2022
# Network configuration details for App Service Environment for Power Apps with Azure ExpressRoute > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> Customers can connect an [Azure ExpressRoute][ExpressRoute] circuit to their virtual network infrastructure to extend their on-premises network to Azure. App Service Environment is created in a subnet of the [virtual network][virtualnetwork] infrastructure. Apps that run on App Service Environment establish secure connections to back-end resources that are accessible only over the ExpressRoute connection.
app-service App Service App Service Environment Securely Connecting To Backend Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-app-service-environment-securely-connecting-to-backend-resources.md
ms.assetid: f82eb283-a6e7-4923-a00b-4b4ccf7c4b5b Previously updated : 03/15/2022 Last updated : 03/29/2022
# Connect securely to back end resources from an App Service environment > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> Since an App Service Environment is always created in **either** an Azure Resource Manager virtual network, **or** a classic deployment model [virtual network][virtualnetwork], outbound connections from an App Service Environment to other backend resources can flow exclusively over the virtual network. As of June 2016, ASEs can also be deployed into virtual networks that use either public address ranges or RFC1918 address spaces (private addresses).
app-service App Service Environment Auto Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-environment-auto-scale.md
ms.assetid: c23af2d8-d370-4b1f-9b3e-8782321ddccb Previously updated : 03/15/2022 Last updated : 03/29/2022
# Autoscaling and App Service Environment v1 > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> Azure App Service environments support *autoscaling*. You can autoscale individual worker pools based on metrics or schedule.
app-service App Service Web Configure An App Service Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-web-configure-an-app-service-environment.md
ms.assetid: b5a1da49-4cab-460d-b5d2-edd086ec32f4 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Configuring an App Service Environment v1 > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> ## Overview
app-service App Service Web Scale A Web App In An App Service Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/app-service-web-scale-a-web-app-in-an-app-service-environment.md
ms.assetid: 78eb1e49-4fcd-49e7-b3c7-f1906f0f22e3 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Scaling apps in an App Service Environment v1 > [!IMPORTANT]
-> This article is about App Service Environment v1. App Service Environment v1 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v1. [App Service Environment v1 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v1, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> In the Azure App Service there are normally three things you can scale:
app-service Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/certificates.md
Title: Certificates bindings
description: Explain numerous topics related to certificates on an App Service Environment v2. Learn how certificate bindings work on the single-tenanted apps in an ASE. Previously updated : 03/15/2022 Last updated : 03/29/2022 # Certificates and the App Service Environment v2 > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> The App Service Environment(ASE) is a deployment of the Azure App Service that runs within your Azure Virtual Network(VNet). It can be deployed with an internet accessible application endpoint or an application endpoint that is in your VNet. If you deploy the ASE with an internet accessible endpoint, that deployment is called an External ASE. If you deploy the ASE with an endpoint in your VNet, that deployment is called an ILB ASE. You can learn more about the ILB ASE from the [Create and use an ILB ASE](./create-ilb-ase.md) document.
app-service Configure Network Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/configure-network-settings.md
+
+ Title: Configure App Service Environment v3 network settings
+description: Configure network settings that apply to the entire Azure App Service environment. Learn how to do it with Azure Resource Manager templates.
+
+keywords: ASE, ASEv3, ftp, remote debug
++ Last updated : 03/29/2022+++
+# Network configuration settings
+
+Because App Service Environments are isolated to the individual customer, there are certain configuration settings that can be applied exclusively to App Service Environments. This article documents the various specific network customizations that are available for App Service Environment v3.
+
+> [!NOTE]
+> This article is about App Service Environment v3, which is used with isolated v2 App Service plans.
+
+If you don't have an App Service Environment, see [How to Create an App Service Environment v3](./creation.md).
+
+App Service Environment network customizations are stored in a subresource of the *hostingEnvironments* Azure Resource Manager entity called networking.
+
+The following abbreviated Resource Manager template snippet shows the **networking** resource:
+
+```json
+"resources": [
+{
+ "apiVersion": "2021-03-01",
+ "type": "Microsoft.Web/hostingEnvironments",
+ "name": "[parameter('aseName')]",
+ "location": ...,
+ "properties": {
+ "internalLoadBalancingMode": ...,
+ etc...
+ },
+ "resources": [
+ {
+ "type": "configurations",
+ "apiVersion": "2021-03-01",
+ "name": "networking",
+ "dependsOn": [
+ "[resourceId('Microsoft.Web/hostingEnvironments', parameters('aseName'))]"
+ ],
+ "properties": {
+ "remoteDebugEnabled": true,
+ "ftpEnabled": true,
+ "allowNewPrivateEndpointConnections": true
+ }
+ }
+ ]
+}
+```
+
+The **networking** resource can be included in a Resource Manager template to update the App Service Environment.
+
+## Configure using Azure Resource Explorer
+Alternatively, you can update the App Service Environment by using [Azure Resource Explorer](https://resources.azure.com).
+
+1. In Resource Explorer, go to the node for the App Service Environment (**subscriptions** > **{your Subscription}** > **resourceGroups** > **{your Resource Group}** > **providers** > **Microsoft.Web** > **hostingEnvironments** > **App Service Environment name** > **configurations** > **networking**).
+2. Select **Read/Write** in the upper toolbar to allow interactive editing in Resource Explorer.
+3. Select the blue **Edit** button to make the Resource Manager template editable.
+4. Modify one or more of the settings ftpEnabled, remoteDebugEnabled, allowNewPrivateEndpointConnections, that you want to change.
+5. Select the green **PUT** button that's located at the top of the right pane to commit the change to the App Service Environment.
+6. You may need to select the green **GET** button again to see the changed values.
+
+The change takes effect within a minute.
+
+## Allow new private endpoint connections
+
+For apps hosted on both ILB and External App Service Environment, you can allow creation of private endpoints. The setting is default disabled. If private endpoint has been created while the setting was enabled, they won't be deleted and will continue to work. The setting only prevents new private endpoints from being created.
+
+The following Azure CLI command will enable allowNewPrivateEndpointConnections:
+
+```azurecli
+ASE_NAME="[myAseName]"
+RESOURCE_GROUP_NAME="[myResourceGroup]"
+az appservice ase update --name $ASE_NAME -g $RESOURCE_GROUP_NAME --allow-new-private-endpoint-connection true
+
+az appservice ase list-addresses -n --name $ASE_NAME -g $RESOURCE_GROUP_NAME --query properties.allowNewPrivateEndpointConnections
+```
+
+The setting is also available for configuration through Azure portal at the App Service Environment configuration:
++
+## FTP access
+
+This ftpEnabled setting allows you to allow or deny FTP connections are the App Service Environment level. Individual apps will still need to configure FTP access. If you enable FTP at the App Service Environment level, you may want to [enforce FTPS](../deploy-ftp.md?tabs=cli#enforce-ftps) at the individual app level. The setting is default disabled.
+
+If you want to enable FTP access, you can run the following Azure CLI command:
+
+```azurecli
+ASE_NAME="[myAseName]"
+RESOURCE_GROUP_NAME="[myResourceGroup]"
+az resource update --name $ASE_NAME/configurations/networking --set properties.ftpEnabled=true -g $RESOURCE_GROUP_NAME --resource-type "Microsoft.Web/hostingEnvironments/networkingConfiguration"
+
+az resource show --name $ASE_NAME/configurations/networking -g $RESOURCE_GROUP_NAME --resource-type "Microsoft.Web/hostingEnvironments/networkingConfiguration" --query properties.ftpEnabled
+```
+
+In addition to enabling access, you need to ensure that you have [configured DNS if you are using ILB App Service Environment](./networking.md#dns-configuration-for-ftp-access).
+
+## Remote debugging access
+
+Remote debugging is default disabled at the App Service Environment level. You can enable network level access for all apps using this configuration. You'll still have to [configure remote debugging](../configure-common.md?tabs=cli#configure-general-settings) at the individual app level.
+
+Run the following Azure CLI command to enable remote debugging access:
+
+```azurecli
+ASE_NAME="[myAseName]"
+RESOURCE_GROUP_NAME="[myResourceGroup]"
+az resource update --name $ASE_NAME/configurations/networking --set properties.RemoteDebugEnabled=true -g $RESOURCE_GROUP_NAME --resource-type "Microsoft.Web/hostingEnvironments/networkingConfiguration"
+
+az resource show --name $ASE_NAME/configurations/networking -g $RESOURCE_GROUP_NAME --resource-type "Microsoft.Web/hostingEnvironments/networkingConfiguration" --query properties.remoteDebugEnabled
+```
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Create an App Service Environment from a template](create-from-template.md)
+
+> [!div class="nextstepaction"]
+> [Deploy your app to Azure App Service using FTP](../deploy-ftp.md)
app-service Create External Ase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/create-external-ase.md
Title: Create an external ASE
description: Learn how to create an App Service environment with an app in it, or create a standalone (empty) ASE. Previously updated : 03/15/2022 Last updated : 03/29/2022 # Create an External App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> Azure App Service Environment is a deployment of Azure App Service into a subnet in an Azure virtual network (VNet). There are two ways to deploy an App Service Environment (ASE):
app-service Create Ilb Ase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/create-ilb-ase.md
description: Learn how to create an App Service environment with an internal loa
ms.assetid: 0f4c1fa4-e344-46e7-8d24-a25e247ae138 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Create and use an Internal Load Balancer App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> The Azure App Service Environment is a deployment of Azure App Service into a subnet in an Azure virtual network (VNet). There are two ways to deploy an App Service Environment (ASE):
app-service Firewall Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/firewall-integration.md
description: Learn how to integrate with Azure Firewall to secure outbound traff
ms.assetid: 955a4d84-94ca-418d-aa79-b57a5eb8cb85 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Locking down an App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> The App Service Environment (ASE) has many external dependencies that it requires access to in order to function properly. The ASE lives in the customer Azure Virtual Network. Customers must allow the ASE dependency traffic, which is a problem for customers that want to lock down all egress from their virtual network.
app-service Forced Tunnel Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/forced-tunnel-support.md
description: Learn how to enable your App Service Environment to work when outbo
ms.assetid: 384cf393-5c63-4ffb-9eb2-bfd990bc7af1 Previously updated : 03/15/2022 Last updated : 03/29/2022
# Configure your App Service Environment with forced tunneling > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> The App Service Environment (ASE) is a deployment of Azure App Service in a customer's Azure Virtual Network. Many customers configure their Azure virtual networks to be extensions of their on-premises networks with VPNs or Azure ExpressRoute connections. Forced tunneling is when you redirect internet bound traffic to your VPN or a virtual appliance instead. Virtual appliances are often used to inspect and audit outbound network traffic.
app-service Intro https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/intro.md
Title: Introduction to ASEv2
description: Learn how Azure App Service Environments v2 help you scale, secure, and optimize your apps in a fully isolated and dedicated environment. Previously updated : 03/15/2022 Last updated : 03/29/2022 # Introduction to App Service Environment v2 > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> ## Overview
app-service Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migrate.md
Title: Migrate to App Service Environment v3 by using the migration feature
description: Overview of the migration feature for migration to App Service Environment v3 Previously updated : 3/14/2022 Last updated : 3/29/2022
With the current version of the migration feature, your new App Service Environm
Note that App Service Environment v3 doesn't currently support the following features that you may be using with your current App Service Environment. If you require any of these features, don't migrate until they're supported. - Sending SMTP traffic. You can still have email triggered alerts but your app can't send outbound traffic on port 25.-- Deploying your apps with FTP.-- Using remote debug with your apps. - Monitoring your traffic with Network Watcher or NSG Flow. - Configuring an IP-based TLS/SSL binding with your apps.
There's no cost to migrate your App Service Environment. You'll stop being charg
- **What happens to my old App Service Environment?** If you decide to migrate an App Service Environment, the old environment gets shut down and deleted and all of your apps are migrated to a new environment. Your old environment will no longer be accessible. - **What will happen to my App Service Environment v1/v2 resources after 31 August 2024?**
- After 31 August 2024, if you haven't migrated to App Service Environment v3, your App Service Environment v1/v2s and the apps deployed in them will no longer be available. App Service Environment v1/v2 is hosted on App Service scale units running on [Cloud Services (classic)](../../cloud-services/cloud-services-choose-me.md) architecture that will be [retired on 31 August 2024](https://azure.microsoft.com/updates/cloud-services-retirement-announcement/). Because of this, App Service Environment v1/v2 will no longer be available after that date. Migrate to App Service Environment v3 to keep your apps running or save or back up any resources or data that you need to maintain.
+ After 31 August 2024, if you haven't migrated to App Service Environment v3, your App Service Environment v1/v2s and the apps deployed in them will no longer be available. App Service Environment v1/v2 is hosted on App Service scale units running on [Cloud Services (classic)](../../cloud-services/cloud-services-choose-me.md) architecture that will be [retired on 31 August 2024](https://azure.microsoft.com/updates/cloud-services-retirement-announcement/). Because of this, [App Service Environment v1/v2 will no longer be available after that date](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). Migrate to App Service Environment v3 to keep your apps running or save or back up any resources or data that you need to maintain.
## Next steps
app-service Migration Alternatives https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migration-alternatives.md
Once your migration and any testing with your new environment is complete, delet
- **What properties of my App Service Environment will change?** You'll now be on App Service Environment v3 so be sure to review the [features and feature differences](overview.md#feature-differences) compared to previous versions. For ILB App Service Environment, you'll keep the same ILB IP address. For internet facing App Service Environment, the public IP address and the outbound IP address will change. Note for internet facing App Service Environment, previously there was a single IP for both inbound and outbound. For App Service Environment v3, they're separate. For more information, see [App Service Environment v3 networking](networking.md#addresses). - **What will happen to my App Service Environment v1/v2 resources after 31 August 2024?**
- After 31 August 2024, if you haven't migrated to App Service Environment v3, your App Service Environment v1/v2s and the apps deployed in them will no longer be available. App Service Environment v1/v2 is hosted on App Service scale units running on [Cloud Services (classic)](../../cloud-services/cloud-services-choose-me.md) architecture that will be [retired on 31 August 2024](https://azure.microsoft.com/updates/cloud-services-retirement-announcement/). Because of this, App Service Environment v1/v2 will no longer be available after that date. Migrate to App Service Environment v3 to keep your apps running or save or back up any resources or data that you need to maintain.
+ After 31 August 2024, if you haven't migrated to App Service Environment v3, your App Service Environment v1/v2s and the apps deployed in them will no longer be available. App Service Environment v1/v2 is hosted on App Service scale units running on [Cloud Services (classic)](../../cloud-services/cloud-services-choose-me.md) architecture that will be [retired on 31 August 2024](https://azure.microsoft.com/updates/cloud-services-retirement-announcement/). Because of this, [App Service Environment v1/v2 will no longer be available after that date](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). Migrate to App Service Environment v3 to keep your apps running or save or back up any resources or data that you need to maintain.
## Next steps
app-service Network Info https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/network-info.md
Title: Networking considerations
description: Learn about App Service Environment network traffic, and how to set network security groups and user-defined routes. Previously updated : 03/15/2022 Last updated : 03/29/2022 # Networking considerations for App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> [App Service Environment][Intro] is a deployment of Azure App Service into a subnet in your Azure virtual network. There are two deployment types for an App Service Environment:
app-service Networking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/networking.md
For your app to receive traffic, ensure that inbound network security group (NSG
It's a good idea to configure the following inbound NSG rule:
-|Port|Source|Destination|
-|-|-|-|
-|80,443|Virtual network|App Service Environment subnet range|
+|Source / Destination Port(s)|Direction|Source|Destination|Purpose|
+|-|-|-|-|-|
+|* / 80,443|Inbound|VirtualNetwork|App Service Environment subnet range|Allow app traffic and internal health ping traffic|
The minimal requirement for App Service Environment to be operational is:
-|Port|Source|Destination|
-|-|-|-|
-|80|Azure Load Balancer|App Service Environment subnet range|
+|Source / Destination Port(s)|Direction|Source|Destination|Purpose|
+|-|-|-|-|-|
+|* / 80|Inbound|AzureLoadBalancer|App Service Environment subnet range|Allow internal health ping traffic|
If you use the minimum required rule, you might need one or more rules for your application traffic. If you're using any of the deployment or debugging options, you must also allow this traffic to the App Service Environment subnet. The source of these rules can be the virtual network, or one or more specific client IPs or IP ranges. The destination is always the App Service Environment subnet range.
+The internal health ping traffic on port 80 is isolated between the Load balancer and the internal servers. No outside traffic can reach the health ping endpoint.
-The normal app access ports are as follows:
+The normal app access ports inbound are as follows:
|Use|Ports| |-|-|
To configure DNS in Azure DNS private zones:
In addition to the default domain provided when an app is created, you can also add a custom domain to your app. You can set a custom domain name without any validation on your apps. If you're using custom domains, you need to ensure they have DNS records configured. You can follow the preceding guidance to configure DNS zones and records for a custom domain name (simply replace the default domain name with the custom domain name). The custom domain name works for app requests, but doesn't work for the `scm` site. The `scm` site is only available at *&lt;appname&gt;.scm.&lt;asename&gt;.appserviceenvironment.net*.
+### DNS configuration for FTP access
+
+For FTP access to Internal Load balancer (ILB) App Service Environment v3 specifically, you need to ensure DNS is configured. Configure an Azure DNS private zone or equivalent custom DNS with the following settings:
+
+1. Create an Azure DNS private zone named `ftp.appserviceenvironment.net`.
+1. Create an A record in that zone that points `<App Service Environment-name>` to the inbound IP address.
+
+In addition to setting up DNS, you also need to enable it in the [App Service Environment configuration](./configure-network-settings.md#ftp-access) as well as at the [app level](../deploy-ftp.md?tabs=cli#enforce-ftps).
+ ### DNS configuration from your App Service Environment The apps in your App Service Environment will use the DNS that your virtual network is configured with. If you want some apps to use a different DNS server, you can manually set it on a per app basis, with the app settings `WEBSITE_DNS_SERVER` and `WEBSITE_DNS_ALT_SERVER`. `WEBSITE_DNS_ALT_SERVER` configures the secondary DNS server. The secondary DNS server is only used when there is no response from the primary DNS server.
app-service Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/overview.md
Title: App Service Environment overview
description: This article discusses the Azure App Service Environment feature of Azure App Service. Previously updated : 01/26/2022 Last updated : 03/29/2022
App Service Environment v3 differs from earlier versions in the following ways:
A few features that were available in earlier versions of App Service Environment aren't available in App Service Environment v3. For example, you can no longer do the following: - Send SMTP traffic. You can still have email triggered alerts but your app can't send outbound traffic on port 25.-- Deploy your apps by using FTP.-- Use remote debugging with your apps. - Monitor your traffic with Network Watcher or network security group (NSG) flow logs. - Configure an IP-based Transport Layer Security (TLS) or Secure Sockets Layer (SSL) binding with your apps. - Configure a custom domain suffix.
App Service Environment v3 is available in the following regions:
## App Service Environment v2 App Service Environment has three versions: App Service Environment v1, App Service Environment v2, and App Service Environment v3. The information in this article is based on App Service Environment v3. To learn more about App Service Environment v2, see [App Service Environment v2 introduction](./intro.md).+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Whitepaper on Using App Service Environment v3 in Compliance-Oriented Industries](https://azure.microsoft.com/resources/using-app-service-environment-v3-in-compliance-oriented-industries/)
app-service Using An Ase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/using-an-ase.md
description: Learn how to create, publish, and scale apps in an App Service Envi
ms.assetid: a22450c4-9b8b-41d4-9568-c4646f4cf66b Previously updated : 03/15/2022 Last updated : 03/29/2022 # Manage an App Service Environment > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> An App Service Environment (ASE) is a deployment of Azure App Service into a subnet in a customer's Azure Virtual Network instance. An ASE consists of:
app-service Zone Redundancy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/zone-redundancy.md
Title: Availability Zone support for App Service Environment v2
description: Learn how to deploy your App Service Environments so that your apps are zone redundant. Previously updated : 03/15/2022 Last updated : 03/29/2022 # Availability Zone support for App Service Environment v2 > [!IMPORTANT]
-> This article is about App Service Environment v2 which is used with Isolated App Service plans. App Service Environment v2 will be retired on 31 August 2024. There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
+> This article is about App Service Environment v2 which is used with Isolated App Service plans. [App Service Environment v2 will be retired on 31 August 2024](https://azure.microsoft.com/updates/app-service-environment-v1-and-v2-retirement-announcement/). There's a new version of App Service Environment that is easier to use and runs on more powerful infrastructure. To learn more about the new version, start with the [Introduction to the App Service Environment](overview.md). If you're currently using App Service Environment v2, please follow the steps in [this article](migration-alternatives.md) to migrate to the new version.
> App Service Environment v2 (ASE) can be deployed into Availability Zones (AZ). Customers can deploy an internal load balancer (ILB) ASEs into a specific AZ within an Azure region. If you pin your ILB ASE to a specific AZ, the resources used by a ILB ASE will either be pinned to the specified AZ, or deployed in a zone redundant manner.
automation Automation Security Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-security-guidelines.md
Review the Azure Policy recommendations for Azure Automation and act as appropri
## Next steps * To learn how to use Azure role-based access control (Azure RBAC), see [Manage role permissions and security in Azure Automation](/azure/automation/automation-role-based-access-control).
-* For information on how Azure protects your privacy and secures your data, see [Azure Automation data security](./automation-managing-data.md).
+* For information on how Azure protects your privacy and secures your data, see [Azure Automation data security](/azure/automation/automation-managing-data).
* To learn about configuring the Automation account to use encryption, see [Encryption of secure assets in Azure Automation](/azure/automation/automation-secure-asset-encryption).
availability-zones Az Region https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/availability-zones/az-region.md
In the Product Catalog, always-available services are listed as "non-regional" s
| **Products** | **Resiliency** | | | |
+| [Azure HPC Cache](../hpc-cache/hpc-cache-overview.md) | ![An icon that signifies this service is zonal.](media/icon-zonal.svg) |
| [Azure IoT Hub Device Provisioning Service](../iot-dps/about-iot-dps.md) | ![An icon that signifies this service is zone redundant.](media/icon-zone-redundant.svg) | | Azure Red Hat OpenShift | ![An icon that signifies this service is zone redundant.](media/icon-zone-redundant.svg) ![An icon that signifies this service is zonal](media/icon-zonal.svg) | | [Azure Managed Instance for Apache Cassandra](../managed-instance-apache-cassandr) | ![An icon that signifies this service is zone redundant.](media/icon-zone-redundant.svg) |
In the Product Catalog, always-available services are listed as "non-regional" s
### ![An icon that signifies this service is non-regional.](media/icon-always-available.svg) Non-regional services (always-available services)
-| **Products** | **Resiliency** |
+| **Products** | **Resiliency** |
| | | | Azure Active Directory | ![An icon that signifies this service is always available.](media/icon-always-available.svg) | | Azure Advanced Threat Protection | ![An icon that signifies this service is always available.](media/icon-always-available.svg) |
azure-arc Concept Log Analytics Extension Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/concept-log-analytics-extension-deployment.md
Title: Deploy Log Analytics agent on Arc-enabled servers description: This article reviews the different methods to deploy the Log Analytics agent on Windows and Linux-based machines registered with Azure Arc-enabled servers in your local datacenter or other cloud environment. Previously updated : 10/22/2021 Last updated : 3/18/2022
Azure Monitor supports multiple methods to install the Log Analytics agent and c
The Log Analytics agent is required if you want to:
-* Monitor the operating system, any workloads running on the machine or server using [VM insights](../../azure-monitor/vm/vminsights-overview.md). Further analyze and alert using other features of [Azure Monitor](../../azure-monitor/overview.md).
+* Monitor the operating system and any workloads running on the machine or server using [VM insights](../../azure-monitor/vm/vminsights-overview.md).
+* Analyze and alert using [Azure Monitor](../../azure-monitor/overview.md).
* Perform security monitoring in Azure by using [Microsoft Defender for Cloud](../../defender-for-cloud/defender-for-cloud-introduction.md) or [Microsoft Sentinel](../../sentinel/overview.md). * Manage operating system updates by using [Azure Automation Update Management](../../automation/update-management/overview.md). * Collect inventory and track changes by using [Azure Automation Change Tracking and Inventory](../../automation/change-tracking/overview.md). * Run Automation runbooks directly on the machine and against resources in the environment by using an [Azure Automation Hybrid Runbook Worker](../../automation/automation-hybrid-runbook-worker.md).
-This article reviews the deployment methods for the Log Analytics agent VM extension, across multiple production physical servers or virtual machines in your environment, to help you determine which works best for your organization. If you are interested in the new Azure Monitor agent and want to see a detailed comparison, then review the [Azure Monitor agents overview](../../azure-monitor//agents/agents-overview.md) article.
+This article reviews the deployment methods for the Log Analytics agent VM extension, across multiple production physical servers or virtual machines in your environment, to help you determine which works best for your organization. If you are interested in the new Azure Monitor agent and want to see a detailed comparison, see [Azure Monitor agents overview](../../azure-monitor/agents/agents-overview.md).
## Installation options
-You can use different methods to install the VM extension using one method or a combination. This section describes each one for you to consider.
+Review the different methods to install the VM extension using one method or a combination and determine which one works best for your scenario.
-### Using Arc-enabled servers
+### Use Azure Arc-enabled servers
This method supports managing the installation, management, and removal of VM extensions from the [Azure portal](manage-vm-extensions-portal.md), using [PowerShell](manage-vm-extensions-powershell.md), the [Azure CLI](manage-vm-extensions-cli.md), or with an [Azure Resource Manager (ARM) template](manage-vm-extensions-template.md).
This method supports managing the installation, management, and removal of VM ex
#### Disadvantages
-* Limited automation when using an Azure Resource Manager template, otherwise it is time consuming.
+* Limited automation when using an Azure Resource Manager template.
* Can only focus on a single Arc-enabled server, and not multiple instances. * Only supports specifying a single workspace to report to. Requires using PowerShell or the Azure CLI to configure the Log Analytics Windows agent VM extension to report to up to four workspaces. * Doesn't support deploying the Dependency agent from the portal. You can only use PowerShell, the Azure CLI, or ARM template.
-### Using Azure Policy
+### Use Azure Policy
You can use Azure Policy to deploy the Log Analytics agent VM extension at-scale to machines in your environment, and maintain configuration compliance. This is accomplished by using either the **Configure Log Analytics extension on Azure Arc enabled Linux servers** / **Configure Log Analytics extension on Azure Arc enabled Windows servers** policy definition, or the **Enable Azure Monitor for VMs** policy initiative.
Azure Policy includes several prebuilt definitions related to Azure Monitor. For
#### Disadvantages
-* The **Configure Log Analytics extension on Azure Arc enabled** *operating system* **servers** policy only installs the Log Analytics VM extension and configures the agent to report to a specified Log Analytics workspace. If you are interested in VM insights to monitor the operating system performance, and map running processes and dependencies on other resources, then you should apply the policy initiative **Enable Azure Monitor for VMs**. It installs and configures both the Log Analytics VM extension and the Dependency agent VM extension, which are required.
+* The **Configure Log Analytics extension on Azure Arc enabled** *operating system* **servers** policy only installs the Log Analytics VM extension and configures the agent to report to a specified Log Analytics workspace. If you want VM insights to monitor the operating system performance, and map running processes and dependencies on other resources, apply the policy initiative **Enable Azure Monitor for VMs**. It installs and configures both the Log Analytics VM extension and the Dependency agent VM extension, which are required.
* Standard compliance evaluation cycle is once every 24 hours. An evaluation scan for a subscription or a resource group can be started with Azure CLI, Azure PowerShell, a call to the REST API, or by using the Azure Policy Compliance Scan GitHub Action. For more information, see [Evaluation triggers](../../governance/policy/how-to/get-compliance-data.md#evaluation-triggers).
-### Using Azure Automation
+### Use Azure Automation
-The process automation operating environment in Azure Automation and its support for PowerShell and Python runbooks can enable you to automate the deployment of the Log Analytics agent VM extension at-scale to machines in your environment.
+The process automation operating environment in Azure Automation and its support for PowerShell and Python runbooks can help you automate the deployment of the Log Analytics agent VM extension at scale to machines in your environment.
#### Advantages
The process automation operating environment in Azure Automation and its support
* Requires an Azure Automation account. * Experience authoring and managing runbooks in Azure Automation.
-* Creating a runbook based on PowerShell or Python depending on the target operating system.
+* Must create a runbook based on PowerShell or Python, depending on the target operating system.
## Next steps
-* To manage operating system updates using Azure Automation Update Management, review [Enable from an Automation account](../../automation/update-management/enable-from-automation-account.md) and then follow the steps to enable machines reporting to the workspace.
+* To manage operating system updates using Azure Automation Update Management, see [Enable from an Automation account](../../automation/update-management/enable-from-automation-account.md) and then follow the steps to enable machines reporting to the workspace.
-* To track changes using Azure Automation Change Tracking and Inventory, review [Enable from an Automation account](../../automation/change-tracking/enable-from-automation-account.md) and then follow the steps to enable machines reporting to the workspace.
+* To track changes using Azure Automation Change Tracking and Inventory, see [Enable from an Automation account](../../automation/change-tracking/enable-from-automation-account.md) and then follow the steps to enable machines reporting to the workspace.
-* You can use the user Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on servers or machines registered with Arc-enabled servers. See the [Deploy Hybrid Runbook Worker VM extension](../../automation/extension-based-hybrid-runbook-worker-install.md) article.
+* Use the Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on servers or machines registered with Arc-enabled servers. See the [Deploy Hybrid Runbook Worker VM extension](../../automation/extension-based-hybrid-runbook-worker-install.md) article.
* To start collecting security-related events with Microsoft Sentinel, see [onboard to Microsoft Sentinel](scenario-onboard-azure-sentinel.md), or to collect with Microsoft Defender for Cloud, see [onboard to Microsoft Defender for Cloud](../../security-center/quickstart-onboard-machines.md).
-* See the VM insights [Monitor performance](../../azure-monitor/vm/vminsights-performance.md) and [Map dependencies](../../azure-monitor/vm/vminsights-maps.md) articles to see how well your machine is performing and view discovered application components.
+* Read the VM insights [Monitor performance](../../azure-monitor/vm/vminsights-performance.md) and [Map dependencies](../../azure-monitor/vm/vminsights-maps.md) articles to see how well your machine is performing and view discovered application components.
azure-arc Manage Howto Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/manage-howto-migrate.md
Title: How to migrate Azure Arc-enabled servers across regions description: Learn how to migrate an Azure Arc-enabled server from one region to another. Previously updated : 07/16/2021 Last updated : 3/29/2022 # How to migrate Azure Arc-enabled servers across regions
-There are scenarios in which you'd want to move your existing Azure Arc-enabled server from one region to another. For example, you realized the machine was registered in the wrong region, to improve manageability, or to move for governance reasons.
+There are scenarios in which you'll want to move your existing Azure Arc-enabled server from one region to another. For example, you might want to move regions to improve manageability, for governance reasons, or because you realized the machine was originally registered in the wrong region.
To migrate an Azure Arc-enabled server from one Azure region to another, you have to uninstall the VM extensions, delete the resource in Azure, and re-create it in the other region. Before you perform these steps, you should audit the machine to verify which VM extensions are installed.
To migrate an Azure Arc-enabled server from one Azure region to another, you hav
## Move machine to other region > [!NOTE]
-> During this operation, it results in downtime during the migration.
+> Performing this operation will result in downtime during the migration.
-1. Remove VM extensions installed from the [Azure portal](manage-vm-extensions-portal.md#remove-extensions), using the [Azure CLI](manage-vm-extensions-cli.md#remove-extensions), or using [Azure PowerShell](manage-vm-extensions-powershell.md#remove-extensions).
+1. Remove any VM extensions that are installed on the machine. You can do this by using the [Azure portal](manage-vm-extensions-portal.md#remove-extensions), [Azure CLI](manage-vm-extensions-cli.md#remove-extensions), or [Azure PowerShell](manage-vm-extensions-powershell.md#remove-extensions).
-2. Use the **azcmagent** tool with the [Disconnect](manage-agent.md#disconnect) parameter to disconnect the machine from Azure Arc and delete the machine resource from Azure. Disconnecting the machine from Azure Arc-enabled servers does not remove the Connected Machine agent, and you do not need to remove the agent as part of this process. You can run this manually while logged on interactively, or automate using the same service principal you used to onboard multiple agents, or with a Microsoft identity platform [access token](../../active-directory/develop/access-tokens.md). If you did not use a service principal to register the machine with Azure Arc-enabled servers, see the following [article](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale) to create a service principal.
+2. Use the **azcmagent** tool with the [Disconnect](manage-agent.md#disconnect) parameter to disconnect the machine from Azure Arc and delete the machine resource from Azure. You can run this manually while logged on interactively, with a Microsoft identity platform [access token](../../active-directory/develop/access-tokens.md), or with the service principal you used for onboarding (or with a [new service principal that you create](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale)).
-3. Re-register the Connected Machine agent with Azure Arc-enabled servers in the other region. Run the `azcmagent` tool with the [Connect](manage-agent.md#connect) parameter complete this step.
+ Disconnecting the machine from Azure Arc-enabled servers does not remove the Connected Machine agent, and you don't need to remove the agent as part of this process.
-4. Redeploy the VM extensions that were originally deployed to the machine from Azure Arc-enabled servers. If you deployed the Azure Monitor for VMs (insights) agent or the Log Analytics agent using an Azure Policy definition, the agents are redeployed after the next [evaluation cycle](../../governance/policy/how-to/get-compliance-data.md#evaluation-triggers).
+3. Run the `azcmagent` tool with the [Connect](manage-agent.md#connect) parameter to re-register the Connected Machine agent with Azure Arc-enabled servers in the other region.
+
+4. Redeploy the VM extensions that were originally deployed to the machine from Azure Arc-enabled servers.
+
+ If you deployed the Azure Monitor for VMs (insights) agent or the Log Analytics agent using an Azure Policy definition, the agents are redeployed after the next [evaluation cycle](../../governance/policy/how-to/get-compliance-data.md#evaluation-triggers).
## Next steps
azure-functions Functions Bindings Warmup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-warmup.md
def main(warmupContext: func.Context) -> None:
::: zone pivot="programming-language-csharp" ## Attributes
-Both [in-process](functions-dotnet-class-library.md) and [isolated process](dotnet-isolated-process-guide.md) C# libraries use the `WarmupTriggerAttribute` to define the function. C# script instead uses a *function.json* configuration file.
+Both [in-process](functions-dotnet-class-library.md) and [isolated process](dotnet-isolated-process-guide.md) C# libraries use the `WarmupTrigger` attribute to define the function. C# script instead uses a *function.json* configuration file.
# [In-process](#tab/in-process)
-Use the `WarmupTriggerAttribute` to define the function. This attribute has no parameters.
+Use the `WarmupTrigger` attribute to define the function. This attribute has no parameters.
# [Isolated process](#tab/isolated-process)
-Use the `WarmupTriggerAttribute` to define the function. This attribute has no parameters.
+Use the `WarmupTrigger` attribute to define the function. This attribute has no parameters.
# [C# script](#tab/csharp-script)
The following considerations apply to using a warmup function in C#:
# [In-process](#tab/in-process) -- Your function must be named `warmup` (case-insensitive) using the `FunctionNameAttribute`.
+- Your function must be named `warmup` (case-insensitive) using the `FunctionName` attribute.
- A return value attribute isn't required. - You must be using version `3.0.5` of the `Microsoft.Azure.WebJobs.Extensions` package, or a later version. - You can pass a `WarmupContext` instance to the function. # [Isolated process](#tab/isolated-process) -- Your function must be named `warmup` (case-insensitive) using the `FunctionNameAttribute`.
+- Your function must be named `warmup` (case-insensitive) using the `FunctionName` attribute.
- A return value attribute isn't required. - You can pass an object instance to the function.
azure-functions Functions Dotnet Dependency Injection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-dotnet-dependency-injection.md
Before you can use dependency injection, you must install the following NuGet pa
- [Microsoft.NET.Sdk.Functions](https://www.nuget.org/packages/Microsoft.NET.Sdk.Functions/) package version 1.0.28 or later -- [Microsoft.Extensions.DependencyInjection](https://www.nuget.org/packages/Microsoft.Extensions.DependencyInjection/) (currently, only version 3.x and earlier supported)
+- [Microsoft.Extensions.DependencyInjection](https://www.nuget.org/packages/Microsoft.Extensions.DependencyInjection/) (currently, only version 2.x or later supported)
## Register services
azure-functions Functions Reference Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-reference-python.md
First, the function.json file must be updated to include a `route` in the HTTP t
"get", "post" ],
- "route": "/{*route}"
+ "route": "{*route}"
}, { "type": "http",
Update the Python code file `init.py`, depending on the interface used by your f
```python app=fastapi.FastAPI()
-@app.get("/hello/{name}")
+@app.get("hello/{name}")
async def get_name( name: str,): return {
def main(req: func.HttpRequest, context: func.Context) -> func.HttpResponse:
```python app=Flask("Test")
-@app.route("/hello/<name>", methods=['GET'])
+@app.route("hello/<name>", methods=['GET'])
def hello(name: str): return f"hello {name}"
azure-government Compare Azure Government Global Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/compare-azure-government-global-azure.md
In general, service availability in Azure Government implies that all correspond
## AI + machine learning
-This section outlines variations and considerations when using **Azure Bot Service**, **Azure Machine Learning**, and **Cognitive Services** in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=machine-learning-service,bot-service,cognitive-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia&rar=true).
+This section outlines variations and considerations when using **Azure Bot Service**, **Azure Machine Learning**, and **Cognitive Services** in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=machine-learning-service,bot-service,cognitive-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia&rar=true).
### [Azure Bot Service](/azure/bot-service/)
The following Azure Cost Management + Billing **features are not currently avail
This section outlines variations and considerations when using Media services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=cdn,media-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia&rar=true).
-### [Media Services](../media-services/index.yml)
+### [Media Services](/media-services/)
-For Azure Media Services v3 feature variations in Azure Government, see [Azure Media Services v3 clouds and regions availability](../media-services/latest/azure-clouds-regions.md#us-government-cloud).
+For Azure Media Services v3 feature variations in Azure Government, see [Azure Media Services v3 clouds and regions availability](/media-services/latest/azure-clouds-regions#us-government-cloud).
## Migration
azure-government Azure Services In Fedramp Auditscope https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/compliance/azure-services-in-fedramp-auditscope.md
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Machine Learning](../../machine-learning/index.yml) | &#x2705; | &#x2705; | | [Managed Applications](../../azure-resource-manager/managed-applications/index.yml) | &#x2705; | &#x2705; | | **Service** | **FedRAMP High** | **DoD IL2** |
-| [Media Services](../../media-services/index.yml) | &#x2705; | &#x2705; |
+| [Media Services](/media-services/) | &#x2705; | &#x2705; |
| [Microsoft 365 Defender](/microsoft-365/security/defender/) (formerly Microsoft Threat Protection) | &#x2705; | &#x2705; | | [Microsoft Azure Attestation](../../attestation/index.yml)| &#x2705; | &#x2705; | | [Microsoft Azure Marketplace portal](https://azuremarketplace.microsoft.com/marketplace/)| &#x2705; | &#x2705; |
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Azure Active Directory (Free and Basic)](../../active-directory/fundamentals/active-directory-whatis.md#what-are-the-azure-ad-licenses) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | [Azure Active Directory (Premium P1 + P2)](../../active-directory/fundamentals/active-directory-whatis.md#what-are-the-azure-ad-licenses) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Active Directory Domain Services](../../active-directory-domain-services/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
-| [Azure AD access reviews](../../active-directory/governance/access-reviews-overview.md) | | | | | &#x2705; |
| [Azure AD Multi-Factor Authentication](../../active-directory/authentication/concept-mfa-howitworks.md) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; |
-| [Azure AD Privileged Identity Management](../../active-directory/privileged-identity-management/index.yml) | | | | | &#x2705; |
| [Azure API for FHIR](../../healthcare-apis/azure-api-for-fhir/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Arc-enabled Kubernetes](../../azure-arc/kubernetes/index.yml) | &#x2705; | &#x2705; | | | | | **Service** | **FedRAMP High** | **DoD IL2** | **DoD IL4** | **DoD IL5** | **DoD IL6** |
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Logic Apps](../../logic-apps/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | [Machine Learning](../../machine-learning/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Managed Applications](../../azure-resource-manager/managed-applications/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
-| [Media Services](../../media-services/index.yml) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; |
+| [Media Services](/media-services/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; |
| [Microsoft 365 Defender](/microsoft-365/security/defender/) (formerly Microsoft Threat Protection) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Microsoft Azure portal](../../azure-portal/index.yml) | &#x2705; | &#x2705; | &#x2705;| &#x2705; | &#x2705; | | **Service** | **FedRAMP High** | **DoD IL2** | **DoD IL4** | **DoD IL5** | **DoD IL6** |
azure-government Documentation Government Csp List https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/documentation-government-csp-list.md
Below you can find a list of all the authorized Cloud Solution Providers (CSPs),
|[Doublehorn, LLC](https://doublehorn.com/)| |[DXC Technology Services LLC](https://www.dxc.technology/services)| |[DXL Enterprises, Inc.](https://mahwahnjcoc.wliinc31.com/Supply-Chain-Management/DXL-Enterprises,-Inc-1349)|
-|[Dynamics Intelligence Inc.](https://www.dynamicsintelligence.us)|
|[DynTek](https://www.dyntek.com)| |[ECS Federal, LLC](https://ecstech.com/)| |[Edafio Technology Partners](https://edafio.com)|
azure-maps Weather Coverage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-maps/weather-coverage.md
Title: Microsoft Azure Maps Weather services coverage
description: Learn about Microsoft Azure Maps Weather services coverage Previously updated : 01/26/2022 Last updated : 03/28/2022 -
-# Azure Maps Weather services coverage
+# Azure Maps weather services coverage
+
+This article provides coverage information for Azure Maps [Weather services][weather-services].
+
+## Weather information supported
+
+### Infrared satellite tiles
+<!-- Replace with Minimal Description
+Infrared (IR) radiation is electromagnetic radiation that measures an object's infrared emission, returning information about its temperature. Infrared images can indicate cloud heights (Colder cloud-tops mean higher clouds) and types, calculate land and surface water temperatures, and locate ocean surface features.
+ -->
+
+Infrared satellite imagery, showing clouds by their temperature, is returned when `tilesetID` is set to `microsoft.weather.infrared.main` when making calls to [Get Map Tile][get-map-tile] and can then be overlaid on the map image.
+
+### Minute forecast
+
+The [Get Minute forecast][get-minute-forecast] service returns minute-by-minute forecasts for the specified location for the next 120 minutes.
+
+### Radar tiles
+<!-- Replace with Minimal Description
+Radar imagery is a depiction of the response returned when microwave radiation is sent into the atmosphere. The pulses of radiation reflect back showing its interactions with any precipitation it encounters. The radar technology visually represents those pulses showing where it's clear, raining, snowing or stormy.
+-->
+
+Radar tiles, showing areas of rain, snow, ice and mixed conditions, are returned when `tilesetID` is set to `microsoft.weather.radar.main` when making calls to [Get Map Tile][get-map-tile] and can then be overlaid on the map image.
+
+### Severe weather alerts
-This article provides coverage information for Azure Maps [Weather services](/rest/api/maps/weather). Azure Maps Weather data services returns details such as radar tiles, current weather conditions, weather forecasts, the weather along a route, air quality, historical weather and tropical storms info.
+Azure Maps [Severe weather alerts][severe-weather-alerts] service returns severe weather alerts from both official Government Meteorological Agencies and other leading severe weather alert providers. The service can return details such as alert type, category, level and detailed description. Severe weather includes conditions like hurricanes, tornados, tsunamis, severe thunderstorms, and fires.
-Azure Maps doesn't have the same level of information and accuracy for all countries and regions.
+### Other
-The following table refers to the *Other* column and provides a list containing the weather information you can request from that country/region.
+- **Air quality**. The Air Quality service returns [current][aq-current], [hourly][aq-hourly] or [daily][aq-daily] forecasts that include pollution levels, air quality index values, the dominant pollutant, and a brief statement summarizing risk level and suggested precautions.
+- **Current conditions**. The [Get Current Conditions](/rest/api/maps/weather/get-current-conditions) service returns detailed current weather conditions such as precipitation, temperature and wind for a given coordinate location.
+- **Daily forecast**. The [Get Daily Forecast](/rest/api/maps/weather/get-current-air-quality) service returns detailed weather forecasts such as temperature and wind by day for the next 1, 5, 10, 15, 25, or 45 days for a given coordinate location.
+- **Daily indices**. The [Get Daily Indices](/rest/api/maps/weather/get-daily-indices) service returns index values that provide information that can help in planning activities. For example, a health mobile application can notify users that today is good weather for running or playing golf.
+- **Historical weather**. The Historical Weather service includes Daily Historical [Records][dh-records], [Actuals][dh-actuals] and [Normals][dh-normals] that return climatology data such as past daily record temperatures, precipitation and snowfall at a given coordinate location.
+- **Hourly forecast**. The [Get Hourly Forecast](/rest/api/maps/weather/get-hourly-forecast) service returns detailed weather forecast information by the hour for up to 10 days.
+- **Quarter-day forecast**. The [Get Quarter Day Forecast](/rest/api/maps/weather/get-quarter-day-forecast) Service returns detailed weather forecast by quarter-day for up to 15 days.
+- **Tropical storms**. The Tropical Storm Service provides information about [active storms][tropical-storm-active], tropical storm [forecasts][tropical-storm-forecasts] and [locations][tropical-storm-locations] and the ability to [search][tropical-storm-search] for tropical storms by year, basin ID, or government ID.
+- **Weather along route**. The [Get Weather Along Route](/rest/api/maps/weather/get-weather-along-route) Service returns hyper local (1 kilometer or less), up-to-the-minute weather nowcasts, weather hazard assessments, and notifications along a route described as a sequence of waypoints.
-| Symbol | Meaning |
-|:-:|--|
-| * |Refers to coverage of the following features: Air Quality, Current Conditions, Daily Forecast, Daily Indices, Historical Weather, Hourly Forecast, Quarter-day Forecast, Tropical Storms and Weather Along Route. |
+## Azure Maps Weather coverage tables
+
+> [!NOTE]
+> Azure Maps doesn't have the same level of detail and accuracy for all countries and regions.
## Americas
-| Country/Region | Infrared Satellite Tiles | Minute Forecast, Radar Tiles | Severe Weather Alerts | Other* |
+| Country/Region | Infrared satellite tiles | Minute forecast, Radar tiles | Severe weather alerts | Other* |
||::|:-:|::|::| | Anguilla | Γ£ô | | | Γ£ô | | Antarctica | Γ£ô | | | Γ£ô |
The following table refers to the *Other* column and provides a list containing
## Asia Pacific
-| Country/Region | Infrared Satellite Tiles | Minute Forecast, Radar Tiles | Severe Weather Alerts | Other* |
-||--|::|:-:|::|::|
+| Country/Region | Infrared satellite tiles | Minute forecast, Radar tiles | Severe weather alerts | Other* |
+|--|::|:-:|::|::|
| Afghanistan | Γ£ô | | | Γ£ô | | American Samoa | Γ£ô | | Γ£ô | Γ£ô | | Australia | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
The following table refers to the *Other* column and provides a list containing
## Europe
-| Country/Region | Infrared Satellite Tiles | Minute Forecast, Radar Tiles | Severe Weather Alerts | Other* |
+| Country/Region | Infrared satellite tiles | Minute forecast, Radar tiles | Severe weather alerts | Other* |
|-|::|:-:|::|::| | Albania | Γ£ô | | | Γ£ô | | Andorra | Γ£ô | | Γ£ô | Γ£ô |
The following table refers to the *Other* column and provides a list containing
## Middle East & Africa
-| Country/Region | Infrared Satellite Tiles | Minute Forecast, Radar Tiles | Severe Weather Alerts | Other* |
+| Country/Region | Infrared satellite tiles | Minute forecast, Radar tiles | Severe weather alerts | Other* |
|-|::|:-:|::|::| | Algeria | Γ£ô | | | Γ£ô | | Angola | Γ£ô | | | Γ£ô |
The following table refers to the *Other* column and provides a list containing
| Yemen | Γ£ô | | | Γ£ô | | Zambia | Γ£ô | | | Γ£ô | | Zimbabwe | Γ£ô | | | Γ£ô |+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Weather services in Azure Maps](weather-services-concepts.md)
+
+> [!div class="nextstepaction"]
+> [Azure Maps weather services frequently asked questions (FAQ)](weather-services-faq.yml)
+
+[weather-services]: /rest/api/maps/weather
+[get-map-tile]: /rest/api/maps/render-v2/get-map-tile
+[get-minute-forecast]: /rest/api/maps/weather/get-minute-forecast
+[severe-weather-alerts]: /rest/api/maps/weather/get-severe-weather-alerts
+
+[aq-current]: /rest/api/maps/weather/get-current-air-quality
+[aq-hourly]: /rest/api/maps/weather/get-air-quality-hourly-forecasts
+[aq-daily]: /rest/api/maps/weather/get-air-quality-daily-forecasts
+
+[current-conditions]: /rest/api/maps/weather/get-current-conditions
+
+[dh-records]: /rest/api/maps/weather/get-dh-records
+[dh-actuals]: /rest/api/maps/weather/get-dh-actuals
+[dh-normals]: /rest/api/maps/weather/get-dh-normals
+
+[tropical-storm-active]: /rest/api/maps/weather/get-tropical-storm-active
+[tropical-storm-forecasts]: /rest/api/maps/weather/get-tropical-storm-forecast
+[tropical-storm-locations]: /rest/api/maps/weather/get-tropical-storm-locations
+[tropical-storm-search]: /rest/api/maps/weather/get-tropical-storm-search
azure-monitor Azure Monitor Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-overview.md
The Azure Monitor agent (AMA) collects monitoring data from the guest operating
Here's an **introductory video** explaining all about this new agent, including a quick demo of how to set things up using the Azure Portal: [ITOps Talk: Azure Monitor Agent](https://www.youtube.com/watch?v=f8bIrFU8tCs) ## Relationship to other agents
-The Azure Monitor agent is meant to replace the following legacy monitoring agents that are currently used by Azure Monitor to collect guest data from virtual machines ([view known gaps](../faq.yml)):
+Eventually, the Azure Monitor agent will replace the following legacy monitoring agents that are currently used by Azure Monitor to collect guest data from virtual machines ([view known gaps](../faq.yml)):
- [Log Analytics agent](./log-analytics-agent.md): Sends data to a Log Analytics workspace and supports VM insights and monitoring solutions.
+- [Telegraf agent](../essentials/collect-custom-metrics-linux-telegraf.md): Sends data to Azure Monitor Metrics (Linux only).
- [Diagnostics extension](./diagnostics-extension-overview.md): Sends data to Azure Monitor Metrics (Windows only), Azure Event Hubs, and Azure Storage.-- [Telegraf agent](../essentials/collect-custom-metrics-linux-telegraf.md): Sends data to Azure Monitor Metrics (Linux only).+
+**Currently**, the Azure Monitor agent consolidates features from the Telegraf agent and Log Analytics agent, with [a few limitations](#current-limitations).
+In future, it will also consolidate features from the Diagnostic extensions.
In addition to consolidating this functionality into a single agent, the Azure Monitor agent provides the following benefits over the existing agents:
In addition to consolidating this functionality into a single agent, the Azure M
- **Improved extension management:** The Azure Monitor agent uses a new method of handling extensibility that's more transparent and controllable than management packs and Linux plug-ins in the current Log Analytics agents. ### Current limitations
-When compared with the existing agents, this new agent doesn't yet have full parity.
+When compared with the legacy agents, this new agent doesn't yet have full parity.
- **Comparison with Log Analytics agents (MMA/OMS):** - Not all Log Analytics solutions are supported yet. [View supported features and services](#supported-services-and-features). - The support for collecting file based logs or IIS logs is in [private preview](https://aka.ms/amadcr-privatepreviews). -- **Comparison with Azure Diagnostics extensions (WAD/LAD):**
- - No support yet for Event Hubs and Storage accounts as destinations.
- - No support yet for collecting file based logs, IIS logs, ETW events, .NET events and crash dumps.
- ### Changes in data collection The methods for defining data collection for the existing agents are distinctly different from each other. Each method has challenges that are addressed with the Azure Monitor agent.
azure-monitor Alerts Action Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-action-rules.md
You can also define filters to narrow down which specific subset of alerts are a
* **Alert Context (payload)** - the rule will apply only to alerts that contain any of the filter's strings within the [alert context](./alerts-common-schema-definitions.md#alert-context) section of the alert. This section includes fields specific to each alert type. * **Alert rule id** - the rule will apply only to alerts from a specific alert rule. The value should be the full resource ID, for example `/subscriptions/SUB1/resourceGroups/RG1/providers/microsoft.insights/metricalerts/MY-API-LATENCY`.
-You can locate the alert rule ID by opening a specific alert rule in the portal, clicking "Properties", and copying the "Resource ID" value.
-You can also locate it by listing your alert rules from PowerShell or CLI.
+You can locate the alert rule ID by opening a specific alert rule in the portal, clicking "Properties", and copying the "Resource ID" value. You can also locate it by listing your alert rules from PowerShell or CLI.
* **Alert rule name** - the rule will apply only to alerts with this alert rule name. Can also be useful with a "Contains" operator. * **Description** - the rule will apply only to alerts that contain the specified string within the alert rule description field. * **Monitor condition** - the rule will apply only to alerts with the specified monitor condition, either "Fired" or "Resolved".
For example, you can use this filter with "Does not equal" to exclude one or mor
* **Resource group** - the rule will apply only to alerts from the specified resource groups. For example, you can use this filter with "Does not equal" to exclude one or more resource groups when the rule's scope is a subscription. * **Resource type** - the rule will apply only to alerts on resource from the specified resource types, such as virtual machines. You can use "Equals" to match one or more specific resources, or you can use contains to match a resource type and all its child resources.
-For example, use "contains MICROSOFT.SQL/SERVERS" to match both SQL servers and all their child resources, like databases.
+For example, use `resource type contains "MICROSOFT.SQL/SERVERS"` to match both SQL servers and all their child resources, like databases.
* **Severity** - the rule will apply only to alerts with the selected severities. **FILTERS BEHAVIOR** * If you define multiple filters in a rule, all of them apply - there is a logical AND between all filters.
- For example, if you set both `resource type = "Virtual Machines` and `severity = "Sev0`, then the rule will apply only for Sev0 alerts on virtual machines in the scope.
+ For example, if you set both `resource type = "Virtual Machines"` and `severity = "Sev0"`, then the rule will apply only for Sev0 alerts on virtual machines in the scope.
* Each filter may include up to five values, and there is a logical OR between the values. For example, if you set `description contains ["this", "that"]`, then the rule will apply only to alerts whose description contains either "this" or "that".
azure-monitor Java 2X Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/java-2x-troubleshoot.md
Questions or problems with [Azure Application Insights in Java][java]? Here are
### Java Agent cannot capture dependency data * Have you configured Java agent by following [Configure Java Agent](java-2x-agent.md) ?
-* Make sure both the java agent jar and the AI-Agent.xml file are placed in the same folder.
+* Make sure both the Java agent jar and the AI-Agent.xml file are placed in the same folder.
* Make sure that the dependency you are trying to auto-collect is supported for auto collection. Currently we only support MySQL, MsSQL, Oracle DB and Azure Cache for Redis dependency collection. ## No usage data
azure-monitor Pricing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/pricing.md
For SDKs that don't support adaptive sampling, you can employ [ingestion samplin
## Viewing Application Insights usage on your Azure bill
-The easiest way to see the billed usage for a single Application Insights resource, which isn't a workspace-baed resource is to go to the resource's Overview page and click **View Cost** in the upper right corner. You might need elevated access to Cost Management data ([learn more](../../cost-management-billing/costs/assign-access-acm-data.md)).
+The easiest way to see the billed usage for a single Application Insights resource, which isn't a workspace-based resource is to go to the resource's Overview page and click **View Cost** in the upper right corner. You might need elevated access to Cost Management data ([learn more](../../cost-management-billing/costs/assign-access-acm-data.md)).
To learn more, Azure provides a great deal of useful functionality in the [Azure Cost Management + Billing](../../cost-management-billing/costs/quick-acm-cost-analysis.md?toc=/azure/billing/TOC.json) hub. For instance, the "Cost analysis" functionality enables you to view your spends for Azure resources. Adding a filter by resource type (to microsoft.insights/components for Application Insights) will allow you to track your spending. Then for "Group by" select "Meter category" or "Meter". Application Insights billed usage for data ingestion and data retention will show up as **Log Analytics** for the Meter category since Log Analytics backend for all Azure Monitor logs.
Lower your bill with updated versions of the ASP.NET Core SDK and Worker Service
### Microsoft Q&A question page
-If you have questions about how pricing works for Application Insights, you can post a question in our [Microsoft Q&A question page](/answers/topics/azure-monitor.html).
+If you have questions about how pricing works for Application Insights, you can post a question in our [Microsoft Q&A question page](/answers/topics/azure-monitor.html).
azure-monitor Activity Log https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/activity-log.md
The following columns have been added to *AzureActivity* in the updated schema:
- Claims_d - Properties_d
+## Activity Logs Insights
+Activity log insights let you view information about changes to resources and resource groups in a subscription. The dashboards also present data about which users or services performed activities in the subscription and the activities' status. This article explains how to view Activity log insights in the Azure portal.
+ ## Activity Log Analytics monitoring solution > [!Note] > The Azure Log Analytics monitoring solution will be deprecated soon and replaced by a workbook using the updated schema in the Log Analytics workspace. You can still use the solution if you already have it enabled, but it can only be used if you're collecting the Activity log using legacy settings.
azure-monitor Activity Logs Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/activity-logs-insights.md
+
+ Title: Activity logs insights
+description: View the overview of Azure Activity logs of your resources
+++ Last updated : 03/14/2021++
+#Customer intent: As an IT administrator, I want to track changes to resource groups or specific resources in a subscription and to see which administrators or services make these changes.
+
+
+# Activity logs insights (Preview)
+
+Activity logs insights let you view information about changes to resources and resource groups in a subscription. The dashboards also present data about which users or services performed activities in the subscription and the activities' status. This article explains how to view Activity log insights in the Azure portal.
+
+Before using Activity log insights, you'll have to [enable sending logs to your Log Analytics workspace](./diagnostic-settings.md).
+
+## How does Activity logs insights work?
+
+Activity logs you send to a [Log Analytics workspace](/articles/azure-monitor/logs/log-analytics-workspace-overview.md) are stored in a table called AzureActivity.
+
+Activity logs insights are a curated [Log Analytics workbook](/articles/azure-monitor/visualize/workbooks-overview.md) with dashboards that visualize the data in the AzureActivity table. For example, which administrators deleted, updated or created resources, and whether the activities failed or succeeded.
++
+## View Activity logs insights - Resource group / Subscription level
+
+To view Activity logs insights on a resource group or a subscription level:
+
+1. In the Azure portal, select **Monitor** > **Workbooks**.
+1. Select **Activity Logs Insights** in the **Insights** section.
+
+ :::image type="content" source="media/activity-log/open-activity-log-insights-workbook.png" lightbox="media/activity-log/open-activity-log-insights-workbook.png" alt-text="A screenshot showing how to locate and open the Activity logs insights workbook on a scale level":::
+
+1. At the top of the **Activity Logs Insights** page, select:
+ 1. One or more subscriptions from the **Subscriptions** dropdown.
+ 1. Resources and resource groups from the **CurrentResource** dropdown.
+ 1. A time range for which to view data from the **TimeRange** dropdown.
+## View Activity logs insights on any Azure resource
+
+>[!Note]
+> * Currently Applications Insights resources are not supported for this workbook.
+
+To view Activity logs insights on a resource level:
+
+1. In the Azure portal, go to your resource, select **Workbooks**.
+1. Select **Activity Logs Insights** in the **Activity Logs Insights** section.
+
+ :::image type="content" source="media/activity-log/activity-log-resource-level.png" lightbox= "media/activity-log/activity-log-resource-level.png" alt-text="A screenshot showing how to locate and open the Activity logs insights workbook on a resource level":::
+
+1. At the top of the **Activity Logs Insights** page, select:
+
+ 1. A time range for which to view data from the **TimeRange** dropdown.
+ * **Azure Activity Logs Entries** shows the count of Activity log records in each [activity log category](/articles/azure-monitor/essentials/activity-log-schema#categories).
+
+ :::image type="content" source="media/activity-log/activity-logs-insights-category-value.png" lightbox= "media/activity-log/activity-logs-insights-category-value.png" alt-text="Azure Activity Logs by Category Value":::
+
+ * **Activity Logs by Status** shows the count of Activity log records in each status.
+
+ :::image type="content" source="media/activity-log/activity-logs-insights-status.png" lightbox= "media/activity-log/activity-logs-insights-status.png" alt-text="Azure Activity Logs by Status":::
+
+ * At the subscription and resource group level, **Activity Logs by Resource** and **Activity Logs by Resource Provider** show the count of Activity log records for each resource and resource provider.
+
+ :::image type="content" source="media/activity-log/activity-logs-insights-resource.png" lightbox= "media/activity-log/activity-logs-insights-resource.png" alt-text="Azure Activity Logs by Resource":::
+
+## Next steps
+Learn more about:
+* [Platform logs](./platform-logs-overview.md)
+* [Activity log event schema](activity-log-schema.md)
+* [Creating a diagnostic setting to send Activity logs to other destinations](./diagnostic-settings.md)
azure-monitor Resource Logs Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/resource-logs-schema.md
Last updated 05/10/2021
# Common and service-specific schemas for Azure resource logs > [!NOTE]
-> Resource logs were previously known as diagnostic logs. The name was changed in October 2019 as the types of logs gathered by Azure Monitor shifted to include more than just the Azure resource.
+> Resource logs were previously known as diagnostic logs. The name was changed in October 2019 as the types of logs gathered by Azure Monitor shifted to include more than just the Azure resource.
>
-> This article used to list resource log categories that you can collect. That list is now at [Resource log categories](resource-logs-categories.md).
+> This article used to list resource log categories that you can collect. That list is now at [Resource log categories](resource-logs-categories.md).
[Azure Monitor resource logs](../essentials/platform-logs-overview.md) are logs emitted by Azure services that describe the operation of those services or resources. All resource logs available through Azure Monitor share a common top-level schema. Each service has the flexibility to emit unique properties for its own events.
The schema for resource logs varies depending on the resource and log category.
| Azure Load Balancer |[Log Analytics for Azure Load Balancer](../../load-balancer/monitor-load-balancer.md) | | Azure Logic Apps |[Logic Apps B2B custom tracking schema](../../logic-apps/logic-apps-track-integration-account-custom-tracking-schema.md) | | Azure Machine Learning | [Diagnostic logging in Azure Machine Learning](../../machine-learning/monitor-resource-reference.md) |
-| Azure Media Services | [Media Services monitoring schemas](../../media-services/latest/monitoring/monitor-media-services-data-reference.md#schemas) |
+| Azure Media Services | [Media Services monitoring schemas](/media-services/latest/monitoring/monitor-media-services-data-reference#schemas) |
| Network security groups |[Log Analytics for network security groups (NSGs)](../../virtual-network/virtual-network-nsg-manage-log.md) | | Azure Power BI Embedded | [Logging for Power BI Embedded in Azure](/power-bi/developer/azure-pbie-diag-logs) | | Recovery Services | [Data model for Azure Backup](../../backup/backup-azure-reports-data-model.md)|
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/monitor-reference.md
This article is a reference of the different applications and services that are
## Insights and curated visualizations
-Some services have a curated monitoring experience. That is, Microsoft provides customized functionality meant to act as a starting point for monitoring those services. These experiences are collectively known as **curated visualizations** with the larger more complex of them being called **Insights**.
+Some services have a curated monitoring experience. That is, Microsoft provides customized functionality meant to act as a starting point for monitoring those services. These experiences are collectively known as **curated visualizations** with the larger more complex of them being called **Insights**.
-The experiences collect and analyze a subset of logs and metrics and depending on the service and might also provide out-of-the-box alerting. They present this telemetry in a visual layout. The visualizations vary in size and scale. Some are considered part of Azure Monitor and follow the support and service level agreements for Azure. They are supported in all Azure regions where Azure Monitor is available. Other curated visualizations provide less functionality, might not scale, and might have different agreements. Some might be based solely on Azure Monitor Workbooks, while others might have an extensive custom experience.
+The experiences collect and analyze a subset of logs and metrics and depending on the service and might also provide out-of-the-box alerting. They present this telemetry in a visual layout. The visualizations vary in size and scale. Some are considered part of Azure Monitor and follow the support and service level agreements for Azure. They are supported in all Azure regions where Azure Monitor is available. Other curated visualizations provide less functionality, might not scale, and might have different agreements. Some might be based solely on Azure Monitor Workbooks, while others might have an extensive custom experience.
-The table below lists the available curated visualizations and more detailed information about them.
+The table below lists the available curated visualizations and more detailed information about them.
>[!NOTE] > Another type of older visualization called **monitoring solutions** are no longer in active development. The replacement technology is the Azure Monitor Insights mentioned above. We suggest you use the insights and not deploy new instances of solutions. For more information on the solutions, see [Monitoring solutions in Azure Monitor](./insights/solutions.md).
-|Name with docs link| State | [Azure portal Link](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/more)| Description |
+|Name with docs link| State | [Azure portal Link](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/more)| Description |
|:--|:--|:--|:--|
-| [Azure Monitor Workbooks for Azure Active Directory](../active-directory/reports-monitoring/howto-use-azure-monitor-workbooks.md) | GA (General availability) | [Yes](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
-| [Azure Backup](../backup/backup-azure-monitoring-use-azuremonitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
-| [Azure Monitor for Azure Cache for Redis (preview)](./insights/redis-cache-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
-| [Azure Cosmos DB Insights](./insights/cosmosdb-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
-| [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
-| [Azure Data Explorer insights](./insights/data-explorer.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
+| [Azure Monitor Workbooks for Azure Active Directory](../active-directory/reports-monitoring/howto-use-azure-monitor-workbooks.md) | GA (General availability) | [Yes](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
+| [Azure Backup](../backup/backup-azure-monitoring-use-azuremonitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
+| [Azure Monitor for Azure Cache for Redis (preview)](./insights/redis-cache-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
+| [Azure Cosmos DB Insights](./insights/cosmosdb-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
+| [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. |
+| [Azure Data Explorer insights](./insights/data-explorer.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. |
| [Azure HDInsight (preview)](../hdinsight/log-analytics-migration.md#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
- | [Azure IoT Edge](../iot-edge/how-to-explore-curated-visualizations.md) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
- | [Azure Key Vault Insights (preview)](./insights/key-vault-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
- | [Azure Monitor Application Insights](./app/app-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
- | [Azure Monitor Log Analytics Workspace](./logs/log-analytics-workspace-insights-overview.md) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
- | [Azure Service Bus Insights](../service-bus-messaging/service-bus-insights.md) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | Azure Service Bus insights provide a view of the overall performance, failures, capacity, and operational health of all your Service Bus resources in a unified interactive experience. |
+ | [Azure IoT Edge](../iot-edge/how-to-explore-curated-visualizations.md) | GA | No | Visualize and explore metrics collected from the IoT Edge device right in the Azure portal using Azure Monitor Workbooks based public templates. The curated workbooks use built-in metrics from the IoT Edge runtime. These views don't need any metrics instrumentation from the workload modules. |
+ | [Azure Key Vault Insights (preview)](./insights/key-vault-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/keyvaultsInsights) | Provides comprehensive monitoring of your key vaults by delivering a unified view of your Key Vault requests, performance, failures, and latency. |
+ | [Azure Monitor Application Insights](./app/app-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/applicationsInsights) | Extensible Application Performance Management (APM) service which monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations. It enables you to diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes. |
+ | [Azure Monitor Log Analytics Workspace](./logs/log-analytics-workspace-insights-overview.md) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/lawsInsights) | Log Analytics Workspace Insights (preview) provides comprehensive monitoring of your workspaces through a unified view of your workspace usage, performance, health, agent, queries, and change log. This article will help you understand how to onboard and use Log Analytics Workspace Insights (preview). |
+ | [Azure Service Bus Insights](../service-bus-messaging/service-bus-insights.md) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | Azure Service Bus insights provide a view of the overall performance, failures, capacity, and operational health of all your Service Bus resources in a unified interactive experience. |
| [Azure SQL insights](./insights/sql-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. |
- | [Azure Storage Insights](/azure/azure-monitor/insights/storage-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/storageInsights) | Provides comprehensive monitoring of your Azure Storage accounts by delivering a unified view of your Azure Storage services performance, capacity, and availability. |
- | [Azure Network Insights](./insights/network-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
- | [Azure Monitor for Resource Groups](./insights/resource-group-insights.md) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
+ | [Azure Storage Insights](/azure/azure-monitor/insights/storage-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/storageInsights) | Provides comprehensive monitoring of your Azure Storage accounts by delivering a unified view of your Azure Storage services performance, capacity, and availability. |
+ | [Azure Network Insights](./insights/network-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
+ | [Azure Monitor for Resource Groups](./insights/resource-group-insights.md) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
| [Azure Monitor SAP](../virtual-machines/workloads/sap/monitor-sap-on-azure.md) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
- | [Azure Stack HCI insights](/azure-stack/hci/manage/azure-stack-hci-insights) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/azureStackHCIInsights) | Azure Monitor Workbook based. Provides health, performance, and usage insights about registered Azure Stack HCI, version 21H2 clusters that are connected to Azure and are enrolled in monitoring. It stores its data in a Log Analytics workspace, which allows it to deliver powerful aggregation and filtering and analyze data trends over time. |
- | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
- | [Azure Virtual Desktop Insights](../virtual-desktop/azure-monitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Azure Virtual Desktop environments. |
+ | [Azure Stack HCI insights](/azure-stack/hci/manage/azure-stack-hci-insights) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/azureStackHCIInsights) | Azure Monitor Workbook based. Provides health, performance, and usage insights about registered Azure Stack HCI, version 21H2 clusters that are connected to Azure and are enrolled in monitoring. It stores its data in a Log Analytics workspace, which allows it to deliver powerful aggregation and filtering and analyze data trends over time. |
+ | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
+ | [Azure Virtual Desktop Insights](../virtual-desktop/azure-monitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Azure Virtual Desktop environments. |
## Product integrations
The other services and older monitoring solutions in the following table store t
| [Microsoft Teams Rooms](/microsoftteams/room-systems/azure-monitor-deploy) | Integrated, end-to-end management of Microsoft Teams Rooms devices. | | [Visual Studio App Center](/appcenter/) | Build, test, and distribute applications and then monitor their status and usage. See [Start analyzing your mobile app with App Center and Application Insights](app/mobile-center-quickstart.md). | | Windows | [Windows Update Compliance](/windows/deployment/update/update-compliance-get-started) - Assess your Windows desktop upgrades.<br>[Desktop Analytics](/configmgr/desktop-analytics/overview) - Integrates with Configuration Manager to provide insight and intelligence to make more informed decisions about the update readiness of your Windows clients. |
-| **The following solutions also integrate with parts of Azure Monitor. Note that solutions, are no longer under active development. Use [insights](#insights-and-curated-visualizations) instead.** | |
-| Network - [Network Performance Monitor solution](insights/network-performance-monitor.md) |
+| **The following solutions also integrate with parts of Azure Monitor. Note that solutions, are no longer under active development. Use [insights](#insights-and-curated-visualizations) instead.** | |
+| Network - [Network Performance Monitor solution](insights/network-performance-monitor.md) |
| Network - [Azure Application Gateway Solution](insights/azure-networking-analytics.md#azure-application-gateway-analytics) | . | [Office 365 solution](insights/solution-office-365.md) | Monitor your Office 365 environment. Updated version with improved onboarding available through Microsoft Sentinel. | | [SQL Analytics solution](insights/azure-sql.md) | Use SQL Insights instead |
The other services and older monitoring solutions in the following table store t
| Integration | Description | |:|:| | [ITSM](alerts/itsmc-overview.md) | The IT Service Management Connector (ITSMC) allows you to connect Azure and a supported IT Service Management (ITSM) product/service. |
-| [Azure Monitor Partners](./partners.md) | A list of partners that integrate with Azure Monitor in some form |
+| [Azure Monitor Partners](./partners.md) | A list of partners that integrate with Azure Monitor in some form |
| [Azure Monitor Partner integrations](../partner-solutions/overview.md)| Specialized integrations between Azure Monitor and other non-Microsoft monitoring platforms if you've already built on them. Examples include Datadog and Elastic|
Azure Monitor can collect data from resources outside of Azure using the methods
## Azure supported services
-
-The following table lists Azure services and the data they collect into Azure Monitor.
-- Metrics - The service automatically collects metrics into Azure Monitor Metrics.
+The following table lists Azure services and the data they collect into Azure Monitor.
+
+- Metrics - The service automatically collects metrics into Azure Monitor Metrics.
- Logs - The service supports diagnostic settings which can send metrics and platform logs into Azure Monitor Logs for analysis in Log Analytics. - Insight - There is an insight available which provides a customized monitoring experience for the service.
The following table lists Azure services and the data they collect into Azure Mo
| [Azure Logic Apps](../logic-apps/index.yml) | Microsoft.Logic/workflows | [**Yes**](./essentials/metrics-supported.md#microsoftlogicworkflows) | [**Yes**](./essentials/resource-logs-categories.md#microsoftlogicworkflows) | | | | [Azure Machine Learning](../machine-learning/index.yml) | Microsoft.MachineLearningServices/workspaces | [**Yes**](./essentials/metrics-supported.md#microsoftmachinelearningservicesworkspaces) | [**Yes**](./essentials/resource-logs-categories.md#microsoftmachinelearningservicesworkspaces) | | | | [Azure Maps](../azure-maps/index.yml) | Microsoft.Maps/accounts | [**Yes**](./essentials/metrics-supported.md#microsoftmapsaccounts) | No | | |
- | [Azure Media Services](../media-services/index.yml) | Microsoft.Medi#microsoftmediamediaservices) | | |
- | [Azure Media Services](../media-services/index.yml) | Microsoft.Medi#microsoftmediamediaservicesliveevents) | No | | |
- | [Azure Media Services](../media-services/index.yml) | Microsoft.Medi#microsoftmediamediaservicesstreamingendpoints) | No | | |
- | [Azure Media Services](../media-services/index.yml) | Microsoft.Medi#microsoftmediavideoanalyzers) | | |
+ | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservices) | | |
+ | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservicesliveevents) | No | | |
+ | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservicesstreamingendpoints) | No | | |
+ | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediavideoanalyzers) | | |
| [Azure Spatial Anchors](../spatial-anchors/index.yml) | Microsoft.MixedReality/remoteRenderingAccounts | [**Yes**](./essentials/metrics-supported.md#microsoftmixedrealityremoterenderingaccounts) | No | | | | [Azure Spatial Anchors](../spatial-anchors/index.yml) | Microsoft.MixedReality/spatialAnchorsAccounts | [**Yes**](./essentials/metrics-supported.md#microsoftmixedrealityspatialanchorsaccounts) | No | | | | [Azure NetApp Files](../azure-netapp-files/index.yml) | Microsoft.NetApp/netAppAccounts/capacityPools | [**Yes**](./essentials/metrics-supported.md#microsoftnetappnetappaccountscapacitypools) | No | | |
azure-monitor Vminsights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/vm/vminsights-overview.md
The steps to configure VM insights are as follows. Follow each link for detailed
- [Add VMInsights solution to workspace.](./vminsights-configure-workspace.md#add-vminsights-solution-to-workspace) - [Install agents on virtual machine and virtual machine scale set to be monitored.](./vminsights-enable-overview.md) -
+Currently, VM insights does not support multi-homing.
## Next steps
azure-portal View Quotas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-portal/supportability/view-quotas.md
In the list of quotas, you can toggle the arrow shown next to **Quota** to expan
You can request quota increases directly from **My quotas**. The process for requesting an increase will depend on the type of quota.
+> [!NOTE]
+> There is no cost associated with requesting a quota increase. Costs are incurred based on resource usage, not the quotas themselves.
+ ### Request a quota increase Some quotas display a pencil icon. Select this icon to quickly request an increase for that quota.
azure-resource-manager Azure Services Resource Providers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/azure-services-resource-providers.md
The resources providers that are marked with **- registered** are registered by
| Microsoft.Marketplace | core | | Microsoft.MarketplaceApps | core | | Microsoft.MarketplaceOrdering - [registered](#registration) | core |
-| Microsoft.Media | [Media Services](../../media-services/index.yml) |
+| Microsoft.Media | [Media Services](/media-services/) |
| Microsoft.Microservices4Spring | [Azure Spring Cloud](../../spring-cloud/overview.md) | | Microsoft.Migrate | [Azure Migrate](../../migrate/migrate-services-overview.md) | | Microsoft.MixedReality | [Azure Spatial Anchors](../../spatial-anchors/index.yml) |
azure-resource-manager Azure Subscription Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/azure-subscription-service-limits.md
The following table details the features and limits of the Basic, Standard, and
### Media Services v2 (legacy)
-For limits specific to Media Services v2 (legacy), see [Media Services v2 (legacy)](../../media-services/previous/media-services-quotas-and-limitations.md)
+For limits specific to Media Services v2 (legacy), see [Media Services v2 (legacy)](/media-services/previous/media-services-quotas-and-limitations)
## Mobile Services limits
azure-resource-manager Resource Name Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/resource-name-rules.md
Title: Resource naming restrictions description: Shows the rules and restrictions for naming Azure resources. Previously updated : 03/08/2022 Last updated : 03/29/2022 # Naming rules and restrictions for Azure resources
In the following tables, the term alphanumeric refers to:
> [!div class="mx-tableFixed"] > | Entity | Scope | Length | Valid Characters | > | | | | |
-> | managedInstances | global | 1-63 | Lowercase letters, numbers, and hyphens.<br><br>Can't start or end with hyphen. <br><br> Can't have any special characters, such as `@`. |
-> | servers | global | 1-63 | Lowercase letters, numbers, and hyphens.<br><br>Can't start or end with hyphen. |
+> | managedInstances | global | 1-63 | Lowercase letters, numbers, and hyphens.<br><br> Can't start or end with hyphen. CanΓÇÖt have hyphen twice in both third and fourth place.<br><br> Can't have any special characters, such as `@`. |
+> | servers | global | 1-63 | Lowercase letters, numbers, and hyphens.<br><br>Can't start or end with hyphen. CanΓÇÖt have hyphen twice in both third and fourth place. |
> | servers / administrators | server | | Must be `ActiveDirectory`. | > | servers / databases | server | 1-128 | Can't use:<br>`<>*%&:\/?` or control characters<br><br>Can't end with period or space. | > | servers / databases / syncGroups | database | 1-150 | Alphanumerics, hyphens, and underscores. |
azure-signalr Concept Connection String https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/concept-connection-string.md
+
+ Title: Connection string in Azure SignalR Service
+description: An overview of connection string in Azure SignalR Service, how to generate it and how to configure it in app server
+++ Last updated : 03/25/2022++
+# Connection string in Azure SignalR Service
+
+Connection string is an important concept that contains information about how to connect to SignalR service. In this article, you'll learn the basics of connection string and how to configure it in your application.
+
+## What is connection string
+
+When an application needs to connect to Azure SignalR Service, it will need the following information:
+
+* The HTTP endpoint of the SignalR service instance
+* How to authenticate with the service endpoint
+
+Connection string contains such information. To see how a connection string looks like, you can open a SignalR service resource in Azure portal and go to "Keys" tab. You'll see two connection strings (primary and secondary) in the following format:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AccessKey=<access_key>;Version=1.0;
+```
+
+> [!NOTE]
+> Besides portal, you can also use Azure CLI to get the connection string:
+>
+> ```bash
+> az signalr key list -g <resource_group> -n <resource_name>
+> ```
+
+You can see in the connection string, there are two main information:
+
+* `Endpoint=https://<resource_name>.service.signalr.net` is the endpoint URL of the resource
+* `AccessKey=<access_key>` is the key to authenticate with the service. When access key is specified in connection string, SignalR service SDK will use it to generate a token that can be validated by the service.
+
+>[!NOTE]
+> For more information about how access tokens are generated and validated, see this [article](https://github.com/Azure/azure-signalr/blob/dev/docs/rest-api.md#authenticate-via-azure-signalr-service-accesskey).
+
+## Other authentication types
+
+Besides access key, SignalR service also supports other types of authentication methods in connection string.
+
+### Azure Active Directory Application
+
+You can use [Azure AD application](/azure/active-directory/develop/app-objects-and-service-principals) to connect to SignalR service. As long as the application has the right permission to access SignalR service, no access key is needed.
+
+To use Azure AD authentication, you need to remove `AccessKey` from connection string and add `AuthType=aad`. You also need to specify the credentials of your Azure AD application, including client ID, client secret and tenant ID. The connection string will look as follows:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AuthType=aad;ClientId=<client_id>;ClientSecret=<client_secret>;TenantId=<tenant_id>;Version=1.0;
+```
+
+For more information about how to authenticate using Azure AD application, see this [article](signalr-howto-authorize-application.md).
+
+### Managed identity
+
+You can also use [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) to authenticate with SignalR service.
+
+There are two types of managed identities, to use system assigned identity, you just need to add `AuthType=aad` to the connection string:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AuthType=aad;Version=1.0;
+```
+
+SignalR service SDK will automatically use the identity of your app server.
+
+To use user assigned identity, you also need to specify the client ID of the managed identity:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AuthType=aad;ClientId=<client_id>;Version=1.0;
+```
+
+For more information about how to configure managed identity, see this [article](signalr-howto-authorize-managed-identity.md).
+
+> [!NOTE]
+> It's highly recommended to use Azure AD to authenticate with SignalR service as it's a more secure way comparing to using access key. If you don't use access key authentication at all, consider to completely disable it (go to Azure portal -> Keys -> Access Key -> Disable). If you still use access key, it's highly recommended to rotate them regularly (more information can be found [here](signalr-howto-key-rotation.md)).
+
+## Client and server endpoints
+
+Connection string contains the HTTP endpoint for app server to connect to SignalR service. This is also the endpoint server will return to clients in negotiate response, so client can also connect to the service.
+
+But in some applications there may be an additional component in front of SignalR service and all client connections need to go through that component first (to gain additional benefits like network security, [Azure Application Gateway](/azure/application-gateway/overview) is a common service that provides such functionality).
+
+In such case, the client will need to connect to an endpoint different than SignalR service. Instead of manually replace the endpoint at client side, you can add `ClientEndpoint` to connecting string:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AccessKey=<access_key>;ClientEndpoint=https://<url_to_app_gateway>;Version=1.0;
+```
+
+Then app server will return the right endpoint url in negotiate response for client to connect.
+
+> [!NOTE]
+> For more information about how clients get service url through negotiate, see this [article](signalr-concept-internals.md#client-connections).
+
+Similarly, when server wants to make [server connections](signalr-concept-internals.md#server-connections) or call [REST APIs](https://github.com/Azure/azure-signalr/blob/dev/docs/rest-api.md) to service, SignalR service may also be behind another service like Application Gateway. In that case, you can use `ServerEndpoint` to specify the actual endpoint for server connections and REST APIs:
+
+```
+Endpoint=https://<resource_name>.service.signalr.net;AccessKey=<access_key>;ServerEndpoint=https://<url_to_app_gateway>;Version=1.0;
+```
+
+## Use connection string generator
+
+It may be cumbersome and error-prone to compose connection string manually. In Azure portal, there is a tool to help you generate connection string with additional information like client endpoint and auth type.
+
+To use connection string generator, open the SignalR resource in Azure portal, go to "Connection strings" tab:
++
+In this page you can choose different authentication types (access key, managed identity or Azure AD application) and input information like client endpoint, client ID, client secret, etc. Then connection string will be automatically generated. You can copy and use it in your application.
+
+> [!NOTE]
+> Everything you input in this page won't be saved after you leave the page (since they're only client side information), so please copy and save it in a secure place for your application to use.
+
+## Configure connection string in your application
+
+There are two ways to configure connection string in your application.
+
+You can set the connection string when calling `AddAzureSignalR()` API:
+
+```cs
+services.AddSignalR().AddAzureSignalR("<connection_string>");
+```
+
+Or you can call `AddAzureSignalR()` without any arguments, then service SDK will read the connection string from a config named `Azure:SignalR:ConnectionString` in your [config providers](/dotnet/core/extensions/configuration-providers).
+
+In a local development environment, the config is usually stored in file (appsettings.json or secrets.json) or environment variables, so you can use one of the following ways to configure connection string:
+
+* Use .NET secret manager (`dotnet user-secrets set Azure:SignalR:ConnectionString "<connection_string>"`)
+* Set connection string to environment variable named `Azure__SignalR__ConnectionString` (colon needs to replaced with double underscore in [environment variable config provider](/dotnet/core/extensions/configuration-providers#environment-variable-configuration-provider)).
+
+In production environment, you can use other Azure services to manage config/secrets like Azure [Key Vault](/azure/key-vault/general/overview) and [App Configuration](/azure/azure-app-configuration/overview). See their documentation to learn how to set up config provider for those services.
+
+> [!NOTE]
+> Even you're directly setting connection string using code, it's not recommended to hardcode the connection string in source code, so you should still first read the connection string from a secret store like key vault and pass it to `AddAzureSignalR()`.
+
+### Configure multiple connection strings
+
+Azure SignalR Service also allows server to connect to multiple service endpoints at the same time, so it can handle more connections which are beyond one service instance's limit. Also if one service instance is down, other service instances can be used as backup. For more information about how to use multiple instances, see this [article](signalr-howto-scale-multi-instances.md).
+
+There are also two ways to configure multiple instances:
+
+* Through code
+
+ ```cs
+ services.AddSignalR().AddAzureSignalR(options =>
+ {
+ options.Endpoints = new ServiceEndpoint[]
+ {
+ new ServiceEndpoint("<connection_string_1>", name: "name_a"),
+ new ServiceEndpoint("<connection_string_2>", name: "name_b", type: EndpointType.Primary),
+ new ServiceEndpoint("<connection_string_3>", name: "name_c", type: EndpointType.Secondary),
+ };
+ });
+ ```
+
+ You can assign a name and type to each service endpoint so you can distinguish them later.
+
+* Through config
+
+ You can use any supported config provider (secret manager, environment variables, key vault, etc.) to store connection strings. Take secret manager as an example:
+
+ ```bash
+ dotnet user-secrets set Azure:SignalR:ConnectionString:name_a <connection_string_1>
+ dotnet user-secrets set Azure:SignalR:ConnectionString:name_b:primary <connection_string_2>
+ dotnet user-secrets set Azure:SignalR:ConnectionString:name_c:secondary <connection_string_3>
+ ```
+
+ You can also assign name and type to each endpoint, by using a different config name in the following format:
+
+ ```
+ Azure:SignalR:ConnectionString:<name>:<type>
+ ```
azure-sql Dtu Benchmark https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/dtu-benchmark.md
+
+ Title: DTU benchmark
+description: Learn about the benchmark for the DTU-based purchasing model for Azure SQL Database.
++++
+ms.devlang:
++++ Last updated : 03/29/2022+
+# DTU benchmark
+
+A database transaction unit (DTU) is a unit of measure representing a blended measure of CPU, memory, reads, and writes. Physical characteristics (CPU, memory, IO) associated with each DTU measure are calibrated using a benchmark that simulates real-world database workload. This article summarizes the DTU benchmark and shares information about the schema, transaction types used, workload mix, users and pacing, scaling rules, and metrics associated with the benchmark.
+
+For general information about the DTU-based purchasing model, see the [DTU-based purchasing model overview](service-tiers-dtu.md).
+
+## Benchmark summary
+
+The DTU benchmark measures the performance of a mix of basic database operations that occur most frequently in online transaction processing (OLTP) workloads. Although the benchmark is designed with cloud computing in mind, the database schema, data population, and transactions have been designed to be broadly representative of the basic elements most commonly used in OLTP workloads.
+
+## Correlating benchmark results to real world database performance
+
+It's important to understand that all benchmarks are representative and indicative only. The transaction rates achieved with the benchmark application will not be the same as those that might be achieved with other applications. The benchmark comprises a collection of different transaction types run against a schema containing a range of tables and data types. While the benchmark exercises the same basic operations that are common to all OLTP workloads, it doesn't represent any specific class of database or application. The goal of the benchmark is to provide a reasonable guide to the relative performance of a database that might be expected when scaling up or down between compute sizes.
+
+In reality, databases are of different sizes and complexity, encounter different mixes of workloads, and will respond in different ways. For example, an IO-intensive application may hit IO thresholds sooner, or a CPU-intensive application may hit CPU limits sooner. There is no guarantee that any particular database will scale in the same way as the benchmark under increasing load.
+
+The benchmark and its methodology are described in more detail in this article.
+
+## Schema
+
+The schema is designed to have enough variety and complexity to support a broad range of operations. The benchmark runs against a database comprised of six tables. The tables fall into three categories: fixed-size, scaling, and growing. There are two fixed-size tables; three scaling tables; and one growing table. Fixed-size tables have a constant number of rows. Scaling tables have a cardinality that is proportional to database performance, but doesnΓÇÖt change during the benchmark. The growing table is sized like a scaling table on initial load, but then the cardinality changes in the course of running the benchmark as rows are inserted and deleted.
+
+The schema includes a mix of data types, including integer, numeric, character, and date/time. The schema includes primary and secondary keys, but not any foreign keys - that is, there are no referential integrity constraints between tables.
+
+A data generation program generates the data for the initial database. Integer and numeric data is generated with various strategies. In some cases, values are distributed randomly over a range. In other cases, a set of values is randomly permuted to ensure that a specific distribution is maintained. Text fields are generated from a weighted list of words to produce realistic looking data.
+
+The database is sized based on a ΓÇ£scale factor.ΓÇ¥ The scale factor (abbreviated as SF) determines the cardinality of the scaling and growing tables. As described below in the section Users and Pacing, the database size, number of users, and maximum performance all scale in proportion to each other.
+
+## Transactions
+
+The workload consists of nine transaction types, as shown in the table below. Each transaction is designed to highlight a particular set of system characteristics in the database engine and system hardware, with high contrast from the other transactions. This approach makes it easier to assess the impact of different components to overall performance. For example, the transaction ΓÇ£Read HeavyΓÇ¥ produces a significant number of read operations from disk.
+
+| Transaction Type | Description |
+| | |
+| Read Lite |SELECT; in-memory; read-only |
+| Read Medium |SELECT; mostly in-memory; read-only |
+| Read Heavy |SELECT; mostly not in-memory; read-only |
+| Update Lite |UPDATE; in-memory; read-write |
+| Update Heavy |UPDATE; mostly not in-memory; read-write |
+| Insert Lite |INSERT; in-memory; read-write |
+| Insert Heavy |INSERT; mostly not in-memory; read-write |
+| Delete |DELETE; mix of in-memory and not in-memory; read-write |
+| CPU Heavy |SELECT; in-memory; relatively heavy CPU load; read-only |
+
+## Workload mix
+
+Transactions are selected at random from a weighted distribution with the following overall mix. The overall mix has a read/write ratio of approximately 2:1.
+
+| Transaction Type | % of Mix |
+| | |
+| Read Lite |35 |
+| Read Medium |20 |
+| Read Heavy |5 |
+| Update Lite |20 |
+| Update Heavy |3 |
+| Insert Lite |3 |
+| Insert Heavy |2 |
+| Delete |2 |
+| CPU Heavy |10 |
+
+## Users and pacing
+
+The benchmark workload is driven from a tool that submits transactions across a set of connections to simulate the behavior of a number of concurrent users. Although all of the connections and transactions are machine generated, for simplicity we refer to these connections as ΓÇ£users.ΓÇ¥ Although each user operates independently of all other users, all users perform the same cycle of steps shown below:
+
+1. Establish a database connection.
+2. Repeat until signaled to exit:
+ - Select a transaction at random (from a weighted distribution).
+ - Perform the selected transaction and measure the response time.
+ - Wait for a pacing delay.
+3. Close the database connection.
+4. Exit.
+
+The pacing delay (in step 2c) is selected at random, but with a distribution that has an average of 1.0 second. Thus each user can, on average, generate at most one transaction per second.
+
+## Scaling rules
+
+The number of users is determined by the database size (in scale-factor units). There is one user for every five scale-factor units. Because of the pacing delay, one user can generate at most one transaction per second, on average.
+
+For example, a scale-factor of 500 (SF=500) database will have 100 users and can achieve a maximum rate of 100 TPS. To drive a higher TPS rate requires more users and a larger database.
+
+## Measurement duration
+
+A valid benchmark run requires a steady-state measurement duration of at least one hour.
+
+## Metrics
+
+The key metrics in the benchmark are throughput and response time.
+
+- Throughput is the essential performance measure in the benchmark. Throughput is reported in transactions per unit-of-time, counting all transaction types.
+- Response time is a measure of performance predictability. The response time constraint varies with class of service, with higher classes of service having a more stringent response time requirement, as shown below.
+
+| Class of Service | Throughput Measure | Response Time Requirement |
+| | | |
+| [Premium](service-tiers-dtu.md#compare-service-tiers) |Transactions per second |95th percentile at 0.5 seconds |
+| [Standard](service-tiers-dtu.md#compare-service-tiers) |Transactions per minute |90th percentile at 1.0 seconds |
+| [Basic](service-tiers-dtu.md#compare-service-tiers) |Transactions per hour |80th percentile at 2.0 seconds |
+
+> [!NOTE]
+> Response time metrics are specific to the [DTU Benchmark](#dtu-benchmark). Response times for other workloads are workload-dependent and will differ.
+
+## Next steps
+
+Learn more about purchasing models and related concepts in the following articles:
+
+- [DTU-based purchasing model overview](service-tiers-dtu.md)
+- [vCore purchasing model - Azure SQL Database](service-tiers-sql-database-vcore.md)
+- [Compare vCore and DTU-based purchasing models of Azure SQL Database](purchasing-models.md)
+- [Migrate Azure SQL Database from the DTU-based model to the vCore-based model](migrate-dtu-to-vcore.md)
+- [Resource limits for single databases using the DTU purchasing model - Azure SQL Database](resource-limits-dtu-single-databases.md)
+- [Resources limits for elastic pools using the DTU purchasing model](resource-limits-dtu-elastic-pools.md)
azure-sql Service Tiers Dtu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/service-tiers-dtu.md
Last updated 02/02/2022
# DTU-based purchasing model overview [!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)]
-In this article, learn about the DTU-based purchasing model for Azure SQL Database.
+In this article, learn about the DTU-based purchasing model for Azure SQL Database.
To learn more, review [vCore-based purchasing model](service-tiers-vcore.md) and [compare purchasing models](purchasing-models.md). - ## Database transaction units (DTUs) A database transaction unit (DTU) represents a blended measure of CPU, memory, reads, and writes. Service tiers in the DTU-based purchasing model are differentiated by a range of compute sizes with a fixed amount of included storage, fixed retention period for backups, and fixed price. All service tiers in the DTU-based purchasing model provide flexibility of changing compute sizes with minimal [downtime](https://azure.microsoft.com/support/legal/sla/azure-sql-database); however, there is a switch over period where connectivity is lost to the database for a short amount of time, which can be mitigated using retry logic. Single databases and elastic pools are billed hourly based on service tier and compute size. For a single database at a specific compute size within a [service tier](single-database-scale.md), Azure SQL Database guarantees a certain level of resources for that database (independent of any other database). This guarantee provides a predictable level of performance. The amount of resources allocated for a database is calculated as a number of DTUs and is a bundled measure of compute, storage, and I/O resources.
-The ratio among these resources is originally determined by an [online transaction processing (OLTP) benchmark workload](service-tiers-dtu.md) designed to be typical of real-world OLTP workloads. When your workload exceeds the amount of any of these resources, your throughput is throttled, resulting in slower performance and time-outs.
+The ratio among these resources is originally determined by an [online transaction processing (OLTP) benchmark workload](dtu-benchmark.md) designed to be typical of real-world OLTP workloads. When your workload exceeds the amount of any of these resources, your throughput is throttled, resulting in slower performance and time-outs.
For single databases, the resources used by your workload don't impact the resources available to other databases in the Azure cloud. Likewise, the resources used by other workloads don't impact the resources available to your database.
In the DTU-based purchasing model, customers cannot choose the hardware generati
For example, a database can be moved to a different hardware generation if it's scaled up or down to a different service objective, or if the current infrastructure in a datacenter is approaching its capacity limits, or if the currently used hardware is being decommissioned due to its end of life.
-If a database is moved to different hardware, workload performance can change. The DTU model guarantees that the throughput and response time of the [DTU benchmark](./service-tiers-dtu.md#dtu-benchmark) workload will remain substantially identical as the database moves to a different hardware generation, as long as its service objective (the number of DTUs) stays the same.
+If a database is moved to different hardware, workload performance can change. The DTU model guarantees that the throughput and response time of the [DTU benchmark](dtu-benchmark.md) workload will remain substantially identical as the database moves to a different hardware generation, as long as its service objective (the number of DTUs) stays the same.
-However, across the wide spectrum of customer workloads running in Azure SQL Database, the impact of using different hardware for the same service objective can be more pronounced. Different workloads will benefit from different hardware configuration and features. Therefore, for workloads other than the DTU benchmark, it's possible to see performance differences if the database moves from one hardware generation to another.
+However, across the wide spectrum of customer workloads running in Azure SQL Database, the impact of using different hardware for the same service objective can be more pronounced. Different workloads will benefit from different hardware configuration and features. Therefore, for workloads other than the [DTU benchmark](dtu-benchmark.md), it's possible to see performance differences if the database moves from one hardware generation to another.
For example, an application that is sensitive to network latency can see better performance on Gen5 hardware vs. Gen4 due to the use of Accelerated Networking in Gen5, but an application using intensive read IO can see better performance on Gen4 hardware versus Gen5 due to a higher memory per core ratio on Gen4.
Choosing a service tier depends primarily on business continuity, storage, and p
> [!NOTE] > You can get a free database in Azure SQL Database at the Basic service tier in conjunction with an Azure free account to explore Azure. For information, see [Create a managed cloud database with your Azure free account](https://azure.microsoft.com/free/services/sql-database/). - ## Resource limits Resource limits differ for single and pooled databases.
To learn more, review [Resource limits for pooled databases](resource-limits-dtu
## DTU Benchmark
-Physical characteristics (CPU, memory, IO) associated to each DTU measure are calibrated using a benchmark that simulates real-world database workload.
-
-### Correlating benchmark results to real world database performance
-
-It is important to understand that all benchmarks are representative and indicative only. The transaction rates achieved with the benchmark application will not be the same as those that might be achieved with other applications. The benchmark comprises a collection of different transaction types run against a schema containing a range of tables and data types. While the benchmark exercises the same basic operations that are common to all OLTP workloads, it does not represent any specific class of database or application. The goal of the benchmark is to provide a reasonable guide to the relative performance of a database that might be expected when scaling up or down between compute sizes. In reality, databases are of different sizes and complexity, encounter different mixes of workloads, and will respond in different ways. For example, an IO-intensive application may hit IO thresholds sooner, or a CPU-intensive application may hit CPU limits sooner. There is no guarantee that any particular database will scale in the same way as the benchmark under increasing load.
-
-The benchmark and its methodology are described in more detail below.
-
-### Benchmark summary
-
-The benchmark measures the performance of a mix of basic database operations that occur most frequently in online transaction processing (OLTP) workloads. Although the benchmark is designed with cloud computing in mind, the database schema, data population, and transactions have been designed to be broadly representative of the basic elements most commonly used in OLTP workloads.
-
-### Schema
-
-The schema is designed to have enough variety and complexity to support a broad range of operations. The benchmark runs against a database comprised of six tables. The tables fall into three categories: fixed-size, scaling, and growing. There are two fixed-size tables; three scaling tables; and one growing table. Fixed-size tables have a constant number of rows. Scaling tables have a cardinality that is proportional to database performance, but doesnΓÇÖt change during the benchmark. The growing table is sized like a scaling table on initial load, but then the cardinality changes in the course of running the benchmark as rows are inserted and deleted.
-
-The schema includes a mix of data types, including integer, numeric, character, and date/time. The schema includes primary and secondary keys, but not any foreign keys - that is, there are no referential integrity constraints between tables.
-
-A data generation program generates the data for the initial database. Integer and numeric data is generated with various strategies. In some cases, values are distributed randomly over a range. In other cases, a set of values is randomly permuted to ensure that a specific distribution is maintained. Text fields are generated from a weighted list of words to produce realistic looking data.
-
-The database is sized based on a ΓÇ£scale factor.ΓÇ¥ The scale factor (abbreviated as SF) determines the cardinality of the scaling and growing tables. As described below in the section Users and Pacing, the database size, number of users, and maximum performance all scale in proportion to each other.
-
-### Transactions
-
-The workload consists of nine transaction types, as shown in the table below. Each transaction is designed to highlight a particular set of system characteristics in the database engine and system hardware, with high contrast from the other transactions. This approach makes it easier to assess the impact of different components to overall performance. For example, the transaction ΓÇ£Read HeavyΓÇ¥ produces a significant number of read operations from disk.
-
-| Transaction Type | Description |
-| | |
-| Read Lite |SELECT; in-memory; read-only |
-| Read Medium |SELECT; mostly in-memory; read-only |
-| Read Heavy |SELECT; mostly not in-memory; read-only |
-| Update Lite |UPDATE; in-memory; read-write |
-| Update Heavy |UPDATE; mostly not in-memory; read-write |
-| Insert Lite |INSERT; in-memory; read-write |
-| Insert Heavy |INSERT; mostly not in-memory; read-write |
-| Delete |DELETE; mix of in-memory and not in-memory; read-write |
-| CPU Heavy |SELECT; in-memory; relatively heavy CPU load; read-only |
+Physical characteristics (CPU, memory, IO) associated with each DTU measure are calibrated using a benchmark that simulates real-world database workload.
-### Workload mix
+Learn about the schema, transaction types used, workload mix, users and pacing, scaling rules, and metrics associated with the [DTU benchmark](dtu-benchmark.md).
-Transactions are selected at random from a weighted distribution with the following overall mix. The overall mix has a read/write ratio of approximately 2:1.
+## Compare DTU-based and vCore purchasing models
-| Transaction Type | % of Mix |
-| | |
-| Read Lite |35 |
-| Read Medium |20 |
-| Read Heavy |5 |
-| Update Lite |20 |
-| Update Heavy |3 |
-| Insert Lite |3 |
-| Insert Heavy |2 |
-| Delete |2 |
-| CPU Heavy |10 |
+While the DTU-based purchasing model is based on a bundled measure of compute, storage, and I/O resources, by comparison the [vCore purchasing model for Azure SQL Database](service-tiers-sql-database-vcore.md) allows you to independently choose and scale compute and storage resources.
-### Users and pacing
+The vCore-based purchasing model also allows you to use [Azure Hybrid Benefit](https://azure.microsoft.com/pricing/hybrid-benefit/) for SQL Server to save costs, and offers [Serverless](serverless-tier-overview.md) and [Hyperscale](service-tier-hyperscale.md) options for Azure SQL Database that are not available in the DTU-based purchasing model.
-The benchmark workload is driven from a tool that submits transactions across a set of connections to simulate the behavior of a number of concurrent users. Although all of the connections and transactions are machine generated, for simplicity we refer to these connections as ΓÇ£users.ΓÇ¥ Although each user operates independently of all other users, all users perform the same cycle of steps shown below:
-
-1. Establish a database connection.
-2. Repeat until signaled to exit:
- - Select a transaction at random (from a weighted distribution).
- - Perform the selected transaction and measure the response time.
- - Wait for a pacing delay.
-3. Close the database connection.
-4. Exit.
-
-The pacing delay (in step 2c) is selected at random, but with a distribution that has an average of 1.0 second. Thus each user can, on average, generate at most one transaction per second.
-
-### Scaling rules
-
-The number of users is determined by the database size (in scale-factor units). There is one user for every five scale-factor units. Because of the pacing delay, one user can generate at most one transaction per second, on average.
-
-For example, a scale-factor of 500 (SF=500) database will have 100 users and can achieve a maximum rate of 100 TPS. To drive a higher TPS rate requires more users and a larger database.
-
-### Measurement duration
-
-A valid benchmark run requires a steady-state measurement duration of at least one hour.
-
-### Metrics
-
-The key metrics in the benchmark are throughput and response time.
--- Throughput is the essential performance measure in the benchmark. Throughput is reported in transactions per unit-of-time, counting all transaction types.-- Response time is a measure of performance predictability. The response time constraint varies with class of service, with higher classes of service having a more stringent response time requirement, as shown below.-
-| Class of Service | Throughput Measure | Response Time Requirement |
-| | | |
-| Premium |Transactions per second |95th percentile at 0.5 seconds |
-| Standard |Transactions per minute |90th percentile at 1.0 seconds |
-| Basic |Transactions per hour |80th percentile at 2.0 seconds |
-
-> [!NOTE]
-> Response time metrics are specific to the [DTU Benchmark](#dtu-benchmark). Response times for other workloads are workload-dependent and will differ.
+Learn more in [Compare vCore and DTU-based purchasing models of Azure SQL Database](purchasing-models.md).
## Next steps
+Learn more about purchasing models and related concepts in the following articles:
+ - For details on specific compute sizes and storage size choices available for single databases, see [SQL Database DTU-based resource limits for single databases](resource-limits-dtu-single-databases.md#single-database-storage-sizes-and-compute-sizes). - For details on specific compute sizes and storage size choices available for elastic pools, see [SQL Database DTU-based resource limits](resource-limits-dtu-elastic-pools.md#elastic-pool-storage-sizes-and-compute-sizes).
+- For information on the benchmark associated with the DTU-based purchasing model, see [DTU benchmark](dtu-benchmark.md).
+- [Compare vCore and DTU-based purchasing models of Azure SQL Database](purchasing-models.md).
azure-sql Vnet Service Endpoint Rule Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/vnet-service-endpoint-rule-overview.md
description: "Mark a subnet as a virtual network service endpoint. Then add the
-+ ms.devlang:
PolyBase and the COPY statement are commonly used to load data into Azure Synaps
- If you have a general-purpose v1 or Blob Storage account, you must *first upgrade to v2* by following the steps in [Upgrade to a general-purpose v2 storage account](../../storage/common/storage-account-upgrade.md). - For known issues with Azure Data Lake Storage Gen2, see [Known issues with Azure Data Lake Storage Gen2](../../storage/blobs/data-lake-storage-known-issues.md).
-1. Under your storage account, go to **Access Control (IAM)**, and select **Add role assignment**. Assign the **Storage Blob Data Contributor** Azure role to the server or workspace hosting your dedicated SQL pool, which you've registered with Azure AD.
+1. On your storage account page, select **Access control (IAM)**.
+
+1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
+
+1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../../role-based-access-control/role-assignments-portal.md).
+
+ | Setting | Value |
+ | | |
+ | Role | Storage Blob Data Contributor |
+ | Assign access to | User, group, or service principal |
+ | Members | Server or workspace hosting your dedicated SQL pool that you've registered with Azure AD |
+
+ ![Screenshot that shows Add role assignment page in Azure portal.](../../../includes/role-based-access-control/media/add-role-assignment-page.png)
> [!NOTE] > Only members with Owner privilege on the storage account can perform this step. For various Azure built-in roles, see [Azure built-in roles](../../role-based-access-control/built-in-roles.md).
azure-sql Doc Changes Updates Release Notes Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/doc-changes-updates-release-notes-whats-new.md
ms.devlang: Previously updated : 03/10/2022 Last updated : 03/28/2022 # What's new in Azure SQL Managed Instance? [!INCLUDE[appliesto-sqldb-sqlmi](../includes/appliesto-sqlmi.md)]
The following table lists the features of Azure SQL Managed Instance that are cu
| [Data virtualization](data-virtualization-overview.md) | Join locally stored relational data with data queried from external data sources, such as Azure Data Lake Storage Gen2 or Azure Blob Storage. | |[Endpoint policies](../../azure-sql/managed-instance/service-endpoint-policies-configure.md) | Configure which Azure Storage accounts can be accessed from a SQL Managed Instance subnet. Grants an extra layer of protection against inadvertent or malicious data exfiltration.| | [Instance pools](instance-pools-overview.md) | A convenient and cost-efficient way to migrate smaller SQL Server instances to the cloud. |
-| [Link feature](link-feature.md)| Online replication of SQL Server databases hosted anywhere to Azure SQL Managed Instance. |
+| [Managed Instance link](managed-instance-link-feature-overview.md)| Online replication of SQL Server databases hosted anywhere to Azure SQL Managed Instance. |
| [Maintenance window advance notifications](../database/advance-notifications.md)| Advance notifications (preview) for databases configured to use a non-default [maintenance window](../database/maintenance-window.md). Advance notifications are in preview for Azure SQL Managed Instance. | | [Memory optimized premium-series hardware generation](resource-limits.md#service-tier-characteristics) | Deploy your SQL Managed Instance to the new memory optimized premium-series hardware generation to take advantage of the latest Intel Ice Lake CPUs. The memory optimized hardware generation offers higher memory to vCore ratios. |
-| [Migration with Log Replay Service](log-replay-service-migrate.md) | Migrate databases from SQL Server to SQL Managed Instance by using Log Replay Service. |
+| [Migrate with Log Replay Service](log-replay-service-migrate.md) | Migrate databases from SQL Server to SQL Managed Instance by using Log Replay Service. |
| [Premium-series hardware generation](resource-limits.md#service-tier-characteristics) | Deploy your SQL Managed Instance to the new premium-series hardware generation to take advantage of the latest Intel Ice Lake CPUs. | | [Query Store hints](/sql/relational-databases/performance/query-store-hints?view=azuresqldb-mi-current&preserve-view=true) | Use query hints to optimize your query execution via the OPTION clause. | | [Service Broker cross-instance message exchange](/sql/database-engine/configure-windows/sql-server-service-broker) | Support for cross-instance message exchange using Service Broker on Azure SQL Managed Instance. |
The following table lists the features of Azure SQL Managed Instance that have t
|[Maintenance window](../database/maintenance-window.md)| March 2022 | The maintenance window feature allows you to configure maintenance schedule for your Azure SQL Managed Instance. [Maintenance window advance notifications](../database/advance-notifications.md), however, are in preview for Azure SQL Managed Instance.| |[16 TB support in General Purpose](resource-limits.md)| November 2021 | Support for allocation up to 16 TB of space on SQL Managed Instance in the General Purpose service tier. | [Azure Active Directory-only authentication](../database/authentication-azure-ad-only-authentication.md) | November 2021 | It's now possible to restrict authentication to your Azure SQL Managed Instance only to Azure Active Directory users. |
-| [Distributed transactions](../database/elastic-transactions-overview.md) | November 2021 | Distributed database transactions for Azure SQL Managed Instance allow you to run distributed transactions that span several databases across instances. |
+|[Distributed transactions](../database/elastic-transactions-overview.md) | November 2021 | Distributed database transactions for Azure SQL Managed Instance allow you to run distributed transactions that span several databases across instances. |
|[Linked server - managed identity Azure AD authentication](/sql/relational-databases/system-stored-procedures/sp-addlinkedserver-transact-sql#h-create-sql-managed-instance-linked-server-with-managed-identity-azure-ad-authentication) |November 2021 | Create a linked server with managed identity authentication for your Azure SQL Managed Instance.| |[Linked server - pass-through Azure AD authentication](/sql/relational-databases/system-stored-procedures/sp-addlinkedserver-transact-sql#i-create-sql-managed-instance-linked-server-with-pass-through-azure-ad-authentication) |November 2021 | Create a linked server with pass-through Azure AD authentication for your Azure SQL Managed Instance. | |[Long-term backup retention](long-term-backup-retention-configure.md) |November 2021 | Store full backups for a specific database with configured redundancy for up to 10 years in Azure Blob storage, restoring the database as a new database. |
Learn about significant changes to the Azure SQL Managed Instance documentation.
| Changes | Details | | | | | **Data virtualization preview** | It's now possible to query data in external sources such as Azure Data Lake Storage Gen2 or Azure Blob Storage, joining it with locally stored relational data. This feature is currently in preview. To learn more, see [Data virtualization](data-virtualization-overview.md). |
-| **Link feature guidance** | We've published a number of guides for using the [link feature](link-feature.md) with SQL Managed Instance, including how to [prepare your environment](managed-instance-link-preparation.md), [configure replication](managed-instance-link-use-ssms-to-replicate-database.md), [failover your database](managed-instance-link-use-ssms-to-failover-database.md), and some [best practices](link-feature-best-practices.md) when using the link feature. |
+| **Log Replay Service migration** | Use the Log Replay Service to migrate from SQL Server to Azure SQL Managed Instance. This feature is currently in preview. To learn more, see [Migrate with Log Replay Service](log-replay-service-migrate.md). |
+| **Managed Instance link guidance** | We've published a number of guides for using the [Managed Instance link feature](managed-instance-link-feature-overview.md), including how to [prepare your environment](managed-instance-link-preparation.md), [configure replication by using SSMS](managed-instance-link-use-ssms-to-replicate-database.md), [configure replication via scripts](managed-instance-link-use-scripts-to-replicate-database.md), [fail over your database by using SSMS](managed-instance-link-use-ssms-to-failover-database.md), [fail over your database via scripts](managed-instance-link-use-scripts-to-failover-database.md) and some [best practices](managed-instance-link-best-practices.md) when using the link feature (currently in preview). |
| **Maintenance window GA, advance notifications preview** | The [maintenance window](../database/maintenance-window.md) feature is now generally available, allowing you to configure a maintenance schedule for your Azure SQL Managed Instance. It's also possible to receive advance notifications for planned maintenance events, which is currently in preview. Review [Maintenance window advance notifications (preview)](../database/advance-notifications.md) to learn more. | | **Windows Auth for Azure Active Directory principals preview** | Windows Authentication for managed instances empowers customers to move existing services to the cloud while maintaining a seamless user experience, and provides the basis for infrastructure modernization. Learn more in [Windows Authentication for Azure Active Directory principals on Azure SQL Managed Instance](winauth-azuread-overview.md). |
Learn about significant changes to the Azure SQL Managed Instance documentation.
| **Azure AD-only authentication GA** | Restricting authentication to your Azure SQL Managed Instance only to Azure Active Directory users is now generally available. To learn more, see [Azure AD-only authentication](../database/authentication-azure-ad-only-authentication.md). | | **Distributed transactions GA** | The ability to execute distributed transactions across managed instances is now generally available. See [Distributed transactions](../database/elastic-transactions-overview.md) to learn more. | |**Endpoint policies preview** | It's now possible to configure an endpoint policy to restrict access from a SQL Managed Instance subnet to an Azure Storage account. This grants an extra layer of protection against inadvertent or malicious data exfiltration. See [Endpoint policies](../../azure-sql/managed-instance/service-endpoint-policies-configure.md) to learn more. |
-|**Link feature preview** | Use the link feature for SQL Managed Instance to replicate data from your SQL Server hosted anywhere to Azure SQL Managed Instance, leveraging the benefits of Azure without moving your data to Azure, to offload your workloads, for disaster recovery, or to migrate to the cloud. See the [Link feature for SQL Managed Instance](link-feature.md) to learn more. The link feature is currently in limited public preview. |
+|**Link feature preview** | Use the link feature for SQL Managed Instance to replicate data from your SQL Server hosted anywhere to Azure SQL Managed Instance, leveraging the benefits of Azure without moving your data to Azure, to offload your workloads, for disaster recovery, or to migrate to the cloud. See the [Link feature for SQL Managed Instance](managed-instance-link-feature-overview.md) to learn more. The link feature is currently in limited public preview. |
|**Long-term backup retention GA** | Storing full backups for a specific database with configured redundancy for up to 10 years in Azure Blob storage is now generally available. To learn more, see [Long-term backup retention](long-term-backup-retention-configure.md). | | **Move instance to different subnet GA** | It's now possible to move your SQL Managed Instance to a different subnet. See [Move instance to different subnet](vnet-subnet-move-instance.md) to learn more. | |**New hardware generation preview** | There are now two new hardware generations for SQL Managed Instance: premium-series, and a memory optimized premium-series. Both offerings take advantage of a new generation of hardware powered by the latest Intel Ice Lake CPUs, and offer a higher memory to vCore ratio to support your most resource demanding database applications. As part of this announcement, the Gen5 hardware generation has been renamed to standard-series. The two new premium hardware generations are currently in preview. See [resource limits](resource-limits.md#service-tier-characteristics) to learn more. |
azure-sql How To Content Reference Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/how-to-content-reference-guide.md
In this article you can find a content reference to various guides, scripts, and
- [Replicate database with Azure SQL Managed Instance link feature with T-SQL and PowerShell scripts](managed-instance-link-use-scripts-to-replicate-database.md) - [Failover database with link feature in SSMS - Azure SQL Managed Instance](managed-instance-link-use-ssms-to-failover-database.md) - [Failover (migrate) database with Azure SQL Managed Instance link feature with T-SQL and PowerShell scripts](managed-instance-link-use-scripts-to-failover-database.md)-- [Best practices with link feature for Azure SQL Managed Instance](link-feature-best-practices.md)
+- [Best practices with link feature for Azure SQL Managed Instance](managed-instance-link-best-practices.md)
## Monitoring and tuning
azure-sql Log Replay Service Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/log-replay-service-migrate.md
description: Learn how to migrate databases from SQL Server to SQL Managed Insta
-+
The SAS authentication is generated with the time validity that you specified. Y
:::image type="content" source="./media/log-replay-service-migrate/lrs-generated-uri-token.png" alt-text="Screenshot that shows an example of the U R I version of an S A S token."::: > [!NOTE]
- > Using SAS tokens created with permissions set through defining a [stored access policy](https://docs.microsoft.com/rest/api/storageservices/define-stored-access-policy.md) is not supported at this time. You will need to follow the instructions in this guide on manually specifying Read and List permissions for the SAS token.
+ > Using SAS tokens created with permissions set through defining a [stored access policy](/rest/api/storageservices/define-stored-access-policy.md) is not supported at this time. You will need to follow the instructions in this guide on manually specifying Read and List permissions for the SAS token.
### Copy parameters from the SAS token
Functional limitations of LRS are:
- System-managed software patches are blocked for 36 hours once the LRS has been started. After this time window expires, the next software maintenance update will stop LRS. You will need to restart LRS from scratch. - LRS requires databases on SQL Server to be backed up with the `CHECKSUM` option enabled. - The SAS token that LRS will use must be generated for the entire Azure Blob Storage container, and it must have Read and List permissions only. For example, if you grant Read, List and Write permissions, LRS will not be able to start because of the extra Write permission.-- Using SAS tokens created with permissions set through defining a [stored access policy](https://docs.microsoft.com/rest/api/storageservices/define-stored-access-policy.md) is not supported at this time. You will need to follow the instructions in this guide on manually specifying Read and List permissions for the SAS token.
+- Using SAS tokens created with permissions set through defining a [stored access policy](/rest/api/storageservices/define-stored-access-policy.md) is not supported at this time. You will need to follow the instructions in this guide on manually specifying Read and List permissions for the SAS token.
- Backup files containing % and $ characters in the file name cannot be consumed by LRS. Consider renaming such file names. - Backup files for different databases must be placed in separate folders on Blob Storage in a flat-file structure. Nested folders inside individual database folders are not supported. - LRS must be started separately for each database pointing to the full URI path containing an individual database folder. - LRS can support up to 100 simultaneous restore processes per single managed instance. > [!NOTE]
-> If you require database to be R/O accessible during the migration, and if you require migration window larger than 36 hours, please consider an alternative online migrations solution [link feature for Managed Instance](link-feature.md) providing such capability.
+> If you require database to be R/O accessible during the migration, and if you require migration window larger than 36 hours, please consider an alternative online migrations solution [link feature for Managed Instance](managed-instance-link-feature-overview.md) providing such capability.
## Troubleshooting
After you start LRS, use the monitoring cmdlet (`get-azsqlinstancedatabaselogrep
- If you started LRS in autocomplete mode, was a valid filename for the last backup file specified? ## Next steps-- Learn more about [migrating to Managed Instance using the link feature](link-feature.md).
+- Learn more about [migrating to Managed Instance using the link feature](managed-instance-link-feature-overview.md).
- Learn more about [migrating from SQL Server to SQL Managed instance](../migration-guides/managed-instance/sql-server-to-managed-instance-guide.md). - Learn more about [differences between SQL Server and SQL Managed Instance](transact-sql-tsql-differences-sql-server.md). - Learn more about [best practices to cost and size workloads migrated to Azure](/azure/cloud-adoption-framework/migrate/azure-best-practices/migrate-best-practices-costs).
azure-sql Managed Instance Link Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-best-practices.md
+
+ Title: The link feature best practices
+
+description: Learn about best practices when using the link feature for Azure SQL Managed Instance.
++++
+ms.devlang:
++++ Last updated : 03/28/2022+
+# Best practices with link feature for Azure SQL Managed Instance (preview)
+
+This article outlines best practices when using the link feature for Azure SQL Managed Instance. The link feature for Azure SQL Managed Instance connects your SQL Servers hosted anywhere to SQL Managed Instance, providing near real-time data replication to the cloud.
+
+> [!NOTE]
+> The link feature for Azure SQL Managed Instance is currently in preview.
+
+## Take log backups regularly
+
+The link feature replicates data using the [Distributed availability groups](/sql/database-engine/availability-groups/windows/distributed-availability-groups) concept based on the Always On availability groups technology stack. Data replication with distributed availability groups is based on replicating transaction log records. No transaction log records can be truncated from the database on the primary instance until they're replicated to the database on the secondary instance. If transaction log record replication is slow or blocked due to network connection issues, the log file keeps growing on the primary instance. Growth speed depends on the intensity of workload and the network speed. If there's a prolonged network connection outage and heavy workload on primary instance, the log file may take all available storage space.
+
+To minimize the risk of running out of space on your primary instance due to log file growth, make sure to **take database log backups regularly**. By taking log backups regularly, you make your database more resilient to unplanned log growth events. Consider scheduling daily log backup tasks using SQL Server Agent job.
+
+You can use a Transact-SQL (T-SQL) script to back up the log file, such as the sample provided in this section. Replace the placeholders in the sample script with name of your database, name and path of the backup file, and the description.
+
+To back up your transaction log, use the following sample Transact-SQL (T-SQL) script on SQL Server:
+
+```sql
+-- Execute on SQL Server
+USE [<DatabaseName>]
+--Set current database inside job step or script
+--Check that you are executing the script on the primary instance
+if (SELECT role
+ FROM sys.dm_hadr_availability_replica_states AS a
+ JOIN sys.availability_replicas AS b
+ ON b.replica_id = a.replica_id
+WHERE b.replica_server_name = @@SERVERNAME) = 1
+BEGIN
+-- Take log backup
+BACKUP LOG [<DatabaseName>]
+TO DISK = N'<DiskPathandFileName>'
+WITH NOFORMAT, NOINIT,
+NAME = N'<Description>', SKIP, NOREWIND, NOUNLOAD, COMPRESSION, STATS = 1
+END
+```
+
+Use the following Transact-SQL (T-SQL) command to check the log spaced used by your database on SQL Server:
+
+```sql
+-- Execute on SQL Server
+DBCC SQLPERF(LOGSPACE);
+```
+
+The query output looks like the following example below for sample database **tpcc**:
++
+In this example, the database has used 76% of the available log, with an absolute log file size of approximately 27 GB (27,971 MB). The thresholds for action may vary based on your workload, but it's typically an indication that you should take a log backup to truncate the log file and free up some space.
+
+## Add startup trace flags
+
+There are two trace flags (`-T1800` and `-T9567`) that, when added as start up parameters, can optimize the performance of data replication through the link. See [Enable startup trace flags](managed-instance-link-preparation.md#enable-startup-trace-flags) to learn more.
+
+## Next steps
+
+To get started with the link feature, [prepare your environment for replication](managed-instance-link-preparation.md).
+
+For more information on the link feature, see the following articles:
+
+- [Managed Instance link ΓÇô overview](managed-instance-link-feature-overview.md)
+- [Managed Instance link ΓÇô connecting SQL Server to Azure reimagined](https://aka.ms/mi-link-techblog)
azure-sql Managed Instance Link Feature Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-feature-overview.md
+
+ Title: The link feature
+
+description: Learn about the link feature for Azure SQL Managed Instance to continuously replicate data from SQL Server to the cloud, or migrate your SQL Server databases with the best possible minimum downtime.
++++
+ms.devlang:
++++ Last updated : 03/28/2022+
+# Link feature for Azure SQL Managed Instance (preview)
+
+The new link feature in Azure SQL Managed Instance connects your SQL Servers hosted anywhere to SQL Managed Instance, providing hybrid flexibility and database mobility. With an approach that uses near real-time data replication to the cloud, you can offload workloads to a read-only secondary in Azure to take advantage of Azure-only features, performance, and scale.
+
+After a disastrous event, you can continue running your read-only workloads on SQL Managed Instance in Azure. You can also choose to migrate one or more applications from SQL Server to SQL Managed Instance at the same time, at your own pace, and with the best possible minimum downtime compared to other solutions in Azure today.
+
+To use the link feature, you'll need:
+
+- SQL Server 2019 Enterprise Edition or Developer Edition with [CU15 (or above)](https://support.microsoft.com/en-us/topic/kb5008996-cumulative-update-15-for-sql-server-2019-4b6a8ee9-1c61-482d-914f-36e429901fb6) installed on-premises, or on an Azure VM.
+- Network connectivity between your SQL Server and managed instance is required. If your SQL Server is running on-premises, use a VPN link or Express route. If your SQL Server is running on an Azure VM, either deploy your VM to the same subnet as your managed instance, or use global VNet peering to connect two separate subnets.
+- Azure SQL Managed Instance provisioned on any service tier.
+
+> [!NOTE]
+> SQL Managed Instance link feature is available in all public Azure regions.
+> National clouds are currently not supported.
+
+## Overview
+
+The underlying technology of near real-time data replication between SQL Server and SQL Managed Instance is based on distributed availability groups, part of the well-known and proven Always On availability group technology stack. Extend your SQL Server on-premises availability group to SQL Managed Instance in Azure in a safe and secure manner.
+
+There's no need to have an existing availability group or multiple nodes. The link supports single node SQL Server instances without existing availability groups, and also multiple-node SQL Server instances with existing availability groups. Through the link, you can use the modern benefits of Azure without migrating your entire SQL Server data estate to the cloud.
+
+You can keep running the link for as long as you need it, for months and even years at a time. And for your modernization journey, if or when you're ready to migrate to Azure, the link enables a considerably-improved migration experience with the minimum possible downtime compared to all other options available today, providing a true online migration to SQL Managed Instance.
+
+## Supported scenarios
+
+Data replicated through the link feature from SQL Server to Azure SQL Managed Instance can be used with several scenarios, such as:
+
+- **Use Azure services without migrating to the cloud**
+- **Offload read-only workloads to Azure**
+- **Migrate to Azure**
+
+![Managed Instance link main scenario](./media/managed-instance-link-feature-overview/mi-link-main-scenario.png)
+
+### Use Azure services
+
+Use the link feature to leverage Azure services using SQL Server data without migrating to the cloud. Examples include reporting, analytics, backups, machine learning, and other jobs that send data to Azure.
+
+### Offload workloads to Azure
+
+You can also use the link feature to offload workloads to Azure. For example, an application could use SQL Server for read-write workloads, while offloading read-only workloads to SQL Managed Instance in any of Azure's 60+ regions worldwide. Once the link is established, the primary database on SQL Server is read/write accessible, while replicated data to SQL Managed Instance in Azure is read-only accessible. This allows for various scenarios where replicated databases on SQL Managed Instance can be used for read scale-out and offloading read-only workloads to Azure. SQL Managed Instance, in parallel, can also host independent read/write databases. This allows for copying the replicated database to another read/write database on the same managed instance for further data processing.
+
+The link is database scoped (one link per one database), allowing for consolidation and deconsolidation of workloads in Azure. For example, you can replicate databases from multiple SQL Servers to a single SQL Managed Instance in Azure (consolidation), or replicate databases from a single SQL Server to multiple managed instances via a 1 to 1 relationship between a database and a managed instance - to any of Azure's regions worldwide (deconsolidation). The latter provides you with an efficient way to quickly bring your workloads closer to your customers in any region worldwide, which you can use as read-only replicas.
+
+### Migrate to Azure
+
+The link feature also facilitates migrating from SQL Server to SQL Managed Instance, enabling:
+
+- The most performant minimum downtime migration compared to all other solutions available today
+- True online migration to SQL Managed Instance in any service tier
+
+Since the link feature enables minimum downtime migration, you can migrate to your managed instance while maintaining your primary workload online. While online migration was possible to achieve previously with other solutions when migrating to the general purpose service tier, the link feature now also allows for true online migrations to the business critical service tier as well.
+
+## How it works
+
+The underlying technology behind the link feature for SQL Managed Instance is distributed availability groups. The solution supports single-node systems without existing availability groups, or multiple node systems with existing availability groups.
+
+![How does the link feature for SQL Managed Instance work](./media/managed-instance-link-feature-overview/mi-link-ag-dag.png)
+
+Secure connectivity, such as VPN or Express Route is used between an on-premises network and Azure. If SQL Server is hosted on an Azure VM, the internal Azure backbone can be used between the VM and managed instance ΓÇô such as, for example, global VNet peering. The trust between the two systems is established using certificate-based authentication, in which SQL Server and SQL Managed Instance exchange their public keys.
+
+There could exist up to 100 links from the same, or various SQL Server sources to a single SQL Managed Instance. This limit is governed by the number of databases that could be hosted on a managed instance at this time. Likewise, a single SQL Server can establish multiple parallel database replication links with several managed instances in different Azure regions in a 1 to 1 relationship between a database and a managed instance . The feature requires CU13 or higher to be installed on SQL Server 2019.
+
+## Use the link feature
+
+To help with the initial environment setup, we have prepared the following online guide on how to setup your SQL Server environment to use with the link feature for Managed Instance:
+
+* [Prepare environment for the link](managed-instance-link-preparation.md)
+
+Once you have ensured the pre-requirements have been met, you can create the link using the automated wizard in SSMS, or you can choose to setup the link manually using scripts. Create the link using one of the following instructions:
+
+* [Replicate database with link feature in SSMS](managed-instance-link-use-ssms-to-replicate-database.md), or alternatively
+* [Replicate database with Azure SQL Managed Instance link feature with T-SQL and PowerShell scripts](managed-instance-link-use-scripts-to-replicate-database.md)
+
+Once the link has been created, ensure that you follow the best practices for maintaining the link, by following instructions described at this page:
+
+* [Best practices with link feature for Azure SQL Managed Instance](managed-instance-link-best-practices.md)
+
+If and when you are ready to migrate a database to Azure with a minimum downtime, you can do this using an automated wizard in SSMS, or you can choose to do this manually with scripts. Migrate database to Azure link using one of the following instructions:
+
+* [Failover database with link feature in SSMS](managed-instance-link-use-ssms-to-failover-database.md), or alternatively
+* [Failover (migrate) database with Azure SQL Managed Instance link feature with T-SQL and PowerShell scripts](managed-instance-link-use-scripts-to-failover-database.md)
+
+## Limitations
+
+This section describes the productΓÇÖs functional limitations.
+
+### General functional limitations
+
+Managed Instance link has a set of general limitations, and those are listed in this section. Listed limitations are of a technical nature and are unlikely to be addressed in the foreseeable future.
+
+- Only user databases can be replicated. Replication of system databases isn't supported.
+- The solution doesn't replicate server level objects, agent jobs, nor user logins from SQL Server to Managed Instance.
+- Only one database can be placed into a single Availability Group per one Distributed Availability Group link.
+- Link can't be established between SQL Server and Managed Instance if functionality used on SQL Server isn't support on Managed Instance.
+ - File tables and file streams aren't supported for replication, as Managed Instance doesn't support this.
+ - Replicating Databases using Hekaton (In-Memory OLTP) isn't supported on Managed Instance General Purpose service tier. Hekaton is only supported on Managed Instance Business Critical service tier.
+ - For the full list of differences between SQL Server and Managed Instance, see [this article](./transact-sql-tsql-differences-sql-server.md).
+- In case Change data capture (CDC), log shipping, or service broker are used with database replicated on the SQL Server, and in case of database migration to Managed Instance, on the failover to the Azure, clients will need to connect using instance name of the current global primary replica. you'll need to manually re-configure these settings.
+- In case Transactional Replication is used with database replicated on the SQL Server, and in case of migration scenario, on failover to Azure, transactional replication on Azure SQL Managed instance won't continue. you'll need to manually re-configure Transactional Replication.
+- In case distributed transactions are used with database replicated from the SQL Server, and in case of migration scenario, on the cutover to the cloud, the DTC capabilities won't be transferred. There will be no possibility for migrated database to get involved in distributed transactions with SQL Server, as Managed Instance doesn't support distributed transactions with SQL Server at this time. For reference, Managed Instance today supports distributed transactions only between other Managed Instances, see [this article](../database/elastic-transactions-overview.md#transactions-for-sql-managed-instance).
+- Managed Instance link can replicate database of any size if it fits into chosen storage size of target Managed Instance.
+
+### Preview limitations
+
+Some Managed Instance link features and capabilities are limited **at this time**. Details can be found in the following list.
+- SQL Server 2019, Enterprise Edition or Developer Edition, CU15 (or higher) on Windows or Linux host OS is supported.
+- Private endpoint (VPN/VNET) is supported to connect Distributed Availability Groups to Managed Instance. Public endpoint can't be used to connect to Managed Instance.
+- Managed Instance Link authentication between SQL Server instance and Managed Instance is certificate-based, available only through exchange of certificates. Windows authentication between instances isn't supported.
+- Replication of user databases from SQL Server to Managed Instance is one-way. User databases from Managed Instance can't be replicated back to SQL Server.
+- [Auto failover groups](auto-failover-group-sql-mi.md) replication to secondary Managed Instance can't be used in parallel while operating the Managed Instance link with SQL Server.
+- Replicated R/O databases aren't part of auto-backup process on SQL Managed Instance.
+
+## Next steps
+
+If you're interested in using Link feature for Azure SQL Managed Instance with versions and editions that are currently not supported, sign-up [here](https://aka.ms/mi-link-signup).
+
+For more information on the link feature, see the following:
+
+- [Managed Instance link ΓÇô connecting SQL Server to Azure reimagined](https://aka.ms/mi-link-techblog).
+- [Prepare for SQL Managed Instance link](./managed-instance-link-preparation.md).
+- [Use SQL Managed Instance link via SSMS to replicate database](./managed-instance-link-use-ssms-to-replicate-database.md).
+- [Use SQL Managed Instance link via SSMS to migrate database](./managed-instance-link-use-ssms-to-failover-database.md).
+
+For other replication scenarios, consider:
+
+- [Transactional replication with Azure SQL Managed Instance (Preview)](replication-transactional-overview.md)
azure-sql Managed Instance Link Preparation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-preparation.md
Last updated 03/22/2022
# Prepare your environment for a link - Azure SQL Managed Instance [!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
-This article teaches you how to prepare your environment for a [Managed Instance link](link-feature.md) so that you can replicate databases from SQL Server to Azure SQL Managed Instance.
+This article teaches you how to prepare your environment for a [Managed Instance link](managed-instance-link-feature-overview.md) so that you can replicate databases from SQL Server to Azure SQL Managed Instance.
> [!NOTE] > The link is a feature of Azure SQL Managed Instance and is currently in preview.
azure-sql Managed Instance Link Use Scripts To Failover Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-use-scripts-to-failover-database.md
Last updated 03/15/2022
[!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
-This article teaches you how to use Transact-SQL (T-SQL) and PowerShell scripts and a [Managed Instance link](link-feature.md) to fail over (migrate) your database from SQL Server to SQL Managed Instance.
+This article teaches you how to use Transact-SQL (T-SQL) and PowerShell scripts and a [Managed Instance link](managed-instance-link-feature-overview.md) to fail over (migrate) your database from SQL Server to SQL Managed Instance.
> [!NOTE] > - The link is a feature of Azure SQL Managed Instance and is currently in preview. You can also use a [SQL Server Management Studio (SSMS) wizard](managed-instance-link-use-ssms-to-failover-database.md) to fail over a database with the link.
azure-sql Managed Instance Link Use Scripts To Replicate Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-use-scripts-to-replicate-database.md
Last updated 03/22/2022
[!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
-This article teaches you how to use Transact-SQL (T-SQL) and PowerShell scripts to replicate your database from SQL Server to Azure SQL Managed Instance by using a [Managed Instance link](link-feature.md).
+This article teaches you how to use Transact-SQL (T-SQL) and PowerShell scripts to replicate your database from SQL Server to Azure SQL Managed Instance by using a [Managed Instance link](managed-instance-link-feature-overview.md).
> [!NOTE] > - The link is a feature of Azure SQL Managed Instance and is currently in preview. You can also use a [SQL Server Management Studio (SSMS) wizard](managed-instance-link-use-ssms-to-replicate-database.md) to set up the link to replicate your database.
After the connection is established, the **Managed Instance Databases** view in
> [!IMPORTANT] > - The link won't work unless network connectivity exists between SQL Server and SQL Managed Instance. To troubleshoot network connectivity, follow the steps in [Test bidirectional network connectivity](managed-instance-link-preparation.md#test-bidirectional-network-connectivity).
-> - Take regular backups of the log file on SQL Server. If the used log space reaches 100 percent, replication to SQL Managed Instance stops until space use is reduced. We highly recommend that you automate log backups by setting up a daily job. For details, see [Back up log files on SQL Server](link-feature-best-practices.md#take-log-backups-regularly).
+> - Take regular backups of the log file on SQL Server. If the used log space reaches 100 percent, replication to SQL Managed Instance stops until space use is reduced. We highly recommend that you automate log backups by setting up a daily job. For details, see [Back up log files on SQL Server](managed-instance-link-best-practices.md#take-log-backups-regularly).
## Next steps
azure-sql Managed Instance Link Use Ssms To Failover Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-use-ssms-to-failover-database.md
Last updated 03/10/2022
[!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
-This article teaches you how to fail over a database from SQL Server to Azure SQL Managed Instance by using [the link feature](link-feature.md) in SQL Server Management Studio (SSMS).
+This article teaches you how to fail over a database from SQL Server to Azure SQL Managed Instance by using [the link feature](managed-instance-link-feature-overview.md) in SQL Server Management Studio (SSMS).
Failing over your database from SQL Server to SQL Managed Instance breaks the link between the two databases. It stops replication and leaves both databases in an independent state, ready for individual read/write workloads.
In the following steps, you use the **Failover database to Managed Instance** wi
During the failover process, the link is dropped and no longer exists. The source SQL Server database and the target SQL Managed Instance database can both execute a read/write workload. They're completely independent.
-You can validate that the link bas been dropped by reviewing the database on SQL Server.
+You can validate that the link has been dropped by reviewing the database on SQL Server.
:::image type="content" source="./media/managed-instance-link-use-ssms-to-failover-database/link-failover-ssms-sql-server-database.png" alt-text="Screenshot that shows a database on SQL Server in S S M S.":::
Then, review the database on SQL Managed Instance.
## Next steps
-To learn more, see [Link feature for Azure SQL Managed Instance](link-feature.md).
+To learn more, see [Link feature for Azure SQL Managed Instance](managed-instance-link-feature-overview.md).
azure-sql Managed Instance Link Use Ssms To Replicate Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/managed-instance-link-use-ssms-to-replicate-database.md
Last updated 03/22/2022
[!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
-This article teaches you how to replicate your database from SQL Server to Azure SQL Managed Instance by using [the link feature](link-feature.md) in SQL Server Management Studio (SSMS).
+This article teaches you how to replicate your database from SQL Server to Azure SQL Managed Instance by using [the link feature](managed-instance-link-feature-overview.md) in SQL Server Management Studio (SSMS).
> [!NOTE] > The link is a feature of Azure SQL Managed Instance and is currently in preview.
Connect to your managed instance and use Object Explorer to view your replicated
## Next steps
-To break the link and fail over your database to SQL Managed Instance, see [Fail over a database](managed-instance-link-use-ssms-to-failover-database.md). To learn more, see [Link feature for Azure SQL Managed Instance](link-feature.md).
+To break the link and fail over your database to SQL Managed Instance, see [Fail over a database](managed-instance-link-use-ssms-to-failover-database.md). To learn more, see [Link feature for Azure SQL Managed Instance](managed-instance-link-feature-overview.md).
azure-sql Sql Server To Managed Instance Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-guide.md
After you've verified that your source environment is supported, start with the
In the Discover phase, scan the network to identify all SQL Server instances and features used by your organization.
-Use [Azure Migrate](../../../migrate/migrate-services-overview.md) to assesses migration suitability of on-premises servers, perform performance-based sizing, and provide cost estimations for running them in Azure.
+Use [Azure Migrate](../../../migrate/migrate-services-overview.md) to assess migration suitability of on-premises servers, perform performance-based sizing, and provide cost estimations for running them in Azure.
Alternatively, use theΓÇ»[Microsoft Assessment and Planning ToolkitΓÇ»(the "MAP Toolkit")](https://www.microsoft.com/download/details.aspx?id=7826) to assess your current IT infrastructure. The toolkit provides a powerful inventory, assessment, and reporting tool to simplify the migration planning process.
activities to the platform as they are built in. Therefore, some instance-level
need to be migrated, such as maintenance jobs for regular backups or Always On configuration, as [high availability](../../database/high-availability-sla.md) is built in.
-SQL Managed Instance supports the following database migration options (currently these are the
-only supported migration methods):
+This article covers two of the recommended migration options:
- Azure Database Migration Service - migration with near-zero downtime. - Native `RESTORE DATABASE FROM URL` - uses native backups from SQL Server and requires some downtime.
-This guide describe the two most popular options - Azure Database Migration Service (DMS) and native backup and restore.
+This guide describes the two most popular options - Azure Database Migration Service (DMS) and native backup and restore.
+
+For other migration tools, see [Compare migration options](sql-server-to-managed-instance-overview.md#compare-migration-options).
### Database Migration Service
To learn more about this migration option, see [Restore a database to Azure SQL
> [!NOTE] > A database restore operation is asynchronous and retryable. You might get an error in SQL Server Management Studio if the connection breaks or a time-out expires. Azure SQL Database will keep trying to restore database in the background, and you can track the progress of the restore using the [sys.dm_exec_requests](/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-requests-transact-sql) and [sys.dm_operation_status](/sql/relational-databases/system-dynamic-management-views/sys-dm-operation-status-azure-sql-database) views.
-## Migation tools
-
-While using [Azure Database Migration Service](../../../dms/tutorial-sql-server-to-managed-instance.md), or [native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) to migrate a database to Managed Instance, consider as well the following migration tools:
-
-|Migration option |When to use |Considerations |
-||||
-|[Azure SQL Migration extension for Azure Data Studio](../../../dms/migration-using-azure-data-studio.md) | - Migrate single databases or multiple databases at scale. </br> - Can run in both online (minimal downtime) and offline (acceptable downtime) modes. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Easy to setup and get started. </br> - Requires setup of self-hosted integration runtime to access on-premises SQL Server and backups. </br> - Includes both assessment and migration capabilities. |
-|[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> </br> Supported sources: </br> - SQL Server (2008 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - The migration entails making full database backups on SQL Server and copying backup files to Azure Blob Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance. </br> - Databases being restored during the migration process will be in a restoring mode and can't be used to read or write until the process has finished.|
-|[Link feature for Managed Instance](../../managed-instance/link-feature.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> - Minimum downtime migration is needed. </br> </br> Supported sources: </br> - SQL Server (2016 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - GCP Compute SQL Server VM | - The migration entails establishing a VPN connectivity between SQL Server and Managed Instance, and opening inbound communication ports. </br> - Always On technology is used to replicate database near real-time, making an exact replica of SQL Server database on Managed Instance. </br> - Database can be used for R/O access on Managed Instance while migration is in progress. </br> - Provides the best performance minimum downtime migration. |
## Data sync and cutover
azure-sql Sql Server To Managed Instance Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-overview.md
We recommend the following migration tools:
|[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-managed-instance.md) | This Azure service supports migration in the offline mode for applications that can afford downtime during the migration process. Unlike the continuous migration in online mode, offline mode migration runs a one-time restore of a full database backup from the source to the target. | |[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | SQL Managed Instance supports restore of native SQL Server database backups (.bak files). It's the easiest migration option for customers who can provide full database backups to Azure Storage.| |[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | This cloud service is enabled for SQL Managed Instance based on SQL Server log-shipping technology. It's a migration option for customers who can provide full, differential, and log database backups to Azure Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance.|
-|[Link feature for Managed Instance](../../managed-instance/link-feature.md) | This feature enables online migration to Managed Instance using Always On technology. ItΓÇÖs a migration option for customers who require database on Managed Instance to be accessible in R/O mode while migration is in progress, who need to keep the migration running for prolonged periods of time (weeks or months at the time), who require true online replication to Business Critical service tier, and for customers who require the most performant minimum downtime migration. |
+|[Managed Instance link](../../managed-instance/managed-instance-link-feature-overview.md) | This feature enables online migration to Managed Instance using Always On technology. ItΓÇÖs a migration option for customers who require database on Managed Instance to be accessible in R/O mode while migration is in progress, who need to keep the migration running for prolonged periods of time (weeks or months at the time), who require true online replication to Business Critical service tier, and for customers who require the most performant minimum downtime migration. |
The following table lists alternative migration tools:
Compare migration options to choose the path that's appropriate to your business needs.
-The following table compares the migration options that we recommend:
+The following table compares the recommended migration options:
|Migration option |When to use |Considerations | |||| |[Azure SQL Migration extension for Azure Data Studio](../../../dms/migration-using-azure-data-studio.md) | - Migrate single databases or multiple databases at scale. </br> - Can run in both online (minimal downtime) and offline (acceptable downtime) modes. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Easy to setup and get started. </br> - Requires setup of self-hosted integration runtime to access on-premises SQL Server and backups. </br> - Includes both assessment and migration capabilities. | |[Azure Database Migration Service](../../../dms/tutorial-sql-server-to-managed-instance.md) | - Migrate single databases or multiple databases at scale. </br> - Can accommodate downtime during the migration process. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Migrations at scale can be automated via [PowerShell](../../../dms/howto-sql-server-to-azure-sql-managed-instance-powershell-offline.md). </br> - Time to complete migration depends on database size and is affected by backup and restore time. </br> - Sufficient downtime might be required. | |[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | - Migrate individual line-of-business application databases. </br> - Quick and easy migration without a separate migration service or tool. </br> </br> Supported sources: </br> - SQL Server (2005 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - Database backup uses multiple threads to optimize data transfer to Azure Blob Storage, but partner bandwidth and database size can affect transfer rate. </br> - Downtime should accommodate the time required to perform a full backup and restore (which is a size of data operation).|
-|[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> </br> Supported sources: </br> - SQL Server (2008 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - The migration entails making full database backups on SQL Server and copying backup files to Azure Blob Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance. </br> - Databases being restored during the migration process will be in a restoring mode and can't be used to read or write until the process has finished.|
-|[Link feature for Managed Instance](../../managed-instance/link-feature.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> - Minimum downtime migration is needed. </br> </br> Supported sources: </br> - SQL Server (2016 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - GCP Compute SQL Server VM | - The migration entails establishing a VPN connectivity between SQL Server and Managed Instance, and opening inbound communication ports. </br> - Always On technology is used to replicate database near real-time, making an exact replica of SQL Server database on Managed Instance. </br> - Database can be used for R/O access on Managed Instance while migration is in progress. </br> - Provides the best performance minimum downtime migration. |
+|[Log Replay Service](../../managed-instance/log-replay-service-migrate.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> </br> Supported sources: </br> - SQL Server (2008 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - AWS RDS </br> - GCP Compute SQL Server VM | - The migration entails making full database backups on SQL Server and copying backup files to Azure Blob Storage. Log Replay Service is used to restore backup files from Azure Blob Storage to SQL Managed Instance. </br> - Databases being restored during the migration process will be in a restoring mode and can't be used for read or write workloads until the process is complete.|
+|[Managed Instance link](../../managed-instance/managed-instance-link-feature-overview.md) | - Migrate individual line-of-business application databases. </br> - More control is needed for database migrations. </br> - Minimum downtime migration is needed. </br> </br> Supported sources: </br> - SQL Server (2016 to 2019) on-premises or Azure VM </br> - AWS EC2 </br> - GCP Compute SQL Server VM | - The migration entails establishing a network connection between SQL Server and SQL Managed Instance, and opening communication ports. </br> - Uses [Always On availability group](/sql/database-engine/availability-groups/windows/overview-of-always-on-availability-groups-sql-server) technology to replicate database near real-time, making an exact replica of the SQL Server database on SQL Managed Instance. </br> - The database can be used for read-only access on SQL Managed Instance while migration is in progress. </br> - Provides the best performance during migration with minimum downtime. |
The following table compares the alternative migration options:
azure-video-analyzer Compare Video Indexer With Media Services Presets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md
-# Compare Azure Media Services v3 presets and Video Analyzer for Media
+# Compare Azure Media Services v3 presets and Video Analyzer for Media
-This article compares the capabilities of **Video Analyzer for Media (formerly Video Indexer) APIs** and **Media Services v3 APIs**.
+This article compares the capabilities of **Video Analyzer for Media (formerly Video Indexer) APIs** and **Media Services v3 APIs**.
-Currently, there is an overlap between features offered by the [Video Analyzer for Media APIs](https://api-portal.videoindexer.ai/) and the [Media Services v3 APIs](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01/Encoding.json). The following table offers the current guideline for understanding the differences and similarities.
+Currently, there is an overlap between features offered by the [Video Analyzer for Media APIs](https://api-portal.videoindexer.ai/) and the [Media Services v3 APIs](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01/Encoding.json). The following table offers the current guideline for understanding the differences and similarities.
## Compare |Feature|Video Analyzer for Media APIs |Video Analyzer and Audio Analyzer Presets<br/>in Media Services v3 APIs| ||||
-|Media Insights|[Enhanced](video-indexer-output-json-v2.md) |[Fundamentals](../../media-services/latest/analyze-video-audio-files-concept.md)|
+|Media Insights|[Enhanced](video-indexer-output-json-v2.md) |[Fundamentals](/media-services/latest/analyze-video-audio-files-concept)|
|Experiences|See the full list of supported features: <br/> [Overview](video-indexer-overview.md)|Returns video insights only| |Billing|[Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/#analytics)|[Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/#analytics)| |Compliance|For the most current compliance updates, visit [Azure Compliance Offerings.pdf](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942/file/178110/23/Microsoft%20Azure%20Compliance%20Offerings.pdf) and search for "Video Analyzer for Media" to see if it complies with a certificate of interest.|For the most current compliance updates, visit [Azure Compliance Offerings.pdf](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942/file/178110/23/Microsoft%20Azure%20Compliance%20Offerings.pdf) and search for "Media Services" to see if it complies with a certificate of interest.|
Currently, there is an overlap between features offered by the [Video Analyzer f
[Video Analyzer for Media overview](video-indexer-overview.md)
-[Media Services v3 overview](../../media-services/latest/media-services-overview.md)
+[Media Services v3 overview](/media-services/latest/media-services-overview)
azure-video-analyzer Connect To Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/connect-to-azure.md
Last updated 10/19/2021
-
+ # Create a Video Analyzer for Media account When creating an Azure Video Analyzer for Media (formerly Video Indexer) account, you can choose a free trial account (where you get a certain number of free indexing minutes) or a paid option (where you're not limited by the quota). With a free trial, Video Analyzer for Media provides up to 600 minutes of free indexing to users and up to 2400 minutes of free indexing to users that subscribe to the Video Analyzer API on the [developer portal](https://aka.ms/avam-dev-portal). With the paid options, Azure Video Analyzer for Media offers two types of accounts: classic accounts(General Availability), and ARM-based accounts(Public Preview). Main difference between the two is account management platform. While classic accounts is built on the API Management, ARM-based accounts management is built on Azure, enables to apply access control to all services with role-based access control (Azure RBAC) natively.
If the connection to Azure failed, you can attempt to troubleshoot the problem b
### Create and configure a Media Services account
-1. Use the [Azure](https://portal.azure.com/) portal to create an Azure Media Services account, as described in [Create an account](../../media-services/previous/media-services-portal-create-account.md).
+1. Use the [Azure](https://portal.azure.com/) portal to create an Azure Media Services account, as described in [Create an account](/media-services/previous/media-services-portal-create-account).
+
+ Make sure the Media Services account was created with the classic APIs.
- Make sure the Media Services account was created with the classic APIs.
-
![Media Services classic API](./media/create-account/enable-classic-api.png)
If the connection to Azure failed, you can attempt to troubleshoot the problem b
In the new Media Services account, select **Streaming endpoints**. Then select the streaming endpoint and press start. ![Streaming endpoints](./media/create-account/create-ams-account-se.png)
-4. For Video Analyzer for Media to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](../../media-services/previous/media-services-portal-get-started-with-aad.md):
+4. For Video Analyzer for Media to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](/media-services/previous/media-services-portal-get-started-with-aad):
1. In the new Media Services account, select **API access**.
- 2. Select [Service principal authentication method](../../media-services/previous/media-services-portal-get-started-with-aad.md).
+ 2. Select [Service principal authentication method](/media-services/previous/media-services-portal-get-started-with-aad).
3. Get the client ID and client secret After you select **Settings**->**Keys**, add **Description**, press **Save**, and the key value gets populated.
In the dialog, provide the following information:
When creating a new **ARM-Based** account, you have an option to import your content from the *trial* account into the new **ARM-Based** account free of charge. > [!NOTE] > * Import from trial can be performed only once per trial account.
-> * The target ARM-Based account needs to be created and available before import is assigned.
+> * The target ARM-Based account needs to be created and available before import is assigned.
> * Target ARM-Based account has to be an empty account (never indexed any media files). To import your data, follow the steps:
To import your data, follow the steps:
3. Click the *Import content to an ARM-based account* 4. From the dropdown menu choose the ARM-based account you wish to import the data to. * If the account ID isn't showing, you can copy and paste the account ID from Azure portal or the account list, on the side blade in the Azure Video Analyzer for Media Portal.
- 5. Click **Import content**
+ 5. Click **Import content**
![import](./media/create-account/import-steps.png)
All media and content model customizations will be copied from the *trial* accou
The following Azure Media Services related considerations apply:
-* If you plan to connect to an existing Media Services account, make sure the Media Services account was created with the classic APIs.
-
+* If you plan to connect to an existing Media Services account, make sure the Media Services account was created with the classic APIs.
+ ![Media Services classic API](./media/create-account/enable-classic-api.png) * If you connect to an existing Media Services account, Video Analyzer for Media doesn't change the existing media **Reserved Units** configuration.
The following Azure Media Services related considerations apply:
* If you connect automatically, Video Analyzer for Media sets the media **Reserved Units** to 10 S3 units: ![Media Services reserved units](./media/create-account/ams-reserved-units.png)
-
+ ## Automate creation of the Video Analyzer for Media account To automate the creation of the account is a two steps process:
-
+ 1. Use Azure Resource Manager to create an Azure Media Services account + Azure AD application. See an example of the [Media Services account creation template](https://github.com/Azure-Samples/media-services-v3-arm-templates).
To automate the creation of the account is a two steps process:
To create a paid account via the Video Analyzer for Media portal:
-1. Go to https://videoindexer.ai.azure.us
+1. Go to https://videoindexer.ai.azure.us
1. Log in with your Azure Government Azure AD account.
-1. If you do not have any Video Analyzer for Media accounts in Azure Government that you are an owner or a contributor to, you will get an empty experience from which you can start creating your account.
+1. If you do not have any Video Analyzer for Media accounts in Azure Government that you are an owner or a contributor to, you will get an empty experience from which you can start creating your account.
- The rest of the flow is as described in above , only the regions to select from will be Government regions in which Video Analyzer for Media is available
+ The rest of the flow is as described in above , only the regions to select from will be Government regions in which Video Analyzer for Media is available
If you already are a contributor or an admin of an existing one or more Video Analyzer for Media account in Azure Government, you will be taken to that account and from there you can start a follow steps for creating an additional account if needed, as described above.
-
+ ### Create new account via the API on Azure Government To create a paid account in Azure Government, follow the instructions in [Create-Paid-Account](). This API end point only includes Government cloud regions. ### Limitations of Video Analyzer for Media on Azure Government
-* No manual content moderation available in Government cloud.
+* No manual content moderation available in Government cloud.
- In the public cloud when content is deemed offensive based on a content moderation, the customer can ask for a human to look at that content and potentially revert that decision.
-* No trial accounts.
-* Bing description - in Gov cloud we will not present a description of celebrities and named entities identified. This is a UI capability only.
+ In the public cloud when content is deemed offensive based on a content moderation, the customer can ask for a human to look at that content and potentially revert that decision.
+* No trial accounts.
+* Bing description - in Gov cloud we will not present a description of celebrities and named entities identified. This is a UI capability only.
## Clean up resources
After you are done with this tutorial, delete resources that you are not plannin
If you want to delete a Video Analyzer for Media account, you can delete the account from the Video Analyzer for Media website. To delete the account, you must be the owner.
-Select the account -> **Settings** -> **Delete this account**.
+Select the account -> **Settings** -> **Delete this account**.
The account will be permanently deleted in 90 days.
azure-video-analyzer Considerations When Use At Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/considerations-when-use-at-scale.md
Title: Things to consider when using Azure Video Analyzer for Media (formerly Vi
description: This topic explains what things to consider when using Azure Video Analyzer for Media (formerly Video Indexer) at scale. Last updated 11/13/2020-+ # Things to consider when using Video Analyzer for Media at scale
-When using Azure Video Analyzer for Media (formerly Video Indexer) to index videos and your archive of videos is growing, consider scaling.
+When using Azure Video Analyzer for Media (formerly Video Indexer) to index videos and your archive of videos is growing, consider scaling.
This article answers questions like:
First, it has file size limitations. The size of the byte array file is limited
Second, consider just some of the issues that can affect your performance and hence your ability to scale:
-* Sending files using multi-part means high dependency on your network,
-* service reliability,
-* connectivity,
-* upload speed,
+* Sending files using multi-part means high dependency on your network,
+* service reliability,
+* connectivity,
+* upload speed,
* lost packets somewhere in the world wide web. :::image type="content" source="./media/considerations-when-use-at-scale/first-consideration.png" alt-text="First consideration for using Video Analyzer for Media at scale":::
When you upload videos using URL, you just need to provide a path to the locatio
To see an example of how to upload videos using URL, check out [this example](upload-index-videos.md#code-sample). Or, you can use [AzCopy](../../storage/common/storage-use-azcopy-v10.md) for a fast and reliable way to get your content to a storage account from which you can submit it to Video Analyzer for Media using [SAS URL](../../storage/common/storage-sas-overview.md). Video Analyzer for Media recommends using *readonly* SAS URLs.
-## Automatic Scaling of Media Reserved Units
+## Automatic Scaling of Media Reserved Units
-Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Reserved Units](../../media-services/latest/concept-media-reserved-units.md)(MRUs) auto scaling by [Azure Media Services](../../media-services/latest/media-services-overview.md) (AMS), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, e.g. price reduction in many cases, based on your business needs as it is being auto scaled.
+Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Reserved Units](/media-services/latest/concept-media-reserved-units)(MRUs) auto scaling by [Azure Media Services](/media-services/latest/media-services-overview) (AMS), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, e.g. price reduction in many cases, based on your business needs as it is being auto scaled.
## Respect throttling
Video Analyzer for Media is built to deal with indexing at scale, and when you w
We recommend that instead of polling the status of your request constantly from the second you sent the upload request, you can add a [callback URL](upload-index-videos.md#callbackurl), and wait for Video Analyzer for Media to update you. As soon as there is any status change in your upload request, you get a POST notification to the URL you specified.
-You can add a callback URL as one of the parameters of the [upload video API](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video). Check out the code samples in [GitHub repo](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/).
+You can add a callback URL as one of the parameters of the [upload video API](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video). Check out the code samples in [GitHub repo](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/).
For callback URL you can also use Azure Functions, a serverless event-driven platform that can be triggered by HTTP and implement a following flow.
For example, donΓÇÖt set the preset to streaming if you don't plan to watch the
## Index in optimal resolution, not highest resolution
-You might be asking, what video quality do you need for indexing your videos?
+You might be asking, what video quality do you need for indexing your videos?
In many cases, indexing performance has almost no difference between HD (720P) videos and 4K videos. Eventually, youΓÇÖll get almost the same insights with the same confidence. The higher the quality of the movie you upload means the higher the file size, and this leads to higher computing power and time needed to upload the video.
azure-video-analyzer Create Video Analyzer For Media Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/create-video-analyzer-for-media-account.md
To start using Azure Video Analyzer for Media, you will need to create a Video A
![Image of create account](media/create-video-analyzer-for-media-account/create-account-blade.png)
-
+ | Name | Description | | || |**Subscription**|Choose the subscription that you are using to create the Video Analyzer for Media account.|
To start using Azure Video Analyzer for Media, you will need to create a Video A
|**Video Analyzer for Media account**|Select *Create a new account* option.| |**Resource name**|Enter the name of the new Video Analyzer for Media account, the name can contain letters, numbers and dashes with no spaces.| |**Location**|Select the geographic region that will be used to deploy the Video Analyzer for Media account. The location matches the **resource group location** you chose, if you'd like to change the selected location change the selected resource group or create a new one in the preferred location. [Azure region in which Video Analyzer for Media is available](https://azure.microsoft.com/global-infrastructure/services/?products=cognitive-services&regions=all)|
-|**Media Services account name**|Select a Media Services that the new Video Analyzer for Media account will use to process the videos. You can select an existing Media Services or you can create a new one. The Media Services must be in the same location you selected.|
+|**Media Services account name**|Select a Media Services that the new Video Analyzer for Media account will use to process the videos. You can select an existing Media Services or you can create a new one. The Media Services must be in the same location you selected.|
|**User-assigned managed identity**|Select a user-assigned managed identity that the new Video Analyzer for Media account will use to access the Media Services. You can select an existing user-assigned managed identity or you can create a new one. The user-assignment managed identity will be assigned the role of Contributor role on the Media Services.| 1. Click **Review + create** at the bottom of the form.
Learn how to [Upload a video using C#](https://github.com/Azure-Samples/media-se
<!-- links --> [docs-uami]: ../../active-directory/managed-identities-azure-resources/overview.md
-[docs-ms]: ../../media-services/latest/media-services-overview.md
+[docs-ms]: /media-services/latest/media-services-overview
[docs-role-contributor]: ../../role-based-access-control/built-in-roles.md#contibutor [docs-contributor-on-ms]: ./add-contributor-role-on-the-media-service.md
azure-video-analyzer Deploy With Arm Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/deploy-with-arm-template.md
Title: Deploy Azure Video Analyzer for Media with ARM template
+ Title: Deploy Azure Video Analyzer for Media with ARM template
description: In this tutorial you will create an Azure Video Analyzer for Media account by using Azure Resource Manager (ARM) template.
Last updated 12/01/2021
-# Tutorial: deploy Azure Video Analyzer for Media with ARM template
+# Tutorial: deploy Azure Video Analyzer for Media with ARM template
## Overview
-In this tutorial you will create an Azure Video Analyzer for Media (formerly Video Indexer) account by using Azure Resource Manager (ARM) template (preview).
+In this tutorial you will create an Azure Video Analyzer for Media (formerly Video Indexer) account by using Azure Resource Manager (ARM) template (preview).
The resource will be deployed to your subscription and will create the Azure Video Analyzer for Media resource based on parameters defined in the avam.template file. > [!NOTE]
The resource will be deployed to your subscription and will create the Azure Vid
## Prerequisites
-* An Azure Media Services (AMS) account. You can create one for free through the [Create AMS Account](../../media-services/latest/account-create-how-to.md).
+* An Azure Media Services (AMS) account. You can create one for free through the [Create AMS Account](/media-services/latest/account-create-how-to).
## Deploy the sample
The resource will be deployed to your subscription and will create the Azure Vid
### Option 1: Click the "Deploy To Azure Button", and fill in the missing parameters
-[![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fmedia-services-video-indexer%2Fmaster%2FARM-Samples%2FCreate-Account%2Favam.template.json)
+[![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fmedia-services-video-indexer%2Fmaster%2FARM-Samples%2FCreate-Account%2Favam.template.json)
-
azure-video-analyzer Odrv Download https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/odrv-download.md
Last updated 12/17/2021
-# Index your videos stored on OneDrive
+# Index your videos stored on OneDrive
This article shows how to index videos stored on OneDrive by using the Azure Video Analyzer for Media (formerly Video Indexer) website. ## Supported file formats
-For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](../../media-services/latest/encode-media-encoder-standard-formats-reference.md).
-
+For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/media-services/latest/encode-media-encoder-standard-formats-reference).
+ ## Index a video by using the website 1. Sign into the [Video Analyzer for Media](https://www.videoindexer.ai/) website, and then select **Upload**.
For a list of file formats that you can use with Video Analyzer for Media, see [
> [!div class="mx-imgBorder"] > :::image type="content" source="./media/video-indexer-get-started/video-indexer-upload.png" alt-text="Screenshot that shows the Upload button.":::
-1. Click on **enter a file URL** button
+1. Click on **enter a file URL** button
> [!div class="mx-imgBorder"] > :::image type="content" source="./media/video-indexer-get-started/avam-enter-file-url.png" alt-text="Screenshot that shows the enter file URL button.":::
For a list of file formats that you can use with Video Analyzer for Media, see [
1. Copy the embed code and extract only the URL part including the key. For example: `https://onedrive.live.com/embed?cid=5BC591B7C713B04F&resid=5DC518B6B713C40F%2110126&authkey=HnsodidN_50oA3lLfk`
-
+ Replace **embed** with **download**. You will now have a url that looks like this:
-
+ `https://onedrive.live.com/download?cid=5BC591B7C713B04F&resid=5DC518B6B713C40F%2110126&authkey=HnsodidN_50oA3lLfk` 1. Now enter this URL in the Azure Video Analyzer for Media portal in the URL field.
Once Video Analyzer for Media is done analyzing, you will receive an email with
## Upload and index a video by using the API
-You can use the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) API to upload and index your videos based on a URL. The code sample that follows includes the commented-out code that shows how to upload the byte array.
+You can use the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) API to upload and index your videos based on a URL. The code sample that follows includes the commented-out code that shows how to upload the byte array.
### Configurations and parameters This section describes some of the optional parameters and when to set them. For the most up-to-date info about parameters, see the [Video Analyzer for Media portal](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video).
-#### externalID
+#### externalID
Use this parameter to specify an ID that will be associated with the video. The ID can be applied to integration into an external video content management (VCM) system. The videos that are in the Video Analyzer for Media portal can be searched via the specified external ID. #### callbackUrl
-Use this parameter to specify a callback URL.
+Use this parameter to specify a callback URL.
[!INCLUDE [callback url](./includes/callback-url.md)]
Use this parameter to define an AI bundle that you want to apply on your audio o
- `BasicAudio`: Index and extract insights by using audio only (ignoring video). Include only basic audio features (transcription, translation, formatting of output captions and subtitles). - `AdvancedAudio`: Index and extract insights by using audio only (ignoring video). Include advanced audio features (such as audio event detection) in addition to the standard audio analysis. - `AdvancedVideo`: Index and extract insights by using video only (ignoring audio). Include advanced video features (such as observed people tracing) in addition to the standard video analysis.-- `AdvancedVideoAndAudio`: Index and extract insights by using both advanced audio and advanced video analysis.
+- `AdvancedVideoAndAudio`: Index and extract insights by using both advanced audio and advanced video analysis.
> [!NOTE]
-> The preceding advanced presets include models that are in public preview. When these models reach general availability, there might be implications for the price.
+> The preceding advanced presets include models that are in public preview. When these models reach general availability, there might be implications for the price.
Video Analyzer for Media covers up to two tracks of audio. If the file has more audio tracks, they're treated as one track. If you want to index the tracks separately, you need to extract the relevant audio file and index it as `AudioOnly`.
This parameter is supported only for paid accounts.
#### streamingPreset
-After your video is uploaded, Video Analyzer for Media optionally encodes the video. It then proceeds to indexing and analyzing the video. When Video Analyzer for Media is done analyzing, you get a notification with the video ID.
+After your video is uploaded, Video Analyzer for Media optionally encodes the video. It then proceeds to indexing and analyzing the video. When Video Analyzer for Media is done analyzing, you get a notification with the video ID.
-When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) or [Re-Index Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Re-Index-Video) API, one of the optional parameters is `streamingPreset`. If you set `streamingPreset` to `Default`, `SingleBitrate`, or `AdaptiveBitrate`, the encoding process is triggered.
+When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) or [Re-Index Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Re-Index-Video) API, one of the optional parameters is `streamingPreset`. If you set `streamingPreset` to `Default`, `SingleBitrate`, or `AdaptiveBitrate`, the encoding process is triggered.
After the indexing and encoding jobs are done, the video is published so you can also stream your video. The streaming endpoint from which you want to stream the video must be in the **Running** state. For `SingleBitrate`, the standard encoder cost will apply for the output. If the video height is greater than or equal to 720, Video Analyzer for Media encodes it as 1280 x 720. Otherwise, it's encoded as 640 x 468.
-The default setting is [content-aware encoding](../../media-services/latest/encode-content-aware-concept.md).
+The default setting is [content-aware encoding](/media-services/latest/encode-content-aware-concept).
If you only want to index your video and not encode it, set `streamingPreset` to `NoStreaming`.
The following C# code snippets demonstrate the usage of all the Video Analyzer f
After you copy the following code into your development platform, you'll need to provide two parameters:
-* API key (`apiKey`): Your personal API management subscription key. It allows you to get an access token in order to perform operations on your Video Analyzer for Media account.
+* API key (`apiKey`): Your personal API management subscription key. It allows you to get an access token in order to perform operations on your Video Analyzer for Media account.
To get your API key:
After you copy the following code into your development platform, you'll need to
* Video URL (`videoUrl`): A URL of the video or audio file to be indexed. Here are the requirements:
- - The URL must point at a media file. (HTML pages are not supported.)
+ - The URL must point at a media file. (HTML pages are not supported.)
- The file can be protected by an access token that's provided as part of the URI. The endpoint that serves the file must be secured with TLS 1.2 or later.
- - The URL must be encoded.
+ - The URL must be encoded.
-The result of successfully running the code sample includes an insight widget URL and a player widget URL. They allow you to examine the insights and the uploaded video, respectively.
+The result of successfully running the code sample includes an insight widget URL and a player widget URL. They allow you to examine the insights and the uploaded video, respectively.
```csharp
public async Task Sample()
HttpResponseMessage result = await client.GetAsync($"{apiUrl}/auth/trial/Accounts?{queryParams}"); var json = await result.Content.ReadAsStringAsync(); var accounts = JsonConvert.DeserializeObject<AccountContractSlim[]>(json);
-
- // Take the relevant account. Here we simply take the first.
+
+ // Take the relevant account. Here we simply take the first.
// You can also get the account via accounts.First(account => account.Id == <GUID>); var accountInfo = accounts.First();
public class AccountContractSlim
### [Azure Resource Manager account](#tab/with-arm-account-account/)
-After you copy this C# project into your development platform, you need to take the following steps:
+After you copy this C# project into your development platform, you need to take the following steps:
1. Go to Program.cs and populate:
namespace VideoIndexerArm
Console.WriteLine($"account id: {accountId}"); Console.WriteLine($"account location: {accountLocation}");
- // Get account-level access token for Azure Video Analyzer for Media
+ // Get account-level access token for Azure Video Analyzer for Media
var accessTokenRequest = new AccessTokenRequest { PermissionType = AccessTokenPermission.Contributor,
namespace VideoIndexerArm
[JsonPropertyName("projectId")] public string ProjectId { get; set; }
-
+ [JsonPropertyName("videoId")] public string VideoId { get; set; } }
The upload operation might return the following status codes:
|429||Trial accounts are allowed 5 uploads per minute. Paid accounts are allowed 50 uploads per minute.| ## Uploading considerations and limitations
-
+ - The name of a video must be no more than 80 characters. - When you're uploading a video based on the URL (preferred), the endpoint must be secured with TLS 1.2 or later. - The upload size with the URL option is limited to 30 GB.
The upload operation might return the following status codes:
- The URL provided in the `videoURL` parameter must be encoded. - Indexing Media Services assets has the same limitation as indexing from a URL. - Video Analyzer for Media has a duration limit of 4 hours for a single file.-- The URL must be accessible (for example, a public URL).
+- The URL must be accessible (for example, a public URL).
If it's a private URL, the access token must be provided in the request. - The URL must point to a valid media file and not to a webpage, such as a link to the `www.youtube.com` page.
The upload operation might return the following status codes:
> [!Tip] > We recommend that you use .NET Framework version 4.6.2. or later, because older .NET Framework versions don't default to TLS 1.2. >
-> If you must use an older .NET Framework version, add one line to your code before making the REST API call:
+> If you must use an older .NET Framework version, add one line to your code before making the REST API call:
> > `System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;`
azure-video-analyzer Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/release-notes.md
Added new code samples including HTTP calls to use Video Analyzer for Media crea
### Improved audio effects detection
-The audio effects detection capability was improved to have a better detection rate over the following classes:
+The audio effects detection capability was improved to have a better detection rate over the following classes:
-* Crowd reactions (cheering, clapping, and booing),
-* Gunshot or explosion,
+* Crowd reactions (cheering, clapping, and booing),
+* Gunshot or explosion,
* Laughter For more information, see [Audio effects detection](audio-effects-detection.md). ### New source languages support for STT, translation, and search on the website
-
-Video Analyzer for Media introduces source languages support for STT (speech-to-text), translation, and search in Hebrew (he-IL), Portuguese (pt-PT), and Persian (fa-IR) on the [Video Analyzer for Media](https://www.videoindexer.ai/) website.
+
+Video Analyzer for Media introduces source languages support for STT (speech-to-text), translation, and search in Hebrew (he-IL), Portuguese (pt-PT), and Persian (fa-IR) on the [Video Analyzer for Media](https://www.videoindexer.ai/) website.
It means transcription, translation, and search features are also supported for these languages in Video Analyzer for Media web applications and widgets.
-## December 2021
-
+## December 2021
+ ### The projects feature is now GA The projects feature is now GA and ready for productive use. There is no pricing impact related to the "Preview to GA" transition. See [Add video clips to your projects](use-editor-create-project.md).
-
-### New source languages support for STT, translation, and search on API level
+
+### New source languages support for STT, translation, and search on API level
Video Analyzer for Media introduces source languages support for STT (speech-to-text), translation, and search in Hebrew (he-IL), Portuguese (pt-PT), and Persian (fa-IR) on the API level. ### Matched person detection capability
-When indexing a video through our advanced video settings, you can view the new matched person detection capability. If there are people observed in your media file, you can now view the specific person who matched each of them through the media player.
+When indexing a video through our advanced video settings, you can view the new matched person detection capability. If there are people observed in your media file, you can now view the specific person who matched each of them through the media player.
## November 2021
-
+ ### Public preview of Video Analyzer for Media account management based on ARM Azure Video Analyzer for Media introduces a public preview of Azure Resource Manager (ARM) based account management. You can leverage ARM-based Video Analyzer for Media APIs to create, edit, and delete an account from the [Azure portal](https://portal.azure.com/#home). > [!NOTE]
-> The Government cloud includes support for CRUD ARM based accounts from Video Analyzer for Media API and from the Azure portal.
->
+> The Government cloud includes support for CRUD ARM based accounts from Video Analyzer for Media API and from the Azure portal.
+>
> There is currently no support from the Video Analyzer for Media [website](https://www.videoindexer.ai). For more information go to [create a Video Analyzer for Media account](https://techcommunity.microsoft.com/t5/azure-ai/azure-video-analyzer-for-media-is-now-available-as-an-azure/ba-p/2912422). ### PeopleΓÇÖs clothing detection
-When indexing a video through the advanced video settings, you can view the new **PeopleΓÇÖs clothing detection** capability. If there are people detected in your media file, you can now view the clothing type they are wearing through the media player.
+When indexing a video through the advanced video settings, you can view the new **PeopleΓÇÖs clothing detection** capability. If there are people detected in your media file, you can now view the clothing type they are wearing through the media player.
### Face bounding box (preview)
There is now an option to re-index video or audio files that have failed during
Fixed bugs related to CSS, theming and accessibility: * high contrast
-* account settings and insights views in the [portal](https://www.videoindexer.ai).
+* account settings and insights views in the [portal](https://www.videoindexer.ai).
## July 2021 ### Automatic Scaling of Media Reserved Units
-
-Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Media Reserved Units (MRUs)](../../media-services/latest/concept-media-reserved-units.md) auto scaling by [Azure Media Services](../../media-services/latest/media-services-overview.md), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, for example price reduction in many cases, based on your business needs as it is being auto scaled.
+
+Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Media Reserved Units (MRUs)](/media-services/latest/concept-media-reserved-units) auto scaling by [Azure Media Services](/media-services/latest/media-services-overview), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, for example price reduction in many cases, based on your business needs as it is being auto scaled.
## June 2021
-
+ ### Video Analyzer for Media deployed in six new regions
-
+ You can now create a Video Analyzer for Media paid account in France Central, Central US, Brazil South, West Central US, Korea Central, and Japan West regions.
-
+ ## May 2021 ### New source languages support for speech-to-text (STT), translation, and search
-Video Analyzer for Media now supports STT, translation, and search in Chinese (Cantonese) ('zh-HK'), Dutch (Netherlands) ('Nl-NL'), Czech ('Cs-CZ'), Polish ('Pl-PL'), Swedish (Sweden) ('Sv-SE'), Norwegian('nb-NO'), Finnish('fi-FI'), Canadian French ('fr-CA'), Thai('th-TH'),
-Arabic: (United Arab Emirates) ('ar-AE', 'ar-EG'), (Iraq) ('ar-IQ'), (Jordan) ('ar-JO'), (Kuwait) ('ar-KW'), (Lebanon) ('ar-LB'), (Oman) ('ar-OM'), (Qatar) ('ar-QA'), (Palestinian Authority) ('ar-PS'), (Syria) ('ar-SY'), and Turkish('tr-TR').
+Video Analyzer for Media now supports STT, translation, and search in Chinese (Cantonese) ('zh-HK'), Dutch (Netherlands) ('Nl-NL'), Czech ('Cs-CZ'), Polish ('Pl-PL'), Swedish (Sweden) ('Sv-SE'), Norwegian('nb-NO'), Finnish('fi-FI'), Canadian French ('fr-CA'), Thai('th-TH'),
+Arabic: (United Arab Emirates) ('ar-AE', 'ar-EG'), (Iraq) ('ar-IQ'), (Jordan) ('ar-JO'), (Kuwait) ('ar-KW'), (Lebanon) ('ar-LB'), (Oman) ('ar-OM'), (Qatar) ('ar-QA'), (Palestinian Authority) ('ar-PS'), (Syria) ('ar-SY'), and Turkish('tr-TR').
These languages are available in both API and Video Analyzer for Media website. Select the language from the combobox under **Video source language**. ### New theme for Azure Video Analyzer for Media New theme is available: 'Azure' along with the 'light' and 'dark themes. To select a theme, click on the gear icon in the top-right corner of the website, find themes under **User settings**.
-
-### New open-source code you can leverage
+
+### New open-source code you can leverage
Three new Git-Hub projects are available at our [GitHub repository](https://github.com/Azure-Samples/media-services-video-indexer): * Code to help you leverage the newly added [widget customization](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/Embedding%20widgets). * Solution to help you add [custom search](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/VideoSearchWithAutoMLVision) to your video libraries. * Solution to help you add [de-duplication](https://github.com/Azure-Samples/media-services-video-indexer/commit/6b828f598f5bf61ce1b6dbcbea9e8b87ba11c7b1) to your video libraries.
-
-### New option to toggle bounding boxes (for observed people) on the player
+
+### New option to toggle bounding boxes (for observed people) on the player
When indexing a video through our advanced video settings, you can view our new observed people capabilities. If there are people detected in your media file, you can enable a bounding box on the detected person through the media player.
When indexing a video through our advanced video settings, you can view our new
The Video Indexer service was renamed to Azure Video Analyzer for Media. ### Improved upload experience in the portal
-
+ Video Analyzer for Media has a new upload experience in the [portal](https://www.videoindexer.ai). To upload your media file, press the **Upload** button from the **Media files** tab. ### New developer portal in available in gov-cloud
-
+ [Video Analyzer for Media Developer Portal](https://api-portal.videoindexer.ai) is now also available in Azure for US Government.
-### Observed people tracing (preview)
+### Observed people tracing (preview)
-Azure Video Analyzer for Media now detects observed people in videos and provides information such as the location of the person in the video frame and the exact timestamp (start, end) when a person appears. The API returns the bounding box coordinates (in pixels) for each person instance detected, including its confidence.
+Azure Video Analyzer for Media now detects observed people in videos and provides information such as the location of the person in the video frame and the exact timestamp (start, end) when a person appears. The API returns the bounding box coordinates (in pixels) for each person instance detected, including its confidence.
-For example, if a video contains a person, the detect operation will list the person appearances together with their coordinates in the video frames. You can use this functionality to determine the person path in a video. It also lets you determine whether there are multiple instances of the same person in a video.
+For example, if a video contains a person, the detect operation will list the person appearances together with their coordinates in the video frames. You can use this functionality to determine the person path in a video. It also lets you determine whether there are multiple instances of the same person in a video.
-The newly added observed people tracing feature is available when indexing your file by choosing the **Advanced option** -> **Advanced video** or **Advanced video + audio** preset (under Video + audio indexing). Standard and basic indexing presets will not include this new advanced model.
+The newly added observed people tracing feature is available when indexing your file by choosing the **Advanced option** -> **Advanced video** or **Advanced video + audio** preset (under Video + audio indexing). Standard and basic indexing presets will not include this new advanced model.
-When you choose to see Insights of your video on the Video Analyzer for Media website, the Observed People Tracing will show up on the page with all detected people thumbnails. You can choose a thumbnail of a person and see where the person appears in the video player.
+When you choose to see Insights of your video on the Video Analyzer for Media website, the Observed People Tracing will show up on the page with all detected people thumbnails. You can choose a thumbnail of a person and see where the person appears in the video player.
The feature is also available in the JSON file generated by Video Analyzer for Media. For more information, see [Trace observed people in a video](observed-people-tracing.md).
You can now see the detected acoustic events in the closed captions file. The fi
## March 2021
-### Audio analysis
+### Audio analysis
Audio analysis is available now in additional new bundle of audio features at different price point. The new **Basic Audio** analysis preset provides a low-cost option to only extract speech transcription, translation and format output captions and subtitles. The **Basic Audio** preset will produce two separate meters on your bill, including a line for transcription and a separate line for caption and subtitle formatting. More information on the pricing, see the [Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/) page. The newly added bundle is available when indexing or re-indexing your file by choosing the **Advanced option** -> **Basic Audio** preset (under the **Video + audio indexing** drop-down box).
-### New developer portal
+### New developer portal
+
+Video Analyzer for Media has a new [Developer Portal](https://api-portal.videoindexer.ai/), try out the new Video Analyzer for Media APIs and find all the relevant resources in one place: [GitHub repository](https://github.com/Azure-Samples/media-services-video-indexer), [Stack overflow](https://stackoverflow.com/questions/tagged/video-indexer), [Video Analyzer for Media tech community](https://techcommunity.microsoft.com/t5/azure-media-services/bg-p/AzureMediaServices/label-name/Video%20Indexer) with relevant blog posts, [Video Analyzer for Media FAQs](faq.yml), [User Voice](https://feedback.azure.com/d365community/forum/09041fae-0b25-ec11-b6e6-000d3a4f0858) to provide your feedback and suggest features, and ['CodePen' link](https://codepen.io/videoindexer) with widgets code samples.
-Video Analyzer for Media has a new [Developer Portal](https://api-portal.videoindexer.ai/), try out the new Video Analyzer for Media APIs and find all the relevant resources in one place: [GitHub repository](https://github.com/Azure-Samples/media-services-video-indexer), [Stack overflow](https://stackoverflow.com/questions/tagged/video-indexer), [Video Analyzer for Media tech community](https://techcommunity.microsoft.com/t5/azure-media-services/bg-p/AzureMediaServices/label-name/Video%20Indexer) with relevant blog posts, [Video Analyzer for Media FAQs](faq.yml), [User Voice](https://feedback.azure.com/d365community/forum/09041fae-0b25-ec11-b6e6-000d3a4f0858) to provide your feedback and suggest features, and ['CodePen' link](https://codepen.io/videoindexer) with widgets code samples.
-
-### Advanced customization capabilities for insight widget
+### Advanced customization capabilities for insight widget
-SDK is now available to embed Video Analyzer for Media's insights widget in your own service and customize its style and data. The SDK supports the standard Video Analyzer for Media insights widget and a fully customizable insights widget. Code sample is available in [Video Analyzer for Media GitHub repository](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/Embedding%20widgets/widget-customization). With this advanced customization capabilities, solution developer can apply custom styling and bring customerΓÇÖs own AI data and present that in the insight widget (with or without Video Analyzer for Media insights).
+SDK is now available to embed Video Analyzer for Media's insights widget in your own service and customize its style and data. The SDK supports the standard Video Analyzer for Media insights widget and a fully customizable insights widget. Code sample is available in [Video Analyzer for Media GitHub repository](https://github.com/Azure-Samples/media-services-video-indexer/tree/master/Embedding%20widgets/widget-customization). With this advanced customization capabilities, solution developer can apply custom styling and bring customerΓÇÖs own AI data and present that in the insight widget (with or without Video Analyzer for Media insights).
-### Video Analyzer for Media deployed in the US North Central , US West and Canada Central
+### Video Analyzer for Media deployed in the US North Central , US West and Canada Central
You can now create a Video Analyzer for Media paid account in the US North Central, US West and Canada Central regions
-
-### New source languages support for speech-to-text (STT), translation and search
-Video Analyzer for Media now support STT, translation and search in Danish ('da-DK'), Norwegian('nb-NO'), Swedish('sv-SE'), Finnish('fi-FI'), Canadian French ('fr-CA'), Thai('th-TH'), Arabic ('ar-BH', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-OM', 'ar-QA', 'ar-S', and 'ar-SY'), and Turkish('tr-TR'). Those languages are available in both API and Video Analyzer for Media website.
-
-### Search by Topic in Video Analyzer for Media Website
+### New source languages support for speech-to-text (STT), translation and search
-You can now use the search feature, at the top of the [Video Analyzer for Media website](https://www.videoindexer.ai/account/login) page, to search for videos with specific topics.
+Video Analyzer for Media now support STT, translation and search in Danish ('da-DK'), Norwegian('nb-NO'), Swedish('sv-SE'), Finnish('fi-FI'), Canadian French ('fr-CA'), Thai('th-TH'), Arabic ('ar-BH', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-OM', 'ar-QA', 'ar-S', and 'ar-SY'), and Turkish('tr-TR'). Those languages are available in both API and Video Analyzer for Media website.
+
+### Search by Topic in Video Analyzer for Media Website
+
+You can now use the search feature, at the top of the [Video Analyzer for Media website](https://www.videoindexer.ai/account/login) page, to search for videos with specific topics.
## February 2021
-### Multiple account owners
+### Multiple account owners
Account owner role was added to Video Analyzer for Media. You can add, change, and remove users; change their role. For details on how to share an account, see [Invite users](invite-users.md). ### Audio event detection (public preview) > [!NOTE]
-> This feature is only available in trial accounts.
+> This feature is only available in trial accounts.
-Video Analyzer for Media now detects the following audio effects in the non-speech segments of the content: gunshot, glass shatter, alarm, siren, explosion, dog bark, screaming, laughter, crowd reactions (cheering, clapping, and booing) and Silence.
+Video Analyzer for Media now detects the following audio effects in the non-speech segments of the content: gunshot, glass shatter, alarm, siren, explosion, dog bark, screaming, laughter, crowd reactions (cheering, clapping, and booing) and Silence.
-The newly added audio affects feature is available when indexing your file by choosing the **Advanced option** -> **Advanced audio** preset (under Video + audio indexing). Standard indexing will only include **silence** and **crowd reaction**.
+The newly added audio affects feature is available when indexing your file by choosing the **Advanced option** -> **Advanced audio** preset (under Video + audio indexing). Standard indexing will only include **silence** and **crowd reaction**.
The **clapping** event type that was included in the previous audio effects model, is now extracted a part of the **crowd reaction** event type.
When you choose to see **Insights** of your video on the [Video Analyzer for Med
:::image type="content" source="./media/release-notes/audio-detection.png" alt-text="Audio event detection":::
-### Named entities enhancement
+### Named entities enhancement
-The extracted list of people and location was extended and updated in general.
+The extracted list of people and location was extended and updated in general.
-In addition, the model now includes people and locations in-context which are not famous, like a ΓÇÿSamΓÇÖ or ΓÇÿHomeΓÇÖ in the video.
+In addition, the model now includes people and locations in-context which are not famous, like a ΓÇÿSamΓÇÖ or ΓÇÿHomeΓÇÖ in the video.
## January 2021
-### Video Analyzer for Media is deployed on US Government cloud
+### Video Analyzer for Media is deployed on US Government cloud
-You can now create a Video Analyzer for Media paid account on US government cloud in Virginia and Arizona regions.
-Video Analyzer for Media free trial offering isn't available in the mentioned region. For more information go to Video Analyzer for Media Documentation.
+You can now create a Video Analyzer for Media paid account on US government cloud in Virginia and Arizona regions.
+Video Analyzer for Media free trial offering isn't available in the mentioned region. For more information go to Video Analyzer for Media Documentation.
-### Video Analyzer for Media deployed in the India Central region
+### Video Analyzer for Media deployed in the India Central region
-You can now create a Video Analyzer for Media paid account in the India Central region.
+You can now create a Video Analyzer for Media paid account in the India Central region.
### New Dark Mode for the Video Analyzer for Media website experience
-The Video Analyzer for Media website experiences is now available in dark mode.
-To enable the dark mode open the settings panel and toggle on the **Dark Mode** option.
+The Video Analyzer for Media website experiences is now available in dark mode.
+To enable the dark mode open the settings panel and toggle on the **Dark Mode** option.
:::image type="content" source="./media/release-notes/dark-mode.png" alt-text="Dark mode setting":::
You can now create a Video Analyzer for Media paid account in the Switzerland We
## October 2020
-### Animated character identification improvements
+### Animated character identification improvements
Video Analyzer for Media supports detection, grouping, and recognition of characters in animated content via integration with Cognitive Services custom vision. We added a major improvement to this AI algorithm in the detection and characters recognition, as a result insight accuracy and identified characters are significantly improved.
Starting March 1st 2021, you no longer will be able to sign up and sign in to th
You will be able to sign up and sign in using one of these providers: Azure AD, Microsoft, and Google. > [!NOTE]
-> The Video Analyzer for Media accounts connected to LinkedIn and Facebook will not be accessible after March 1st 2021.
->
+> The Video Analyzer for Media accounts connected to LinkedIn and Facebook will not be accessible after March 1st 2021.
+>
> You should [invite](invite-users.md) an Azure AD, Microsoft, or Google email you own to the Video Analyzer for Media account so you will still have access. You can add an additional owner of supported providers, as described in [invite](invite-users.md). <br/> > Alternatively, you can create a paid account and migrate the data.
You will be able to sign up and sign in using one of these providers: Azure AD,
### Mobile design for the Video Analyzer for Media website
-The Video Analyzer for Media website experience is now supporting mobile devices. The user experience is responsive to adapt to your mobile screen size (excluding customization UIs).
+The Video Analyzer for Media website experience is now supporting mobile devices. The user experience is responsive to adapt to your mobile screen size (excluding customization UIs).
-### Accessibility improvements and bug fixes
+### Accessibility improvements and bug fixes
-As part of WCAG (Web Content Accessibility guidelines), the Video Analyzer for Media website experiences is aligned with grade C, as part of Microsoft Accessibility standards. Several bugs and improvements related to keyboard navigation, programmatic access, and screen reader were solved.
+As part of WCAG (Web Content Accessibility guidelines), the Video Analyzer for Media website experiences is aligned with grade C, as part of Microsoft Accessibility standards. Several bugs and improvements related to keyboard navigation, programmatic access, and screen reader were solved.
## July 2020
Side panel is also used for user preferences and help.
You can now use the search API to search for videos with specific topics (API only).
-Topics is added as part of the `textScope` (optional parameter). See [API](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Search-Videos) for details.
+Topics is added as part of the `textScope` (optional parameter). See [API](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Search-Videos) for details.
### Labels enhancement
The label tagger was upgraded and now includes more visual labels that can be id
### Video Analyzer for Media deployed in the East US You can now create a Video Analyzer for Media paid account in the East US region.
-
+ ### Video Analyzer for Media URL Video Analyzer for Media regional endpoints were all unified to start only with www. No action item is required.
The **Insights** widget includes new parameters: `language` and `control`.
The **Player** widget has a new `locale` parameter. Both `locale` and `language` parameters control the playerΓÇÖs language.
-For more information, see the [widget types](video-indexer-embed-widgets.md#widget-types) section.
+For more information, see the [widget types](video-indexer-embed-widgets.md#widget-types) section.
### New player skin
A new player skin launched with updated design.
* [Get-Account](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Get-Account) * [Get-Accounts-Authorization](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Get-Accounts-Authorization) * [Get-Accounts-With-Token](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Get-Accounts-With-Token)
-
+ The Account object has a `Url` field pointing to the location of the [Video Analyzer for Media website](https://www.videoindexer.ai/). For paid accounts the `Url` field is currently pointing to an internal URL instead of the public website. In the coming weeks we will change it and return the [Video Analyzer for Media website](https://www.videoindexer.ai/) URL for all accounts (trial and paid).
In the coming weeks we will change it and return the [Video Analyzer for Media w
* Replacing the URL with a URL pointing to the Video Analyzer for Media widget APIs (for example, the [insights widget](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Get-Video-Insights-Widget)) * Using the Video Analyzer for Media website to generate a new embedded URL:
-
+ Press **Play** to get to your video's page -> click the **&lt;/&gt; Embed** button -> copy the URL into your application:
-
+ The regional URLs are not supported and will be blocked in the coming weeks. ## January 2020
-
+ ### Custom language support for additional languages Video Analyzer for Media now supports custom language models for `ar-SY` , `en-UK`, and `en-AU` (API only).
-
+ ### Delete account timeframe action update Delete account action now deletes the account within 90 days instead of 48 hours.
-
+ ### New Video Analyzer for Media GitHub repository A new Video Analyzer for Media GitHub with different projects, getting started guides and code samples is now available: https://github.com/Azure-Samples/media-services-video-indexer
-
+ ### Swagger update Video Analyzer for Media unified **authentications** and **operations** into a single [Video Analyzer for Media OpenAPI Specification (swagger)](https://api-portal.videoindexer.ai/api-details#api=Operations&operation). Developers can find the APIs in [Video Analyzer for Media Developer Portal](https://api-portal.videoindexer.ai/).
Update a specific section in the transcript using the [Update-Video-Index](https
### Fix account configuration from the Video Analyzer for Media portal
-You can now update Media Services connection configuration in order to self-help with issues like:
+You can now update Media Services connection configuration in order to self-help with issues like:
* incorrect Azure Media Services resource * password changes
-* Media Services resources were moved between subscriptions
+* Media Services resources were moved between subscriptions
To fix the account configuration, in the Video Analyzer for Media portal navigate to Settings > Account tab (as owner). ### Configure the custom vision account
-Configure the custom vision account on paid accounts using the Video Analyzer for Media portal (previously, this was only supported by API). To do that, sign in to the Video Analyzer for Media portal, choose Model Customization > Animated characters > Configure.
+Configure the custom vision account on paid accounts using the Video Analyzer for Media portal (previously, this was only supported by API). To do that, sign in to the Video Analyzer for Media portal, choose Model Customization > Animated characters > Configure.
### Scenes, shots and keyframes ΓÇô now in one insight pane
-Scenes, shots, and keyframes are now merged into one insight for easier consumption and navigation. When you select the desired scene you can see what shots and keyframes it consists of.
+Scenes, shots, and keyframes are now merged into one insight for easier consumption and navigation. When you select the desired scene you can see what shots and keyframes it consists of.
### Notification about a long video name
When streaming endpoint is disabled, Video Analyzer for Media will show a descri
Status code 409 will now be returned from [Re-Index Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Re-Index-Video) and [Update Video Index](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Update-Video-Index) APIs in case a video is actively indexed, to prevent overriding the current re-index changes by accident. ## November 2019
-
+ * Korean custom language models support
- Video Analyzer for Media now supports custom language models in Korean (`ko-KR`) in both the API and portal.
+ Video Analyzer for Media now supports custom language models in Korean (`ko-KR`) in both the API and portal.
* New languages supported for speech-to-text (STT) Video Analyzer for Media APIs now support STT in Arabic Levantine (ar-SY), English UK dialect (en-GB), and English Australian dialect (en-AU).
-
+ For video upload, we replaced zh-HANS to zh-CN, both are supported but zh-CN is recommended and more accurate.
-
+ ## October 2019
-
+ * Search for animated characters in the gallery When indexing animated characters, you can now search for them in the accountΓÇÖs video galley. For more information, see [Animated characters recognition](animated-characters-recognition.md). ## September 2019
-
+ Multiple advancements announced at IBC 2019:
-
+ * Animated character recognition (public preview) Ability to detect group ad recognize characters in animated content, via integration with custom vision. For more information, see [Animated character detection](animated-characters-recognition.md).
Multiple advancements announced at IBC 2019:
Tagging of shots with editorial types such as close up, medium shot, two shot, indoor, outdoor etc. For more information, see [Editorial shot type detection](scenes-shots-keyframes.md#editorial-shot-type-detection). * Topic inferencing enhancement - now covering level 2
-
+ The topic inferencing model now supports deeper granularity of the IPTC taxonomy. Read full details at [Azure Media Services new AI-powered innovation](https://azure.microsoft.com/blog/azure-media-services-new-ai-powered-innovation/). ## August 2019
-
+ ### Video Analyzer for Media deployed in UK South You can now create a Video Analyzer for Media paid account in the UK south region.
Video Analyzer for Media identifies named locations and people via natural langu
### Keyframes extraction in native resolution Keyframes extracted by Video Analyzer for Media are available in the original resolution of the video.
-
+ ### GA for training custom face models from images Training faces from images moved from Preview mode to GA (available via API and in the portal).
Training faces from images moved from Preview mode to GA (available via API and
### Hide gallery toggle option User can choose to hide the gallery tab from the portal (similar to hiding the samples tab).
-
+ ### Maximum URL size increased Support for URL query string of 4096 (instead of 2048) on indexing a video.
-
+ ### Support for multi-lingual projects Projects can now be created based on videos indexed in different languages (API only).
You can now create a Video Analyzer for Media paid account in the Japan East reg
Added a new API that enables you to [update the Azure Media Service connection endpoint or key](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Update-Paid-Account-Azure-Media-Services).
-### Improve error handling on upload
+### Improve error handling on upload
A descriptive message is returned in case of misconfiguration of the underlying Azure Media Services account.
-### Player timeline Keyframes preview
+### Player timeline Keyframes preview
You can now see an image preview for each time on the player's timeline.
azure-video-analyzer Upload Index Videos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/upload-index-videos.md
Last updated 11/15/2021
-# Upload and index your videos
+# Upload and index your videos
-This article shows how to upload and index videos by using the Azure Video Analyzer for Media (formerly Video Indexer) website and the Upload Video API.
+This article shows how to upload and index videos by using the Azure Video Analyzer for Media (formerly Video Indexer) website and the Upload Video API.
When you're creating a Video Analyzer for Media account, you choose between: -- A free trial account. Video Analyzer for Media provides up to 600 minutes of free indexing to website users and up to 2,400 minutes of free indexing to API users. -- A paid option where you're not limited by a quota. You create a Video Analyzer for Media account that's [connected to your Azure subscription and an Azure Media Services account](connect-to-azure.md). You pay for indexed minutes.
+- A free trial account. Video Analyzer for Media provides up to 600 minutes of free indexing to website users and up to 2,400 minutes of free indexing to API users.
+- A paid option where you're not limited by a quota. You create a Video Analyzer for Media account that's [connected to your Azure subscription and an Azure Media Services account](connect-to-azure.md). You pay for indexed minutes.
For more information about account types, see [Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/).
-When you're uploading videos by using the API, you have the following options:
+When you're uploading videos by using the API, you have the following options:
* Upload your video from a URL (preferred). * Send the video file as a byte array in the request body.
-* Use existing an Azure Media Services asset by providing the [asset ID](../../media-services/latest/assets-concept.md). This option is supported in paid accounts only.
+* Use existing an Azure Media Services asset by providing the [asset ID](/media-services/latest/assets-concept). This option is supported in paid accounts only.
## Supported file formats
-For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](../../media-services/latest/encode-media-encoder-standard-formats-reference.md).
+For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/media-services/latest/encode-media-encoder-standard-formats-reference).
## Storage of video files
When you use Video Analyzer for Media, video files are stored in Azure Storage t
You can always delete your video and audio files, along with any metadata and insights that Video Analyzer for Media has extracted from them. After you delete a file from Video Analyzer for Media, the file and its metadata and insights are permanently removed from Video Analyzer for Media. However, if you've implemented your own backup solution in Azure Storage, the file remains in Azure Storage. The persistence of a video is identical whether you upload by using the Video Analyzer for Media website or by using the Upload Video API.
-
+ ## Upload and index a video by using the website Sign in on the [Video Analyzer for Media](https://www.videoindexer.ai/) website, and then select **Upload**.
After Video Analyzer for Media is done analyzing, you get an email with a link t
## Upload and index a video by using the API
-You can use the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) API to upload and index your videos based on a URL. The code sample that follows includes the commented-out code that shows how to upload the byte array.
+You can use the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) API to upload and index your videos based on a URL. The code sample that follows includes the commented-out code that shows how to upload the byte array.
### Configurations and parameters This section describes some of the optional parameters and when to set them. For the most up-to-date info about parameters, see the [Video Analyzer for Media portal](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video).
-#### externalID
+#### externalID
Use this parameter to specify an ID that will be associated with the video. The ID can be applied to integration into an external video content management (VCM) system. The videos that are in the Video Analyzer for Media portal can be searched via the specified external ID. #### callbackUrl
-Use this parameter to specify a callback URL.
+Use this parameter to specify a callback URL.
[!INCLUDE [callback url](./includes/callback-url.md)]
Use this parameter to define an AI bundle that you want to apply on your audio o
- `BasicAudio`: Index and extract insights by using audio only (ignoring video). Include only basic audio features (transcription, translation, formatting of output captions and subtitles). - `AdvancedAudio`: Index and extract insights by using audio only (ignoring video). Include advanced audio features (such as audio event detection) in addition to the standard audio analysis. - `AdvancedVideo`: Index and extract insights by using video only (ignoring audio). Include advanced video features (such as observed people tracing) in addition to the standard video analysis.-- `AdvancedVideoAndAudio`: Index and extract insights by using both advanced audio and advanced video analysis.
+- `AdvancedVideoAndAudio`: Index and extract insights by using both advanced audio and advanced video analysis.
> [!NOTE]
-> The preceding advanced presets include models that are in public preview. When these models reach general availability, there might be implications for the price.
+> The preceding advanced presets include models that are in public preview. When these models reach general availability, there might be implications for the price.
Video Analyzer for Media covers up to two tracks of audio. If the file has more audio tracks, they're treated as one track. If you want to index the tracks separately, you need to extract the relevant audio file and index it as `AudioOnly`.
This parameter is supported only for paid accounts.
#### streamingPreset
-After your video is uploaded, Video Analyzer for Media optionally encodes the video. It then proceeds to indexing and analyzing the video. When Video Analyzer for Media is done analyzing, you get a notification with the video ID.
+After your video is uploaded, Video Analyzer for Media optionally encodes the video. It then proceeds to indexing and analyzing the video. When Video Analyzer for Media is done analyzing, you get a notification with the video ID.
-When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) or [Re-Index Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Re-Index-Video) API, one of the optional parameters is `streamingPreset`. If you set `streamingPreset` to `Default`, `SingleBitrate`, or `AdaptiveBitrate`, the encoding process is triggered.
+When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Upload-Video) or [Re-Index Video](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Re-Index-Video) API, one of the optional parameters is `streamingPreset`. If you set `streamingPreset` to `Default`, `SingleBitrate`, or `AdaptiveBitrate`, the encoding process is triggered.
After the indexing and encoding jobs are done, the video is published so you can also stream your video. The streaming endpoint from which you want to stream the video must be in the **Running** state. For `SingleBitrate`, the standard encoder cost will apply for the output. If the video height is greater than or equal to 720, Video Analyzer for Media encodes it as 1280 x 720. Otherwise, it's encoded as 640 x 468.
-The default setting is [content-aware encoding](../../media-services/latest/encode-content-aware-concept.md).
+The default setting is [content-aware encoding](/media-services/latest/encode-content-aware-concept).
If you only want to index your video and not encode it, set `streamingPreset` to `NoStreaming`.
The following C# code snippets demonstrate the usage of all the Video Analyzer f
After you copy the following code into your development platform, you'll need to provide two parameters:
-* API key (`apiKey`): Your personal API management subscription key. It allows you to get an access token in order to perform operations on your Video Analyzer for Media account.
+* API key (`apiKey`): Your personal API management subscription key. It allows you to get an access token in order to perform operations on your Video Analyzer for Media account.
To get your API key:
After you copy the following code into your development platform, you'll need to
* Video URL (`videoUrl`): A URL of the video or audio file to be indexed. Here are the requirements:
- - The URL must point at a media file. (HTML pages are not supported.)
+ - The URL must point at a media file. (HTML pages are not supported.)
- The file can be protected by an access token that's provided as part of the URI. The endpoint that serves the file must be secured with TLS 1.2 or later.
- - The URL must be encoded.
+ - The URL must be encoded.
-The result of successfully running the code sample includes an insight widget URL and a player widget URL. They allow you to examine the insights and the uploaded video, respectively.
+The result of successfully running the code sample includes an insight widget URL and a player widget URL. They allow you to examine the insights and the uploaded video, respectively.
```csharp
public async Task Sample()
HttpResponseMessage result = await client.GetAsync($"{apiUrl}/auth/trial/Accounts?{queryParams}"); var json = await result.Content.ReadAsStringAsync(); var accounts = JsonConvert.DeserializeObject<AccountContractSlim[]>(json);
-
- // Take the relevant account. Here we simply take the first.
+
+ // Take the relevant account. Here we simply take the first.
// You can also get the account via accounts.First(account => account.Id == <GUID>); var accountInfo = accounts.First();
public class AccountContractSlim
### [Azure Resource Manager account](#tab/with-arm-account-account/)
-After you copy this C# project into your development platform, you need to take the following steps:
+After you copy this C# project into your development platform, you need to take the following steps:
1. Go to Program.cs and populate:
namespace VideoIndexerArm
Console.WriteLine($"account id: {accountId}"); Console.WriteLine($"account location: {accountLocation}");
- // Get account-level access token for Azure Video Analyzer for Media
+ // Get account-level access token for Azure Video Analyzer for Media
var accessTokenRequest = new AccessTokenRequest { PermissionType = AccessTokenPermission.Contributor,
namespace VideoIndexerArm
[JsonPropertyName("projectId")] public string ProjectId { get; set; }
-
+ [JsonPropertyName("videoId")] public string VideoId { get; set; } }
The upload operation might return the following status codes:
|429||Trial accounts are allowed 5 uploads per minute. Paid accounts are allowed 50 uploads per minute.| ## Uploading considerations and limitations
-
+ - The name of a video must be no more than 80 characters. - When you're uploading a video based on the URL (preferred), the endpoint must be secured with TLS 1.2 or later. - The upload size with the URL option is limited to 30 GB.
The upload operation might return the following status codes:
- The URL provided in the `videoURL` parameter must be encoded. - Indexing Media Services assets has the same limitation as indexing from a URL. - Video Analyzer for Media has a duration limit of 4 hours for a single file.-- The URL must be accessible (for example, a public URL).
+- The URL must be accessible (for example, a public URL).
If it's a private URL, the access token must be provided in the request. - The URL must point to a valid media file and not to a webpage, such as a link to the `www.youtube.com` page.
The upload operation might return the following status codes:
> [!Tip] > We recommend that you use .NET Framework version 4.6.2. or later, because older .NET Framework versions don't default to TLS 1.2. >
-> If you must use an older .NET Framework version, add one line to your code before making the REST API call:
+> If you must use an older .NET Framework version, add one line to your code before making the REST API call:
> > `System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;`
azure-video-analyzer Video Indexer Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-get-started.md
This getting started quickstart shows how to sign in to the Azure Video Analyzer for Media (formerly Video Indexer) website and how to upload your first video.
-When creating a Video Analyzer for Media account, you can choose a free trial account (where you get a certain number of free indexing minutes) or a paid option (where you are not limited by the quota). With free trial, Video Analyzer for Media provides up to 600 minutes of free indexing to website users and up to 2400 minutes of free indexing to API users. With paid option, you create a Video Analyzer for Media account that is [connected to your Azure subscription and an Azure Media Services account](connect-to-azure.md). You pay for minutes indexed, for more information, see [Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/).
+When creating a Video Analyzer for Media account, you can choose a free trial account (where you get a certain number of free indexing minutes) or a paid option (where you are not limited by the quota). With free trial, Video Analyzer for Media provides up to 600 minutes of free indexing to website users and up to 2400 minutes of free indexing to API users. With paid option, you create a Video Analyzer for Media account that is [connected to your Azure subscription and an Azure Media Services account](connect-to-azure.md). You pay for minutes indexed, for more information, see [Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/).
## Sign up for Video Analyzer for Media
Once you start using Video Analyzer for Media, all your stored data and uploaded
### Supported file formats for Video Analyzer for Media
-See the [input container/file formats](../../media-services/latest/encode-media-encoder-standard-formats-reference.md) article for a list of file formats that you can use with Video Analyzer for Media.
+See the [input container/file formats](/media-services/latest/encode-media-encoder-standard-formats-reference) article for a list of file formats that you can use with Video Analyzer for Media.
### Upload a video
See the [input container/file formats](../../media-services/latest/encode-media-
> [!div class="mx-imgBorder"] > :::image type="content" source="./media/video-indexer-get-started/video-indexer-upload.png" alt-text="Upload":::
-1. Once your video has been uploaded, Video Analyzer for Media starts indexing and analyzing the video. You see the progress.
+1. Once your video has been uploaded, Video Analyzer for Media starts indexing and analyzing the video. You see the progress.
> [!div class="mx-imgBorder"] > :::image type="content" source="./media/video-indexer-get-started/progress.png" alt-text="Progress of the upload":::
For more information, see [supported browsers](video-indexer-overview.md#support
See [Upload and index videos](upload-index-videos.md) for more details.
-After you upload and index a video, you can start using [Video Analyzer for Media website](video-indexer-view-edit.md) or [Video Analyzer for Media Developer Portal](video-indexer-use-apis.md) to see the insights of the video.
+After you upload and index a video, you can start using [Video Analyzer for Media website](video-indexer-view-edit.md) or [Video Analyzer for Media Developer Portal](video-indexer-use-apis.md) to see the insights of the video.
[Start using APIs](video-indexer-use-apis.md) ## Next steps
-For detailed introduction please visit our [introduction lab](https://github.com/Azure-Samples/media-services-video-indexer/blob/master/IntroToVideoIndexer.md).
+For detailed introduction please visit our [introduction lab](https://github.com/Azure-Samples/media-services-video-indexer/blob/master/IntroToVideoIndexer.md).
At the end of the workshop you will have a good understanding of the kind of information that can be extracted from video and audio content, you will be more prepared to identify opportunities related to content intelligence, pitch video AI on Azure, and demo several scenarios on Video Analyzer for Media.
azure-vmware Install Vmware Hcx https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/install-vmware-hcx.md
Title: Install VMware HCX in Azure VMware Solution description: Install VMware HCX in your Azure VMware Solution private cloud. Previously updated : 09/16/2021 Last updated : 03/29/2022 # Install and activate VMware HCX in Azure VMware Solution VMware HCX Advanced and its associated Cloud Manager are no longer pre-deployed in Azure VMware Solution. Instead, you'll install it through the Azure portal as an add-on. You'll still download the HCX Connector OVA and deploy the virtual appliance on your on-premises vCenter.
-Any edition of VMware HCX supports 25 site pairings (on-premises to cloud or cloud to cloud). The default is HCX Advanced, but you can open a [support request](https://rc.portal.azure.com/#create/Microsoft.Support) to have HCX Enterprise Edition enabled, which is currently in public preview. Once the service is generally available, you'll have 30 days to decide on your next steps. You can also turn off or opt out of the HCX Enterprise Edition service but keep HCX Advanced as it's part of the node cost.
+Any edition of VMware HCX supports 25 site pairings (on-premises to cloud or cloud to cloud). The default is HCX Advanced, but you can open a [support request](https://rc.portal.azure.com/#create/Microsoft.Support) to have HCX Enterprise Edition enabled. Once the service is generally available, you'll have 30 days to decide on your next steps. You can turn off or opt out of the HCX Enterprise Edition service but keep HCX Advanced as it's part of the node cost.
Downgrading from HCX Enterprise Edition to HCX Advanced is possible without redeploying. First, ensure you've reverted to an HCX Advanced configuration state and not using the Enterprise features. If you plan to downgrade, ensure that no scheduled migrations, features like RAV and [HCX Mobility Optimized Networking (MON)](https://docs.vmware.com/en/VMware-HCX/4.1/hcx-user-guide/GUID-0E254D74-60A9-479C-825D-F373C41F40BC.html) aren't in use, and site pairings are three or fewer.
cdn Cdn Caching Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-caching-policy.md
Azure Media Services provides [integrated CDN](https://azure.microsoft.com/updat
> [!IMPORTANT] >Azure Media Services has complete integration with Azure CDN. With a single click, you can integrate all the available Azure CDN providers to your streaming endpoint including standard and premium products. For more information, see this [announcement](https://azure.microsoft.com/blog/standardstreamingendpoint/).
->
+>
> Data charges from streaming endpoint to CDN only gets disabled if the CDN is enabled over streaming endpoint APIs or using Azure portal's streaming endpoint section. Manual integration or directly creating a CDN endpoint using CDN APIs or portal section doesn't disable the data charges. ## Configuring cache headers with Azure Media Services You can use Azure portal or Azure Media Services APIs to configure cache header values.
-1. To configure cache headers using Azure portal, refer to [How to Manage Streaming Endpoints](../media-services/previous/media-services-portal-manage-streaming-endpoints.md) section Configuring the Streaming Endpoint.
+1. To configure cache headers using Azure portal, refer to [How to Manage Streaming Endpoints](/media-services/previous/media-services-portal-manage-streaming-endpoints) section Configuring the Streaming Endpoint.
2. Azure Media Services REST API, [StreamingEndpoint](/rest/api/media/operations/streamingendpoint#StreamingEndpointCacheControl). 3. Azure Media Services .NET SDK, [StreamingEndpointCacheControl Properties](/dotnet/api/microsoft.windowsazure.mediaservices.client.streamingendpointcachecontrol).
cdn Cdn Custom Ssl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-custom-ssl.md
To enable HTTPS on a custom domain, follow these steps:
> This option is available only with **Azure CDN from Microsoft** and **Azure CDN from Verizon** profiles. >
-You can use your own certificate to enable the HTTPS feature. This process is done through an integration with Azure Key Vault, which allows you to store your certificates securely. Azure Front Door uses this secure mechanism to get your certificate and it requires a few extra steps. When you create your TLS/SSL certificate, you must create a complete certificate chain with an allowed certificate authority (CA) that is part of the [Microsoft Trusted CA List](https://ccadb-public.secure.force.com/microsoft/IncludedCACertificateReportForMSFT). If you use a non-allowed CA, your request will be rejected. If a certificate without complete chain is presented, the requests which involve that certificate are not guaranteed to work as expected. For Azure CDN from Verizon, any valid CA will be accepted.
+You can use your own certificate to enable the HTTPS feature. This process is done through an integration with Azure Key Vault, which allows you to store your certificates securely. Azure CDN uses this secure mechanism to get your certificate and it requires a few extra steps. When you create your TLS/SSL certificate, you must create a complete certificate chain with an allowed certificate authority (CA) that is part of the [Microsoft Trusted CA List](https://ccadb-public.secure.force.com/microsoft/IncludedCACertificateReportForMSFT). If you use a non-allowed CA, your request will be rejected. If a certificate without complete chain is presented, the requests which involve that certificate are not guaranteed to work as expected. For Azure CDN from Verizon, any valid CA will be accepted.
### Prepare your Azure Key vault account and certificate
cdn Cdn Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-features.md
# What are the comparisons between Azure CDN product features?
-Azure Content Delivery Network (CDN) includes four products:
+Azure Content Delivery Network (CDN) includes four products:
* **Azure CDN Standard from Microsoft** * **Azure CDN Standard from Akamai** * **Azure CDN Standard from Verizon**
-* **Azure CDN Premium from Verizon**.
+* **Azure CDN Premium from Verizon**.
The following table compares the features available with each product.
The following table compares the features available with each product.
| IPv4/IPv6 dual-stack | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | [HTTP/2 support](cdn-http2.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | ||||
- **Security** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
+ **Security** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
| HTTPS support with CDN endpoint | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | [Custom domain HTTPS](cdn-custom-ssl.md) | **&#x2713;** | **&#x2713;**, Requires direct CNAME to enable |**&#x2713;** |**&#x2713;** | | [Custom domain name support](cdn-map-content-to-custom-domain.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | [Geo-filtering](cdn-restrict-access-by-country-region.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** |
-| [Token authentication](cdn-token-auth.md) | | | |**&#x2713;**|
+| [Token authentication](cdn-token-auth.md) | | | |**&#x2713;**|
| [DDOS protection](https://www.us-cert.gov/ncas/tips/ST04-015) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | [Bring your own certificate](cdn-custom-ssl.md?tabs=option-2-enable-https-with-your-own-certificate#tlsssl-certificates) |**&#x2713;** | | **&#x2713;** | **&#x2713;** | | Supported TLS Versions | TLS 1.2, TLS 1.0/1.1 - [Configurable](/rest/api/cdn/custom-domains/enable-custom-https#usermanagedhttpsparameters) | TLS 1.2 | TLS 1.2 | TLS 1.2 | ||||
-| **Analytics and reporting** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
+| **Analytics and reporting** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
| [Azure diagnostic logs](cdn-azure-diagnostic-logs.md) | **&#x2713;** | **&#x2713;** |**&#x2713;** |**&#x2713;** | | [Core reports from Verizon](cdn-analyze-usage-patterns.md) | | |**&#x2713;** |**&#x2713;** | | [Custom reports from Verizon](cdn-verizon-custom-reports.md) | | |**&#x2713;** |**&#x2713;** |
The following table compares the features available with each product.
| [Edge node performance](cdn-edge-performance.md) | | | |**&#x2713;** | | [Real-time alerts](cdn-real-time-alerts.md) | | | |**&#x2713;** | ||||
-| **Ease of use** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
-| Easy integration with Azure services, such as [Storage](cdn-create-a-storage-account-with-cdn.md), [Web Apps](cdn-add-to-web-app.md), and [Media Services](../media-services/previous/media-services-portal-manage-streaming-endpoints.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** |
+| **Ease of use** | **Standard Microsoft** | **Standard Akamai** | **Standard Verizon** | **Premium Verizon** |
+| Easy integration with Azure services, such as [Storage](cdn-create-a-storage-account-with-cdn.md), [Web Apps](cdn-add-to-web-app.md), and [Media Services](/media-services/previous/media-services-portal-manage-streaming-endpoints) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** |
| Management via [REST API](/rest/api/cdn/), [.NET](cdn-app-dev-net.md), [Node.js](cdn-app-dev-node.md), or [PowerShell](cdn-manage-powershell.md) | **&#x2713;** |**&#x2713;** |**&#x2713;** |**&#x2713;** | | [Compression MIME types](./cdn-improve-performance.md) |Configurable |Configurable |Configurable |Configurable | | Compression encodings |gzip, brotli |gzip |gzip, deflate, bzip2 |gzip, deflate, bzip2 | ## Migration
-For information about migrating an **Azure CDN Standard from Verizon** profile to **Azure CDN Premium from Verizon**, see [Migrate an Azure CDN profile from Standard Verizon to Premium Verizon](cdn-migrate.md).
+For information about migrating an **Azure CDN Standard from Verizon** profile to **Azure CDN Premium from Verizon**, see [Migrate an Azure CDN profile from Standard Verizon to Premium Verizon](cdn-migrate.md).
> [!NOTE] > There is an upgrade path from Standard Verizon to Premium Verizon, there is no conversion mechanism between other products at this time.
cognitive-services Video Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Content-Moderator/video-moderation-api.md
This article provides information and code samples to help you get started using the [Content Moderator SDK for .NET](https://www.nuget.org/packages/Microsoft.Azure.CognitiveServices.ContentModerator/) to scan video content for adult or racy content.
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/cognitive-services/) before you begin.
+If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/cognitive-services/) before you begin.
## Prerequisites - Any edition of [Visual Studio 2015 or 2017](https://www.visualstudio.com/downloads/) ## Set up Azure resources
-The Content Moderator's video moderation capability is available as a free public preview **media processor** in Azure Media Services (AMS). Azure Media Services is a specialized Azure service for storing and streaming video content.
+The Content Moderator's video moderation capability is available as a free public preview **media processor** in Azure Media Services (AMS). Azure Media Services is a specialized Azure service for storing and streaming video content.
### Create an Azure Media Services account
-Follow the instructions in [Create an Azure Media Services account](../../media-services/previous/media-services-portal-create-account.md) to subscribe to AMS and create an associated Azure storage account. In that storage account, create a new Blob storage container.
+Follow the instructions in [Create an Azure Media Services account](/media-services/previous/media-services-portal-create-account) to subscribe to AMS and create an associated Azure storage account. In that storage account, create a new Blob storage container.
### Create an Azure Active Directory application
In the **Azure AD app** section, select **Create New** and name your new Azure A
Select your app registration and click the **Manage application** button below it. Note the value in the **Application ID** field; you will need this later. Select **Settings** > **Keys**, and enter a description for a new key (such as "VideoModKey"). Click **Save**, and then notice the new key value. Copy this string and save it somewhere secure.
-For a more thorough walkthrough of the above process, See [Get started with Azure AD authentication](../../media-services/previous/media-services-portal-get-started-with-aad.md).
+For a more thorough walkthrough of the above process, See [Get started with Azure AD authentication](/media-services/previous/media-services-portal-get-started-with-aad).
Once you've done this, you can use the video moderation media processor in two different ways.
The Azure Media Services Explorer is a user-friendly frontend for AMS. Use it to
## Create the Visual Studio project
-1. In Visual Studio, create a new **Console app (.NET Framework)** project and name it **VideoModeration**.
+1. In Visual Studio, create a new **Console app (.NET Framework)** project and name it **VideoModeration**.
1. If there are other projects in your solution, select this one as the single startup project. 1. Get the required NuGet packages. Right-click on your project in the Solution Explorer and select **Manage NuGet Packages**; then find and install the following packages: - windowsazure.mediaservices
Add the following static fields to the **Program** class in _Program.cs_. These
private static CloudMediaContext _context = null; private static CloudStorageAccount _StorageAccount = null;
-// Azure Media Services (AMS) associated Storage Account, Key, and the Container that has
+// Azure Media Services (AMS) associated Storage Account, Key, and the Container that has
// a list of Blobs to be processed. static string STORAGE_NAME = "YOUR AMS ASSOCIATED BLOB STORAGE NAME"; static string STORAGE_KEY = "YOUR AMS ASSOCIATED BLOB STORAGE KEY";
static string STORAGE_CONTAINER_NAME = "YOUR BLOB CONTAINER FOR VIDEO FILES";
private static StorageCredentials _StorageCredentials = null;
-// Azure Media Services authentication.
+// Azure Media Services authentication.
private const string AZURE_AD_TENANT_NAME = "microsoft.onmicrosoft.com"; private const string CLIENT_ID = "YOUR CLIENT ID"; private const string CLIENT_SECRET = "YOUR CLIENT SECRET";
-// REST API endpoint, for example "https://accountname.restv2.westcentralus.media.azure.net/API".
+// REST API endpoint, for example "https://accountname.restv2.westcentralus.media.azure.net/API".
private const string REST_API_ENDPOINT = "YOUR API ENDPOINT"; // Content Moderator Media Processor Nam
Add the following method to the **Program** class. You use the Storage Context,
// Creates a storage context from the AMS associated storage name and key static void CreateStorageContext() {
- // Get a reference to the storage account associated with a Media Services account.
+ // Get a reference to the storage account associated with a Media Services account.
if (_StorageCredentials == null) { _StorageCredentials = new StorageCredentials(STORAGE_NAME, STORAGE_KEY);
static IEnumerable<IListBlobItem> GetBlobsList()
CloudBlobClient CloudBlobClient = _StorageAccount.CreateCloudBlobClient(); CloudBlobContainer MediaBlobContainer = CloudBlobClient.GetContainerReference(STORAGE_CONTAINER_NAME);
- // Get the reference to the list of Blobs
+ // Get the reference to the list of Blobs
var blobList = MediaBlobContainer.ListBlobs(); return blobList; }
static void RunContentModeratorJob(IAsset asset)
CancellationToken.None); progressJobTask.Wait();
- // If job state is Error, the event handling
- // method for job progress should log errors. Here we check
+ // If job state is Error, the event handling
+ // method for job progress should log errors. Here we check
// for error state and exit if needed. if (job.State == JobState.Error) {
After the Content Moderation job is completed, analyze the JSON response. It con
- **Shots** as "**fragments**" - **Key frames** as "**events**" with a **reviewRecommended" (= true or false)"** flag based on **Adult** and **Racy** scores - **start**, **duration**, **totalDuration**, and **timestamp** are in "ticks". Divide by **timescale** to get the number in seconds.
-
+ > [!NOTE] > - `adultScore` represents the potential presence and prediction score of content that may be considered sexually explicit or adult in certain situations. > - `racyScore` represents the potential presence and prediction score of content that may be considered sexually suggestive or mature in certain situations.
cognitive-services Get Started Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/get-started-text-to-speech.md
keywords: text to speech
[!INCLUDE [CLI include](includes/quickstarts/text-to-speech-basics/cli.md)] ::: zone-end
-## Get position information
-
-Your project might need to know when a word is spoken by text-to-speech so that it can take specific action based on that timing. For example, if you want to highlight words as they're spoken, you need to know what to highlight, when to highlight it, and for how long to highlight it.
-
-You can accomplish this by using the `WordBoundary` event within `SpeechSynthesizer`. This event is raised at the beginning of each new spoken word. It provides a time offset within the spoken stream and a text offset within the input prompt:
-
-* `AudioOffset` reports the output audio's elapsed time between the beginning of synthesis and the start of the next word. This is measured in hundred-nanosecond units (HNS), with 10,000 HNS equivalent to 1 millisecond.
-* `WordOffset` reports the character position in the input string (original text or [SSML](speech-synthesis-markup.md)) immediately before the word that's about to be spoken.
-
-> [!NOTE]
-> `WordBoundary` events are raised as the output audio data becomes available, which will be faster than playback to an output device. The caller must appropriately synchronize stream timing to "real time."
-
-You can find examples of using `WordBoundary` in the [text-to-speech samples](https://aka.ms/csspeech/samples) on GitHub.
- ## Next steps
-* [Get started with Custom Neural Voice](how-to-custom-voice.md)
-* [Improve synthesis with SSML](speech-synthesis-markup.md)
-* Learn how to use the [Long Audio API](long-audio-api.md) for large text samples like books and news articles
+> [!div class="nextstepaction"]
+> [Learn more about speech synthesis](how-to-speech-synthesis.md)
+
cognitive-services How To Recognize Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-recognize-speech.md
Title: "How to recognize speech - Speech service"
-description: Learn how to use the Speech SDK to convert speech to text, including object construction, supported audio input formats, and configuration options for speech recognition.
+description: Learn how to convert speech to text, including object construction, supported audio input formats, and configuration options for speech recognition.
keywords: speech to text, speech to text software
## Next steps
-> [!div class="nextstepaction"]
-> [See the quickstart samples on GitHub](https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart)
+* [Try the speech to text quickstart](get-started-speech-to-text.md)
+* [Improve recognition accuracy with custom speech](custom-speech-overview.md)
+* [Transcribe audio in batches](batch-transcription.md)
cognitive-services How To Speech Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-speech-synthesis.md
+
+ Title: "How to synthesize speech from text - Speech service"
+
+description: Learn how to convert text to speech. Learn about object construction and design patterns, supported audio output formats, and custom configuration options for speech synthesis.
++++++ Last updated : 03/14/2022+
+ms.devlang: cpp, csharp, golang, java, javascript, objective-c, python
+
+zone_pivot_groups: programming-languages-speech-services
+keywords: text to speech
++
+# How to synthesize speech from text
+++++++++++
+## Get facial pose events
+
+Speech can be a good way to drive the animation of facial expressions.
+[Visemes](how-to-speech-synthesis-viseme.md) are often used to represent the key poses in observed speech. Key poses include the position of the lips, jaw, and tongue in producing a particular phoneme.
+
+You can subscribe to viseme events in the Speech SDK. Then, you can apply viseme events to animate the face of a character as speech audio plays.
+Learn [how to get viseme events](how-to-speech-synthesis-viseme.md#get-viseme-events-with-the-speech-sdk).
+
+## Get position information
+
+Your project might need to know when a word is spoken by text-to-speech so that it can take specific action based on that timing. For example, if you want to highlight words as they're spoken, you need to know what to highlight, when to highlight it, and for how long to highlight it.
+
+You can accomplish this by using the `WordBoundary` event within `SpeechSynthesizer`. This event is raised at the beginning of each new spoken word. It provides a time offset within the spoken stream and a text offset within the input prompt:
+
+* `AudioOffset` reports the output audio's elapsed time between the beginning of synthesis and the start of the next word. This is measured in hundred-nanosecond units (HNS), with 10,000 HNS equivalent to 1 millisecond.
+* `WordOffset` reports the character position in the input string (original text or [SSML](speech-synthesis-markup.md)) immediately before the word that's about to be spoken.
+
+> [!NOTE]
+> `WordBoundary` events are raised as the output audio data becomes available, which will be faster than playback to an output device. The caller must appropriately synchronize streaming and real time.
+
+You can find examples of using `WordBoundary` in the [text-to-speech samples](https://aka.ms/csspeech/samples) on GitHub.
+
+## Next steps
+
+* [Get started with Custom Neural Voice](how-to-custom-voice.md)
+* [Improve synthesis with SSML](speech-synthesis-markup.md)
+* [Synthesize from long-form text](long-audio-api.md) like books and news articles
cognitive-services Disconnected Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/containers/disconnected-containers.md
Previously updated : 03/11/2022 Last updated : 03/28/2022 # Use Docker containers in disconnected environments
-Containers enable you to run Cognitive Services APIs in your own environment, and are great for your specific security and data governance requirements. Disconnected containers enable you to use several of these APIs completely disconnected from the internet. Currently, the following containers can be run in this manner:
+Containers enable you to run Cognitive Services APIs in your own environment, and are great for your specific security and data governance requirements. Disconnected containers enable you to use several of these APIs disconnected from the internet. Currently, the following containers can be run in this manner:
* [Speech to Text (Standard)](../speech-service/speech-container-howto.md?tabs=stt) * [Neural Text to Speech](../speech-service/speech-container-howto.md?tabs=ntts) * [Text Translation (Standard)](../translator/containers/translator-how-to-install-container.md#host-computer) * [Language Understanding (LUIS)](../LUIS/luis-container-howto.md) * Azure Cognitive Service for Language
- * [Sentiment Analysis](../language-service/sentiment-opinion-mining/how-to/use-containers.md)
- * [Key Phrase Extraction](../language-service/key-phrase-extraction/how-to/use-containers.md)
- * [Language Detection](../language-service/language-detection/how-to/use-containers.md)
+ * [Sentiment Analysis](../language-service/sentiment-opinion-mining/how-to/use-containers.md)
+ * [Key Phrase Extraction](../language-service/key-phrase-extraction/how-to/use-containers.md)
+ * [Language Detection](../language-service/language-detection/how-to/use-containers.md)
* [Computer Vision - Read](../computer-vision/computer-vision-how-to-install-containers.md) Disconnected container usage is also available for the following Applied AI service:
-* [Form Recognizer ΓÇô Custom/Invoice](../../applied-ai-services/form-recognizer/containers/form-recognizer-container-install-run.md)
+
+* [Form Recognizer](../../applied-ai-services/form-recognizer/containers/form-recognizer-container-install-run.md#required-containers)
Before attempting to run a Docker container in an offline environment, make sure you know the steps to successfully download and use the container. For example:+ * Host computer requirements and recommendations.
-* The Docker `pull` command you will use to download the container.
+* The Docker `pull` command you'll use to download the container.
* How to validate that a container is running. * How to send queries to the container's endpoint, once it's running.
Fill out and submit the [request form](https://aka.ms/csdisconnectedcontainers)
[!INCLUDE [Request access to public preview](../../../includes/cognitive-services-containers-request-access.md)]
-Access is limited to customers that meet the following requirements:
-* Your organization must have a Microsoft Enterprise Agreement or an equivalent agreement and should identified as strategic customer or partner with Microsoft.
+Access is limited to customers that meet the following requirements:
+
+* Your organization must have a Microsoft Enterprise Agreement or an equivalent agreement and should be identified as strategic customer or partner with Microsoft.
* Disconnected containers are expected to run fully offline, hence your use cases must meet one of below or similar requirements:
- * Environment or device(s) with zero connectivity to internet.
- * Remote location that occasionally has internet access.
- * Organization under strict regulation of not sending any kind of data back to cloud.
+ * Environment or device(s) with zero connectivity to internet.
+ * Remote location that occasionally has internet access.
+ * Organization under strict regulation of not sending any kind of data back to cloud.
* Application completed as instructed - Please pay close attention to guidance provided throughout the application to ensure you provide all the necessary information required for approval. ## Purchase a commitment plan to use containers in disconnected environments ### Create a new resource
-1. Sign into the [Azure portal](https://portal.azure.com/) and select **Create a new resource** for one of the applicable Cognitive Services or Applied AI services listed above.
+1. Sign into the [Azure portal](https://portal.azure.com/) and select **Create a new resource** for one of the applicable Cognitive Services or Applied AI services listed above.
2. Enter the applicable information to create your resource. Be sure to select **Commitment tier disconnected containers** as your pricing tier. > [!NOTE]
+ >
> * You will only see the option to purchase a commitment tier if you have been approved by Microsoft. > * Pricing details are for example only. :::image type="content" source="media/offline-container-signup.png" alt-text="A screenshot showing resource creation on the Azure portal." lightbox="media/offline-container-signup.png":::
-3. Select **Review + Create** at the bottom of the page. Review the information, and select **Create**.
+3. Select **Review + Create** at the bottom of the page. Review the information, and select **Create**.
## Gather required parameters
docker pull mcr.microsoft.com/azure-cognitive-services/form-recognizer/invoice:l
## Configure the container to be run in a disconnected environment
-Now that you've downloaded your container, you will need to run the container with the `DownloadLicense=True` parameter in your `docker run` command. This parameter will download a license file that will enable your Docker container to run when it isn't connected to the internet. It also contains an expiration date, after which the license file will be invalid to run the container. You can only use a license file with the appropriate container that you've been approved for. For example, you cannot use a license file for a speech-to-text container with a form recognizer container.
+Now that you've downloaded your container, you'll need to run the container with the `DownloadLicense=True` parameter in your `docker run` command. This parameter will download a license file that will enable your Docker container to run when it isn't connected to the internet. It also contains an expiration date, after which the license file will be invalid to run the container. You can only use a license file with the appropriate container that you've been approved for. For example, you can't use a license file for a speech-to-text container with a form recognizer container.
> [!IMPORTANT]
-> * [**Translator container only**](../translator/containers/translator-how-to-install-container.md):
-> * You must include a parameter to download model files for the [languages](../translator/language-support.md) you want to translate. For example: `-e Languages=en,es`
-> * The container will generate a `docker run` template that you can use to run the container, containing parameters you will need for the downloaded models and configuration file. Make sure you save this template.
+>
+> * [**Translator container only**](../translator/containers/translator-how-to-install-container.md):
+> * You must include a parameter to download model files for the [languages](../translator/language-support.md) you want to translate. For example: `-e Languages=en,es`
+> * The container will generate a `docker run` template that you can use to run the container, containing parameters you will need for the downloaded models and configuration file. Make sure you save this template.
The following example shows the formatting of the `docker run` command you'll use, with placeholder values. Replace these placeholder values with your own values.
DownloadLicense=True \
Mounts:License={LICENSE_MOUNT} \ ```
-After you have configured the container, use the next section to run the container in your environment with the license, and appropriate memory and CPU allocations.
+After you've configured the container, use the next section to run the container in your environment with the license, and appropriate memory and CPU allocations.
## Run the container in a disconnected environment > [!IMPORTANT] > If you're using the Translator, Neural text-to-speech, or Speech-to-text containers, read the **Additional parameters** section below for information on commands or additional parameters you will need to use.
-Once the license file has been downloaded, you can run the container in a disconnected environment. The following example shows the formatting of the `docker run` command you'll use, with placeholder values. Replace these placeholder values with your own values.
+Once the license file has been downloaded, you can run the container in a disconnected environment. The following example shows the formatting of the `docker run` command you'll use, with placeholder values. Replace these placeholder values with your own values.
Wherever the container is run, the license file must be mounted to the container and the location of the license folder on the container's local filesystem must be specified with `Mounts:License=`. An output mount must also be specified so that billing usage records can be written. Placeholder | Value | Format or example | |-|-|| | `{IMAGE}` | The container image you want to use. | `mcr.microsoft.com/azure-cognitive-services/form-recognizer/invoice` |
- `{MEMORY_SIZE}` | The appropriate size of memory to allocate for your container. | `4g` |
+ `{MEMORY_SIZE}` | The appropriate size of memory to allocate for your container. | `4g` |
| `{NUMBER_CPUS}` | The appropriate number of CPUs to allocate for your container. | `4` | | `{LICENSE_MOUNT}` | The path where the license will be located and mounted. | `/volume/license:/path/to/license/directory` | | `{OUTPUT_PATH}` | The output path for logging [usage records](#usage-records). | `/host/output:/path/to/output/directory` |
Mounts:Output={OUTPUT_PATH}
See the following sections for additional parameters and commands you may need to run the container.
-#### Translator container
+#### Translator container
+
+If you're using the [Translator container](../translator/containers/translator-how-to-install-container.md), you'll need to add parameters for the downloaded translation models and container configuration. These values are generated and displayed in the container output when you [configure the container](#configure-the-container-to-be-run-in-a-disconnected-environment) as described above. For example:
-If you're using the [Translator container](../translator/containers/translator-how-to-install-container.md), you will need to add parameters for the downloaded translation models and container configuration. These values are generated and displayed in the container output when you [configure the container](#configure-the-container-to-be-run-in-a-disconnected-environment) as described above. For example:
```bash -e MODELS= /path/to/model1/, /path/to/model2/ -e TRANSLATORSYSTEMCONFIG=/path/to/model/config/translatorsystemconfig.json
If you're using the [Translator container](../translator/containers/translator-h
#### Speech-to-text and Neural text-to-speech containers
-The [speech-to-text](../speech-service/speech-container-howto.md?tabs=stt) and [neural text-to-speech](../speech-service/speech-container-howto.md?tabs=ntts) containers provide a default directory for writing the license file and billing log at runtime. When you're mounting these directories to the container with the `docker run -v` command, make sure the local machine directory is set ownership to `user:group nonroot:nonroot` before running the container.
-
+The [speech-to-text](../speech-service/speech-container-howto.md?tabs=stt) and [neural text-to-speech](../speech-service/speech-container-howto.md?tabs=ntts) containers provide a default directory for writing the license file and billing log at runtime. When you're mounting these directories to the container with the `docker run -v` command, make sure the local machine directory is set ownership to `user:group nonroot:nonroot` before running the container.
+ Below is a sample command to set file/directory ownership. ```bash
sudo chown -R nonroot:nonroot <YOUR_LOCAL_MACHINE_PATH_1> <YOUR_LOCAL_MACHINE_PA
## Usage records
-When operating Docker containers in a disconnected environment, the container will write usage records to a volume where they are collected over time. You can also call a REST endpoint to generate a report about service usage.
+When operating Docker containers in a disconnected environment, the container will write usage records to a volume where they're collected over time. You can also call a REST endpoint to generate a report about service usage.
### Arguments for storing logs
docker run -v /host/output:{OUTPUT_PATH} ... <image> ... Mounts:Output={OUTPUT_P
The container provides two endpoints for returning records about its usage.
-#### Get all records
+#### Get all records
The following endpoint will provide a report summarizing all of the usage collected in the mounted billing record directory.
The following endpoint will provide a report summarizing all of the usage collec
https://<service>/records/usage-logs/ ```
-It will return JSON similar to the example below.
+It will return JSON similar to the example below.
```json {
It will return JSON similar to the example below.
] } ```+ #### Get records for a specific month The following endpoint will provide a report summarizing usage over a specific month and year.
it will return a JSON response similar to the example below:
## Purchase a different commitment plan for disconnected containers
-Commitment plans for disconnected containers have a calendar year commitment period. When you purchase a plan, you will be charged the full price immediately. During the commitment period, you cannot change your commitment plan, however you can purchase additional unit(s) at a pro-rated price for the remaining days in the year. You have until midnight (UTC) on the last day of your commitment, to end a commitment plan.
+Commitment plans for disconnected containers have a calendar year commitment period. When you purchase a plan, you'll be charged the full price immediately. During the commitment period, you can't change your commitment plan, however you can purchase additional unit(s) at a pro-rated price for the remaining days in the year. You have until midnight (UTC) on the last day of your commitment, to end a commitment plan.
-You can choose a different commitment plan in the **Commitment Tier pricing** settings of your resource.
+You can choose a different commitment plan in the **Commitment Tier pricing** settings of your resource.
## End a commitment plan
-If you decide that you don't want to continue purchasing a commitment plan, you can set your resource's auto-renewal to **Do not auto-renew**. Your commitment plan will expire on the displayed commitment end date. After this date, you won't be charged for the commitment plan. You will be able to continue using the Azure resource to make API calls, charged at pay-as-you-go pricing. You have until midnight (UTC) on the last day of the year to end a commitment plan for disconnected containers, and not be charged for the following year.
+If you decide that you don't want to continue purchasing a commitment plan, you can set your resource's auto-renewal to **Do not auto-renew**. Your commitment plan will expire on the displayed commitment end date. After this date, you won't be charged for the commitment plan. You'll be able to continue using the Azure resource to make API calls, charged at pay-as-you-go pricing. You have until midnight (UTC) on the last day of the year to end a commitment plan for disconnected containers, and not be charged for the following year.
## Troubleshooting
If you run the container with an output mount and logging enabled, the container
> [!TIP] > For more troubleshooting information and guidance, see [Disconnected containers Frequently asked questions (FAQ)](disconnected-container-faq.yml).+ ## Next steps
-[Azure Cognitive Services containers overview](../cognitive-services-container-support.md)
+[Azure Cognitive Services containers overview](../cognitive-services-container-support.md)
communication-services Media Comp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/voice-video-calling/media-comp.md
These media streams are typically arrayed in a grid and broadcast to call partic
- Connect devices and services using streaming protocols such as [RTMP](https://datatracker.ietf.org/doc/html/rfc7016) or [SRT](https://datatracker.ietf.org/doc/html/draft-sharabayko-srt) - Compose media streams into complex scenes
-RTMP & SRT connectivity can be used for both input and output. Using RTMP/SRT input, a videography studio that emits RTMP/SRT can join an Azure Communication Services call. RTMP/SRT output allows you to stream media from Azure Communication Services into [Azure Media Services](../../../media-services/latest/concepts-overview.md), YouTube Live, and many other broadcasting channels. The ability to attach industry standard RTMP/SRT emitters and to output content to RTMP/SRT subscribers for broadcasting transforms a small group call into a virtual event that reaches millions of people in real time.
+RTMP & SRT connectivity can be used for both input and output. Using RTMP/SRT input, a videography studio that emits RTMP/SRT can join an Azure Communication Services call. RTMP/SRT output allows you to stream media from Azure Communication Services into [Azure Media Services](/media-services/latest/concepts-overview), YouTube Live, and many other broadcasting channels. The ability to attach industry standard RTMP/SRT emitters and to output content to RTMP/SRT subscribers for broadcasting transforms a small group call into a virtual event that reaches millions of people in real time.
Media Composition REST APIs (and open-source SDKs) allow you to command the Azure service to cloud compose these media streams. For example, a **presenter layout** can be used to compose a speaker and a translator together in a classic picture-in-picture style. Media Composition allows for all clients and services connected to the media data plane to enjoy a particular dynamic layout without local processing or application complexity.
- In the diagram below, three endpoints are participating actively in a group call and uploading media. Two users, one of which is using Microsoft Teams, are composed using a *presenter layout.* The third endpoint is a television studio that emits RTMP into the call. The Azure Calling client and Teams client will receive the composed media stream instead of a typical grid. Additionally, Azure Media Services is shown here subscribing to the call's RTMP channel and broadcasting content externally.
+ In the diagram below, three endpoints are participating actively in a group call and uploading media. Two users, one of which is using Microsoft Teams, are composed using a *presenter layout.* The third endpoint is a television studio that emits RTMP into the call. The Azure Calling client and Teams client will receive the composed media stream instead of a typical grid. Additionally, Azure Media Services is shown here subscribing to the call's RTMP channel and broadcasting content externally.
:::image type="content" source="../media/media-comp.svg" alt-text="Diagram showing how media input is processed by the Azure Communication Services Media Composition services":::
connectors Built In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/built-in.md
Azure Logic Apps provides the following built-in triggers and actions:
:::row-end::: :::row::: :::column:::
- [![STFP-SSH icon][sftp-ssh-icon]][sftp-ssh-doc]
+ [![SFTP-SSH icon][sftp-ssh-icon]][sftp-ssh-doc]
\ \
- [**STFP-SSH**][sftp-ssh-doc]<br>(*Standard logic app only*)
+ [**SFTP-SSH**][sftp-ssh-doc]<br>(*Standard logic app only*)
\ \ Connect to SFTP servers that you can access from the internet by using SSH so that you can work with your files and folders.
container-apps Azure Resource Manager Api Spec https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/azure-resource-manager-api-spec.md
Previously updated : 11/02/2021 Last updated : 03/28/2022 # Container Apps Preview ARM template API specification
-Azure Container Apps deployments are powered by an Azure Resource Manager (ARM) template. The following tables describe the properties available in the container app ARM template.
+Azure Container Apps deployments are powered by an Azure Resource Manager (ARM) template. Some Container Apps CLI commands also support using a YAML template to specify a resource.
-The [sample ARM template for usage examples](#examples).
+> [!NOTE]
+> Azure Container Apps resources are in the process of migrating from the `Microsoft.Web` namespace to the `Microsoft.App` namespace. Refer to [Namespace migration from Microsoft.Web to Microsoft.App in March 2022](https://github.com/microsoft/azure-container-apps/issues/109) for more details.
+
+## Container Apps environment
+
+The following tables describe the properties available in the Container Apps environment resource.
+
+### Resource
+
+A container app resource of the ARM template has the following properties:
+
+| Property | Description | Data type |
+|||--|
+| `name` | The Container Apps environment name. | string |
+| `location` | The Azure region where the Container Apps environment is deployed. | string |
+| `type` | `Microsoft.App/managedEnvironments` ΓÇô the ARM resource type | string |
+
+#### `properties`
+
+A resource's `properties` object has the following properties:
+
+| Property | Description | Data type | Read only |
+|||||
+| `daprAIInstrumentationKey` | The Application Insights instrumentation key used by Dapr. | string | No |
+| `appLogsConfiguration` | The environment's logging configuration. | Object | No |
+
+### <a name="container-apps-environment-examples"></a>Examples
+
+# [ARM template](#tab/arm-template)
+
+The following example ARM template deploys a Container Apps environment.
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "location": {
+ "defaultValue": "canadacentral",
+ "type": "String"
+ },
+ "dapr_ai_instrumentation_key": {
+ "defaultValue": "",
+ "type": "String"
+ },
+ "environment_name": {
+ "defaultValue": "myenvironment",
+ "type": "String"
+ },
+ "log_analytics_customer_id": {
+ "type": "String"
+ },
+ "log_analytics_shared_key": {
+ "type": "SecureString"
+ }
+ },
+ "variables": {},
+ "resources": [
+ {
+ "type": "Microsoft.App/managedEnvironments",
+ "apiVersion": "2022-01-01-preview",
+ "name": "[parameters('environment_name')]",
+ "location": "[parameters('location')]",
+ "properties": {
+ "daprAIInstrumentationKey": "[parameters('dapr_ai_instrumentation_key')]",
+ "appLogsConfiguration": {
+ "destination": "log-analytics",
+ "logAnalyticsConfiguration": {
+ "customerId": "[parameters('log_analytics_customer_id')]",
+ "sharedKey": "[parameters('log_analytics_shared_key')]"
+ }
+ }
+ }
+ }
+ ]
+}
+```
+
+# [YAML](#tab/yaml)
+
+YAML input isn't currently used by Azure CLI commands to specify a Container Apps environment.
+++
+## Container app
-## Resources
+The following tables describe the properties available in the container app resource.
-Entries in the `resources` array of the ARM template have the following properties:
+### Resource
+
+A container app resource of the ARM template has the following properties:
| Property | Description | Data type | |||--| | `name` | The Container Apps application name. | string | | `location` | The Azure region where the Container Apps instance is deployed. | string | | `tags` | Collection of Azure tags associated with the container app. | array |
-| `type` | Always `Microsoft.Web/containerApps` ARM endpoint determines which API to forward to | string |
-
-> [!NOTE]
-> Azure Container Apps resources are in the process of migrating from the `Microsoft.Web` namespace to the `Microsoft.App` namespace. Refer to [Namespace migration from Microsoft.Web to Microsoft.App in March 2022](https://github.com/microsoft/azure-container-apps/issues/109) for more details.
+| `type` | `Microsoft.App/containerApps` ΓÇô the ARM resource type | string |
In this example, you put your values in place of the placeholder tokens surrounded by `<>` brackets.
-## properties
+#### `properties`
A resource's `properties` object has the following properties:
A resource's `properties` object has the following properties:
The `environmentId` value takes the following form: ```console
-/subscriptions/<SUBSCRIPTION_ID>/resourcegroups/<RESOURCE_GROUP_NAME>/providers/Microsoft.Web/environmentId/<ENVIRONMENT_NAME>
+/subscriptions/<SUBSCRIPTION_ID>/resourcegroups/<RESOURCE_GROUP_NAME>/providers/Microsoft.App/environmentId/<ENVIRONMENT_NAME>
``` In this example, you put your values in place of the placeholder tokens surrounded by `<>` brackets.
-## properties.configuration
+#### `properties.configuration`
A resource's `properties.configuration` object has the following properties: | Property | Description | Data type | ||||
-| `activeRevisionsMode` | Setting to `multiple` allows you to maintain multiple revisions. Setting to `single` automatically deactivates old revisions, and only keeps the latest revision active. | string |
+| `activeRevisionsMode` | Setting to `single` automatically deactivates old revisions, and only keeps the latest revision active. Setting to `multiple` allows you to maintain multiple revisions. | string |
| `secrets` | Defines secret values in your container app. | object | | `ingress` | Object that defines public accessibility configuration of a container app. | object | | `registries` | Configuration object that references credentials for private container registries. Entries defined with `secretref` reference the secrets configuration object. | object |
+| `dapr` | Configuration object that defines the Dapr settings for the container app. | object |
Changes made to the `configuration` section are [application-scope changes](revisions.md#application-scope-changes), which doesn't trigger a new revision.
-## properties.template
+#### `properties.template`
A resource's `properties.template` object has the following properties:
A resource's `properties.template` object has the following properties:
| `revisionSuffix` | A friendly name for a revision. This value must be unique as the runtime rejects any conflicts with existing revision name suffix values. | string | | `containers` | Configuration object that defines what container images are included in the container app. | object | | `scale` | Configuration object that defines scale rules for the container app. | object |
-| `dapr` | Configuration object that defines the Dapr settings for the container app. | object |
Changes made to the `template` section are [revision-scope changes](revisions.md#revision-scope-changes), which triggers a new revision.
-## Examples
+### <a name="container-app-examples"></a>Examples
-The following is an example ARM template used to deploy a container app.
+# [ARM template](#tab/arm-template)
+
+The following example ARM template deploys a container app.
```json {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "containerappName": {
- "defaultValue": "mycontainerapp",
- "type": "String"
- },
- "location": {
- "defaultValue": "canadacentral",
- "type": "String"
- },
- "environment_name": {
- "defaultValue": "myenvironment",
- "type": "String"
- }
+ "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "containerappName": {
+ "defaultValue": "mycontainerapp",
+ "type": "String"
+ },
+ "location": {
+ "defaultValue": "canadacentral",
+ "type": "String"
+ },
+ "environment_name": {
+ "defaultValue": "myenvironment",
+ "type": "String"
+ },
+ "container_image": {
+ "type": "String"
},
- "variables": {},
- "resources": [
- {
- "apiVersion": "2021-03-01",
- "type": "Microsoft.Web/containerApps",
- "name": "[parameters('containerappName')]",
- "location": "[parameters('location')]",
- "properties": {
- "kubeEnvironmentId": "[resourceId('Microsoft.Web/kubeEnvironments', parameters('environment_name'))]",
- "configuration": {
- "secrets": [
- {
- "name": "mysecret",
- "value": "thisismysecret"
- }
- ],
- "ingress": {
- "external": true,
- "targetPort": 80,
- "allowInsecure": false,
- "traffic": [
- {
- "latestRevision": true,
- "weight": 100
- }
- ]
- }
+ "registry_password": {
+ "type": "SecureString"
+ }
+ },
+ "variables": {},
+ "resources": [
+ {
+ "apiVersion": "2022-01-01-preview",
+ "type": "Microsoft.App/containerApps",
+ "name": "[parameters('containerappName')]",
+ "location": "[parameters('location')]",
+ "properties": {
+ "managedEnvironmentId": "[resourceId('Microsoft.App/managedEnvironments', parameters('environment_name'))]",
+ "configuration": {
+ "secrets": [
+ {
+ "name": "mysecret",
+ "value": "thisismysecret"
+ },
+ {
+ "name": "myregistrypassword",
+ "value": "[parameters('registry_password')]"
+ }
+ ],
+ "ingress": {
+ "external": true,
+ "targetPort": 80,
+ "allowInsecure": false,
+ "traffic": [
+ {
+ "latestRevision": true,
+ "weight": 100
+ }
+ ]
+ },
+ "registries": [
+ {
+ "server": "myregistry.azurecr.io",
+ "username": "[parameters('containerappName')]",
+ "passwordSecretRef": "myregistrypassword"
+ }
+ ],
+ "dapr": {
+ "appId": "[parameters('containerappName')]",
+ "appPort": 80,
+ "appProtocol": "http",
+ "enabled": true
+ }
+ },
+ "template": {
+ "revisionSuffix": "myrevision",
+ "containers": [
+ {
+ "name": "main",
+ "image": "[parameters('container_image')]",
+ "env": [
+ {
+ "name": "HTTP_PORT",
+ "value": "80"
},
- "template": {
- "revisionSuffix": "myrevision",
- "containers": [
- {
- "name": "nginx",
- "image": "nginx",
- "env": [
- {
- "name": "HTTP_PORT",
- "value": "80"
- },
- {
- "name": "SECRET_VAL",
- "secretRef": "mysecret"
- }
- ],
- "resources": {
- "cpu": 0.5,
- "memory": "1Gi"
- }
- }
- ],
- "scale": {
- "minReplicas": 1,
- "maxReplicas": 3
- }
+ {
+ "name": "SECRET_VAL",
+ "secretRef": "mysecret"
}
+ ],
+ "resources": {
+ "cpu": 0.5,
+ "memory": "1Gi"
+ }
}
+ ],
+ "scale": {
+ "minReplicas": 1,
+ "maxReplicas": 3
+ }
}
- ]
+ }
+ }
+ ]
} ```
-The following is an example YAML configuration used to deploy a container app.
+# [YAML](#tab/yaml)
+
+The following example YAML configuration deploys a container app when used with the `--yaml` parameter in the following Azure CLI commands:
+
+- [`az containerapp create`](/cli/azure/containerapp?view=azure-cli-latest&preserve-view=true#az-containerapp-create)
+- [`az containerapp update`](/cli/azure/containerapp?view=azure-cli-latest&preserve-view=true#az-containerapp-update)
+- [`az containerapp revision copy`](/cli/azure/containerapp?view=azure-cli-latest&preserve-view=true#az-containerapp-revision-copy)
```yaml kind: containerapp location: northeurope name: mycontainerapp resourceGroup: myresourcegroup
-type: Microsoft.Web/containerApps
+type: Microsoft.App/containerApps
tags:
- tagname: value
+ tagname: value
properties:
- kubeEnvironmentId: /subscriptions/mysubscription/resourceGroups/myresourcegroup/providers/Microsoft.Web/kubeEnvironments/myenvironment
- configuration:
- activeRevisionsMode: Multiple
- secrets:
- - name: mysecret
- value: thisismysecret
- ingress:
- external: True
- allowInsecure: false
- targetPort: 80
- traffic:
- - latestRevision: true
- weight: 100
- transport: Auto
- template:
- revisionSuffix: myrevision
- containers:
- - image: nginx
- name: nginx
- env:
+ managedEnvironmentId: /subscriptions/mysubscription/resourceGroups/myresourcegroup/providers/Microsoft.App/managedEnvironments/myenvironment
+ configuration:
+ activeRevisionsMode: Multiple
+ secrets:
+ - name: mysecret
+ value: thisismysecret
+ - name: myregistrypassword
+ value: I<3containerapps
+ ingress:
+ external: true
+ allowInsecure: false
+ targetPort: 80
+ traffic:
+ - latestRevision: true
+ weight: 100
+ transport: Auto
+ registries:
+ - passwordSecretRef: myregistrypassword
+ server: myregistry.azurecr.io
+ username: myregistrye
+ dapr:
+ appId: mycontainerapp
+ appPort: 80
+ appProtocol: http
+ enabled: true
+ template:
+ revisionSuffix: myrevision
+ containers:
+ - image: nginx
+ name: nginx
+ env:
- name: HTTP_PORT value: 80 - name: secret_name secretRef: mysecret
- resources:
- cpu: 0.5
- memory: 1Gi
- scale:
- minReplicas: 1
- maxReplicas: 1
+ resources:
+ cpu: 0.5
+ memory: 1Gi
+ scale:
+ minReplicas: 1
+ maxReplicas: 3
```++
container-apps Background Processing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/background-processing.md
Individual container apps are deployed to an Azure Container Apps environment. T
az containerapp env create \ --name $CONTAINERAPPS_ENVIRONMENT \ --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location "$LOCATION" ```
az containerapp env create \
az containerapp env create ` --name $CONTAINERAPPS_ENVIRONMENT ` --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location $LOCATION ```
Create a file named *queue.json* and paste the following configuration code into
"resources": [ { "name": "queuereader",
- "type": "Microsoft.Web/containerApps",
- "apiVersion": "2021-03-01",
+ "type": "Microsoft.App/containerApps",
+ "apiVersion": "2022-01-01-preview",
"kind": "containerapp", "location": "[parameters('location')]", "properties": {
- "kubeEnvironmentId": "[resourceId('Microsoft.Web/kubeEnvironments', parameters('environment_name'))]",
+ "managedEnvironmentId": "[resourceId('Microsoft.App/managedEnvironments', parameters('environment_name'))]",
"configuration": { "activeRevisionsMode": "single", "secrets": [
Run the following command to see logged messages. This command requires the Log
# [Bash](#tab/bash) ```azurecli
+LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv`
+ az monitor log-analytics query \ --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'" \
az monitor log-analytics query \
# [PowerShell](#tab/powershell) ```powershell
+$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv)
+ $queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'queuereader' and Log_s contains 'Message ID'" $queryResults.Results ```
container-apps Get Started Existing Container Image https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/get-started-existing-container-image.md
Previously updated : 12/16/2021 Last updated : 03/21/2022 zone_pivot_groups: container-apps-registry-types
To create the environment, run the following command:
az containerapp env create \ --name $CONTAINERAPPS_ENVIRONMENT \ --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location $LOCATION ```
az containerapp env create \
az containerapp env create ` --name $CONTAINERAPPS_ENVIRONMENT ` --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location $LOCATION ```
For details on how to provide values for any of these parameters to the `create`
```bash CONTAINER_IMAGE_NAME=<CONTAINER_IMAGE_NAME>
-REGISTRY_LOGIN_SERVER=<REGISTRY_LOGIN_URL>
+REGISTRY_SERVER=<REGISTRY_SERVER>
REGISTRY_USERNAME=<REGISTRY_USERNAME> REGISTRY_PASSWORD=<REGISTRY_PASSWORD> ```
az containerapp create \
--resource-group $RESOURCE_GROUP \ --image $CONTAINER_IMAGE_NAME \ --environment $CONTAINERAPPS_ENVIRONMENT \
- --registry-login-server $REGISTRY_LOGIN_SERVER \
+ --registry-server $REGISTRY_SERVER \
--registry-username $REGISTRY_USERNAME \ --registry-password $REGISTRY_PASSWORD ```
az containerapp create \
```powershell $CONTAINER_IMAGE_NAME=<CONTAINER_IMAGE_NAME>
-$REGISTRY_LOGIN_SERVER=<REGISTRY_LOGIN_URL>
+$REGISTRY_SERVER=<REGISTRY_SERVER>
$REGISTRY_USERNAME=<REGISTRY_USERNAME> $REGISTRY_PASSWORD=<REGISTRY_PASSWORD> ```
az containerapp create `
--resource-group $RESOURCE_GROUP ` --image $CONTAINER_IMAGE_NAME ` --environment $CONTAINERAPPS_ENVIRONMENT `
- --registry-login-server $REGISTRY_LOGIN_SERVER `
+ --registry-server $REGISTRY_SERVER `
--registry-username $REGISTRY_USERNAME ` --registry-password $REGISTRY_PASSWORD ```
az containerapp create `
```azurecli az containerapp create \
- --image <REGISTRY_CONTAINER_URL> \
+ --image <REGISTRY_CONTAINER_NAME> \
--name my-container-app \ --resource-group $RESOURCE_GROUP \ --environment $CONTAINERAPPS_ENVIRONMENT
az containerapp create \
```azurecli az containerapp create `
- --image <REGISTRY_CONTAINER_URL> `
+ --image <REGISTRY_CONTAINER_NAME> `
--name my-container-app ` --resource-group $RESOURCE_GROUP ` --environment $CONTAINERAPPS_ENVIRONMENT
az containerapp create `
-Before you run this command, replace `<REGISTRY_CONTAINER_URL>` with the URL to the public container registry location including tag.
+Before you run this command, replace `<REGISTRY_CONTAINER_NAME>` with the full name the public container registry location, including the registry path and tag. For example, a valid container name is `mcr.microsoft.com/azuredocs/containerapps-helloworld:latest`.
::: zone-end
-If you have enabled ingress on your container app, you can add `--query configuration.ingress.fqdn` to the `create` command to return the public URL for the application.
+If you have enabled ingress on your container app, you can add `--query properties.configuration.ingress.fqdn` to the `create` command to return the public URL for the application.
## Verify deployment
After about 5-10 minutes has passed, use the following steps to view logged mess
# [Bash](#tab/bash) ```azurecli
+LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv`
+ az monitor log-analytics query \ --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'my-container-app' | project ContainerAppName_s, Log_s, TimeGenerated" \
az monitor log-analytics query \
# [PowerShell](#tab/powershell) ```powershell
+$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv)
+ $queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" $queryResults.Results --out table
container-apps Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/get-started.md
Previously updated : 11/02/2021 Last updated : 03/21/2022 ms.devlang: azurecli
To create the environment, run the following command:
az containerapp env create \ --name $CONTAINERAPPS_ENVIRONMENT \ --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location $LOCATION ```
az containerapp env create \
az containerapp env create ` --name $CONTAINERAPPS_ENVIRONMENT ` --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location $LOCATION ```
az containerapp create \
--image mcr.microsoft.com/azuredocs/containerapps-helloworld:latest \ --target-port 80 \ --ingress 'external' \
- --query configuration.ingress.fqdn
+ --query properties.configuration.ingress.fqdn
``` # [PowerShell](#tab/powershell)
az containerapp create `
--image mcr.microsoft.com/azuredocs/containerapps-helloworld:latest ` --target-port 80 ` --ingress 'external' `
- --query configuration.ingress.fqdn
+ --query properties.configuration.ingress.fqdn
```
container-apps Manage Secrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/manage-secrets.md
az containerapp create \
--environment "my-environment-name" \ --image demos/myQueueApp:v1 \ --secrets "queue-connection-string=$CONNECTIONSTRING" \
- --environment-variables "QueueName=myqueue,ConnectionString=secretref:queue-connection-string"
+ --env-vars "QueueName=myqueue" "ConnectionString=secretref:queue-connection-string"
``` Here, the environment variable named `connection-string` gets its value from the application-level `queue-connection-string` secret by using `secretref`.
az containerapp create `
--environment "my-environment-name" ` --image demos/myQueueApp:v1 ` --secrets "queue-connection-string=$CONNECTIONSTRING" `
- --environment-variables "QueueName=myqueue,ConnectionString=secretref:queue-connection-string"
+ --env-vars "QueueName=myqueue" "ConnectionString=secretref:queue-connection-string"
``` Here, the environment variable named `connection-string` gets its value from the application-level `queue-connection-string` secret by using `secretref`.
container-apps Microservices Dapr Azure Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/microservices-dapr-azure-resource-manager.md
Title: 'Tutorial: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template'
+ Title: "Tutorial: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template"
description: Deploy a Dapr application to Azure Container Apps with an ARM or Bicep template. Last updated 01/31/2022-+ zone_pivot_groups: container-apps
zone_pivot_groups: container-apps
You learn how to: > [!div class="checklist"]-
-> * Create a Container Apps environment for your container apps
-> * Create an Azure Blob Storage state store for the container app
-> * Deploy two apps that a produce and consume messages and persist them with the state store
+> * Create an Azure Blob Storage for use as a Dapr state store
+> * Deploy a container apps environment to host container apps
+> * Deploy two dapr-enabled container apps: one that produces orders and one that consumes orders and stores them
> * Verify the interaction between the two microservices. With Azure Container Apps, you get a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include [Service to Service calls](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/), [Pub/Sub](https://docs.dapr.io/developing-applications/building-blocks/pubsub/), [Event Bindings](https://docs.dapr.io/developing-applications/building-blocks/bindings/), [State Stores](https://docs.dapr.io/developing-applications/building-blocks/state-management/), and [Actors](https://docs.dapr.io/developing-applications/building-blocks/actors/).
-In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/tutorials/hello-kubernetes) quickstart.
+In this tutorial, you deploy the same applications from the Dapr [Hello World](https://github.com/dapr/quickstarts/tree/master/tutorials/hello-world) quickstart.
The application consists of:
-* A client (Python) container app to generate messages.
-* A service (Node) container app to consume and persist those messages in a state store
+- A client (Python) container app to generates messages.
+- A service (Node) container app to consume and persist those messages in a state store
The following architecture diagram illustrates the components that make up this tutorial:
The following architecture diagram illustrates the components that make up this
## Prerequisites
-* Install [Azure CLI](/cli/azure/install-azure-cli)
+- Install [Azure CLI](/cli/azure/install-azure-cli)
::: zone pivot="container-apps-bicep"
-* [Bicep](../azure-resource-manager/bicep/install.md)
+- [Bicep](../azure-resource-manager/bicep/install.md)
::: zone-end
-* An Azure account with an active subscription is required. If you don't already have one, you can [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- An Azure account with an active subscription is required. If you don't already have one, you can [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
## Before you begin
This guide uses the following environment variables:
RESOURCE_GROUP="my-containerapps" LOCATION="canadacentral" CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-LOG_ANALYTICS_WORKSPACE="containerapps-logs"
STORAGE_ACCOUNT_CONTAINER="mycontainer" ```
STORAGE_ACCOUNT_CONTAINER="mycontainer"
$RESOURCE_GROUP="my-containerapps" $LOCATION="canadacentral" $CONTAINERAPPS_ENVIRONMENT="containerapps-env"
-$LOG_ANALYTICS_WORKSPACE="containerapps-logs"
$STORAGE_ACCOUNT_CONTAINER="mycontainer" ``` - # [Bash](#tab/bash) ```bash
$STORAGE_ACCOUNT="<storage account name>"
-Choose a name for `STORAGE_ACCOUNT`. Storage account names must be *unique within Azure*. Be from 3 to 24 characters in length and contain numbers and lowercase letters only.
+Choose a name for `STORAGE_ACCOUNT`. Storage account names must be _unique within Azure_. Be from 3 to 24 characters in length and contain numbers and lowercase letters only.
## Setup
az upgrade
Next, install the Azure Container Apps extension for the Azure CLI.
+> [!NOTE]
+> If you have worked with earlier versions of Container Apps, make sure to first remove the old extension version by running `az extension remove -n containerapp`.
+ # [Bash](#tab/bash) ```azurecli
-az extension add \
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.4-py2.py3-none-any.whl
+az extension add --name containerapp
``` # [PowerShell](#tab/powershell) ```azurecli
-az extension add `
- --source https://workerappscliextension.blob.core.windows.net/azure-cli-extension/containerapp-0.2.4-py2.py3-none-any.whl
+az extension add --name containerapp
```
-Now that the extension is installed, register the `Microsoft.Web` namespace.
+Now that the extension is installed, register the `Microsoft.App` namespace.
> [!NOTE] > Azure Container Apps resources are in the process of migrating from the `Microsoft.Web` namespace to the `Microsoft.App` namespace. Refer to [Namespace migration from Microsoft.Web to Microsoft.App in March 2022](https://github.com/microsoft/azure-container-apps/issues/109) for more details.
Now that the extension is installed, register the `Microsoft.Web` namespace.
# [Bash](#tab/bash) ```azurecli
-az provider register --namespace Microsoft.Web
+az provider register --namespace Microsoft.App
``` # [PowerShell](#tab/powershell) ```powershell
-Register-AzResourceProvider -ProviderNamespace Microsoft.Web
+Register-AzResourceProvider -ProviderNamespace Microsoft.App
```
-Create a resource group to organize the services related to your new container app.
+Create a resource group to organize the services related to your container apps.
# [Bash](#tab/bash)
New-AzResourceGroup -Name $RESOURCE_GROUP -Location $LOCATION
-With the CLI upgraded and a new resource group available, you can create a Container Apps environment and deploy your container app.
-
-## Create an environment
-
-The Azure Container Apps environment acts as a secure boundary around a group of container apps. Container Apps deployed to the same environment share a virtual network and write logs to the same Log Analytics workspace.
-
-Your container apps are monitored with Azure Log Analytics, which is required when you create a Container Apps environment.
-
-Create a Log Analytics workspace with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az monitor log-analytics workspace create \
- --resource-group $RESOURCE_GROUP \
- --workspace-name $LOG_ANALYTICS_WORKSPACE
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-New-AzOperationalInsightsWorkspace `
- -Location $LOCATION `
- -Name $LOG_ANALYTICS_WORKSPACE `
- -ResourceGroupName $RESOURCE_GROUP
-```
---
-Next, retrieve the Log Analytics Client ID and client secret.
-
-# [Bash](#tab/bash)
-
-Make sure to run each query separately to give enough time for the request to complete.
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az monitor log-analytics workspace show --query customerId -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE -o tsv | tr -d '[:space:]'`
-```
-
-```azurecli
-LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=`az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE -o tsv | tr -d '[:space:]'`
-```
-
-# [PowerShell](#tab/powershell)
-
-Make sure to run each query separately to give enough time for the request to complete.
-
-```powershell
-$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(Get-AzOperationalInsightsWorkspace -ResourceGroupName $RESOURCE_GROUP -Name $LOG_ANALYTICS_WORKSPACE).CustomerId
-```
-
-<! This was taken out because of a breaking changes warning. We should put it back after it's fixed. Until then we'll go with the az command
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=(Get-AzOperationalInsightsWorkspaceSharedKey -ResourceGroupName $RESOURCE_GROUP -Name $LOG_ANALYTICS_WORKSPACE).PrimarySharedKey
->
-
-```azurecli
-$LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET=(az monitor log-analytics workspace get-shared-keys --query primarySharedKey -g $RESOURCE_GROUP -n $LOG_ANALYTICS_WORKSPACE --out tsv)
-```
---
-Individual container apps are deployed to an Azure Container Apps environment. To create the environment, run the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-az containerapp env create \
- --name $CONTAINERAPPS_ENVIRONMENT \
- --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
- --location "$LOCATION"
-```
-
-# [PowerShell](#tab/powershell)
-
-```azurecli
-az containerapp env create `
- --name $CONTAINERAPPS_ENVIRONMENT `
- --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
- --location "$LOCATION"
-```
--- ## Set up a state store ### Create an Azure Blob Storage account
New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable.
+- `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable.
-* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER`variable.
+- `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER` variable.
Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
-Get the storage account key with the following command:
-
-# [Bash](#tab/bash)
-
-```azurecli
-STORAGE_ACCOUNT_KEY=`az storage account keys list --resource-group $RESOURCE_GROUP --account-name $STORAGE_ACCOUNT --query '[0].value' --out tsv`
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP -AccountName $STORAGE_ACCOUNT)| Where-Object -Property KeyName -Contains 'key1' | Select-Object -ExpandProperty Value
-```
--- ::: zone pivot="container-apps-arm"
-### Create Azure Resource Manager (ARM) templates
-
-Create two ARM templates.
+### Create Azure Resource Manager (ARM) template
-Each ARM template has a container app definition and a Dapr component definition.
+Create an ARM template to deploy a Container Apps environment including the associated Log Analytics workspace and Application Insights resource for distributed tracing, a dapr component for the state store and the two dapr-enabled container apps.
-The following example shows how your ARM template should look when configured for your Azure Blob Storage account.
-
-Save the following file as *serviceapp.json*:
+Save the following file as _hello-world.json_:
```json {
- "$schema": "https://schema.management.azure.com/schemas/2019-08-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "location": {
- "defaultValue": "canadacentral",
- "type": "String"
- },
- "environment_name": {
- "type": "String"
- },
- "storage_account_name": {
- "type": "String"
- },
- "storage_account_key": {
- "type": "String"
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "environment_name": {
+ "type": "string"
+ },
+ "location": {
+ "defaultValue": "canadacentral",
+ "type": "string"
+ },
+ "storage_account_name": {
+ "type": "string"
+ },
+ "storage_container_name": {
+ "type": "string"
+ }
+ },
+ "variables": {
+ "logAnalyticsWorkspaceName": "[concat('logs-', parameters('environment_name'))]",
+ "appInsightsName": "[concat('appins-', parameters('environment_name'))]"
+ },
+ "resources": [
+ {
+ "type": "Microsoft.OperationalInsights/workspaces",
+ "apiVersion": "2020-03-01-preview",
+ "name": "[variables('logAnalyticsWorkspaceName')]",
+ "location": "[parameters('location')]",
+ "properties": {
+ "retentionInDays": 30,
+ "features": {
+ "searchVersion": 1
},
- "storage_container_name": {
- "type": "String"
+ "sku": {
+ "name": "PerGB2018"
}
+ }
},
- "variables": {},
- "resources": [
+ {
+ "type": "Microsoft.Insights/components",
+ "apiVersion": "2020-02-02",
+ "name": "[variables('appInsightsName')]",
+ "location": "[parameters('location')]",
+ "kind": "web",
+ "dependsOn": [
+ "[resourceId('Microsoft.OperationalInsights/workspaces/', variables('logAnalyticsWorkspaceName'))]"
+ ],
+ "properties": {
+ "Application_Type": "web",
+ "WorkspaceResourceId": "[resourceId('Microsoft.OperationalInsights/workspaces/', variables('logAnalyticsWorkspaceName'))]"
+ }
+ },
+ {
+ "type": "Microsoft.App/managedEnvironments",
+ "apiVersion": "2022-01-01-preview",
+ "name": "[parameters('environment_name')]",
+ "location": "[parameters('location')]",
+ "dependsOn": [
+ "[resourceId('Microsoft.Insights/components/', variables('appInsightsName'))]"
+ ],
+ "properties": {
+ "daprAIInstrumentationKey": "[reference(resourceId('Microsoft.Insights/components/', variables('appInsightsName')), '2020-02-02').InstrumentationKey]",
+ "appLogsConfiguration": {
+ "destination": "log-analytics",
+ "logAnalyticsConfiguration": {
+ "customerId": "[reference(resourceId('Microsoft.OperationalInsights/workspaces/', variables('logAnalyticsWorkspaceName')), '2020-03-01-preview').customerId]",
+ "sharedKey": "[listKeys(resourceId('Microsoft.OperationalInsights/workspaces/', variables('logAnalyticsWorkspaceName')), '2020-03-01-preview').primarySharedKey]"
+ }
+ }
+ },
+ "resources": [
{
- "name": "nodeapp",
- "type": "Microsoft.Web/containerApps",
- "apiVersion": "2021-03-01",
- "kind": "containerapp",
- "location": "[parameters('location')]",
- "properties": {
- "kubeEnvironmentId": "[resourceId('Microsoft.Web/kubeEnvironments', parameters('environment_name'))]",
- "configuration": {
- "ingress": {
- "external": true,
- "targetPort": 3000
- },
- "secrets": [
- {
- "name": "storage-key",
- "value": "[parameters('storage_account_key')]"
- }
- ]
- },
- "template": {
- "containers": [
- {
- "image": "dapriosamples/hello-k8s-node:latest",
- "name": "hello-k8s-node",
- "resources": {
- "cpu": 0.5,
- "memory": "1Gi"
- }
- }
- ],
- "scale": {
- "minReplicas": 1,
- "maxReplicas": 1
- },
- "dapr": {
- "enabled": true,
- "appPort": 3000,
- "appId": "nodeapp",
- "components": [
- {
- "name": "statestore",
- "type": "state.azure.blobstorage",
- "version": "v1",
- "metadata": [
- {
- "name": "accountName",
- "value": "[parameters('storage_account_name')]"
- },
- {
- "name": "accountKey",
- "secretRef": "storage-key"
- },
- {
- "name": "containerName",
- "value": "[parameters('storage_container_name')]"
- }
- ]
- }
- ]
- }
- }
+ "type": "daprComponents",
+ "name": "statestore",
+ "apiVersion": "2022-01-01-preview",
+ "dependsOn": [
+ "[resourceId('Microsoft.App/managedEnvironments/', parameters('environment_name'))]"
+ ],
+ "properties": {
+ "componentType": "state.azure.blobstorage",
+ "version": "v1",
+ "ignoreErrors": false,
+ "initTimeout": "5s",
+ "secrets": [
+ {
+ "name": "storageaccountkey",
+ "value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts/', parameters('storage_account_name')), '2021-09-01').keys[0].value]"
+ }
+ ],
+ "metadata": [
+ {
+ "name": "accountName",
+ "value": "[parameters('storage_account_name')]"
+ },
+ {
+ "name": "containerName",
+ "value": "[parameters('storage_container_name')]"
+ },
+ {
+ "name": "accountKey",
+ "secretRef": "storageaccountkey"
+ }
+ ],
+ "scopes": ["nodeapp"]
+ }
+ }
+ ]
+ },
+ {
+ "type": "Microsoft.App/containerApps",
+ "apiVersion": "2022-01-01-preview",
+ "name": "nodeapp",
+ "location": "[parameters('location')]",
+ "dependsOn": [
+ "[resourceId('Microsoft.App/managedEnvironments/', parameters('environment_name'))]"
+ ],
+ "properties": {
+ "managedEnvironmentId": "[resourceId('Microsoft.App/managedEnvironments/', parameters('environment_name'))]",
+ "configuration": {
+ "ingress": {
+ "external": true,
+ "targetPort": 3000
+ },
+ "dapr": {
+ "enabled": true,
+ "appId": "nodeapp",
+ "appProcotol": "http",
+ "appPort": 3000
+ }
+ },
+ "template": {
+ "containers": [
+ {
+ "image": "dapriosamples/hello-k8s-node:latest",
+ "name": "hello-k8s-node",
+ "resources": {
+ "cpu": 0.5,
+ "memory": "1.0Gi"
+ }
}
+ ],
+ "scale": {
+ "minReplicas": 1,
+ "maxReplicas": 1
+ }
}
- ]
+ }
+ },
+ {
+ "type": "Microsoft.App/containerApps",
+ "apiVersion": "2022-01-01-preview",
+ "name": "pythonapp",
+ "location": "[parameters('location')]",
+ "dependsOn": [
+ "[resourceId('Microsoft.App/managedEnvironments/', parameters('environment_name'))]",
+ "[resourceId('Microsoft.App/containerApps/', 'nodeapp')]"
+ ],
+ "properties": {
+ "managedEnvironmentId": "[resourceId('Microsoft.App/managedEnvironments/', parameters('environment_name'))]",
+ "configuration": {
+ "dapr": {
+ "enabled": true,
+ "appId": "pythonapp"
+ }
+ },
+ "template": {
+ "containers": [
+ {
+ "image": "dapriosamples/hello-k8s-python:latest",
+ "name": "hello-k8s-python",
+ "resources": {
+ "cpu": 0.5,
+ "memory": "1.0Gi"
+ }
+ }
+ ],
+ "scale": {
+ "minReplicas": 1,
+ "maxReplicas": 1
+ }
+ }
+ }
+ }
+ ]
} ```
Save the following file as *serviceapp.json*:
### Create Azure Bicep templates
-Create two Bicep templates.
-
-Each Bicep template contains a container app definition and a Dapr component definition.
-
-The following example shows how your Bicep template should look when configured for your Azure Blob Storage account.
+Create a bicep template to deploy a Container Apps environment including the associated Log Analytics workspace and Application Insights resource for distributed tracing, a dapr component for the state store and the two dapr-enabled container apps.
-Save the following file as *serviceapp.bicep*:
+Save the following file as _hello-world.bicep_:
```bicep
-param location string = 'canadacentral'
param environment_name string
+param location string = 'canadacentral'
param storage_account_name string
-param storage_account_key string
param storage_container_name string
-resource nodeapp 'Microsoft.Web/containerapps@2021-03-01' = {
+var logAnalyticsWorkspaceName = 'logs-${environment_name}'
+var appInsightsName = 'appins-${environment_name}'
+
+resource logAnalyticsWorkspace'Microsoft.OperationalInsights/workspaces@2020-03-01-preview' = {
+ name: logAnalyticsWorkspaceName
+ location: location
+ properties: any({
+ retentionInDays: 30
+ features: {
+ searchVersion: 1
+ }
+ sku: {
+ name: 'PerGB2018'
+ }
+ })
+}
+
+resource appInsights 'Microsoft.Insights/components@2020-02-02' = {
+ name: appInsightsName
+ location: location
+ kind: 'web'
+ properties: {
+ Application_Type: 'web'
+ WorkspaceResourceId: logAnalyticsWorkspace.id
+ }
+}
+
+resource environment 'Microsoft.App/managedEnvironments@2022-01-01-preview' = {
+ name: environment_name
+ location: location
+ properties: {
+ daprAIInstrumentationKey: reference(appInsights.id, '2020-02-02').InstrumentationKey
+ appLogsConfiguration: {
+ destination: 'log-analytics'
+ logAnalyticsConfiguration: {
+ customerId: reference(logAnalyticsWorkspace.id, '2020-03-01-preview').customerId
+ sharedKey: listKeys(logAnalyticsWorkspace.id, '2020-03-01-preview').primarySharedKey
+ }
+ }
+ }
+ resource daprComponent 'daprComponents@2022-01-01-preview' = {
+ name: 'statestore'
+ properties: {
+ componentType: 'state.azure.blobstorage'
+ version: 'v1'
+ ignoreErrors: false
+ initTimeout: '5s'
+ secrets: [
+ {
+ name: 'storageaccountkey'
+ value: listKeys(resourceId('Microsoft.Storage/storageAccounts/', storage_account_name), '2021-09-01').keys[0].value
+ }
+ ]
+ metadata: [
+ {
+ name: 'accountName'
+ value: storage_account_name
+ }
+ {
+ name: 'containerName'
+ value: storage_container_name
+ }
+ {
+ name: 'accountKey'
+ secretRef: 'storageaccountkey'
+ }
+ ]
+ scopes: [
+ 'nodeapp'
+ ]
+ }
+ }
+}
+
+resource nodeapp 'Microsoft.App/containerApps@2022-01-01-preview' = {
name: 'nodeapp'
- kind: 'containerapp'
location: location properties: {
- kubeEnvironmentId: resourceId('Microsoft.Web/kubeEnvironments', environment_name)
+ managedEnvironmentId: environment.id
configuration: { ingress: { external: true targetPort: 3000 }
- secrets: [
- {
- name: 'storage-key'
- value: storage_account_key
- }
- ]
+ dapr: {
+ enabled: true
+ appId: 'nodeapp'
+ appProtocol: 'http'
+ appPort: 3000
+ }
} template: { containers: [
resource nodeapp 'Microsoft.Web/containerapps@2021-03-01' = {
name: 'hello-k8s-node' resources: { cpu: '0.5'
- memory: '1Gi'
+ memory: '1.0Gi'
} } ]
resource nodeapp 'Microsoft.Web/containerapps@2021-03-01' = {
minReplicas: 1 maxReplicas: 1 }
- dapr: {
- enabled: true
- appPort: 3000
- appId: 'nodeapp'
- components: [
- {
- name: 'statestore'
- type: 'state.azure.blobstorage'
- version: 'v1'
- metadata: [
- {
- name: 'accountName'
- value: storage_account_name
- }
- {
- name: 'accountKey'
- secretRef: 'storage-key'
- }
- {
- name: 'containerName'
- value: storage_container_name
- }
- ]
- }
- ]
- }
} } }
-```
--
-> [!NOTE]
-> Container Apps does not currently support the native [Dapr components schema](https://docs.dapr.io/operations/components/component-schema/). The above example uses the supported schema.
->
-> In a production-grade application, follow [secret management](https://docs.dapr.io/operations/components/component-secrets) instructions to securely manage your secrets.
--
-Save the following file as *clientapp.json*:
-
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-08-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "location": {
- "defaultValue": "canadacentral",
- "type": "String"
- },
- "environment_name": {
- "type": "String"
- }
- },
- "variables": {},
- "resources": [
- {
- "name": "pythonapp",
- "type": "Microsoft.Web/containerApps",
- "apiVersion": "2021-03-01",
- "kind": "containerapp",
- "location": "[parameters('location')]",
- "properties": {
- "kubeEnvironmentId": "[resourceId('Microsoft.Web/kubeEnvironments', parameters('environment_name'))]",
- "configuration": {},
- "template": {
- "containers": [
- {
- "image": "dapriosamples/hello-k8s-python:latest",
- "name": "hello-k8s-python",
- "resources": {
- "cpu": 0.5,
- "memory": "1Gi"
- }
- }
- ],
- "scale": {
- "minReplicas": 1,
- "maxReplicas": 1
- },
- "dapr": {
- "enabled": true,
- "appId": "pythonapp"
- }
- }
- }
- }
- ]
-}
-```
---
-Save the following file as *clientapp.bicep*:
-
-```bicep
-param location string = 'canadacentral'
-param environment_name string
-
-resource pythonapp 'Microsoft.Web/containerApps@2021-03-01' = {
+resource pythonapp 'Microsoft.App/containerApps@2022-01-01-preview' = {
name: 'pythonapp'
- kind: 'containerapp'
location: location properties: {
- kubeEnvironmentId: resourceId('Microsoft.Web/kubeEnvironments', environment_name)
- configuration: {}
+ managedEnvironmentId: environment.id
+ configuration: {
+ dapr: {
+ enabled: true
+ appId: 'pythonapp'
+ }
+ }
template: { containers: [ {
resource pythonapp 'Microsoft.Web/containerApps@2021-03-01' = {
name: 'hello-k8s-python' resources: { cpu: '0.5'
- memory: '1Gi'
+ memory: '1.0Gi'
} } ]
resource pythonapp 'Microsoft.Web/containerApps@2021-03-01' = {
minReplicas: 1 maxReplicas: 1 }
- dapr: {
- enabled: true
- appId: 'pythonapp'
- }
} }
+ dependsOn: [
+ nodeapp
+ ]
}
-
``` ::: zone-end
-## Deploy the service application (HTTP web server)
+> [!NOTE]
+> Container Apps does not currently support the native [Dapr components schema](https://docs.dapr.io/operations/components/component-schema/). The above example uses the supported schema.
+
+## Deploy
::: zone pivot="container-apps-arm"
-Now deploy the service Container App. Navigate to the directory in which you stored the ARM template file and run the following command:
+Navigate to the directory in which you stored the ARM template file and run the following command:
# [Bash](#tab/bash) ```azurecli az deployment group create \ --resource-group "$RESOURCE_GROUP" \
- --template-file ./serviceapp.json \
+ --template-file ./hello-world.json \
--parameters \ environment_name="$CONTAINERAPPS_ENVIRONMENT" \ location="$LOCATION" \ storage_account_name="$STORAGE_ACCOUNT" \
- storage_account_key="$STORAGE_ACCOUNT_KEY" \
storage_container_name="$STORAGE_ACCOUNT_CONTAINER" ```
$params = @{
environment_name = $CONTAINERAPPS_ENVIRONMENT location = $LOCATION storage_account_name = $STORAGE_ACCOUNT
- storage_account_key = $STORAGE_ACCOUNT_KEY
storage_container_name = $STORAGE_ACCOUNT_CONTAINER } New-AzResourceGroupDeployment ` -ResourceGroupName $RESOURCE_GROUP ` -TemplateParameterObject $params `
- -TemplateFile ./serviceapp.json `
- -SkipTemplateParameterPrompt
+ -TemplateFile ./hello-world.json `
+ -SkipTemplateParameterPrompt
``` ::: zone-end ::: zone pivot="container-apps-bicep"
-Now deploy the service container. Navigate to the directory in which you stored the Bicep template file and run the following command:
+Navigate to the directory in which you stored the Bicep template file and run the following command:
A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
A warning (BCP081) might be displayed. This warning has no effect on the success
```azurecli az deployment group create \ --resource-group "$RESOURCE_GROUP" \
- --template-file ./serviceapp.bicep \
+ --template-file ./hello-world.bicep \
--parameters \ environment_name="$CONTAINERAPPS_ENVIRONMENT" \ location="$LOCATION" \ storage_account_name="$STORAGE_ACCOUNT" \
- storage_account_key="$STORAGE_ACCOUNT_KEY" \
storage_container_name="$STORAGE_ACCOUNT_CONTAINER" ```
$params = @{
environment_name = $CONTAINERAPPS_ENVIRONMENT location = $LOCATION storage_account_name = $STORAGE_ACCOUNT
- storage_account_key = $STORAGE_ACCOUNT_KEY
storage_container_name = $STORAGE_ACCOUNT_CONTAINER } New-AzResourceGroupDeployment ` -ResourceGroupName $RESOURCE_GROUP ` -TemplateParameterObject $params `
- -TemplateFile ./serviceapp.bicep `
- -SkipTemplateParameterPrompt
+ -TemplateFile ./hello-world.bicep `
+ -SkipTemplateParameterPrompt
``` -- ::: zone-end This command deploys:
-* the service (Node) app server on `targetPort: 3000` (the app port)
-* its accompanying Dapr sidecar configured with `"appId": "nodeapp",` and dapr `"appPort": 3000,` for service discovery and invocation.
-
-Your state store is configured with the `components` object of `"type": "state.azure.blobstorage"`, which enables the sidecar to persist state.
-
-## Deploy the client application (headless client)
-
-Run the following command to deploy the client container.
--
-# [Bash](#tab/bash)
-
-```azurecli
-az deployment group create --resource-group "$RESOURCE_GROUP" \
- --template-file ./clientapp.json \
- --parameters \
- environment_name="$CONTAINERAPPS_ENVIRONMENT" \
- location="$LOCATION"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$params = @{
- environment_name = $CONTAINERAPPS_ENVIRONMENT
- location = $LOCATION
-}
-
-New-AzResourceGroupDeployment `
- -ResourceGroupName $RESOURCE_GROUP `
- -TemplateParameterObject $params `
- -TemplateFile ./clientapp.json `
- -SkipTemplateParameterPrompt
-```
---
-A warning (BCP081) might be displayed. This warning has no effect on the successful deployment of the application.
-
-# [Bash](#tab/bash)
-
-```azurecli
-az deployment group create --resource-group "$RESOURCE_GROUP" \
- --template-file ./clientapp.bicep \
- --parameters \
- environment_name="$CONTAINERAPPS_ENVIRONMENT" \
- location="$LOCATION"
-```
-
-# [PowerShell](#tab/powershell)
-
-```powershell
-$params = @{
- environment_name = $CONTAINERAPPS_ENVIRONMENT
- location = $LOCATION
-}
-
-New-AzResourceGroupDeployment `
- -ResourceGroupName $RESOURCE_GROUP `
- -TemplateParameterObject $params `
- -TemplateFile ./clientapp.bicep `
- -SkipTemplateParameterPrompt
-```
----
-This command deploys `pythonapp` that also runs with a Dapr sidecar that is used to look up and securely call the Dapr sidecar for `nodeapp`. As this app is headless there's no `targetPort` to start a server, nor is there a need to enable ingress.
+- the container apps environment and associated Log Analytics workspace for hosting the hello world dapr solution
+- an Application Insights instance for Dapr distributed tracing
+- the `nodeapp` app server running on `targetPort: 3000` with dapr enabled and configured using: `"appId": "nodeapp"` and `"appPort": 3000`
+- the `daprComponents` object of `"type": "state.azure.blobstorage"` scoped for use by the `nodeapp` for storing state
+- the headless `pythonapp` with no ingress and dapr enabled that calls the `nodeapp` service via dapr service-to-service communication
## Verify the result
This command deploys `pythonapp` that also runs with a Dapr sidecar that is used
You can confirm that the services are working correctly by viewing data in your Azure Storage account.
-1. Open the [Azure portal](https://portal.azure.com) in your browser and navigate to your storage account.
+1. Open the [Azure portal](https://portal.azure.com) in your browser.
+
+1. Navigate to your storage account.
1. Select **Containers** from the menu on the left side.
Use the following command to view logs in bash or PowerShell.
# [Bash](#tab/bash)
+```azurecli
+LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv`
+```
+ ```azurecli az monitor log-analytics query \
- --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
+ --workspace "$LOG_ANALYTICS_WORKSPACE_CLIENT_ID" \
--analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" \ --out table ``` # [PowerShell](#tab/powershell)
+```powershell
+$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv)
+```
+ ```powershell $queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" $queryResults.Results
container-apps Microservices Dapr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/microservices-dapr.md
Previously updated : 11/02/2021 Last updated : 03/22/2022 ms.devlang: azurecli
Individual container apps are deployed to an Azure Container Apps environment. T
az containerapp env create \ --name $CONTAINERAPPS_ENVIRONMENT \ --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location "$LOCATION" ```
az containerapp env create \
az containerapp env create ` --name $CONTAINERAPPS_ENVIRONMENT ` --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location "$LOCATION" ```
New-AzStorageAccount -ResourceGroupName $RESOURCE_GROUP `
-Once your Azure Blob Storage account is created, the following values are needed for subsequent steps in this tutorial.
-
-* `storage_account_name` is the value of the `STORAGE_ACCOUNT` variable that you set previously.
-
-* `storage_container_name` is the value of the `STORAGE_ACCOUNT_CONTAINER` variable. Dapr creates a container with this name when it doesn't already exist in your Azure Storage account.
- Get the storage account key with the following command: # [Bash](#tab/bash)
$STORAGE_ACCOUNT_KEY=(Get-AzStorageAccountKey -ResourceGroupName $RESOURCE_GROUP
### Configure the state store component
-Create a config file named *components.yaml* with the properties that you sourced from the previous steps. This file helps enable your Dapr app to access your state store. The following example shows how your *components.yaml* file should look when configured for your Azure Blob Storage account:
+Create a config file named *statestore.yaml* with the properties that you sourced from the previous steps. This file helps enable your Dapr app to access your state store. The following example shows how your *statestore.yaml* file should look when configured for your Azure Blob Storage account:
```yaml
-# components.yaml for Azure Blob storage component
-- name: statestore
- type: state.azure.blobstorage
- version: v1
- metadata:
- # Note that in a production scenario, account keys and secrets
- # should be securely stored. For more information, see
- # https://docs.dapr.io/operations/components/component-secrets
- - name: accountName
- secretRef: storage-account-name
- - name: accountKey
- secretRef: storage-account-key
- - name: containerName
- value: mycontainer
+# statestore.yaml for Azure Blob storage component
+componentType: state.azure.blobstorage
+version: v1
+metadata:
+- name: accountName
+ value: "<STORAGE_ACCOUNT>"
+- name: accountKey
+ secretRef: account-key
+- name: containerName
+ value: mycontainer
+secrets:
+- name: account-key
+ value: "<STORAGE_ACCOUNT_KEY>"
+scopes:
+- nodeapp
```
-To use this file, make sure to replace the value of `containerName` with your own value if you have changed `STORAGE_ACCOUNT_CONTAINER` variable from its original value, `mycontainer`.
+To use this file, update the placeholders:
+
+- Replace `<STORAGE_ACCOUNT>` with the value of the `STORAGE_ACCOUNT` variable you defined. To obtain its value, run the following command:
+ ```azurecli
+ echo $STORAGE_ACCOUNT
+ ```
+- Replace `<STORAGE_ACCOUNT_KEY>` with the storage account key. To obtain its value, run the following command:
+ ```azurecli
+ echo $STORAGE_ACCOUNT_KEY
+ ```
+
+If you've changed the `STORAGE_ACCOUNT_CONTAINER` variable from its original value, `mycontainer`, replace the value of `containerName` with your own value.
> [!NOTE] > Container Apps does not currently support the native [Dapr components schema](https://docs.dapr.io/operations/components/component-schema/). The above example uses the supported schema.
+Navigate to the directory in which you stored the *statestore.yaml* file and run the following command to configure the Dapr component in the Container Apps environment.
-## Deploy the service application (HTTP web server)
+# [Bash](#tab/bash)
+
+```azurecli
+az containerapp env dapr-component set \
+ --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP \
+ --dapr-component-name statestore \
+ --yaml statestore.yaml
+```
-Navigate to the directory in which you stored the *components.yaml* file and run the following command to deploy the service container app.
+# [PowerShell](#tab/powershell)
+
+```powershell
+az containerapp env dapr-component set `
+ --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP `
+ --dapr-component-name statestore `
+ --yaml statestore.yaml
+```
+++
+Your state store is configured using the Dapr component described in *statestore.yaml*. The component is scoped to a container app named `nodeapp` and is not available to other container apps.
+
+## Deploy the service application (HTTP web server)
# [Bash](#tab/bash)
az containerapp create \
--max-replicas 1 \ --enable-dapr \ --dapr-app-port 3000 \
- --dapr-app-id nodeapp \
- --secrets "storage-account-name=${STORAGE_ACCOUNT},storage-account-key=${STORAGE_ACCOUNT_KEY}" \
- --dapr-components ./components.yaml
+ --dapr-app-id nodeapp
``` # [PowerShell](#tab/powershell)
az containerapp create `
--max-replicas 1 ` --enable-dapr ` --dapr-app-port 3000 `
- --dapr-app-id nodeapp `
- --secrets "storage-account-name=${STORAGE_ACCOUNT},storage-account-key=${STORAGE_ACCOUNT_KEY}" `
- --dapr-components ./components.yaml
+ --dapr-app-id nodeapp
```
This command deploys:
* the service (Node) app server on `--target-port 3000` (the app port) * its accompanying Dapr sidecar configured with `--dapr-app-id nodeapp` and `--dapr-app-port 3000` for service discovery and invocation
-Your state store is configured using `--dapr-components ./components.yaml`, which enables the sidecar to persist state.
-- ## Deploy the client application (headless client) Run the following command to deploy the client container app.
Use the following CLI command to view logs on the command line.
# [Bash](#tab/bash) ```azurecli
+LOG_ANALYTICS_WORKSPACE_CLIENT_ID=`az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv`
+ az monitor log-analytics query \ --workspace $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \ --analytics-query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" \
az monitor log-analytics query \
# [PowerShell](#tab/powershell) ```powershell
+$LOG_ANALYTICS_WORKSPACE_CLIENT_ID=(az containerapp env show --name $CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP --query properties.appLogsConfiguration.logAnalyticsConfiguration.customerId --out tsv)
+ $queryResults = Invoke-AzOperationalInsightsQuery -WorkspaceId $LOG_ANALYTICS_WORKSPACE_CLIENT_ID -Query "ContainerAppConsoleLogs_CL | where ContainerAppName_s == 'nodeapp' and (Log_s contains 'persisted' or Log_s contains 'order') | project ContainerAppName_s, Log_s, TimeGenerated | take 5" $queryResults.Results ```
container-apps Revisions Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/revisions-manage.md
Activate a revision by using `az containerapp revision activate`.
```azurecli az containerapp revision activate \
- --name <REVISION_NAME> \
- --app <CONTAINER_APP_NAME> \
+ --revision <REVISION_NAME> \
+ --name <CONTAINER_APP_NAME> \
--resource-group <RESOURCE_GROUP_NAME> ```
az containerapp revision activate \
```poweshell az containerapp revision activate `
- --name <REVISION_NAME> `
- --app <CONTAINER_APP_NAME> `
+ --revision <REVISION_NAME> `
+ --name <CONTAINER_APP_NAME> `
--resource-group <RESOURCE_GROUP_NAME> ```
Deactivate revisions that are no longer in use with `az container app revision d
```azurecli az containerapp revision deactivate \
- --name <REVISION_NAME> \
- --app <CONTAINER_APP_NAME> \
+ --revision <REVISION_NAME> \
+ --name <CONTAINER_APP_NAME> \
--resource-group <RESOURCE_GROUP_NAME> ```
az containerapp revision deactivate \
```azurecli az containerapp revision deactivate `
- --name <REVISION_NAME> `
- --app <CONTAINER_APP_NAME> `
+ --revision <REVISION_NAME> `
+ --name <CONTAINER_APP_NAME> `
--resource-group <RESOURCE_GROUP_NAME> ```
All existing container apps revisions will not have access to this secret until
```azurecli az containerapp revision restart \
- --name <REVISION_NAME> \
- --app <APPLICATION_NAME> \
+ --revision <REVISION_NAME> \
+ --name <APPLICATION_NAME> \
--resource-group <RESOURCE_GROUP_NAME> ```
az containerapp revision restart \
```azurecli az containerapp revision restart `
- --name <REVISION_NAME> `
- --app <APPLICATION_NAME> `
+ --revision <REVISION_NAME> `
+ --name <APPLICATION_NAME> `
--resource-group <RESOURCE_GROUP_NAME> ```
container-apps Vnet Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/vnet-custom.md
Finally, create the Container Apps environment with the VNET and subnets.
az containerapp env create \ --name $CONTAINERAPPS_ENVIRONMENT \ --resource-group $RESOURCE_GROUP \
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID \
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET \
--location "$LOCATION" \ --app-subnet-resource-id $APP_SUBNET \ --controlplane-subnet-resource-id $CONTROL_PLANE_SUBNET
az containerapp env create \
az containerapp env create ` --name $CONTAINERAPPS_ENVIRONMENT ` --resource-group $RESOURCE_GROUP `
- --logs-workspace-id $LOG_ANALYTICS_WORKSPACE_CLIENT_ID `
- --logs-workspace-key $LOG_ANALYTICS_WORKSPACE_CLIENT_SECRET `
--location "$LOCATION" ` --app-subnet-resource-id $APP_SUBNET ` --controlplane-subnet-resource-id $CONTROL_PLANE_SUBNET
az containerapp env create `
> [!NOTE] > As you call `az containerapp create` to create the container app inside your environment, make sure the value for the `--image` parameter is in lower case.
-The following table describes the parameters used in for `containerapp env create`.
+The following table describes the parameters used in `containerapp env create`.
| Parameter | Description | ||| | `name` | Name of the container apps environment. | | `resource-group` | Name of the resource group. |
-| `logs-workspace-id` | The ID of the Log Analytics workspace. |
-| `logs-workspace-key` | The Log Analytics client secret. |
| `location` | The Azure location where the environment is to deploy. | | `app-subnet-resource-id` | The resource ID of a subnet where containers are injected into the container app. This subnet must be in the same VNET as the subnet defined in `--control-plane-subnet-resource-id`. | | `controlplane-subnet-resource-id` | The resource ID of a subnet for control plane infrastructure components. This subnet must be in the same VNET as the subnet defined in `--app-subnet-resource-id`. |
cosmos-db Configure Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/configure-synapse-link.md
Serverless SQL pool allows you to query and analyze data in your Azure Cosmos DB
## <a id="analyze-with-powerbi"></a>Use serverless SQL pool to analyze and visualize data in Power BI
-You can create a serverless SQL pool database and views over Synapse Link for Azure Cosmos DB. Later you can query the Azure Cosmos DB containers and then build a model with Power BI over those views to reflect that query. There is no performance or cost impact to your transactional workloads, and no complexity of managing ETL pipelines. You can use either [DirectQuery](/power-bi/connect-dat) article.
+You can use integrated BI experience in Azure Cosmos DB portal, to build BI dashboards using Synapse Link with just a few clicks. To learn more, see [how to build BI dashboards using Synapse Link](integrated-power-bi-synapse-link.md). This integrated experience will create simple T-SQL views in Synapse serverless SQL pools, for your Cosmos DB containers. You can build BI dashboards over these views, which will query your Azure Cosmos DB containers in real-time, using [Direct Query](/power-bi/connect-data/service-dataset-modes-understand#directquery-mode), reflecting latest changes to your data. There is no performance or cost impact to your transactional workloads, and no complexity of managing ETL pipelines.
+
+If you want to use advance T-SQL views with joins across your containers or build BI dashboards in import](/power-bi/connect-dat) article.
## Configure custom partitioning
cosmos-db How To Always Encrypted https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/how-to-always-encrypted.md
Title: Use client-side encryption with Always Encrypted for Azure Cosmos DB
description: Learn how to use client-side encryption with Always Encrypted for Azure Cosmos DB Previously updated : 01/26/2022 Last updated : 03/30/2022
-# Use client-side encryption with Always Encrypted for Azure Cosmos DB (Preview)
+# Use client-side encryption with Always Encrypted for Azure Cosmos DB
[!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
+> [!IMPORTANT]
+> A breaking change has been introduced with the 1.0 release of our encryption packages. If you created data encryption keys and encryption-enabled containers with prior versions, you will need to re-create your databases and containers after migrating your client code to 1.0 packages.
+ Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national identification numbers (for example, U.S. social security numbers), stored in Azure Cosmos DB. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the database. Always Encrypted brings client-side encryption capabilities to Azure Cosmos DB. Encrypting your data client-side can be required in the following scenarios:
Always Encrypted brings client-side encryption capabilities to Azure Cosmos DB.
- **Protecting sensitive data that has specific confidentiality characteristics**: Always Encrypted allows clients to encrypt sensitive data inside their applications and never reveal the plain text data or encryption keys to the Azure Cosmos DB service. - **Implementing per-property access control**: Because the encryption is controlled with keys that you own and manage from Azure Key Vault, you can apply access policies to control which sensitive properties each client has access to.
-> [!IMPORTANT]
-> Always Encrypted for Azure Cosmos DB is currently in preview. This preview version is provided without a Service Level Agreement and is not recommended for production workloads. For more information, see [Supplemental terms of use for Microsoft Azure previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-## Prerequisites
-
-To start using the preview of Always Encrypted for Azure Cosmos DB, you can:
--- Use the 2.11.13.0 or higher version of [Azure Cosmos DB local emulator](local-emulator.md).-- Request the preview to be enabled on your Azure Cosmos DB account by filling [this form](https://ncv.microsoft.com/poTcF52I6N).-
-> [!TIP]
-> Do you have any feedback to share regarding the preview of Always Encrypted for Azure Cosmos DB? Reach out to [azurecosmosdbcse@service.microsoft.com](mailto:azurecosmosdbcse@service.microsoft.com).
- ## Concepts Always Encrypted for Azure Cosmos DB introduces some new concepts that are involved in the configuration of your client-side encryption.
You can:
#### Customer-managed keys
-Before DEKs get stored in Azure Cosmos DB, they are wrapped by a customer-managed key (CMK). By controlling the wrapping and unwrapping of DEKs, CMKs effectively control the access to the data that's encrypted with their corresponding DEKs. CMK storage is designed as an extensible/plug-in model, with a default implementation that expects them to be stored in Azure Key Vault.
+Before DEKs get stored in Azure Cosmos DB, they are wrapped by a customer-managed key (CMK). By controlling the wrapping and unwrapping of DEKs, CMKs effectively control the access to the data that's encrypted with their corresponding DEKs. CMK storage is designed as an extensible, with a default implementation that expects them to be stored in Azure Key Vault.
:::image type="content" source="./media/how-to-always-encrypted/encryption-keys.png" alt-text="Encryption keys" border="true":::
If you're using an existing Azure Key Vault instance, you can verify that these
> - In **.NET** with the [Microsoft.Azure.Cosmos.Encryption package](https://www.nuget.org/packages/Microsoft.Azure.Cosmos.Encryption). > - In **Java** with the [azure.cosmos.encryption package](https://mvnrepository.com/artifact/com.azure/azure-cosmos-encryption).
-To use Always Encrypted, an instance of an `EncryptionKeyWrapProvider` must be attached to your Azure Cosmos DB SDK instance. This object is used to interact with the key store hosting your CMKs. The default key store provider for Azure Key Vault is named `AzureKeyVaultKeyWrapProvider`.
-
-The following snippets use the `DefaultAzureCredential` class to retrieve the Azure AD identity to use when accessing your Azure Key Vault instance. You can find examples of creating different kinds of `TokenCredential` classes:
+# [.NET](#tab/dotnet)
-- [In .NET](/dotnet/api/overview/azure/identity-readme#credential-classes)-- [In Java](/java/api/overview/azure/identity-readme#credential-classes)
+To use Always Encrypted, an instance of a `KeyResolver` must be attached to your Azure Cosmos DB SDK instance. This class, defined in the `Azure.Security.KeyVault.Keys.Cryptography` namespace, is used to interact with the key store hosting your CMKs.
-# [.NET](#tab/dotnet)
+The following snippets use the `DefaultAzureCredential` class to retrieve the Azure AD identity to use when accessing your Azure Key Vault instance. You can find examples of creating different kinds of `TokenCredential` classes [here](/dotnet/api/overview/azure/identity-readme#credential-classes).
> [!NOTE] > You will need the additional [Azure.Identity package](https://www.nuget.org/packages/Azure.Identity/) to access the `TokenCredential` classes. ```csharp var tokenCredential = new DefaultAzureCredential();
-var keyWrapProvider = new AzureKeyVaultKeyWrapProvider(tokenCredential);
+var keyResolver = new KeyResolver(tokenCredential);
var client = new CosmosClient("<connection-string>")
- .WithEncryption(keyStoreProvider);
+ .WithEncryption(keyResolver, KeyEncryptionKeyResolverName.AzureKeyVault);
``` # [Java](#tab/java)
+To use Always Encrypted, an instance of a `KeyEncryptionKeyClientBuilder` must be attached to your Azure Cosmos DB SDK instance. This class, defined in the `com.azure.security.keyvault.keys.cryptography` namespace, is used to interact with the key store hosting your CMKs.
+
+The following snippets use the `DefaultAzureCredential` class to retrieve the Azure AD identity to use when accessing your Azure Key Vault instance. You can find examples of creating different kinds of `TokenCredential` classes [here](/java/api/overview/azure/identity-readme#credential-classes).
+ ```java TokenCredential tokenCredential = new DefaultAzureCredentialBuilder() .build();
-AzureKeyVaultKeyStoreProvider encryptionKeyStoreProvider =
- new AzureKeyVaultKeyStoreProvider(tokenCredential);
+KeyEncryptionKeyClientBuilder keyEncryptionKeyClientBuilder =
+ new KeyEncryptionKeyClientBuilder().credential(tokenCredentials);
CosmosAsyncClient client = new CosmosClientBuilder() .endpoint("<endpoint>") .key("<primary-key>") .buildAsyncClient();
-EncryptionAsyncCosmosClient encryptionClient =
- EncryptionAsyncCosmosClient.buildEncryptionAsyncClient(client, encryptionKeyStoreProvider);
+CosmosEncryptionAsyncClient cosmosEncryptionAsyncClient =
+ new CosmosEncryptionClientBuilder().cosmosAsyncClient(client).keyEncryptionKeyResolver(keyEncryptionKeyClientBuilder)
+ .keyEncryptionKeyResolverName(CosmosEncryptionClientBuilder.KEY_RESOLVER_NAME_AZURE_KEY_VAULT).buildAsyncClient();
``` ## Create a data encryption key
-Before data can be encrypted in a container, a [data encryption key](#data-encryption-keys) must be created in the parent database. This operation is done by calling the `CreateClientEncryptionKeyAsync` method and passing:
+Before data can be encrypted in a container, a [data encryption key](#data-encryption-keys) must be created in the parent database.
+
+# [.NET](#tab/dotnet)
+
+Creating a new data encryption key is done by calling the `CreateClientEncryptionKeyAsync` method and passing:
- A string identifier that will uniquely identify the key in the database. - The encryption algorithm intended to be used with the key. Only one algorithm is currently supported.-- The key identifier of the [CMK](#customer-managed-keys) stored in Azure Key Vault. This parameter is passed in a generic `EncryptionKeyWrapMetadata` object where the `name` can be any friendly name you want, and the `value` must be the key identifier.-
-# [.NET](#tab/dotnet)
+- The key identifier of the [CMK](#customer-managed-keys) stored in Azure Key Vault. This parameter is passed in a generic `EncryptionKeyWrapMetadata` object where:
+ - The `type` defines the type of key resolver (for example, Azure Key Vault).
+ - The `name` can be any friendly name you want.
+ - The `value` must be the key identifier.
+ - The `algorithm` defines which algorithm shall be used to wrap the key encryption key with the customer-managed key.
```csharp var database = client.GetDatabase("my-database"); await database.CreateClientEncryptionKeyAsync( "my-key",
- DataEncryptionKeyAlgorithm.AeadAes256CbcHmacSha256,
+ DataEncryptionAlgorithm.AeadAes256CbcHmacSha256,
new EncryptionKeyWrapMetadata(
- keyWrapProvider.ProviderName,
+ KeyEncryptionKeyResolverName.AzureKeyVault,
"akvKey",
- "https://<my-key-vault>.vault.azure.net/keys/<key>/<version>"));
+ "https://<my-key-vault>.vault.azure.net/keys/<key>/<version>",
+ EncryptionAlgorithm.RsaOaep.ToString()));
``` # [Java](#tab/java)
+Creating a new data encryption key is done by calling the `createClientEncryptionKey` method and passing:
+
+- A string identifier that will uniquely identify the key in the database.
+- The encryption algorithm intended to be used with the key. Only one algorithm is currently supported.
+- The key identifier of the [CMK](#customer-managed-keys) stored in Azure Key Vault. This parameter is passed in a generic `EncryptionKeyWrapMetadata` object where:
+ - The `type` defines the type of key resolver (for example, Azure Key Vault).
+ - The `name` can be any friendly name you want.
+ - The `value` must be the key identifier.
+ - The `algorithm` defines which algorithm shall be used to wrap the key encryption key with the customer-managed key.
+ ```java
-EncryptionCosmosAsyncDatabase database =
- client.getEncryptedCosmosAsyncDatabase("my-database");
+CosmosEncryptionAsyncDatabase database =
+ cosmosEncryptionAsyncClient.getCosmosEncryptionAsyncDatabase("my-database");
+EncryptionKeyWrapMetadata metadata = new EncryptionKeyWrapMetadata(
+ cosmosEncryptionAsyncClient.getKeyEncryptionKeyResolverName(),
+ "akvKey",
+ "https://<my-key-vault>.vault.azure.net/keys/<key>/<version>",
+ EncryptionAlgorithm.RSA_OAEP.toString());
database.createClientEncryptionKey( "my-key",
- CosmosEncryptionAlgorithm.AEAES_256_CBC_HMAC_SHA_256,
- new EncryptionKeyWrapMetadata(
- "akvKey",
- "https://<my-key-vault>.vault.azure.net/keys/<key>/<version>"));
+ CosmosEncryptionAlgorithm.AEAD_AES_256_CBC_HMAC_SHA256.getName(),
+ metadata);
```
await database.DefineContainer("my-container", "/partition-key")
```java ClientEncryptionIncludedPath path1 = new ClientEncryptionIncludedPath();
-path1.clientEncryptionKeyId = "my-key":
-path1.path = "/property1";
-path1.encryptionType = CosmosEncryptionType.DETERMINISTIC;
-path1.encryptionAlgorithm = CosmosEncryptionAlgorithm.AEAES_256_CBC_HMAC_SHA_256;
+path1.setClientEncryptionKeyId("my-key"):
+path1.setPath("/property1");
+path1.setEncryptionType(CosmosEncryptionType.DETERMINISTIC.getName());
+path1.setEncryptionAlgorithm(CosmosEncryptionAlgorithm.AEAES_256_CBC_HMAC_SHA_256.getName());
ClientEncryptionIncludedPath path2 = new ClientEncryptionIncludedPath();
-path2.clientEncryptionKeyId = "my-key":
-path2.path = "/property2";
-path2.encryptionType = CosmosEncryptionType.RANDOMIZED;
-path2.encryptionAlgorithm = CosmosEncryptionAlgorithm.AEAES_256_CBC_HMAC_SHA_256;
+path2.setClientEncryptionKeyId("my-key"):
+path2.setPath("/property2");
+path2.setEncryptionType(CosmosEncryptionType.RANDOMIZED.getName());
+path2.setEncryptionAlgorithm(CosmosEncryptionAlgorithm.AEAES_256_CBC_HMAC_SHA_256.getName());
List<ClientEncryptionIncludedPath> paths = new ArrayList<>(); paths.add(path1);
Note that the resolution of encrypted properties and their subsequent decryption
### Filter queries on encrypted properties
-When writing queries that filter on encrypted properties, the `AddParameterAsync` method must be used to pass the value of the query parameter. This method takes the following arguments:
+When writing queries that filter on encrypted properties, a specific method must be used to pass the value of the query parameter. This method takes the following arguments:
- The name of the query parameter. - The value to use in the query.
await queryDefinition.AddParameterAsync(
# [Java](#tab/java) ```java
-EncryptionSqlQuerySpec encryptionSqlQuerySpec = new EncryptionSqlQuerySpec(
- new SqlQuerySpec("SELECT * FROM c where c.property1 = @Property1"), container);
-encryptionSqlQuerySpec.addEncryptionParameterAsync(
- new SqlParameter("@Property1", 1234), "/property1")
+SqlQuerySpecWithEncryption sqlQuerySpecWithEncryption = new SqlQuerySpecWithEncryption(
+ new SqlQuerySpec("SELECT * FROM c where c.property1 = @Property1"));
+sqlQuerySpecWithEncryption.addEncryptionParameter(
+ "/property1", new SqlParameter("@Property1", 1234))
```
You may want to "rotate" your CMK (that is, use a new CMK instead of the current
await database.RewrapClientEncryptionKeyAsync( "my-key", new EncryptionKeyWrapMetadata(
- keyWrapProvider.ProviderName,
+ KeyEncryptionKeyResolverName.AzureKeyVault,
"akvKey",
- " https://<my-key-vault>.vault.azure.net/keys/<new-key>/<version>"));
+ "https://<my-key-vault>.vault.azure.net/keys/<new-key>/<version>",
+ EncryptionAlgorithm.RsaOaep.ToString()));
``` # [Java](#tab/java) ```java
-database. rewrapClientEncryptionKey(
+EncryptionKeyWrapMetadata metadata = new EncryptionKeyWrapMetadata(
+ cosmosEncryptionAsyncClient.getKeyEncryptionKeyResolverName(),
+ "akvKey",
+ "https://<my-key-vault>.vault.azure.net/keys/<new-key>/<version>",
+ EncryptionAlgorithm.RSA_OAEP.toString());
+database.rewrapClientEncryptionKey(
"my-key",
- new EncryptionKeyWrapMetadata(
- "akvKey", " https://<my-key-vault>.vault.azure.net/keys/<new-key>/<version>"));
+ metadata);
```
+## DEK rotation
+
+Performing a rotation of a data encryption key isn't offered as a turnkey capability. This is because updating a DEK requires a scan of all containers where this key is used and a re-encryption of all properties encrypted with this key. This operation can only happen client-side as the Azure Cosmos DB service does not store or ever accesses the plain text value of the DEK.
+
+In practice, a DEK rotation can be done by performing a data migration from the impacted containers to new ones. The new containers can be created the exact same way as the original ones. To help you with such a data migration, you can find [a standalone migration tool on GitHub](https://github.com/Azure/azure-cosmos-dotnet-v3/tree/master/Microsoft.Azure.Cosmos.Samples/Usage/ReEncryption).
+
+## Adding additional encrypted properties
+
+Adding additional encrypted properties to an existing encryption policy isn't supported for the same reasons explained in the section just above. This operation requires a full scan of the container to ensure that all instances of the properties are properly encrypted, and this is an operation that can only happen client-side. Just like a DEK rotation, adding additional encrypted properties can be done by performing a data migration to a new container with an appropriate encryption policy.
+
+If you have flexibility in the way new encrypted properties can be added from a schema standpoint, you can also leverage the schema-agnostic nature of Azure Cosmos DB. If you use a property defined in your encryption policy as a "property bag", you can add more properties below with no constraint. For example, let's imagine that `property1` is defined in your encryption policy and you initially write `property1.property2` in your documents. If, at a later stage, you need to add `property3` as an encrypted property, you can start writing `property1.property3` in your documents and the new property will automatically be encrypted as well. This approach doesn't require any data migration.
+ ## Next steps -- Get an overview of [secure access to data in Cosmos DB](secure-access-to-data.md).-- Learn more about [customer-managed keys](how-to-setup-cmk.md)
+- Get an overview of [secure access to data in Azure Cosmos DB](secure-access-to-data.md).
+- Learn more about [customer-managed keys for encryption-at-rest](how-to-setup-cmk.md)
cosmos-db Integrated Power Bi Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/integrated-power-bi-synapse-link.md
Use the following steps to build a Power BI report from Azure Cosmos DB data in
## Next steps
-* Learn more about [Azure Synapse Link for Azure Cosmos DB](synapse-link.md)
* [Connect serverless SQL pool to Power BI Desktop & create report](../synapse-analytics/sql/tutorial-connect-power-bi-desktop.md#prerequisites)
+* [Use Power BI and serverless Synapse SQL pool to analyze Azure Cosmos DB data with Synapse Link](synapse-link-power-bi.md)
cosmos-db Secure Access To Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/secure-access-to-data.md
Last updated 08/30/2021-+ # Secure access to data in Azure Cosmos DB
CosmosClient client = new CosmosClient(accountEndpoint: "MyEndpoint", authKeyOrR
To add Azure Cosmos DB account reader access to your user account, have a subscription owner perform the following steps in the Azure portal. 1. Open the Azure portal, and select your Azure Cosmos DB account.
-2. Click the **Access control (IAM)** tab, and then click **+ Add role assignment**.
-3. In the **Add role assignment** pane, in the **Role** box, select **Cosmos DB Account Reader Role**.
-4. In the **Assign access to box**, select **Azure AD user, group, or application**.
-5. Select the user, group, or application in your directory to which you wish to grant access. You can search the directory by display name, email address, or object identifiers.
- The selected user, group, or application appears in the selected members list.
-6. Click **Save**.
+
+1. Select **Access control (IAM)**.
+
+1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
+
+1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
+
+ | Setting | Value |
+ | | |
+ | Role | Cosmos DB Account Reader |
+ | Assign access to | User, group, or service principal |
+ | Members | The user, group, or application in your directory to which you wish to grant access. |
+
+ ![Screenshot that shows Add role assignment page in Azure portal.](../../includes/role-based-access-control/media/add-role-assignment-page.png)
The entity can now read Azure Cosmos DB resources.
cosmos-db Create Real Time Weather Dashboard Powerbi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/create-real-time-weather-dashboard-powerbi.md
- Title: Create a real-time dashboard using Azure Cosmos DB, Azure Analysis Services, and Power BI
-description: Learn how to create a live weather dashboard in Power BI using Azure Cosmos DB and Azure Analysis Services.
----- Previously updated : 09/04/2019----
-# Create a real-time dashboard using Azure Cosmos DB and Power BI
-
-This article describes the steps required to create a live weather dashboard in Power BI using Azure Cosmos DB OLTP connector and Azure Analysis Services. The Power BI dashboard will display charts to show near real-time information about temperature and rainfall in a region.
-
-Another option is to create near real-time reports using [Azure Synapse Link for Azure Cosmos DB](../synapse-link.md). With Azure Synapse Link, you can connect Power BI to analyze your Azure Cosmos DB data, with no performance or cost impact to your transactional workloads, and no ETL pipelines. You can use either [DirectQuery](/power-bi/connect-dat).
--
-## Reporting scenarios
-
-There are multiple ways to set up reporting dashboards on data stored in Azure Cosmos DB. Depending on the staleness requirements and the size of the data, the following table describes the reporting setup for each scenario:
--
-|Scenario |Setup |
-|||
-|1. Generating real time reports on large data sets with aggregates | **Option 1:** [Power BI and Azure Synapse Link with DirectQuery](../synapse-link-power-bi.md)<br /> **Option 2:** [Power BI and Spark connector with DirectQuery + Azure Databricks + Azure Cosmos DB Spark connector.](https://github.com/Azure/azure-cosmosdb-spark/wiki/Connecting-Cosmos-DB-with-PowerBI-using-spark-and-databricks-premium)<br /> **Option 3:** Power BI and Azure Analysis Services connector with DirectQuery + Azure Analysis Services + Azure Databricks + Cosmos DB Spark connector. |
-|2. Generating real time reports on large data sets (>= 10 GB) | **Option 1:** [Power BI and Azure Synapse Link with DirectQuery](../synapse-link-power-bi.md)<br /> **Option 2:** [Power BI and Azure Analysis Services connector with DirectQuery + Azure Analysis Services](create-real-time-weather-dashboard-powerbi.md) |
-|3. Generating ad-hoc reports on large data sets (< 10 GB) | [Power BI Azure Cosmos DB connector with import mode and incremental refresh](create-real-time-weather-dashboard-powerbi.md) |
-|4. Generating ad-hoc reports with periodic refresh | [Power BI Azure Cosmos DB connector with import mode (Scheduled periodic refresh)](powerbi-visualize.md) |
-|5. Generating ad-hoc reports (no refresh) | [Power BI Azure Cosmos DB connector with import mode](powerbi-visualize.md) |
--
-Scenarios 4 and 5 can be easily set up [using the Azure Cosmos DB Power BI connector](powerbi-visualize.md). This article describes below the setups for scenarios 2 (Option 2) and 3.
-
-### Power BI with incremental refresh
-
-Power BI has a mode where incremental refresh can be configured. This mode eliminates the need to create and manage Azure Analysis Services partitions. Incremental refresh can be set up to filter only the latest updates in large datasets. However, this mode works only with Power BI Premium service that has a dataset limitation of 10 GB.
-
-### Power BI Azure Analysis connector + Azure Analysis Services
-
-Azure Analysis Services provides a fully managed platform as a service that hosts enterprise-grade data models in the cloud. Massive data sets can be loaded from Azure Cosmos DB into Azure Analysis Services. To avoid querying the entire dataset all the time, the datasets can be subdivided into Azure Analysis Services partitions, which can be refreshed independently at different frequencies.
-
-## Power BI incremental refresh
-
-### Ingest weather data into Azure Cosmos DB
-
-Set up an ingestion pipeline to load [weather data](https://catalog.data.gov/dataset?groups=climate5434&#topic=climate_navigation) to Azure Cosmos DB. You can set up an [Azure Data Factory (ADF)](../../data-factory/connector-azure-cosmos-db.md) job to periodically load the latest weather data into Azure Cosmos DB using the HTTP Source and Cosmos DB sink.
--
-### Connect Power BI to Azure Cosmos DB
-
-1. **Connect Azure Cosmos account to Power BI** - Open the Power BI Desktop and use the Azure Cosmos DB connector to select the right database and container.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/cosmosdb-powerbi-connector.png" alt-text="Azure Cosmos DB Power BI connector":::
-
-1. **Configure incremental refresh** - Follow the steps in [incremental refresh with Power BI](/power-bi/service-premium-incremental-refresh) article to configure incremental refresh for the dataset. Add the **RangeStart** and **RangeEnd** parameters as shown in the following screenshot:
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/configure-range-parameters.png" alt-text="Configure range parameters":::
-
- Since the dataset has a Date column that is in text form, the **RangeStart** and **RangeEnd** parameters should be transformed to use the following filter. In the **Advanced Editor** pane, modify your query add the following text to filter the rows based on the RangeStart and RangeEnd parameters:
-
- ```
- #"Filtered Rows" = Table.SelectRows(#"Expanded Document", each [Document.date] > DateTime.ToText(RangeStart,"yyyy-MM-dd") and [Document.date] < DateTime.ToText(RangeEnd,"yyyy-MM-dd"))
- ```
-
- Depending on which column and data type is present in the source dataset, you can change the RangeStart and RangeEnd fields accordingly
-
-
- |Property |Data type |Filter |
- ||||
- |_ts | Numeric | [_ts] > Duration.TotalSeconds(RangeStart - #datetime(1970, 1, 1, 0, 0, 0)) and [_ts] < Duration.TotalSeconds(RangeEnd - #datetime(1970, 1, 1, 0, 0, 0))) |
- |Date (for example:- 2019-08-19) | String | [Document.date]> DateTime.ToText(RangeStart,"yyyy-MM-dd") and [Document.date] < DateTime.ToText(RangeEnd,"yyyy-MM-dd") |
- |Date (for example:- 2019-08-11 12:00:00) | String | [Document.date]> DateTime.ToText(RangeStart," yyyy-mm-dd HH:mm:ss") and [Document.date] < DateTime.ToText(RangeEnd,"yyyy-mm-dd HH:mm:ss") |
--
-1. **Define the refresh policy** - Define the refresh policy by navigating to the **Incremental refresh** tab on the **context** menu for the table. Set the refresh policy to refresh **every day** and store the last month data.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/define-refresh-policy.png" alt-text="Define refresh policy":::
-
- Ignore the warning that says *the M query cannot be confirmed to be folded*. The Azure Cosmos DB connector folds filter queries.
-
-1. **Load the data and generate the reports** - By using the data you have loaded earlier, create the charts to report on temperature and rainfall.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/load-data-generate-report.png" alt-text="Load data and generate report":::
-
-1. **Publish the report to Power BI premium** - Since incremental refresh is a Premium only feature, the publish dialog only allows selection of a workspace on Premium capacity. The first refresh may take longer to import the historical data. Subsequent data refreshes are much quicker because they use incremental refresh.
--
-## Power BI Azure Analysis connector + Azure Analysis Services
-
-### Ingest weather data into Azure Cosmos DB
-
-Set up an ingestion pipeline to load [weather data](https://catalog.data.gov/dataset?groups=climate5434&#topic=climate_navigation) to Azure Cosmos DB. You can set up an Azure Data Factory(ADF) job to periodically load the latest weather data into Azure Cosmos DB using the HTTP Source and Cosmos DB Sink.
-
-### Connect Azure Analysis Services to Azure Cosmos account
-
-1. **Create a new Azure Analysis Services cluster** - [Create an instance of Azure Analysis services](../../analysis-services/analysis-services-create-server.md) in the same region as the Azure Cosmos account and the Databricks cluster.
-
-1. **Create a new Analysis Services Tabular Project in Visual Studio** - [Install the SQL Server Data Tools (SSDT)](/sql/ssdt/download-sql-server-data-tools-ssdt) and create an Analysis Services Tabular project in Visual Studio.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/create-analysis-services-project.png" alt-text="Create Azure Analysis Services project":::
-
- Choose the **Integrated Workspace** instance and the set the Compatibility Level to **SQL Server 2017 / Azure Analysis Services (1400)**
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/tabular-model-designer.png" alt-text="Azure Analysis Services tabular model designer":::
-
-1. **Add the Azure Cosmos DB data source** - Navigate to **Models**> **Data Sources** > **New Data Source** and add the Azure Cosmos DB data source as shown in the following screenshot:
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/add-data-source.png" alt-text="Add Cosmos DB data source":::
-
- Connect to Azure Cosmos DB by providing the **account URI**, **database name**, and the **container name**. You can now see the data from Azure Cosmos container is imported into Power BI.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/preview-cosmosdb-data.png" alt-text="Preview Azure Cosmos DB data":::
-
-1. **Construct the Analysis Services model** - Open the query editor, perform the required operations to optimize the loaded data set:
-
- * Extract only the weather-related columns (temperature and rainfall)
-
- * Extract the month information from the table. This data is useful in creating partitions as described in the next section.
-
- * Convert the temperature columns to number
-
- The resulting M expression is as follows:
-
- ```
- let
- Source=#"DocumentDB/https://[ACCOUNTNAME].documents.azure.com:443/",
- #"Expanded Document" = Table.ExpandRecordColumn(Source, "Document", {"id", "_rid", "_self", "_etag", "fogground", "snowfall", "dust", "snowdepth", "mist", "drizzle", "hail", "fastest2minwindspeed", "thunder", "glaze", "snow", "ice", "fog", "temperaturemin", "fastest5secwindspeed", "freezingfog", "temperaturemax", "blowingsnow", "freezingrain", "rain", "highwind", "date", "precipitation", "fogheavy", "smokehaze", "avgwindspeed", "fastest2minwinddir", "fastest5secwinddir", "_attachments", "_ts"}, {"Document.id", "Document._rid", "Document._self", "Document._etag", "Document.fogground", "Document.snowfall", "Document.dust", "Document.snowdepth", "Document.mist", "Document.drizzle", "Document.hail", "Document.fastest2minwindspeed", "Document.thunder", "Document.glaze", "Document.snow", "Document.ice", "Document.fog", "Document.temperaturemin", "Document.fastest5secwindspeed", "Document.freezingfog", "Document.temperaturemax", "Document.blowingsnow", "Document.freezingrain", "Document.rain", "Document.highwind", "Document.date", "Document.precipitation", "Document.fogheavy", "Document.smokehaze", "Document.avgwindspeed", "Document.fastest2minwinddir", "Document.fastest5secwinddir", "Document._attachments", "Document._ts"}),
- #"Select Columns" = Table.SelectColumns(#"Expanded Document",{"Document.temperaturemin", "Document.temperaturemax", "Document.rain", "Document.date"}),
- #"Duplicated Column" = Table.DuplicateColumn(#"Select Columns", "Document.date", "Document.month"),
- #"Extracted First Characters" = Table.TransformColumns(#"Duplicated Column", {{"Document.month", each Text.Start(_, 7), type text}}),
- #"Sorted Rows" = Table.Sort(#"Extracted First Characters",{{"Document.date", Order.Ascending}}),
- #"Changed Type" = Table.TransformColumnTypes(#"Sorted Rows",{{"Document.temperaturemin", type number}, {"Document.temperaturemax", type number}}),
- #"Filtered Rows" = Table.SelectRows(#"Changed Type", each [Document.month] = "2019-07")
- in
- #"Filtered Rows"
- ```
-
- Additionally, change the data type of the temperature columns to Decimal to make sure that these values can be plotted in Power BI.
-
-1. **Create Azure Analysis partitions** - Create partitions in Azure Analysis Services to divide the dataset into logical partitions that can be refreshed independently and at different frequencies. In this example, you create two partitions that would divide the dataset into the most recent month's data and everything else.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/create-analysis-services-partitions.png" alt-text="Create analysis services partitions":::
-
- Create the following two partitions in Azure Analysis
-
- * **Latest Month** - `#"Filtered Rows" = Table.SelectRows(#"Sorted Rows", each [Document.month] = "2019-07")`
- * **Historical** - `#"Filtered Rows" = Table.SelectRows(#"Sorted Rows", each [Document.month] <> "2019-07")`
-
-1. **Deploy the Model to the Azure Analysis Server** - Right click on the Azure Analysis Services project and choose **Deploy**. Add the server name in the **Deployment Server properties** pane.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/analysis-services-deploy-model.png" alt-text="Deploy Azure Analysis Services model":::
-
-1. **Configure partition refreshes and merges** - Azure Analysis Services allows independent processing of partitions. Since we want the **Latest Month** partition to be constantly updated with the most recent data, set the refresh interval to 5 minutes. You can refresh the data by using the [REST API](../../analysis-services/analysis-services-async-refresh.md), [Azure automation](../../analysis-services/analysis-services-refresh-azure-automation.md), or with a [Logic App](../../analysis-services/analysis-services-refresh-logic-app.md). It's not required to refresh the data in historical partition. Additionally, you need to write some code to consolidate the latest month partition to the historical partition and create a new latest month partition.
-
-## Connect Power BI to Analysis Services
-
-1. **Connect to the Azure Analysis Server using the Azure Analysis Services database Connector** - Choose the **Live mode** and connect to the Azure Analysis Services instance as shown in the following screenshot:
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/analysis-services-get-data.png" alt-text="Get data from Azure Analysis Services":::
-
-1. **Load the data and generate reports** - By using the data you have loaded earlier, create charts to report on temperature and rainfall. Since you are creating a live connection, the queries should be executed on the data in the Azure Analysis Services model that you have deployed in the previous step. The temperature charts will be updated within five minutes after the new data is loaded into Azure Cosmos DB.
-
- :::image type="content" source="./media/create-real-time-weather-dashboard-powerbi/load-data-generate-report.png" alt-text="Load the data and generate reports":::
-
-## Next steps
-
-* To learn more about Power BI, see [Get started with Power BI](https://powerbi.microsoft.com/documentation/powerbi-service-get-started/).
-
-* [Connect Qlik Sense to Azure Cosmos DB and visualize your data](../visualize-qlik-sense.md)
cosmos-db Troubleshoot Sdk Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/troubleshoot-sdk-availability.md
Title: Diagnose and troubleshoot the availability of Azure Cosmos SDKs in multir
description: Learn all about the Azure Cosmos SDK availability behavior when operating in multi regional environments. Previously updated : 02/18/2021 Last updated : 03/28/2022
If you **don't set a preferred region**, the SDK client defaults to the primary
> [!WARNING] > The failover and availability logic described in this document can be disabled on the client configuration, which is not advised unless the user application is going to handle availability errors itself. This can be achieved by: >
-> * Setting the [ConnectionPolicy.EnableEndpointRediscovery](/dotnet/api/microsoft.azure.documents.client.connectionpolicy.enableendpointdiscovery) property in .NET V2 SDK to false.
+> * Setting the [ConnectionPolicy.EnableEndpointDiscovery](/dotnet/api/microsoft.azure.documents.client.connectionpolicy.enableendpointdiscovery) property in .NET V2 SDK to false.
> * Setting the [CosmosClientOptions.LimitToEndpoint](/dotnet/api/microsoft.azure.cosmos.cosmosclientoptions.limittoendpoint) property in .NET V3 SDK to true. > * Setting the [CosmosClientBuilder.endpointDiscoveryEnabled](/java/api/com.azure.cosmos.cosmosclientbuilder.endpointdiscoveryenabled) method in Java V4 SDK to false. > * Setting the [CosmosClient.enable_endpoint_discovery](/python/api/azure-cosmos/azure.cosmos.cosmos_client.cosmosclient) parameter in Python SDK to false.
cosmos-db Synapse Link Power Bi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/synapse-link-power-bi.md
In this article, you learn how to build a serverless SQL pool database and views
With Azure Synapse Link, you can build near real-time dashboards in Power BI to analyze your Azure Cosmos DB data. There is no performance or cost impact to your transactional workloads, and no complexity of managing ETL pipelines. You can use either [DirectQuery](/power-bi/connect-data/service-dataset-modes-understand#directquery-mode) or [import](/power-bi/connect-data/service-dataset-modes-understand#import-mode) modes.
+> [!Note]
+> You can build Power BI dashboards with just a few clicks using Azure Cosmos DB portal. For more information, see [Integrated Power BI experience in Azure Cosmos DB portal for Synapse Link enabled accounts](integrated-power-bi-synapse-link.md). This will automatically create T-SQL views in Synapse serverless SQL pools on your Cosmos DB containers. You can simply download the .pbids file that connects to these T-SQL views to start building your BI dashboards.
+ In this scenario, you will use dummy data about Surface product sales in a partner retail store. You will analyze the revenue per store based on the proximity to large households and the impact of advertising for a specific week. In this article, you create two views named **RetailSales** and **StoreDemographics** and a query between them. You can get the sample product data from this [GitHub](https://github.com/Azure-Samples/Synapse/tree/main/Notebooks/PySpark/Synapse%20Link%20for%20Cosmos%20DB%20samples/Retail/RetailData) repo. ## Prerequisites
After you choose these options, you should see a graph like the following screen
## Next steps
+[Integrated Power BI experience in Azure Cosmos DB portal for Synapse Link enabled accounts](integrated-power-bi-synapse-link.md)
+ [Use T-SQL to query Azure Cosmos DB data using Azure Synapse Link](../synapse-analytics/sql/query-cosmos-db-analytical-store.md)
-Use serverless SQL pool to [analyze Azure Open Datasets and visualize the results in Azure Synapse Studio](../synapse-analytics/sql/tutorial-data-analyst.md)
+Use serverless SQL pool to [analyze Azure Open Datasets and visualize the results in Azure Synapse Studio](../synapse-analytics/sql/tutorial-data-analyst.md)
cost-management-billing Programmatically Create Subscription Enterprise Agreement https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/manage/programmatically-create-subscription-enterprise-agreement.md
Previously updated : 03/22/2022 Last updated : 03/29/2022
Using one of the following methods, you'll create a subscription alias name. We
- Start with a letter and end with an alphanumeric character - Don't use periods
+An alias is used for simple substitution of a user-defined string instead of the subscription GUID. In other words, you can use it as a shortcut. You can learn more about alias at [Alias - Create](/rest/api/subscription/2020-09-01/alias/create). In the following examples, `sampleAlias` is created but you can use any string you like.
### [REST](#tab/rest)
data-factory Data Factory Service Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/data-factory-service-identity.md
Last updated 01/27/2022 -+ # Managed identity for Azure Data Factory and Azure Synapse
You can find the managed identity information from Azure portal -> your data fac
The managed identity information will also show up when you create linked service, which supports managed identity authentication, like Azure Blob, Azure Data Lake Storage, Azure Key Vault, etc.
-When granting permission, in Azure resource's Access Control (IAM) tab -> Add role assignment -> Assign access to -> select Data Factory under System assigned managed identity -> select by factory name; or in general, you can use object ID or data factory name (as managed identity name) to find this identity. If you need to get managed identity's application ID, you can use PowerShell.
+To grant permissions, follow these steps. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
+
+1. Select **Access control (IAM)**.
+
+1. Select **Add** > **Add role assignment**.
+
+ :::image type="content" source="../../includes/role-based-access-control/media/add-role-assignment-menu-generic.png" alt-text="Screenshot that shows Access control (IAM) page with Add role assignment menu open.":::
+
+1. On the **Members** tab, select **Managed identity**, and then select **Select members**.
+
+1. Select your Azure subscription.
+
+1. Under **System-assigned managed identity**, select **Data Factory**, and then select a data factory. You can also use the object ID or data factory name (as the managed-identity name) to find this identity. To get the managed identity's application ID, use PowerShell.
+
+1. On the **Review + assign** tab, select **Review + assign** to assign the role.
#### Retrieve system-assigned managed identity using PowerShell
data-factory How To Schedule Azure Ssis Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/how-to-schedule-azure-ssis-integration-runtime.md
Last updated 02/15/2022 + # How to start and stop Azure-SSIS Integration Runtime on a schedule
If you create a third trigger that is scheduled to run daily at midnight and ass
:::image type="content" source="./media/how-to-schedule-azure-ssis-integration-runtime/adf-until-activity-on-demand-ssis-ir-open.png" alt-text="ADF Until Activity On-Demand SSIS IR Open":::
-7. Assign the managed identity for your ADF a **Contributor** role to itself, so Web activities in its pipelines can call REST API to start/stop Azure-SSIS IRs provisioned in it. On your ADF page in Azure portal, click **Access control (IAM)**, click **+ Add role assignment**, and then on **Add role assignment** blade, do the following actions:
+7. Assign the managed identity for your ADF a **Contributor** role to itself, so Web activities in its pipelines can call REST API to start/stop Azure-SSIS IRs provisioned in it:
- 1. For **Role**, select **Contributor**.
- 2. For **Assign access to**, select **Azure AD user, group, or service principal**.
- 3. For **Select**, search for your ADF name and select it.
- 4. Click **Save**.
-
- :::image type="content" source="./media/how-to-schedule-azure-ssis-integration-runtime/adf-managed-identity-role-assignment.png" alt-text="ADF Managed Identity Role Assignment":::
+ 1. On your ADF page in the Azure portal, select **Access control (IAM)**.
+ 1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
+ 1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
+
+ | Setting | Value |
+ | | |
+ | Role | Contributor |
+ | Assign access to | User, group, or service principal |
+ | Members | Your ADF username |
+
+ :::image type="content" source="../../includes/role-based-access-control/media/add-role-assignment-page.png" alt-text="Screenshot that shows Add role assignment page in Azure portal.":::
8. Validate your ADF and all pipeline settings by clicking **Validate all/Validate** on the factory/pipeline toolbar. Close **Factory/Pipeline Validation Output** by clicking **>>** button.
databox-online Azure Stack Edge Gpu Enable Azure Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox-online/azure-stack-edge-gpu-enable-azure-monitor.md
Previously updated : 06/03/2021 Last updated : 03/28/2022
Take the following steps to enable Container Insights on your workspace.
`Set-HcsKubernetesAzureMonitorConfiguration -WorkspaceId <> -WorkspaceKey <>`
+ > [!NOTE]
+ > By default, this cmdlet configures the Azure public cloud. To configure a government cloud or non-public cloud, use the parameter `AzureCloudDomainName`.
+ 1. After the Azure Monitor is enabled, you should see logs in the Log Analytics workspace. To view the status of the Kubernetes cluster deployed on your device, go to **Azure Monitor > Insights > Containers**. For the environment option, select **All**. ![Metrics in Log Analytics workspace](media/azure-stack-edge-gpu-enable-azure-monitor/log-analytics-workspace-metrics-1.png)
ddos-protection Manage Ddos Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ddos-protection/manage-ddos-protection.md
If you want to delete a DDoS protection plan, you must first dissociate all virt
## Next steps
-To learn how to view and configure telemetry for your DDoS protection plan, continue to the tutorials.
+To learn how to view and configure telemetry for your DDoS protection plan, continue to the tutorials.
> [!div class="nextstepaction"]
-> [View and configure DDoS protection telemetry](telemetry.md)
+> [View and configure DDoS protection telemetry](telemetry.md)
defender-for-cloud Alerts Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/alerts-reference.md
Microsoft Defender for Containers provides security alerts on the cluster level
|--|--|:-:|--| | **A file was downloaded and executed (Preview)**<br>(K8S.NODE_LinuxSuspiciousActivity) | Analysis of processes running within a container indicates that a file has been downloaded to the container, given execution privileges and then executed. | Execution | Medium | | **A history file has been cleared (Preview)**<br>(K8S.NODE_HistoryFileCleared) | Analysis of processes running within a container indicates that the command history log file has been cleared. Attackers may do this to cover their tracks. The operation was performed by the specified user account. | DefenseEvasion | Medium |
+| **Abnormal activity of managed identity associated with Kubernetes (Preview)**<br>(K8S_AbnormalMiAcitivty) | Analysis of Azure Resource Manager operations detected an abnormal behavior of a managed identity used by an AKS addon. The detected activity isn\'t consistent with the behavior of the associated addon. While this activity can be legitimate, such behavior might indicate that the identity was gained by an attacker, possibly from a compromised container in the Kubernetes cluster. | Lateral Movement | Medium |
+| **Abnormal Kubernetes service account operation detected**<br>(K8S_ServiceAccountRareOperation) | Kubernetes audit log analysis detected abnormal behavior by a service account in your Kubernetes cluster. The service account was used for an operation which isn't common for this service account. While this activity can be legitimate, such behavior might indicate that the service account is being used for malicious purposes. | Lateral Movement, Credential Access | Medium |
| **An uncommon connection attempt detected (Preview)**<br>(K8S.NODE_SuspectConnection) | Analysis of processes running within a container detected an uncommon connection attempt utilizing a socks protocol. This is very rare in normal operations, but a known technique for attackers attempting to bypass network-layer detections. | Execution, Exfiltration, Exploitation | Medium | | **Anomalous pod deployment (Preview)**<br>(K8S_AnomalousPodDeployment) | Kubernetes audit log analysis detected pod deployment which is anomalous based on previous pod deployment activity. This activity is considered an anomaly when taking into account how the different features seen in the deployment operation are in relations to one another. The features monitored include the container image registry used, the account performing the deployment, day of the week, how often this account performs pod deployments, user agent used in the operation, whether this is a namespace to which pod deployments often occur, and other features. Top contributing reasons for raising this alert as anomalous activity are detailed under the alertΓÇÖs extended properties. | Execution | Medium | | **Attempt to stop apt-daily-upgrade.timer service detected (Preview)**<br>(K8S.NODE_TimerServiceDisabled) | Analysis of host/device data detected an attempt to stop apt-daily-upgrade.timer service. Attackers have been observed stopping this service to download malicious files and grant execution privileges for their attacks. This activity can also happen if the service is updated through normal administrative actions. | DefenseEvasion | Informational |
defender-for-cloud Defender For Containers Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/defender-for-containers-introduction.md
Defender for Cloud filters, and classifies findings from the scanner. When an im
:::image type="content" source="./media/defender-for-containers/recommendation-acr-images-with-vulnerabilities.png" alt-text="Sample Microsoft Defender for Cloud recommendation about vulnerabilities discovered in Azure Container Registry (ACR) hosted images." lightbox="./media/defender-for-containers/recommendation-acr-images-with-vulnerabilities.png":::
-### View vulnerabilities for running images
+### View vulnerabilities for running images
-Defender for Containers expands on the registry scanning features by introducing the **preview feature** of run-time visibility of vulnerabilities powered by the Defender profile, or extension.
-
-> [!NOTE]
-> There's no Defender profile for Windows, it's only available on Linux OS.
-
-The new recommendation, **Running container images should have vulnerability findings resolved**, only shows vulnerabilities for running images, and relies on the Defender security profile, or extension to discover which images are currently running. This recommendation groups running images that have vulnerabilities, and provides details about the issues discovered, and how to remediate them. The Defender profile, or extension is used to gain visibility into vulnerable containers that are active.
-
-This recommendation shows running images, and their vulnerabilities based on ACR image. Images that are deployed from a non ACR registry, won't be scanned, and will appear under the Not applicable tab.
+The recommendation **Running container images should have vulnerability findings resolved** shows vulnerabilities for running images by using the scan results from ACR registeries and information on running images from the Defender security profile/extension. Images that are deployed from a non ACR registry, will appear under the Not applicable tab.
:::image type="content" source="media/defender-for-containers/running-image-vulnerabilities-recommendation.png" alt-text="Screenshot showing where the recommendation is viewable" lightbox="media/defender-for-containers/running-image-vulnerabilities-recommendation-expanded.png":::
+> [!NOTE]
+> This recommendation is currently supported for Linux containers only, as there's no Defender profile/extension for Windows.
+>
## Run-time protection for Kubernetes nodes and clusters Defender for Cloud provides real-time threat protection for your containerized environments and generates alerts for suspicious activities. You can use this information to quickly remediate security issues and improve the security of your containers.
defender-for-cloud Defender For Sql Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/defender-for-sql-introduction.md
The focus of **Microsoft Defender for SQL on machines** is obviously security. B
The service has a split architecture to balance data uploading and speed with performance: -- some of our detectors run on the machine for real-time speed advantages-- others run in the cloud to spare the machine from heavy computational loads
+- Some of our detectors, including an [extended events trace](../azure-sql/database/xevent-db-diff-from-svr.md) named `SQLAdvancedThreatProtectionTraffic`, run on the machine for real-time speed advantages.
+- Other detectors run in the cloud to spare the machine from heavy computational loads.
Lab tests of our solution, comparing it against benchmark loads, showed CPU usage averaging 3% for peak slices. An analysis of the telemetry for our current users shows a negligible impact on CPU and memory usage.
defender-for-cloud Export To Siem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/export-to-siem.md
description: Learn how to stream your security alerts to Microsoft Sentinel, thi
Previously updated : 11/09/2021 Last updated : 03/29/2022 # Stream alerts to a SIEM, SOAR, or IT Service Management solution
Microsoft Sentinel includes built-in connectors for Microsoft Defender for Cloud
- [Stream alerts to Microsoft Sentinel at the subscription level](../sentinel/connect-azure-security-center.md) - [Connect all subscriptions in your tenant to Microsoft Sentinel](https://techcommunity.microsoft.com/t5/azure-sentinel/azure-security-center-auto-connect-to-sentinel/ba-p/1387539)
-When you connect Defender for Cloud to Microsoft Sentinel, the status of Defender for Cloud alerts that get ingested into Microsoft Sentinel is synchronized between the two services. So, for example, when an alert is closed in Defender for Cloud, that alert will display as closed in Microsoft Sentinel as well. Changing the status of an alert in Defender for Cloud "won't"* affect the status of any Microsoft Sentinel **incidents** that contain the synchronized Microsoft Sentinel alert, only that of the synchronized alert itself.
+When you connect Defender for Cloud to Microsoft Sentinel, the status of Defender for Cloud alerts that get ingested into Microsoft Sentinel is synchronized between the two services. So, for example, when an alert is closed in Defender for Cloud, that alert is also shown as closed in Microsoft Sentinel. If you change the status of an alert in Defender for Cloud, the status of the alert in Microsoft Sentinel is also updated, but the statuses of any Microsoft Sentinel **incidents** that contain the synchronized Microsoft Sentinel alert aren't updated.
-Enabling the preview feature, **bi-directional alert synchronization**, will automatically sync the status of the original Defender for Cloud alerts with Microsoft Sentinel incidents that contain the copies of those Defender for Cloud alerts. So, for example, when a Microsoft Sentinel incident containing a Defender for Cloud alert is closed, Defender for Cloud will automatically close the corresponding original alert.
+You can enable the preview feature **bi-directional alert synchronization** to automatically sync the status of the original Defender for Cloud alerts with Microsoft Sentinel incidents that contain the copies of those Defender for Cloud alerts. So, for example, when a Microsoft Sentinel incident that contains a Defender for Cloud alert is closed, Defender for Cloud automatically closes the corresponding original alert.
Learn more in [Connect alerts from Microsoft Defender for Cloud](../sentinel/connect-azure-security-center.md).
Learn more in [Connect alerts from Microsoft Defender for Cloud](../sentinel/con
### Configure ingestion of all audit logs into Microsoft Sentinel Another alternative for investigating Defender for Cloud alerts in Microsoft Sentinel is to stream your audit logs into Microsoft Sentinel:
- - [Connect Windows security events](../sentinel/connect-windows-security-events.md)
- - [Collect data from Linux-based sources using Syslog](../sentinel/connect-syslog.md)
- - [Connect data from Azure Activity log](../sentinel/data-connectors-reference.md#azure-activity)
+- [Connect Windows security events](../sentinel/connect-windows-security-events.md)
+- [Collect data from Linux-based sources using Syslog](../sentinel/connect-syslog.md)
+- [Connect data from Azure Activity log](../sentinel/data-connectors-reference.md#azure-activity)
> [!TIP]
-> Microsoft Sentinel is billed based on the volume of data ingested for analysis in Microsoft Sentinel and stored in the Azure Monitor Log Analytics workspace. Microsoft Sentinel offers a flexible and predictable pricing model. [Learn more at the Microsoft Sentinel pricing page](https://azure.microsoft.com/pricing/details/azure-sentinel/).
+> Microsoft Sentinel is billed based on the volume of data that it ingests for analysis in Microsoft Sentinel and stores in the Azure Monitor Log Analytics workspace. Microsoft Sentinel offers a flexible and predictable pricing model. [Learn more at the Microsoft Sentinel pricing page](https://azure.microsoft.com/pricing/details/azure-sentinel/).
## Stream alerts with Azure Monitor
-To stream alerts into **ArcSight**, **Splunk**, **QRadar**, **SumoLogic**, **Syslog servers**, **LogRhythm**, **Logz.io Cloud Observability Platform**, and other monitoring solutions. connect Defender for Cloud with Azure monitor via Azure Event Hubs:
+To stream alerts into **ArcSight**, **Splunk**, **QRadar**, **SumoLogic**, **Syslog servers**, **LogRhythm**, **Logz.io Cloud Observability Platform**, and other monitoring solutions, connect Defender for Cloud to Azure monitor using Azure Event Hubs:
> [!NOTE]
-> To stream alerts at the tenant level, use this Azure policy and set the scope at the root management group (you'll need permissions for the root management group as explained in [Defender for Cloud permissions](permissions.md)): [Deploy export to event hub for Microsoft Defender for Cloud alerts and recommendations](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2fproviders%2fMicrosoft.Authorization%2fpolicyDefinitions%2fcdfcce10-4578-4ecd-9703-530938e4abcb).
+> To stream alerts at the tenant level, use this Azure policy and set the scope at the root management group. You'll need permissions for the root management group as explained in [Defender for Cloud permissions](permissions.md): [Deploy export to an event hub for Microsoft Defender for Cloud alerts and recommendations](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2fproviders%2fMicrosoft.Authorization%2fpolicyDefinitions%2fcdfcce10-4578-4ecd-9703-530938e4abcb).
1. Enable [continuous export](continuous-export.md) to stream Defender for Cloud alerts into a dedicated event hub at the subscription level. To do this at the Management Group level using Azure Policy, see [Create continuous export automation configurations at scale](continuous-export.md?tabs=azure-policy#configure-continuous-export-at-scale-using-the-supplied-policies)
To stream alerts into **ArcSight**, **Splunk**, **QRadar**, **SumoLogic**, **Sys
1. Optionally, stream the raw logs to the event hub and connect to your preferred solution. Learn more in [Monitoring data available](../azure-monitor/essentials/stream-monitoring-data-event-hubs.md#monitoring-data-available).
-To view the event schemas of the exported data types, visit the [Event hub event schemas](https://aka.ms/ASCAutomationSchemas).
+To view the event schemas of the exported data types, visit the [Event Hubs event schemas](https://aka.ms/ASCAutomationSchemas).
## Other streaming options
As an alternative to Sentinel and Azure Monitor, you can use Defender for Cloud'
You can use this API to stream alerts from your **entire tenant** (and data from many other Microsoft Security products) into third-party SIEMs and other popular platforms: - **Splunk Enterprise and Splunk Cloud** - [Use the Microsoft Graph Security API Add-On for Splunk](https://splunkbase.splunk.com/app/4564/) -- **Power BI** - [Connect to the Microsoft Graph Security API in Power BI Desktop](/power-bi/connect-data/desktop-connect-graph-security)-- **ServiceNow** - [Follow the instructions to install and configure the Microsoft Graph Security API application from the ServiceNow Store](https://docs.servicenow.com/bundle/orlando-security-management/page/product/secops-integration-sir/secops-integration-ms-graph/task/ms-graph-install.html)-- **QRadar** - [IBM's Device Support Module for Microsoft Defender for Cloud via Microsoft Graph API](https://www.ibm.com/support/knowledgecenter/SS42VS_DSM/com.ibm.dsm.doc/c_dsm_guide_ms_azure_security_center_overview.html) -- **Palo Alto Networks**, **Anomali**, **Lookout**, **InSpark**, and more - [Microsoft Graph Security API](https://www.microsoft.com/security/business/graph-security-api#office-MultiFeatureCarousel-09jr2ji)
+- **Power BI** - [Connect to the Microsoft Graph Security API in Power BI Desktop](/power-bi/connect-data/desktop-connect-graph-security).
+- **ServiceNow** - [Install and configure the Microsoft Graph Security API application from the ServiceNow Store](https://docs.servicenow.com/bundle/orlando-security-management/page/product/secops-integration-sir/secops-integration-ms-graph/task/ms-graph-install.html).
+- **QRadar** - [Use IBM's Device Support Module for Microsoft Defender for Cloud via Microsoft Graph API](https://www.ibm.com/support/knowledgecenter/SS42VS_DSM/com.ibm.dsm.doc/c_dsm_guide_ms_azure_security_center_overview.html).
+- **Palo Alto Networks**, **Anomali**, **Lookout**, **InSpark**, and more - [Use the Microsoft Graph Security API](https://www.microsoft.com/security/business/graph-security-api#office-MultiFeatureCarousel-09jr2ji).
defender-for-cloud Supported Machines Endpoint Solutions Clouds Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/supported-machines-endpoint-solutions-clouds-containers.md
The **tabs** below show the features that are available, by environment, for Mic
| Domain | Feature | Supported Resources | Release state <sup>[1](#footnote1)</sup> | Windows support | Agentless/Agent-based | Pricing tier | |--|--| -- | -- | -- | -- | --| | Compliance | Docker CIS | EC2 | Preview | X | Log Analytics agent | Defender for Servers |
-| Vulnerability Assessment | Registry scan | - | - | - | - | - |
-| Vulnerability Assessment | View vulnerabilities for running images | - | - | - | - | - |
+| Vulnerability assessment | Registry scan | - | - | - | - | - |
+| Vulnerability assessment | View vulnerabilities for running images | - | - | - | - | - |
| Hardening | Control plane recommendations | - | - | - | - | - | | Hardening | Kubernetes data plane recommendations | EKS | Preview | X | Azure Policy extension | Defender for Containers | | Runtime protection| Threat detection (control plane)| EKS | Preview | Γ£ô | Agentless | Defender for Containers |
The **tabs** below show the features that are available, by environment, for Mic
| Domain | Feature | Supported Resources | Release state <sup>[1](#footnote1)</sup> | Windows support | Agentless/Agent-based | Pricing tier | |--|--| -- | -- | -- | -- | --| | Compliance | Docker CIS | GCP VMs | Preview | X | Log Analytics agent | Defender for Servers |
-| Vulnerability Assessment | Registry scan | - | - | - | - | - |
-| Vulnerability Assessment | View vulnerabilities for running images | - | - | - | - | - |
+| Vulnerability assessment | Registry scan | - | - | - | - | - |
+| Vulnerability assessment | View vulnerabilities for running images | - | - | - | - | - |
| Hardening | Control plane recommendations | - | - | - | - | - | | Hardening | Kubernetes data plane recommendations | GKE | Preview | X | Azure Policy extension | Defender for Containers | | Runtime protection| Threat detection (control plane)| GKE | Preview | Γ£ô | Agentless | Defender for Containers |
The **tabs** below show the features that are available, by environment, for Mic
| Domain | Feature | Supported Resources | Release state <sup>[1](#footnote1)</sup> | Windows support | Agentless/Agent-based | Pricing tier | |--|--| -- | -- | -- | -- | --| | Compliance | Docker CIS | Arc enabled VMs | Preview | X | Log Analytics agent | Defender for Servers |
-| Vulnerability Assessment | Registry scan | ACR, Private ACR | Preview | Γ£ô (Preview) | Agentless | Defender for Containers |
-| Vulnerability Assessment | View vulnerabilities for running images | Arc enabled K8s clusters | Preview | X | Defender extension | Defender for Containers |
+| Vulnerability assessment | Registry scan | ACR, Private ACR | Preview | Γ£ô (Preview) | Agentless | Defender for Containers |
+| Vulnerability assessment | View vulnerabilities for running images | Arc enabled K8s clusters | Preview | X | Defender extension | Defender for Containers |
| Hardening | Control plane recommendations | - | - | - | - | - | | Hardening | Kubernetes data plane recommendations | Arc enabled K8s clusters | Preview | X | Azure Policy extension | Defender for Containers | | Runtime protection| Threat detection (control plane)| Arc enabled K8s clusters | Preview | Γ£ô | Defender extension | Defender for Containers |
devtest-labs Devtest Lab Attach Detach Data Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-attach-detach-data-disk.md
Title: Attach an existing data disk to a lab VM
-description: Learn how to attach or detach a lab data disk to a lab virtual machine in Azure DevTest Labs
+ Title: Attach & detach data disks for lab VMs
+description: Learn how to attach or detach a data disk for a lab virtual machine in Azure DevTest Labs.
Previously updated : 10/26/2021 Last updated : 03/29/2022
-# Attach or detach a lab data disk to a lab virtual machine in Azure DevTest Labs
+# Attach or detach a data disk for a lab virtual machine in Azure DevTest Labs
-You can create and attach a new lab [data disk](../virtual-machines/managed-disks-overview.md) for a lab Azure virtual machine (VM). The data disk can then be detached, and either: deleted, reattached, or attached to a different lab VM that you own. This functionality is handy for managing storage or software outside of each individual virtual machine.
-
-In this article, you'll learn how to attach and detach a data disk to a lab virtual machine.
+This article explains how to attach and detach a lab virtual machine (VM) data disk in Azure DevTest Labs. You can create, attach, detach, and reattach [data disks](/azure/virtual-machines/managed-disks-overview) for lab VMs that you own. This functionality is useful for managing storage or software separately from individual VMs.
## Prerequisites
-Your lab virtual machine must be running. The virtual machine size controls how many data disks you can attach. For details, see [Sizes for virtual machines](../virtual-machines/sizes.md).
-
-## Attach a new data disk
-
-Follow these steps to create and attach a new managed data disk to a VM in Azure DevTest Labs.
+To attach or detach a data disk, you need to own the lab VM, and the VM must be running. The VM size determines how many data disks you can attach. For more information, see [Sizes for virtual machines](/azure/virtual-machines/sizes).
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+## Create and attach a new data disk
-1. Navigate to your lab in **DevTest Labs**.
+Follow these steps to create and attach a new managed data disk for a DevTest Labs VM.
-1. Select your running virtual machine.
+1. Select your VM from the **My virtual machines** list on the lab **Overview** page.
-1. From the **virtual machine** page, under **Settings**, select **Disks**.
+1. On the VM **Overview** page, select **Disks** under **Settings** in the left navigation.
-1. Select **Attach new**.
+1. On the **Disks** page, select **Attach new**.
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-new.png" alt-text="Screenshot of attach new data disk to a virtual machine.":::
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-new.png" alt-text="Screenshot of Attach new on the V M's Disk page.":::
-1. From the **Attach new disk** page, provide the following information:
+1. Fill out the **Attach new disk** form as follows:
- |Property | Description |
- |||
- |Name|Enter a unique name.|
- |Disk type| Select a [disk type](../virtual-machines/disks-types.md) from the drop-down list.|
- |Size (GiB)|Enter a size in gigabytes.|
-
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-new-form.png" alt-text="Screenshot of complete the 'attach new disk' form.":::
+ - For **Name**, enter a unique name.
+ - For **Disk type**, select a [disk type](/azure/virtual-machines/disks-types) from the drop-down list.
+ - For **Size (GiB)**, enter a size in gigabytes.
1. Select **OK**.
-1. You're returned to the **virtual machine** page. View your attached disk under **Data disks**.
-
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attached-data-disk.png" alt-text="Screenshot of attached disk appears under data disks.":::
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-new-form.png" alt-text="Screenshot of the Attach new disk form.":::
-## Detach a data disk
+1. After the disk is attached, on the **Disks** page, view the new attached disk under **Data disks**.
-Detaching removes the lab disk from the lab VM, but keeps it in storage for later use.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attached-data-disk.png" alt-text="Screenshot of the new data disk under Data disks on the Disks page.":::
-### Detach from the VM's management page
+## Attach an existing data disk
-1. Navigate to your lab in **DevTest Labs**.
+Follow these steps to attach an existing available data disk to a running VM.
-1. Select your running virtual machine with an attached data disk.
+1. Select your VM from the **My virtual machines** list on the lab **Overview** page.
-1. From the **virtual machine** page, under **Settings**, select **Disks**.
-
-1. Under **Data disks**, select the data disk you want to detach.
-
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-button.png" alt-text="Screenshot of select data disks for a virtual machine.":::
+1. On the VM **Overview** page, select **Disks** under **Settings** in the left navigation.
+
+1. On the **Disks** page, select **Attach existing**.
-1. From the **Data disk** page, select **Detach**.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-existing-button.png" alt-text="Screenshot of Attach existing on the V M's Disk page.":::
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-data-disk-2.png" alt-text="Screenshot shows a disk's details pane with the 'Detach' action highlighted.":::
+1. On the **Attach existing disk** page, select a disk, and then select **OK**.
-1. Select **OK** to confirm that you want to detach the data disk. The disk is detached and is available to attach to another VM.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-existing.png" alt-text="Screenshot of attach existing data disk to a virtual machine.":::
-### Detach from the lab's management page
+1. After the disk is attached, on the **Disks** page, view the attached disk under **Data disks**.
-1. Navigate to your lab in **DevTest Labs**.
+## Detach a data disk
-1. Under **My Lab**, select **My data disks**.
+Detaching removes the lab disk from the VM, but keeps it in storage for later use.
-1. For the disk you wish to detach, select its ellipsis (**...**) ΓÇô and select **Detach**.
+Follow these steps to detach an attached data disk from a running VM.
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-data-disk.png" alt-text="Screenshot of detach a data disk.":::
+1. Select the VM with the disk from the **My virtual machines** list on the lab **Overview** page.
-1. Select **Yes** to confirm that you want to detach it.
+1. On the VM **Overview** page, select **Disks** under **Settings** in the left navigation.
+
+1. On the **Disks** page, under **Data disks**, select the data disk you want to detach.
- > [!NOTE]
- > If a data disk is already detached, you can choose to remove it from your list of available data disks by selecting **Delete**.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-button.png" alt-text="Screenshot of selecting a data disk to detach.":::
-## Attach an existing disk
+1. On the data disk's page, select **Detach**, and then select **OK**.
-Follow these steps to attach an existing available data disk to a running VM.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-data-disk-2.png" alt-text="Screenshot showing Detach on the Data disk page.":::
-1. Navigate to your lab in **DevTest Labs**.
+The disk is detached, and is available to reattach to this or another VM.
-1. Select your running virtual machine.
+### Detach or delete a data disk on the lab management page
-1. From the **virtual machine** page, under **Settings**, select **Disks**.
+You can also detach or delete a data disk without navigating to the VM's page.
-1. Select **Attach existing**.
+1. In the left navigation for your lab's **Overview** page, select **My data disks** under **My Lab**.
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-existing-button.png" alt-text="Screenshot that shows the 'Disks' setting selected and 'Attach existing' selected.":::
+1. On the **My data disks** page, either:
-1. From the **Attach existing disk** page, select a disk and then **OK**. After a few moments, the data disk is attached to the VM and appears in the list of **Data disks** for that VM.
+ - Select the disk you want to detach, and then on the data disk's page, select **Detach** and then select **OK**.
- :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-attach-existing.png" alt-text="Screenshot of attach existing data disk to a virtual machine.":::
+ or
-## Upgrade an unmanaged data disk
+ - Select the ellipsis (**...**) next to the disk you want to detach, select **Detach** from the context menu, and then select **Yes**.
-If you have a VM with unmanaged data disks, you can convert the VM to use managed disks. This process converts both the OS disk and any attached data disks.
+ :::image type="content" source="./media/devtest-lab-attach-detach-data-disk/devtest-lab-detach-data-disk.png" alt-text="Screenshot of detaching a data disk from the listing's context menu.":::
-First [detach the data disk](#detach-a-data-disk) from the unmanaged VM. Then, [reattach the disk](#attach-an-existing-disk) to a managed VM to automatically upgrade the data disk from unmanaged to managed.
+You can also delete a detached data disk, by selecting **Delete** from the context menu or from the data disk page. When you delete a data disk, it is removed from storage and can't be reattached anymore.
## Next steps
-Learn how to manage data disks for [claimable virtual machines](devtest-lab-add-claimable-vm.md#unclaim-a-vm).
+For information about transferring data disks for claimable lab VMs, see [Transfer the data disk](devtest-lab-add-claimable-vm.md#transfer-the-data-disk).
devtest-labs Devtest Lab Auto Startup Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/devtest-lab-auto-startup-vm.md
Title: Configure auto start settings for a VM
-description: Learn how to configure auto start settings for VMs in a lab. This setting allows VMs in the lab to be automatically started on a schedule.
+ Title: Configure auto-start settings for a VM
+description: Learn how to configure auto-start settings for VMs in a lab. This setting allows VMs in the lab to be automatically started on a schedule.
Previously updated : 12/10/2021 Last updated : 03/29/2022
-# Start up lab virtual machines automatically
+# Automatically start lab VMs with auto-start in Azure DevTest Labs
-Auto start allows you to automatically start virtual machines (VMs) in a lab at a scheduled time each day. You first need to create an auto start policy. Then you must select which VMs to follow the policy. The extra step of affirmatively selecting VMs to auto start is meant to prevent the unintentional starting of VMs that result in increased costs.
+This article shows how to configure and apply an auto-start policy for Azure DevTest Labs virtual machines (VMs). Auto-start automatically starts up lab VMs at specified times and days.
-This article shows you how to configure an auto start policy for a lab. For information on configuring auto shutdown settings, see [Manage auto shutdown policies for a lab in Azure DevTest Labs](devtest-lab-auto-shutdown.md).
+To implement auto-start, you configure an auto-start policy for the lab first. Then, you can enable the policy for individual lab VMs. Requiring individual VMs to enable auto-start helps prevent unnecessary startups that could increase costs.
-## Configure auto start settings for a lab
+You can also configure auto-shutdown policies for lab VMs. For more information, see [Manage auto shutdown policies for a lab in Azure DevTest Labs](devtest-lab-auto-shutdown.md).
-The policy doesn't automatically apply auto start to any VMs in the lab. After configuring the policy, follow the steps from [Enable auto start for a VM in the lab](#enable-auto-start-for-a-vm-in-the-lab).
+## Configure auto-start for the lab
-1. Sign in to the [Azure portal](https://portal.azure.com/).
+To configure auto-start policy for a lab, follow these steps. After configuring the policy, [enable auto-start](#add-vms-to-the-auto-start-schedule) for each VM that you want to auto-start.
-1. Navigate to your lab in **DevTest Labs**.
+1. On your lab **Overview** page, select **Configuration and policies** under **Settings** in the left navigation.
-1. Under **Settings**, select **Configuration and policies**.
+ :::image type="content" source="./media/devtest-lab-auto-startup-vm/configuration-policies-menu.png" alt-text="Screenshot that shows selecting Configuration and policies in the left navigation menu.":::
- :::image type="content" source="./media/devtest-lab-auto-startup-vm/configuration-policies-menu.png" alt-text="Screenshot that shows the 'Configuration and policies' menu in the DevTest Labs.":::
+1. On the **Configuration and policies** page, select **Auto-start** under **Schedules** in the left navigation.
-1. On the **Configuration and policies** page, under **Schedules**, select **Auto-start**.
-
-1. For **Allow auto-start**, select **Yes**. Scheduling information will then appear.
+1. Select **Yes** for **Allow auto-start**.
:::image type="content" source="./media/devtest-lab-auto-startup-vm/portal-lab-auto-start.png" alt-text="Screenshot of Auto-start option under Schedules.":::
-1. Provide the following scheduling information:
-
- |Property | Description |
- |||
- |Scheduled start| Enter a start time.|
- |Time zone| Select a time zone from the drop-down list.|
- |Days of the week| Select each box next to the day you want the schedule to be applied.|
-
- :::image type="content" source="./media/devtest-lab-auto-startup-vm/auto-start-configuration.png" alt-text="Screenshot of Autostart schedule settings.":::
+1. Enter a **Scheduled start** time, select a **Time zone**, and select the checkboxes next to the **Days of the week** that you want to apply the schedule.
-1. Select **Save**.
+1. Select **Save**.
-## Enable auto start for a VM in the lab
+ :::image type="content" source="./media/devtest-lab-auto-startup-vm/auto-start-configuration.png" alt-text="Screenshot of auto-start schedule settings.":::
-These steps continue from the prior section. Now that an auto start policy has been created, select the VMs to apply the policy against.
+## Add VMs to the auto-start schedule
-1. Close the **Configuration and policies** page to return to the **DevTest Labs** page.
+After you configure the auto-start policy, follow these steps for each VM that you want to auto-start.
-1. Under **My virtual machines**, select a VM.
+1. On your lab **Overview** page, select the VM under **My virtual machines**.
- :::image type="content" source="./media/devtest-lab-auto-startup-vm/select-vm.png" alt-text="Screenshot of Select VM from list under My virtual machines.":::
+ :::image type="content" source="./media/devtest-lab-auto-startup-vm/select-vm.png" alt-text="Screenshot of selecting a VM from the list under My virtual machines.":::
-1. On the **virtual machine** page, under **Operations**, select **Auto-start**.
+1. On the VM's **Overview** page, select **Auto-start** under **Operations** in the left navigation.
-1. On the **Auto-start** page, select **Yes**, and then **Save**.
+1. On the **Auto-start** page, select **Yes** for **Allow this virtual machine to be scheduled for automatic start**, and then select **Save**.
- :::image type="content" source="./media/devtest-lab-auto-startup-vm/select-auto-start.png" alt-text="Screenshot of Select autostart menu.":::
+ :::image type="content" source="./media/devtest-lab-auto-startup-vm/select-auto-start.png" alt-text="Screenshot of selecting Yes on the Auto-start page.":::
## Next steps - [Manage auto shutdown policies for a lab in Azure DevTest Labs](devtest-lab-auto-shutdown.md)
+- [Use command lines to start and stop DevTest Labs virtual machines](use-command-line-start-stop-virtual-machines.md)
devtest-labs Test App Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/test-app-azure.md
Title: How to test your app in Azure
-description: Learn how to deploy desktop/web applications to a file share and test them.
+ Title: Set up an app for testing on a lab VM
+description: Learn how to publish an app to an Azure file share for testing from a DevTest Labs virtual machine.
Previously updated : 11/03/2021 Last updated : 03/29/2022
-# Test your app in Azure
+# Set up an app for testing on an Azure DevTest Labs VM
-In this guide, you'll learn how to test your application in Azure using DevTest Labs. You use Visual Studio to deploy your app to an Azure file share. Then you'll access the share from a lab virtual machine (VM).
+This article shows how to set up an application for testing from an Azure DevTest Labs virtual machine (VM). In this example, you use Visual Studio to publish an app to an Azure file share. Then you access the file share from a lab VM for testing.
## Prerequisites - An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- A Windows-based [DevTest Labs VM](devtest-lab-add-vm.md) to use for testing the app.
+- [Visual Studio](https://visualstudio.microsoft.com/free-developer-offers/) installed on a different workstation.
+- A [file share](/azure/storage/files/storage-how-to-create-file-share) created in your lab's [Azure Storage Account](encrypt-storage.md).
+- The [file share mounted](/azure/storage/files/storage-how-to-use-files-windows#mount-the-azure-file-share) to your Visual Studio workstation, and to the lab VM you want to use for testing.
-- A local workstation with [Visual Studio](https://visualstudio.microsoft.com/free-developer-offers/).--- A lab in [DevTest Labs](devtest-lab-overview.md).--- An [Azure virtual machine](devtest-lab-add-vm.md) running Windows in your lab.--- A [file share](../storage/files/storage-how-to-create-file-share.md) in your lab's existing Azure storage account. A storage account is automatically created with a lab.
+## Publish your app from Visual Studio
-- The [Azure file share mounted](../storage/files/storage-how-to-use-files-windows.md#mount-the-azure-file-share) to your local workstation and lab VM.
+First, publish an app from Visual Studio to your Azure file share.
-## Publish your app from Visual Studio
+1. Open Visual Studio, and choose **Create a new project** in the **Start** window.
-In this section, you publish your app from Visual Studio to your Azure file share.
+ :::image type="content" source="./media/test-app-in-azure/launch-visual-studio.png" alt-text="Screenshot of the Visual Studio Start page with Create a new project selected.":::
-1. Open Visual Studio, and choose **Create a new project** in the Start window.
+1. On the **Create a new project** screen, select **Console Application**, and then select **Next**.
- :::image type="content" source="./media/test-app-in-azure/launch-visual-studio.png" alt-text="Screenshot of visual studio start page.":::
+ :::image type="content" source="./media/test-app-in-azure/select-console-application.png" alt-text="Screenshot of choosing Console Application.":::
-1. Select **Console Application** and then **Next**.
+1. On the **Configure your new project** page, keep the defaults and select **Next**.
- :::image type="content" source="./media/test-app-in-azure/select-console-application.png" alt-text="Screenshot of option to choose console application.":::
+1. On the **Additional information** page, keep the defaults and select **Create**.
-1. On the **Configure your new project** page, leave the defaults, and select **Next**.
+1. In Visual Studio **Solution Explorer**, right-click your project name, and select **Build**.
-1. On the **Additional information** page, leave the defaults and select **Create**.
+1. When the build succeeds, in **Solution Explorer**, right-click your project name, and select **Publish**.
-1. From **Solution Explorer**, right-click your project and select **Build**.
+ :::image type="content" source="./media/test-app-in-azure/publish-application.png" alt-text="Screenshot of selecting Publish from Solution Explorer.":::
-1. From **Solution Explorer**, right-click your project and select **Publish**.
+1. On the **Publish** screen, select **Folder**, and then select **Next**.
- :::image type="content" source="./media/test-app-in-azure/publish-application.png" alt-text="Screenshot of option to publish application.":::
+ :::image type="content" source="./media/test-app-in-azure/publish-to-folder.png" alt-text="Screenshot of selecting Folder on the Publish screen.":::
-1. On the **Publish** page, select **Folder** and then **Next**.
+1. For **Specific target**, select **Folder**, and then select **Next**.
- :::image type="content" source="./media/test-app-in-azure/publish-to-folder.png" alt-text="Screenshot of option to publish to folder.":::
+1. For the **Location** option, select **Browse**, and then select the file share you mounted earlier.
-1. For the **Specific target** option, select **Folder** and then **Next**.
+ :::image type="content" source="./media/test-app-in-azure/selecting-file-share.png" alt-text="Screenshot of browsing and selecting the file share.":::
-1. For the **Location** option, select **Browse**, and select the file share you mounted earlier. Then Select **OK**, and then **Finish**.
+1. Select **OK**, and then select **Finish**.
- :::image type="content" source="./media/test-app-in-azure/selecting-file-share.png" alt-text="Screenshot of option to select file share.":::
+1. Select **Publish**.
-1. Select **Publish**. Visual Studio builds your application and publishes it to your file share.
+ :::image type="content" source="./media/test-app-in-azure/final-publish.png" alt-text="Screenshot of selecting Publish.":::
- :::image type="content" source="./media/test-app-in-azure/final-publish.png" alt-text="Screenshot of publish button.":::
+Visual Studio publishes your application to the file share.
-## Test the app on your test VM in the lab
+## Access the app on your lab VM
-1. Connect to your lab virtual machine.
+1. Connect to your lab test VM.
-1. Within the virtual machine, launch **File Explorer**, and select **This PC** to find the file share you mounted earlier.
+1. On the lab VM, start up **File Explorer**, select **This PC**, and find the file share you mounted earlier.
- :::image type="content" source="./media/test-app-in-azure/find-share-on-vm.png" alt-text="Screenshot of file explorer.":::
+ :::image type="content" source="./media/test-app-in-azure/find-share-on-vm.png" alt-text="Screenshot of the file share in the V M's File Explorer.":::
-1. Open the file share and confirm that you see the app you deployed from Visual Studio.
+1. Open the file share, and confirm that you see the app you deployed from Visual Studio.
- :::image type="content" source="./media/test-app-in-azure/open-file-share.png" alt-text="Screenshot of contents of file share.":::
+ :::image type="content" source="./media/test-app-in-azure/open-file-share.png" alt-text="Screenshot of contents of file share.":::
- You can now access and test your app within the test VM you created in Azure.
+You can now test your app on your lab VM.
## Next steps
devtest-labs Use Command Line Start Stop Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/use-command-line-start-stop-virtual-machines.md
Title: Use command-line tools to start and stop VMs
-description: Learn how to use command-line tools to start and stop virtual machines in Azure DevTest Labs.
+ Title: Start & stop lab VMs with command lines
+description: Use Azure PowerShell or Azure CLI command lines and scripts to start and stop Azure DevTest Labs virtual machines.
Previously updated : 10/22/2021 Last updated : 03/29/2022 ms.devlang: azurecli
-# Use command-line tools to start and stop Azure DevTest Labs virtual machines
+# Use command lines to start and stop DevTest Labs virtual machines
-This article shows you how to start or stop a lab virtual machines in Azure DevTest Labs. You can create Azure PowerShell or Azure CLI scripts to automate these operations.
+This article shows how to start or stop Azure DevTest Labs virtual machines (VMs) by using Azure PowerShell or Azure CLI command lines and scripts.
-## Prerequisites
-- If using PowerShell, you'll need the [Az Module](/powershell/azure/new-azureps-module-az) installed on your workstation. Ensure you have the latest version. If necessary, run `Update-Module -Name Az`.--- If wanting to use Azure CLI and you haven't yet installed it, see [Install the Azure CLI](/cli/azure/install-azure-cli).
+You can start, stop, or [restart DevTest Labs VMs](devtest-lab-restart-vm.md) by using the Azure portal. You can also use the portal to configure [automatic startup](devtest-lab-auto-startup-vm.md) and [automatic shutdown](devtest-lab-auto-shutdown.md) schedules and policies for lab VMs.
-- A virtual machine in a DevTest Labs lab.
+When you want to script or automate start or stop for lab VMs, use PowerShell or Azure CLI commands. For example, you can use start or stop commands to:
-## Overview
+- Test a three-tier application, where the tiers need to start in a sequence.
+- Turn off VMs to save costs when they meet custom criteria.
+- Start when a continuous integration and continuous delivery (CI/CD) workflow begins, and stop when it finishes. For an example of this workflow, see [Run an image factory from Azure DevOps](image-factory-set-up-devops-lab.md).
-Azure DevTest Labs provides a way to create fast, easy, and lean dev/test environments. Labs allow you to manage cost, quickly create VMs, and minimize waste. You can use the features in the Azure portal to automatically start and stop VMs at specific times. However, you may want to automate the starting and stopping of VMs from scripts. Here are some situations in which running these tasks by using scripts would be helpful.
+## Prerequisites
-- When using a three-tier application as part of a test environment and the tiers need to be started in a sequence. -- To turn off a VM when a custom criteria is met to save money. -- As a task within a continuous integration and continuous delivery workflow to start at the beginning, and then stop the VMs when the process is complete. An example of this workflow would be the custom image factory with Azure DevTest Labs.
+- A [lab VM in DevTest Labs](devtest-lab-add-vm.md).
+- For Azure PowerShell, the [Az module](/powershell/azure/new-azureps-module-az) installed on your workstation. Make sure you have the latest version. If necessary, run `Update-Module -Name Az` to update the module.
+- For Azure CLI, [Azure CLI ](/cli/azure/install-azure-cli) installed on your workstation.
-## Azure PowerShell
+## Azure PowerShell script
-The following PowerShell script can start or stop a VM in a lab. [Invoke-AzResourceAction](/powershell/module/az.resources/invoke-azresourceaction) is the primary focus for this script. The **ResourceId** parameter is the fully qualified resource ID for the VM in the lab. The **Action** parameter is where the **Start** or **Stop** options are set depending on what is needed.
+The following PowerShell script starts or stops a VM in a lab by using [Invoke-AzResourceAction](/powershell/module/az.resources/invoke-azresourceaction). The `ResourceId` parameter is the fully qualified ID for the lab VM you want to start or stop. The `Action` parameter determines whether to start or stop the VM, depending on which action you need.
-1. From your workstation, sign in to your Azure subscription with the PowerShell [Connect-AzAccount](/powershell/module/Az.Accounts/Connect-AzAccount) cmdlet and follow the on-screen directions.
+1. From your workstation, use the PowerShell [Connect-AzAccount](/powershell/module/Az.Accounts/Connect-AzAccount) cmdlet to sign in to your Azure account. If you have multiple Azure subscriptions, uncomment the `Set-AzContext` line and fill in the `<Subscription ID>` you want to use.
```powershell # Sign in to your Azure subscription
The following PowerShell script can start or stop a VM in a lab. [Invoke-AzResou
Connect-AzAccount }
- # If you have multiple subscriptions, set the one to use
- # Set-AzContext -SubscriptionId "<SUBSCRIPTIONID>"
+ # Set-AzContext -SubscriptionId "<Subscription ID>"
```
-1. Provide an appropriate value for the variables and then execute the script.
+1. Provide values for the *`<lab name>`* and *`<VM name>`*, and enter which action you want for *`<Start or Stop>`*.
```powershell
- $devTestLabName = "yourlabname"
- $vMToStart = "vmname"
+ $devTestLabName = "<lab name>"
+ $vMToStart = "<VM name>"
# The action on the virtual machine (Start or Stop)
- $vmAction = "Start"
+ $vmAction = "<Start or Stop>"
```
-1. Start or stop the VM based on the value you passed to `$vmAction`.
+1. Start or stop the VM, based on the value you passed to `$vmAction`.
```powershell # Get the lab information
The following PowerShell script can start or stop a VM in a lab. [Invoke-AzResou
Write-Output "##[section] Successfully updated DTL machine: $vMToStart, Action: $vmAction" } else {
- Write-Error "##[error]Failed to update DTL machine: $vMToStart, Action: $vmAction"
+ Write-Error "##[error] Failed to update DTL machine: $vMToStart, Action: $vmAction"
} ```
-## Azure CLI
+## Azure CLI script
-The [Azure CLI](/cli/azure/get-started-with-azure-cli) is another way to automate the starting and stopping of DevTest Labs VMs. The following script gives you commands for starting and stopping a VM in a lab. The use of variables in this section is based on a Windows environment. Slight variations will be needed for bash or other environments.
+The following script provides [Azure CLI](/cli/azure/get-started-with-azure-cli) commands for starting or stopping a lab VM. The variables in this script are for a Windows environment. Bash or other environments have slight variations.
-1. Replace `SubscriptionID`, `yourlabname`, `yourVM`, and `action` with the appropriate values. Then execute the script.
+1. Provide appropriate values for *`<Subscription ID>`*, *`<lab name>`*, *`<VM name>`*, and the *`<Start or Stop>`* action to take.
- ```azurecli
- set SUBSCIPTIONID=SubscriptionID
- set DEVTESTLABNAME=yourlabname
- set VMNAME=yourVM
-
- REM The action on the virtual machine (Start or Stop)
- set ACTION=action
- ```
+ ```azurecli
+ set SUBSCIPTIONID=<Subscription ID>
+ set DEVTESTLABNAME=<lab name>
+ set VMNAME=<VM name>
+ set ACTION=<Start or Stop>
+ ```
-1. Sign in to your Azure subscription and get the name of the resource group that contains the lab.
+1. Sign in to your Azure account. If you have multiple Azure subscriptions, uncomment the `az account set` line to use the subscription ID you provided.
- ```azurecli
- az login
-
- REM If you have multiple subscriptions, set the one to use
- REM az account set --subscription %SUBSCIPTIONID%
+ ```azurecli
+ az login
+
+ REM az account set --subscription %SUBSCIPTIONID%
+ ```
- az resource list --resource-type "Microsoft.DevTestLab/labs" --name %DEVTESTLABNAME% --query "[0].resourceGroup"
- ```
+1. Get the name of the resource group that contains the lab.
-1. Replace `resourceGroup` with the value obtained from the prior step. Then execute the script.
+ ```azurecli
+ az resource list --resource-type "Microsoft.DevTestLab/labs" --name %DEVTESTLABNAME% --query "[0].resourceGroup"
+ ```
- ```azurecli
- set RESOURCEGROUP=resourceGroup
- ```
+1. Replace *`<resourceGroup>`* with the value you got from the previous step.
-1. Start or stop the VM based on the value you passed to `ACTION`.
+ ```azurecli
+ set RESOURCEGROUP=<resourceGroup>
+ ```
- ```azurecli
- az lab vm %ACTION% --lab-name %DEVTESTLABNAME% --name %VMNAME% --resource-group %RESOURCEGROUP%
- ```
+1. Run the command line to start or stop the VM, based on the value you passed to `ACTION`.
+
+ ```azurecli
+ az lab vm %ACTION% --lab-name %DEVTESTLABNAME% --name %VMNAME% --resource-group %RESOURCEGROUP%
+ ```
## Next steps
-See the following article for using the Azure portal to do these operations: [Restart a VM](devtest-lab-restart-vm.md).
+- [Azure CLI az lab reference](/cli/azure/lab)
+- [PowerShell Az.DevTestLabs reference](/powershell/module/az.devtestlabs)
+- [Define the startup order for DevTest Labs VMs](start-machines-use-automation-runbooks.md)
digital-twins How To Use Data History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-use-data-history.md
# Mandatory fields. Title: Use data history (preview) with Azure Data Explorer
+ Title: Use data history with Azure Data Explorer (preview)
description: See how to set up and use data history for Azure Digital Twins, using the CLI or Azure portal.
digital-twins Troubleshoot Error Azure Digital Twins Explorer Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/troubleshoot-error-azure-digital-twins-explorer-authentication.md
Previously updated : 03/28/2022 Last updated : 03/29/2022 # Troubleshoot Azure Digital Twins Explorer: Authentication errors
When running Azure Digital Twins Explorer, you encounter the following error mes
:::image type="content" source="media/troubleshoot-error-azure-digital-twins-explorer-authentication/permission-error.png" alt-text="Screenshot of an error message in the Azure Digital Twins Explorer, entitled Make sure you have the right permissions.":::
-If you are running the code locally, you might see this error message instead:
-- ## Causes ### Cause #1
-You will see these errors if your Azure account doesn't have the required Azure role-based access control (Azure RBAC) permissions set on your Azure Digital Twins instance. In order to access data in your instance, you must have the *Azure Digital Twins Data Reader* or *Azure Digital Twins Data Owner* role on the instance you are trying to read or manage, respectively.
+This error will occur if your Azure account doesn't have the required Azure role-based access control (Azure RBAC) permissions set on your Azure Digital Twins instance. In order to access data in your instance, you must have the *Azure Digital Twins Data Reader* or *Azure Digital Twins Data Owner* role on the instance you are trying to read or manage, respectively.
For more information about security and roles in Azure Digital Twins, see [Security for Azure Digital Twins solutions](concepts-security.md).
Read the setup steps for creating and authenticating a new Azure Digital Twins i
* [Set up an instance and authentication (CLI)](how-to-set-up-instance-cli.md) Read more about security and permissions on Azure Digital Twins:
-* [Security for Azure Digital Twins solutions](concepts-security.md)
+* [Security for Azure Digital Twins solutions](concepts-security.md)
dms Migration Dms Powershell Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/migration-dms-powershell-cli.md
+
+ Title: Migrate databases at scale using Azure PowerShell / CLI
+description: Learn how to use Azure PowerShell or CLI to migrate databases at scale using the capabilities of Azure SQL Migration extension in Azure Data Studio with Azure Database Migration Service.
++++++++ Last updated : 03/28/2022+++
+# Migrate databases at scale using automation (Preview)
+
+The [Azure SQL Migration extension for Azure Data Studio](/sql/azure-data-studio/extensions/azure-sql-migration-extension) enables you to assess, get Azure recommendations and migrate your SQL Server databases to Azure. Using automation with [Azure PowerShell](/powershell/module/az.datamigration) or [Azure CLI](/cli/azure/datamigration), you can leverage the capabilities of the extension with Azure Database Migration Service to migrate one or more databases at scale (including databases across multiple SQL Server instances).
+
+The following sample scripts can be referenced to suit your migration scenario using Azure PowerShell or Azure CLI:
+
+|Scripting language |Migration scenario |Azure Samples link |
+||||
+|PowerShell |SQL Server assessment |[Azure-Samples/data-migration-sql/PowerShell/sql-server-assessment](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/sql-server-assessment.md) |
+|PowerShell |SQL Server to Azure SQL Managed Instance (using file share) |[Azure-Samples/data-migration-sql/PowerShell/sql-server-to-sql-mi-fileshare](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/sql-server-to-sql-mi-fileshare.md) |
+|PowerShell |SQL Server to Azure SQL Managed Instance (using Azure storage) |[Azure-Samples/data-migration-sql/PowerShell/sql-server-to-sql-mi-blob](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/sql-server-to-sql-mi-blob.md) |
+|PowerShell |SQL Server to SQL Server on Azure Virtual Machines (using file share) |[Azure-Samples/data-migration-sql/PowerShell/sql-server-to-sql-vm-fileshare](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/sql-server-to-sql-vm-fileshare.md) |
+|PowerShell |SQL Server to SQL Server on Azure Virtual Machines (using Azure Storage) |[Azure-Samples/data-migration-sql/PowerShell/sql-server-to-sql-vm-blob](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/sql-server-to-sql-vm-blob.md) |
+|PowerShell |Sample: End-to-End migration automation |[Azure-Samples/data-migration-sql/PowerShell/scripts/](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/scripts/) |
+|PowerShell |Sample: End-to-End migration automation for multiple databases |[Azure-Samples/data-migration-sql/PowerShell/scripts/multiple%20databases/](https://github.com/Azure-Samples/data-migration-sql/tree/main/PowerShell/scripts/multiple%20databases/) |
+|CLI |SQL Server assessment |[Azure-Samples/data-migration-sql/CLI/sql-server-assessment](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/sql-server-assessment.md) |
+|CLI |SQL Server to Azure SQL Managed Instance (using file share) |[Azure-Samples/data-migration-sql/CLI/sql-server-to-sql-mi-fileshare](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/sql-server-to-sql-mi-fileshare.md) |
+|CLI |SQL Server to Azure SQL Managed Instance (using Azure storage) |[Azure-Samples/data-migration-sql/CLI/sql-server-to-sql-mi-blob](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/sql-server-to-sql-mi-blob.md) |
+|CLI |SQL Server to SQL Server on Azure Virtual Machines (using file share) |[Azure-Samples/data-migration-sql/CLI/sql-server-to-sql-vm-fileshare](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/sql-server-to-sql-vm-fileshare.md) |
+|CLI |SQL Server to SQL Server on Azure Virtual Machines (using Azure Storage) |[Azure-Samples/data-migration-sql/CLI/sql-server-to-sql-vm-blob](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/sql-server-to-sql-vm-blob.md) |
+|CLI |Sample: End-to-End migration automation |[Azure-Samples/data-migration-sql/CLI/scripts/](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/scripts/) |
+|CLI |Sample: End-to-End migration automation for multiple databases |[Azure-Samples/data-migration-sql/CLI/scripts/multiple%20databases/](https://github.com/Azure-Samples/data-migration-sql/tree/main/CLI/scripts/multiple%20databases/) |
+
+## Prerequisites
+
+Pre-requisites that are common across all supported migration scenarios using Azure PowerShell or Azure CLI are:
+
+* Have an Azure account that is assigned to one of the built-in roles listed below:
+ - Contributor for the target Azure SQL Managed Instance (and Storage Account to upload your database backup files from SMB network share).
+ - Reader role for the Azure Resource Groups containing the target Azure SQL Managed Instance or the Azure storage account.
+ - Owner or Contributor role for the Azure subscription.
+ > [!IMPORTANT]
+ > Azure account is only required when running the migration steps and is not required for assessment or Azure recommendation steps process.
+* Create a target [Azure SQL Managed Instance](../azure-sql/managed-instance/create-configure-managed-instance-powershell-quickstart.md) or [SQL Server on Azure Virtual Machine](../azure-sql/virtual-machines/windows/sql-vm-create-powershell-quickstart.md)
+
+ > [!IMPORTANT]
+ > If you have an existing Azure Virtual Machine, it should be registered with [SQL IaaS Agent extension in Full management mode](../azure-sql/virtual-machines/windows/sql-server-iaas-agent-extension-automate-management.md#management-modes).
+* Ensure that the logins used to connect the source SQL Server are members of the *sysadmin* server role or have `CONTROL SERVER` permission.
+* Use one of the following storage options for the full database and transaction log backup files:
+ - SMB network share
+ - Azure storage account file share or blob container
+
+ > [!IMPORTANT]
+ > - If your database backup files are provided in an SMB network share, [Create an Azure storage account](../storage/common/storage-account-create.md) that allows the DMS service to upload the database backup files. Make sure to create the Azure Storage Account in the same region as the Azure Database Migration Service instance is created.
+ > - Azure Database Migration Service does not initiate any backups, and instead uses existing backups, which you may already have as part of your disaster recovery plan, for the migration.
+ > - You should take [backups using the `WITH CHECKSUM` option](/sql/relational-databases/backup-restore/enable-or-disable-backup-checksums-during-backup-or-restore-sql-server).
+ > - Each backup can be written to either a separate backup file or multiple backup files. However, appending multiple backups (i.e. full and t-log) into a single backup media is not supported.
+ > - Use compressed backups to reduce the likelihood of experiencing potential issues associated with migrating large backups.
+* Ensure that the service account running the source SQL Server instance has read and write permissions on the SMB network share that contains database backup files.
+* The source SQL Server instance certificate from a database protected by Transparent Data Encryption (TDE) needs to be migrated to the target Azure SQL Managed Instance or SQL Server on Azure Virtual Machine before migrating data. To learn more, see [Migrate a certificate of a TDE-protected database to Azure SQL Managed Instance](../azure-sql/managed-instance/tde-certificate-migrate.md) and [Move a TDE Protected Database to Another SQL Server](/sql/relational-databases/security/encryption/move-a-tde-protected-database-to-another-sql-server).
+ > [!TIP]
+ > If your database contains sensitive data that is protected by [Always Encrypted](/sql/relational-databases/security/encryption/configure-always-encrypted-using-sql-server-management-studio), migration process using Azure Data Studio with DMS will automatically migrate your Always Encrypted keys to your target Azure SQL Managed Instance or SQL Server on Azure Virtual Machine.
+
+* If your database backups are in a network file share, provide a machine to install [self-hosted integration runtime](../data-factory/create-self-hosted-integration-runtime.md) to access and migrate database backups. The Azure PowerShell or Azure CLI modules provide the authentication keys to register your self-hosted integration runtime. In preparation for the migration, ensure that the machine where you plan to install the self-hosted integration runtime has the following outbound firewall rules and domain names enabled:
+
+ | Domain names | Outbound ports | Description |
+ | -- | -- | |
+ | Public Cloud: `{datafactory}.{region}.datafactory.azure.net`<br> or `*.frontend.clouddatahub.net` <br> Azure Government: `{datafactory}.{region}.datafactory.azure.us` <br> China: `{datafactory}.{region}.datafactory.azure.cn` | 443 | Required by the self-hosted integration runtime to connect to the Data Migration service. <br>For new created Data Factory in public cloud, locate the FQDN from your Self-hosted Integration Runtime key, which is in format `{datafactory}.{region}.datafactory.azure.net`. For old Data factory, if you don't see the FQDN in your Self-hosted Integration key, use *.frontend.clouddatahub.net instead. |
+ | `download.microsoft.com` | 443 | Required by the self-hosted integration runtime for downloading the updates. If you have disabled auto-update, you can skip configuring this domain. |
+ | `*.core.windows.net` | 443 | Used by the self-hosted integration runtime that connects to the Azure storage account for uploading database backups from your network share |
+
+ > [!TIP]
+ > If your database backup files are already provided in an Azure storage account, self-hosted integration runtime is not required during the migration process.
+
+* When using self-hosted integration runtime, make sure that the machine where the runtime is installed can connect to the source SQL Server instance and the network file share where backup files are located. Outbound port 445 should be enabled to allow access to the network file share.
+* If you're using the Azure Database Migration Service for the first time, ensure that Microsoft.DataMigration resource provider is registered in your subscription. You can follow the steps to [register the resource provider](./quickstart-create-data-migration-service-portal.md#register-the-resource-provider)
+
+## Automate database migrations
+Using Azure PowerShell [Az.DataMigration](/powershell/module/az.datamigration) or Azure CLI [az datamigration](/cli/azure/datamigration), you can migrate databases by automating the creation of the Azure Database Migration Service, configuring database migrations for online migration and performing a cutover. There are several more commands and functionality that is documented in [Azure Samples](https://github.com/Azure-Samples/data-migration-sql).
+
+Example of automating migration of a SQL Server database using Azure CLI:
+
+**Step 1: Create Azure Database Migration Service which will orchestrate all the migration activities for your database.**
+```azurepowershell-interactive
+#STEP 1: Create Database Migration Service
+az datamigration sql-service create --resource-group "myRG" --sql-migration-service-name "myMigrationService" --location "EastUS2"
+```
+
+**Step 2: Configure and start online database migration from SQL Server on-premises (with backups in Azure storage) to Azure SQL Managed Instance.**
+```azurepowershell-interactive
+#STEP 2: Start Migration
+az datamigration sql-managed-instance create `
+--source-location '{\"AzureBlob\":{\"storageAccountResourceId\":\"/subscriptions/mySubscriptionID/resourceGroups/myRG/providers/Microsoft.Storage/storageAccounts/dbbackupssqlbits\",\"accountKey\":\"myAccountKey\",\"blobContainerName\":\"dbbackups\"}}' `
+--migration-service "/subscriptions/mySubscriptionID/resourceGroups/myRG/providers/Microsoft.DataMigration/SqlMigrationServices/myMigrationService" `
+--scope "/subscriptions/mySubscriptionID/resourceGroups/myRG/providers/Microsoft.Sql/managedInstances/mySQLMI" `
+--source-database-name "AdventureWorks2008" `
+--source-sql-connection authentication="SqlAuthentication" data-source="mySQLServer" password="myPassword" user-name="sqluser" `
+--target-db-name "AdventureWorks2008" `
+--resource-group myRG `
+--managed-instance-name mySQLMI
+```
+
+**Step 3: Perform a migration cutover once all backups are restored to Azure SQL Managed Instance.**
+```azurepowershell-interactive
+#STEP 3: Get migration ID and perform Cutover
+$migOpId = az datamigration sql-managed-instance show --managed-instance-name "mySQLMI" --resource-group "myRG" --target-db-name "AdventureWorks2008" --expand=MigrationStatusDetails --query "properties.migrationOperationId"
+az datamigration sql-managed-instance cutover --managed-instance-name "mySQLMI" --resource-group "myRG" --target-db-name "AdventureWorks2008" --migration-operation-id $migOpId
+```
+
+## Next steps
+
+- For Azure PowerShell reference documentation for SQL Server database migrations, see [Az.DataMigration](/powershell/module/az.datamigration).
+- For Azure CLI reference documentation for SQL Server database migrations, see [az datamigration](/cli/azure/datamigration).
+- For Azure Samples code repository, see [data-migration-sql](https://github.com/Azure-Samples/data-migration-sql)
event-grid Event Schema Media Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/event-schema-media-services.md
This article provides the schemas and properties for Media Services events.
## Job-related event types
-Media Services emits the **Job-related** event types described below. There are two categories for the **Job-related** events: "Monitoring Job State Changes" and "Monitoring Job Output State Changes".
+Media Services emits the **Job-related** event types described below. There are two categories for the **Job-related** events: "Monitoring Job State Changes" and "Monitoring Job Output State Changes".
-You can register for all of the events by subscribing to the JobStateChange event. Or, you can subscribe for specific events only (for example, final states like JobErrored, JobFinished, and JobCanceled).
+You can register for all of the events by subscribing to the JobStateChange event. Or, you can subscribe for specific events only (for example, final states like JobErrored, JobFinished, and JobCanceled).
### Monitoring Job state changes
See [Schema examples](#event-schema-examples) that follow.
A job may contain multiple job outputs (if you configured the transform to have multiple job outputs.) If you want to track the details of the individual job output, listen for a job output change event.
-Each **Job** is going to be at a higher level than **JobOutput**, thus job output events get fired inside of a corresponding job.
+Each **Job** is going to be at a higher level than **JobOutput**, thus job output events get fired inside of a corresponding job.
The error messages in `JobFinished`, `JobCanceled`, `JobError` output the aggregated results for each job output ΓÇô when all of them are finished. Whereas the job output events fire as each task finishes. For example, if you have an encoding output, followed by a Video Analytics output, you would get two events firing as job output events before the final JobFinished event fires with the aggregated data.
See [Schema examples](#event-schema-examples) that follow.
## Live event types
-Media Services also emits the **Live** event types described below. There are two categories for the **Live** events: stream-level events and track-level events.
+Media Services also emits the **Live** event types described below. There are two categories for the **Live** events: stream-level events and track-level events.
### Stream-level events
See [Schema examples](#event-schema-examples) that follow.
### Track-level events
-Track-level events are raised per track.
+Track-level events are raised per track.
> [!NOTE] > All track-level events are raised after a live encoder is connected.
See [Schema examples](#event-schema-examples) that follow.
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **JobStateChange** event:
+The following example shows the schema of the **JobStateChange** event:
```json [
The following example shows the schema of the **JobStateChange** event:
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **JobStateChange** event:
+The following example shows the schema of the **JobStateChange** event:
```json [
The example schema looks similar to the following:
### LiveEventConnectionRejected
-The following example shows the schema of the **LiveEventConnectionRejected** event:
+The following example shows the schema of the **LiveEventConnectionRejected** event:
```json [
The following example shows the schema of the **LiveEventConnectionRejected** ev
"eventType": "Microsoft.Media.LiveEventConnectionRejected", "eventTime": "2018-01-16T01:57:26.005121Z", "id": "b303db59-d5c1-47eb-927a-3650875fded1",
- "data": {
+ "data": {
"streamId":"Mystream1", "ingestUrl": "http://abc.ingest.isml", "encoderIp": "118.238.251.xxx",
The data object has the following properties:
| Property | Type | Description | | -- | - | -- |
-| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
+| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
+| `ingestUrl` | string | Ingest URL provided by the live event. |
| `encoderIp` | string | IP of the encoder. | | `encoderPort` | string | Port of the encoder from where this stream is coming. | | `resultCode` | string | The reason the connection was rejected. The result codes are listed in the following table. |
-You can find the error result codes in [live Event error codes](../media-services/latest/live-event-error-codes-reference.md).
+You can find the error result codes in [live Event error codes](/media-services/latest/live-event-error-codes-reference).
### LiveEventEncoderConnected # [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventEncoderConnected** event:
+The following example shows the schema of the **LiveEventEncoderConnected** event:
```json [
- {
+ {
"topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>", "subject": "liveEvent/mle1", "eventType": "Microsoft.Media.LiveEventEncoderConnected",
The following example shows the schema of the **LiveEventEncoderConnected** even
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventEncoderConnected** event:
+The following example shows the schema of the **LiveEventEncoderConnected** event:
```json [
- {
+ {
"source": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>", "subject": "liveEvent/mle1", "type": "Microsoft.Media.LiveEventEncoderConnected",
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventEncoderDisconnected** event:
+The following example shows the schema of the **LiveEventEncoderDisconnected** event:
```json [
- {
+ {
"topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>", "subject": "liveEvent/mle1", "eventType": "Microsoft.Media.LiveEventEncoderDisconnected",
The following example shows the schema of the **LiveEventEncoderDisconnected** e
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventEncoderDisconnected** event:
+The following example shows the schema of the **LiveEventEncoderDisconnected** event:
```json [
- {
+ {
"source": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>", "subject": "liveEvent/mle1", "type": "Microsoft.Media.LiveEventEncoderDisconnected",
The data object has the following properties:
| Property | Type | Description | | -- | - | -- |
-| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
+| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
+| `ingestUrl` | string | Ingest URL provided by the live event. |
| `encoderIp` | string | IP of the encoder. | | `encoderPort` | string | Port of the encoder from where this stream is coming. | | `resultCode` | string | The reason for the encoder disconnecting. It could be graceful disconnect or from an error. The result codes are listed in the following table. |
-You can find the error result codes in [live Event error codes](../media-services/latest/live-event-error-codes-reference.md).
+You can find the error result codes in [live Event error codes](/media-services/latest/live-event-error-codes-reference).
The graceful disconnect result codes are:
The graceful disconnect result codes are:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventIncomingDataChunkDropped** event:
+The following example shows the schema of the **LiveEventIncomingDataChunkDropped** event:
```json [
The following example shows the schema of the **LiveEventIncomingDataChunkDroppe
"eventType": "Microsoft.Media.LiveEventIncomingDataChunkDropped", "eventTime": "2018-01-16T01:57:26.005121Z", "id": "03da9c10-fde7-48e1-80d8-49936f2c3e7d",
- "data": {
+ "data": {
"trackType": "Video", "trackName": "Video", "bitrate": 300000,
The following example shows the schema of the **LiveEventIncomingDataChunkDroppe
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventIncomingDataChunkDropped** event:
+The following example shows the schema of the **LiveEventIncomingDataChunkDropped** event:
```json [
The following example shows the schema of the **LiveEventIncomingDataChunkDroppe
"type": "Microsoft.Media.LiveEventIncomingDataChunkDropped", "time": "2018-01-16T01:57:26.005121Z", "id": "03da9c10-fde7-48e1-80d8-49936f2c3e7d",
- "data": {
+ "data": {
"trackType": "Video", "trackName": "Video", "bitrate": 300000,
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventIncomingStreamReceived** event:
+The following example shows the schema of the **LiveEventIncomingStreamReceived** event:
```json [
The following example shows the schema of the **LiveEventIncomingStreamReceived*
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventIncomingStreamReceived** event:
+The following example shows the schema of the **LiveEventIncomingStreamReceived** event:
```json [
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventIncomingStreamsOutOfSync** event:
+The following example shows the schema of the **LiveEventIncomingStreamsOutOfSync** event:
```json [
The following example shows the schema of the **LiveEventIncomingStreamsOutOfSyn
"typeOfStreamWithMinLastTimestamp": "Audio", "maxLastTimestamp": "366000", "typeOfStreamWithMaxLastTimestamp": "Video",
- "timescaleOfMinLastTimestamp": "10000000",
- "timescaleOfMaxLastTimestamp": "10000000"
+ "timescaleOfMinLastTimestamp": "10000000",
+ "timescaleOfMaxLastTimestamp": "10000000"
}, "dataVersion": "1.0", "metadataVersion": "1"
The following example shows the schema of the **LiveEventIncomingStreamsOutOfSyn
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventIncomingStreamsOutOfSync** event:
+The following example shows the schema of the **LiveEventIncomingStreamsOutOfSync** event:
```json [
The following example shows the schema of the **LiveEventIncomingStreamsOutOfSyn
"typeOfStreamWithMinLastTimestamp": "Audio", "maxLastTimestamp": "366000", "typeOfStreamWithMaxLastTimestamp": "Video",
- "timescaleOfMinLastTimestamp": "10000000",
- "timescaleOfMaxLastTimestamp": "10000000"
+ "timescaleOfMinLastTimestamp": "10000000",
+ "timescaleOfMaxLastTimestamp": "10000000"
}, "specversion": "1.0" }
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventIncomingVideoStreamsOutOfSync** event:
+The following example shows the schema of the **LiveEventIncomingVideoStreamsOutOfSync** event:
```json [
The following example shows the schema of the **LiveEventIncomingVideoStreamsOut
"firstDuration": "2000", "secondTimestamp": "2162057216", "secondDuration": "2000",
- "timescale": "10000000"
+ "timescale": "10000000"
}, "dataVersion": "1.0", "metadataVersion": "1"
The following example shows the schema of the **LiveEventIncomingVideoStreamsOut
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventIncomingVideoStreamsOutOfSync** event:
+The following example shows the schema of the **LiveEventIncomingVideoStreamsOutOfSync** event:
```json [
The following example shows the schema of the **LiveEventIncomingVideoStreamsOut
"firstDuration": "2000", "secondTimestamp": "2162057216", "secondDuration": "2000",
- "timescale": "10000000"
+ "timescale": "10000000"
}, "specversion": "1.0" }
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventIngestHeartbeat** event:
+The following example shows the schema of the **LiveEventIngestHeartbeat** event:
```json [
The following example shows the schema of the **LiveEventIngestHeartbeat** event
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventIngestHeartbeat** event:
+The following example shows the schema of the **LiveEventIngestHeartbeat** event:
```json [
The data object has the following properties:
# [Event Grid event schema](#tab/event-grid-event-schema)
-The following example shows the schema of the **LiveEventTrackDiscontinuityDetected** event:
+The following example shows the schema of the **LiveEventTrackDiscontinuityDetected** event:
```json [
The following example shows the schema of the **LiveEventTrackDiscontinuityDetec
# [Cloud event schema](#tab/cloud-event-schema)
-The following example shows the schema of the **LiveEventTrackDiscontinuityDetected** event:
+The following example shows the schema of the **LiveEventTrackDiscontinuityDetected** event:
```json [
An event has the following top-level data:
## Next steps
-[Register for job state change events](../media-services/latest/monitoring/job-state-events-cli-how-to.md)
+[Register for job state change events](/media-services/latest/monitoring/job-state-events-cli-how-to)
## See also - [EventGrid .NET SDK that includes Media Service events](https://www.nuget.org/packages/Microsoft.Azure.EventGrid/) - [Definitions of Media Services events](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/eventgrid/data-plane/Microsoft.Media/stable/2018-01-01/MediaServices.json)-- [Live Event error codes](../media-services/latest/live-event-error-codes-reference.md)
+- [Live Event error codes](/media-services/latest/live-event-error-codes-reference)
governance Assign Policy Terraform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/assign-policy-terraform.md
for Azure Policy use the
```hcl provider "azurerm" {
- version = "~>2.0"
features {} }-
- resource "azurerm_policy_assignment" "auditvms" {
- name = "audit-vm-manageddisks"
- scope = var.cust_scope
- policy_definition_id = "/providers/Microsoft.Authorization/policyDefinitions/06a78e20-9358-41c9-923c-fb736d382a4d"
- description = "Shows all virtual machines not using managed disks"
- display_name = "Audit VMs without managed disks Assignment"
+
+ terraform {
+ required_providers {
+ azurerm = {
+ source = "hashicorp/azurerm"
+ version = ">= 2.96.0"
+ }
+ }
+ }
+
+ resource "azurerm_resource_policy_assignment" "auditvms" {
+ name = "audit-vm-manageddisks"
+ resource_id = var.cust_scope
+ policy_definition_id = "/providers/Microsoft.Authorization/policyDefinitions/06a78e20-9358-41c9-923c-fb736d382a4d"
+ description = "Shows all virtual machines not using managed disks"
+ display_name = "Audit VMs without managed disks assignment"
} ```
for Azure Policy use the
```hcl output "assignment_id" {
- value = azurerm_policy_assignment.auditvms.id
+ value = azurerm_resource_policy_assignment.auditvms.id
} ```
guides Azure Developer Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/guides/developer/azure-developer-guide.md
Azure Spring Cloud is a serverless microservices platform that enables you to bu
* Easily bind connections between your apps and Azure services such as Azure Database for MySQL and Azure Cache for Redis. * Monitor and troubleshoot microservices and applications using enterprise-grade unified monitoring tools that offer deep insights on application dependencies and operational telemetry.
-> **When to use:** As a fully managed service Azure Spring Cloud is a good choice when you're minimizing operational cost running Spring Boot/Spring Cloud based microservices on Azure.
+> **When to use:** As a fully managed service Azure Spring Cloud is a good choice when you're minimizing operational cost running Spring Boot/Spring Cloud based microservices on Azure.
> > **Get started:** [Deploy your first Spring Boot app in Azure Spring Cloud](../../spring-cloud/quickstart.md).
Along with REST APIs, many Azure services also let you programmatically manage r
* [Go](/azure/go) Services such as [Mobile Apps](/previous-versions/azure/app-service-mobile/app-service-mobile-dotnet-how-to-use-client-library)
-and [Azure Media Services](../../media-services/previous/media-services-dotnet-how-to-use.md) provide client-side SDKs to let you access services from web and mobile client apps.
+and [Azure Media Services](/media-services/previous/media-services-dotnet-how-to-use) provide client-side SDKs to let you access services from web and mobile client apps.
### Azure Resource Manager
industrial-iot Industrial Iot Platform Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/industrial-iot/industrial-iot-platform-versions.md
Last updated 11/10/2021
-# Azure Industrial IoT Platform Release 2.8.1
+# Azure Industrial IoT Platform Release 2.8.2
+We are pleased to announce the release of version 2.8.2 of our Industrial IoT Platform components as a second patch update of the 2.8 Long-Term Support (LTS) release. This release contains important backward compatibility fixes including Direct Methods API support with version 2.5.x, performance optimizations as well as security updates and bugfixes.
+
+## Azure Industrial IoT Platform Release 2.8.1
We are pleased to announce the release of version 2.8.1 of our Industrial IoT Platform components. This is the first patch update of the 2.8 Long-Term Support (LTS) release. It contains important security updates, bug fixes, and performance optimizations. ## Azure Industrial IoT Platform Release 2.8
We are pleased to announce the declaration of Long-Term Support (LTS) for versio
|[2.7.206](https://github.com/Azure/Industrial-IoT/tree/release/2.7.206) |Stable |January 2021 |Configuration through REST API (orchestrated mode), supports Samples telemetry format as well as PubSub - [Release notes](https://github.com/Azure/Industrial-IoT/releases/tag/2.7.206)| |[2.8](https://github.com/Azure/Industrial-IoT/tree/2.8.0) |Long-term support (LTS)|July 2021 |IoT Edge update to 1.1 LTS, OPC stack logging and tracing for better OPC Publisher diagnostics, Security fixes - [Release notes](https://github.com/Azure/Industrial-IoT/releases/tag/2.8.0)| |[2.8.1](https://github.com/Azure/Industrial-IoT/tree/2.8.1) |Patch release for LTS 2.8|November 2021 |Critical bug fixes, security updates, performance optimizations for LTS v2.8|
+|[2.8.2](https://github.com/Azure/Industrial-IoT/tree/2.8.2) |Patch release for LTS 2.8|March 2022 |Backwards compatibility with 2.5.x, bug fixes, security updates, performance optimizations for LTS v2.8|
## Next steps
iot-dps How To Troubleshoot Dps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/how-to-troubleshoot-dps.md
To learn more, see [alerts in Azure Monitor](../azure-monitor/alerts/alerts-over
1. Sign in to the [Azure portal](https://portal.azure.com).
-2. Browse to your IoT hub.
+2. Browse to your Device Provisioning Service.
3. Select **Diagnostics settings**.
-4. Select **Turn on diagnostics**.
+4. Select **Add diagnostic setting**.
-5. Enable the desired logs to be collected.
+5. Configure the desired logs to be collected.
| Log Name | Description | |-|| | DeviceOperations | Logs related to device connection events | | ServiceOperations | Event logs related to using service SDK (e.g. Creating or updating enrollment groups)|
-6. Turn on **Send to Log Analytics** ([see pricing](https://azure.microsoft.com/pricing/details/log-analytics/)).
+6. Tick the box **Send to Log Analytics** ([see pricing](https://azure.microsoft.com/pricing/details/log-analytics/)) and save.
7. Go to **Logs** tab in the Azure portal under Device Provisioning Service resource.
-8. Click **Run** to view recent events.
+8. Write **AzureDiagnostics** as a query and click **Run** to view recent events.
9. If there are results, look for `OperationName`, `ResultType`, `ResultSignature`, and `ResultDescription` (error message) to get more detail on the error.
Use this table to understand and resolve common errors.
| 404 | The Device Provisioning Service instance, or a resource (e.g. an enrollment) does not exist. |404 Not Found | | 412 | The ETag in the request does not match the ETag of the existing resource, as per RFC7232. | 412 Precondition failed | | 429 | Operations are being throttled by the service. For specific service limits, see [IoT Hub Device Provisioning Service limits](../azure-resource-manager/management/azure-subscription-service-limits.md#iot-hub-device-provisioning-service-limits). | 429 Too many requests |
-| 500 | An internal error occurred. | 500 Internal Server Error|
+| 500 | An internal error occurred. | 500 Internal Server Error|
key-vault Tutorial Import Certificate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/certificates/tutorial-import-certificate.md
Previously updated : 04/16/2020 Last updated : 03/16/2022 ms.devlang: azurecli #Customer intent:As a security admin who is new to Azure, I want to use Key Vault to securely store certificates in Azure
If you don't have an Azure subscription, create a [free account](https://azure.m
Sign in to the Azure portal at https://portal.azure.com.
-## Create a vault
+## Create a key vault
-1. From the Azure portal menu, or from the **Home** page, select **Create a resource**.
-2. In the Search box, enter **Key Vault**.
-3. From the results list, choose **Key Vault**.
-4. On the Key Vault section, choose **Create**.
-5. On the **Create key vault** section provide the following information:
- - **Name**: A unique name is required. For this quickstart, we use **Example-Vault**.
- - **Subscription**: Choose a subscription.
- - Under **Resource Group**, choose **Create new** and enter a resource group name.
- - In the **Location** pull-down menu, choose a location.
- - Leave the other options to their defaults.
-6. After providing the information above, select **Create**.
+Create a key vault using one of these three methods:
-Take note of the two properties listed below:
+- [Create a key vault using the Azure portal](../general/quick-create-portal.md)
+- [Create a key vault using the Azure CLI](../general/quick-create-cli.md)
+- [Create a key vault using Azure PowerShell](../general/quick-create-powershell.md)
-* **Vault Name**: In the example, this is **Example-Vault**. You will use this name for other steps.
-* **Vault URI**: In the example, this is https://example-vault.vault.azure.net/. Applications that use your vault through its REST API must use this URI.
+## Import a certificate to your key vault
-At this point, your Azure account is the only one authorized to perform operations on this new vault.
-
-![Output after Key Vault creation completes](../media/certificates/tutorial-import-cert/vault-properties.png)
-
-## Import a certificate to Key Vault
-
-To import a certificate to the vault, you need to have a PEM or PFX certificate file to be on disk. In this case, we will import a certificate with file name called **ExampleCertificate**.
+To import a certificate to the vault, you need to have a PEM or PFX certificate file to be on disk. If the certificate is in PEM format, the PEM file must contain the key as well as x509 certificates. This operation requires the certificates/import permission.
> [!IMPORTANT]
-> In Azure Key Vault, supported certificate formats are PFX and PEM.
+> In Azure Key Vault, supported certificate formats are PFX and PEM.
> - .pem file format contains one or more X509 certificate files. > - .pfx file format is an archive file format for storing several cryptographic objects in a single file i.e. server certificate (issued for your domain), a matching private key, and may optionally include an intermediate CA.
+In this case, we will create a certificate called **ExampleCertificate**, or import a certificate called **ExampleCertificate** with a path of **/path/to/cert.pem". You can import a certificate with the Azure portal, Azure CLI, or Azure PowerShell.
+
+# [Azure portal](#tab/azure-portal)
+ 1. On the Key Vault properties pages, select **Certificates**. 2. Click on **Generate/Import**. 3. On the **Create a certificate** screen choose the following values:
To import a certificate to the vault, you need to have a PEM or PFX certificate
- **Password** : If you are uploading a password protected certificate file, provide that password here. Otherwise, leave it blank. Once the certificate file is successfully imported, key vault will remove that password. 4. Click **Create**.
-![Certificate properties](../media/certificates/tutorial-import-cert/cert-import.png)
-By adding a certificate using **Import** method, Azure Key vault will automatically populate certificate parameters (i.e. validity period, Issuer name, activation date etc.).
+When importing a certificate, Azure Key vault will automatically populate certificate parameters (i.e. validity period, Issuer name, activation date etc.).
-Once you receive the message that the certificate has been successfully imported, you may click on it on the list to view its properties.
+Once you receive the message that the certificate has been successfully imported, you may click on it on the list to view its properties.
-![Screenshot that shows where to view the certificate properties.](../media/certificates/tutorial-import-cert/current-version-hidden.png)
-## Import a certificate using Azure CLI
+# [Azure CLI](#tab/azure-cli)
-Import a certificate into a specified key vault. To
-import an existing valid certificate, containing a private key, into Azure Key Vault, the file to be imported can be in either PFX or PEM format. If the certificate is in PEM format, the PEM file must contain the key as well as x509 certificates. This operation requires the certificates/import permission.
+Import a certificate into your key vault using the Azure CLI [az keyvault certificate import](/cli/azure/keyvault/certificate#az-keyvault-certificate-import) command:
```azurecli
-az keyvault certificate import --file
- --name
- --vault-name
- [--disabled {false, true}]
- [--only-show-errors]
- [--password]
- [--policy]
- [--subscription]
- [--tags]
+az keyvault certificate import --vault-name "<your-key-vault-name>" -n "ExampleCertificate" -f "/path/to/ExampleCertificate.pem"
```
-Learn more about the [parameters](/cli/azure/keyvault/certificate#az-keyvault-certificate-import).
-
-After importing the certificate, you can view the certificate using [Certificate show](/cli/azure/keyvault/certificate#az-keyvault-certificate-show)
-
+After importing the certificate, you can view the certificate using the Azure CLI [az keyvault certificate show](/cli/azure/keyvault/certificate#az-keyvault-certificate-show) command.
```azurecli
-az keyvault certificate show [--id]
- [--name]
- [--only-show-errors]
- [--subscription]
- [--vault-name]
- [--version]
+az keyvault certificate show --vault-name "<your-key-vault-name>" --name "ExampleCertificate"
```
-Now, you have created a Key vault, imported a certificate and viewed Certificate's properties.
-## Import a certificate using Azure PowerShell
+# [Azure PowerShell](#tab/azure-powershell)
+
+You can import a certificate into Key Vault using the Azure PowerShell [Import-AzKeyVaultCertificate](/powershell/module/az.keyvault/import-azkeyvaultcertificate) cmdlet.
+```azurepowershell
+$Password = ConvertTo-SecureString -String "123" -AsPlainText -Force
+Import-AzKeyVaultCertificate -VaultName "<your-key-vault-name>" -Name "ExampleCertificate" -FilePath "C:\path\to\ExampleCertificate.pem" -Password $Password
```
-Import-AzureKeyVaultCertificate
- [-VaultName] <String>
- [-Name] <String>
- -FilePath <String>
- [-Password <SecureString>]
- [-Tag <Hashtable>]
- [-DefaultProfile <IAzureContextContainer>]
- [-WhatIf]
- [-Confirm]
- [<CommonParameters>]
+
+After importing the certificate, you can view the certificate using the Azure PowerShell [Import-AzKeyVaultCertificate](/powershell/module/az.keyvault/import-azkeyvaultcertificate) cmdlet
+
+```azurepowershell
+Get-AzKeyVaultCertificate -VaultName "<your-key-vault-name>" -Name "ExampleCertificate"
```
-Learn more about the [parameters](/powershell/module/azurerm.keyvault/import-azurekeyvaultcertificate?).
+
+Now, you have created a Key vault, imported a certificate and viewed a certificate's properties.
## Clean up resources
When no longer needed, delete the resource group, which deletes the Key Vault an
2. Select **Delete resource group**. 3. In the **TYPE THE RESOURCE GROUP NAME:** box type in the name of the resource group and select **Delete**. - ## Next steps In this tutorial, you created a Key Vault and imported a certificate in it. To learn more about Key Vault and how to integrate it with your applications, continue on to the articles below. - Read more about [Managing certificate creation in Azure Key Vault](./create-certificate-scenarios.md) - See examples of [Importing Certificates Using REST APIs](/rest/api/keyvault/certificates/import-certificate/import-certificate)-- Review the [Key Vault security overview](../general/security-features.md)
+- Review the [Key Vault security overview](../general/security-features.md)
key-vault Tutorial Rotate Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/certificates/tutorial-rotate-certificates.md
Sign in to the Azure portal at https://portal.azure.com.
## Create a vault
-Create an Azure Key Vault using [Azure portal](../general/quick-create-portal.md), [Azure CLI](../general/quick-create-cli.md), or [Azure PowerShell](../general/quick-create-powershell.md). In the example, the key vault name is **Example-Vault**.
+Create a key vault using one of these three methods:
-![Output after key vault creation finishes](../media/certificates/tutorial-import-cert/vault-properties.png)
+- [Create a key vault using the Azure portal](../general/quick-create-portal.md)
+- [Create a key vault using the Azure CLI](../general/quick-create-cli.md)
+- [Create a key vault using Azure PowerShell](../general/quick-create-powershell.md)
## Create a certificate in Key Vault
key-vault Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/managed-hsm/overview.md
Previously updated : 06/21/2021 Last updated : 03/28/2022 #Customer intent: As an IT Pro, Decision maker or developer I am trying to learn what Managed HSM is and if it offers anything that could be used in my organization.
For pricing information, please see Managed HSM Pools section on [Azure Key Vaul
### Integrated with Azure and Microsoft PaaS/SaaS services -- Generate (or import using [BYOK](hsm-protected-keys-byok.md)) keys and use them to encrypt your data at rest in Azure services such as [Azure Storage](../../storage/common/customer-managed-keys-overview.md), [Azure SQL](../../azure-sql/database/transparent-data-encryption-byok-overview.md), and [Azure Information Protection](/azure/information-protection/byok-price-restrictions).
+- Generate (or import using [BYOK](hsm-protected-keys-byok.md)) keys and use them to encrypt your data at rest in Azure services such as [Azure Storage](../../storage/common/customer-managed-keys-overview.md), [Azure SQL](../../azure-sql/database/transparent-data-encryption-byok-overview.md), [Azure Information Protection](/azure/information-protection/byok-price-restrictions), and [Customer Key for Microsoft 365](/microsoft-365/compliance/customer-key-set-up). For a more complete list of Azure services which work with Managed HSM, see [Data Encryption Models](/azure/security/fundamentals/encryption-models#supporting-services).
### Uses same API and management interfaces as Key Vault
machine-learning Azure Machine Learning Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/azure-machine-learning-release-notes.md
Previously updated : 02/21/2022 Last updated : 03/28/2022 # Azure Machine Learning Python SDK release notes
In this article, learn about Azure Machine Learning Python SDK releases. For th
__RSS feed__: Get notified when this page is updated by copying and pasting the following URL into your feed reader: `https://docs.microsoft.com/api/search/rss?search=%22Azure+machine+learning+release+notes%22&locale=en-us`
+## 2022-03-28
+
+### Azure Machine Learning SDK for Python v1.40.0
+ + **azureml-automl-dnn-nlp**
+ + We're making the Long Range Text feature optional and only if the customers explicitly opt in for it, using the kwarg "enable_long_range_text"
+ + Adding data validation layer for multi-class classification scenario which leverages the same base class as multilabel for common validations, and a derived class for additional task specific data validation checks.
+ + **azureml-automl-dnn-vision**
+ + Fixing KeyError while computing class weights.
+ + **azureml-contrib-reinforcementlearning**
+ + SDK warning message for upcoming deprecation of RL service
+ + **azureml-core**
+ + * Return logs for runs that went through our new runtime when calling any of the get logs function on the run object, including `run.get_details`, `run.get_all_logs`, etc.
+ + Added experimental method Datastore.register_onpremises_hdfs to allow users to create datastores pointing to on-premises HDFS resources.
+ + Updating the cli documentation in the help command
+ + **azureml-interpret**
+ + For azureml-interpret package, remove shap pin with packaging update. Remove numba and numpy pin after CE env update.
+ + **azureml-mlflow**
+ + Bugfix for MLflow deployment client run_local failing when config object wasn't provided.
+ + **azureml-pipeline-steps**
+ + Remove broken link of deprecated pipeline EstimatorStep
+ + **azureml-responsibleai**
+ + update azureml-responsibleai package to raiwidgets and responsibleai 0.17.0 release
+ + **azureml-train-automl-runtime**
+ + Code generation for automated ML now supports ForecastTCN models (experimental).
+ + Models created via code generation will now have all metrics calculated by default (except normalized mean absolute error, normalized median absolute error, normalized RMSE, and normalized RMSLE in the case of forecasting models). The list of metrics to be calculated can be changed by editing the return value of `get_metrics_names()`. Cross validation will now be used by default for forecasting models created via code generation..
+ + **azureml-training-tabular**
+ + The list of metrics to be calculated can be changed by editing the return value of `get_metrics_names()`. Cross validation will now be used by default for forecasting models created via code generation.
+ + Converting decimal type y-test into float to allow for metrics computation to proceed without errors.
+ ## 2022-02-28 ### Azure Machine Learning SDK for Python v1.39.0
machine-learning How To Secure Training Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-secure-training-vnet.md
Previously updated : 03/01/2022 Last updated : 03/29/2022 ms.devlang: azurecli
When you enable **No public IP**, your compute cluster doesn't use a public IP f
A compute cluster with **No public IP** enabled has **no inbound communication requirements** from public internet. Specifically, neither inbound NSG rule (`BatchNodeManagement`, `AzureMachineLearning`) is required. You still need to allow inbound from source of **VirtualNetwork** and any port source, to destination of **VirtualNetwork**, and destination port of **29876, 29877**.
-> [!IMPORTANT]
-> When creating a compute instance with no public IP, the managed identity for your workspace must be assigned the __Owner__ role on the virtual network. For more information on assigning roles, see [Steps to assign an Azure role](../role-based-access-control/role-assignments-steps.md).
- **No public IP** clusters are dependent on [Azure Private Link](how-to-configure-private-link.md) for Azure Machine Learning workspace. A compute cluster with **No public IP** also requires you to disable private endpoint network policies and private link service network policies. These requirements come from Azure private link service and private endpoints and aren't Azure Machine Learning specific. Follow instruction from [Disable network policies for Private Link service](../private-link/disable-private-link-service-network-policy.md) to set the parameters `disable-private-endpoint-network-policies` and `disable-private-link-service-network-policies` on the virtual network subnet.
machine-learning Tutorial Create Secure Workspace Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/tutorial-create-secure-workspace-template.md
The template consists of multiple files. The following table describes what each
# [Bicep](#tab/bicep)
-To run the Terraform template, use the following commands from the `machine-learning-end-to-end-secure` where the `main.bicep` file is:
+To run the Bicep template, use the following commands from the `machine-learning-end-to-end-secure` where the `main.bicep` file is:
1. To create a new Azure Resource Group, use the following command. Replace `exampleRG` with your resource group name, and `eastus` with the Azure region you want to use:
After the template completes, use the following steps to connect to the DSVM:
To continue learning how to use the secured workspace from the DSVM, see [Tutorial: Get started with a Python script in Azure Machine Learning](tutorial-1st-experiment-hello-world.md).
-To learn more about common secure workspace configurations and input/output requirements, see [Azure Machine Learning secure workspace traffic flow](concept-secure-network-traffic-flow.md).
+To learn more about common secure workspace configurations and input/output requirements, see [Azure Machine Learning secure workspace traffic flow](concept-secure-network-traffic-flow.md).
marketplace Azure Private Plan Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/azure-private-plan-troubleshooting.md
Title: Troubleshoot private plans in the commercial marketplace description: Troubleshoot private plans in the commercial marketplace-+
While troubleshooting the Azure Subscription Hierarchy, keep these things in min
## Troubleshooting Checklist -- ISV to ensure the SaaS private plan is using the correct tenant ID for the customer - [How to find your Azure Active Directory tenant ID](../active-directory/fundamentals/active-directory-how-to-find-tenant.md). For VMs use the [Azure Subscription ID. (video guide)](../media-services/latest/setup-azure-subscription-how-to.md?tabs=portal)
+- ISV to ensure the SaaS private plan is using the correct tenant ID for the customer - [How to find your Azure Active Directory tenant ID](../active-directory/fundamentals/active-directory-how-to-find-tenant.md). For VMs use the [Azure Subscription ID. (video guide)](/media-services/latest/setup-azure-subscription-how-to?tabs=portal)
- ISV to ensure that the Customer is not buying through a CSP. Private Plans are not available on a CSP-managed subscription. - Customer to ensure customer is logging in with an email ID that is registered under the same tenant ID (use the same user ID they used in step #1 above) - ISV to ask the customer to find the Private Plan in Azure Marketplace: [Private plans in Azure Marketplace](/marketplace/private-plans)
marketplace Marketplace Metering Service Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/marketplace-metering-service-apis.md
description: The usage event API allows you to emit usage events for SaaS offers
Previously updated : 12/06/2021 Last updated : 03/30/2022
Bad request. The batch contained more than 25 usage events.
Code: 403<br> Forbidden. The authorization token isn't provided, is invalid or expired. Or the request is attempting to access a subscription for an offer that was published with a different Azure AD App ID from the one used to create the authorization token.
+## Metered billing retrieve usage events
+
+You can call the usage events API to get the list of usage events. ISVs can use this API to see the usage events that have been posted for a certain configurable duration of time and what state these events are at the point of calling the API.
+
+GET: https://marketplaceapi.microsoft.com/api/usageEvents
+
+*Query parameters*:
+
+| Parameter | Recommendation |
+| | - |
+| ApiVersion | Use this format: 2018-08-31 |
+| usageStartDate | DateTime in ISO8601 format. For example, 2020-12-03T15:00 or 2020-12-03 |
+| UsageEndDate (optional) | DateTime in ISO8601 format. Default = current date |
+| offerId (optional) | Default = all available |
+| planId (optional) | Default = all available |
+| dimension (optional) | Default = all available |
+| azureSubscriptionId (optional) | Default = all available |
+| reconStatus (optional) | Default = all available |
+|||
+
+*Possible values of reconStatus*:
+
+| ReconStatus | Description |
+| | - |
+| Submitted | Not yet processed by PC Analytics |
+| Accepted | Matched with PC Analytics |
+| Rejected | Rejected in the pipeline. Contact Microsoft support to investigate the cause. |
+| Mismatch | MarketplaceAPI and Partner Center Analytics quantities are both non-zero, however not matching |
+| TestHeaders | Subscription listed with test headers, and therefore not in PC Analytics |
+| DryRun | Submitted with SessionMode=DryRun, and therefore not in PC |
+|||
+
+*Request headers*:
+
+| Content type | Use application/json |
+| | - |
+| x-ms-requestid | Unique string value (preferably a GUID), for tracking the request from the client. If this value is not provided, one will be generated and provided in the response headers. |
+| x-ms-correlationid | Unique string value for operation on the client. This parameter correlates all events from client operation with events on the server side. If this value isn't provided, one will be generated and provided in the response headers. |
+| authorization | A unique access token that identifies the ISV that is making this API call. The format is `Bearer <access_token>` when the token value is retrieved by the publisher. For more information, see:<br><ul><li>SaaS in [Get the token with an HTTP POST](./partner-center-portal/pc-saas-registration.md#get-the-token-with-an-http-post)</li><li>Managed application in [Authentication strategies](marketplace-metering-service-authentication.md)</li></ul> |
+|||
+
+### Responses
+
+Response payload examples:
+
+*Accepted**
+
+```json
+[
+ {
+ "usageDate": "2020-11-30T00:00:00Z",
+ "usageResourceId": "11111111-2222-3333-4444-555555555555",
+ "dimension": "tokens",
+ "planId": "silver",
+ "planName": "Silver",
+ "offerId": "mycooloffer",
+ "offerName": "My Cool Offer",
+ "offerType": "SaaS",
+ "azureSubscriptionId": "12345678-9012-3456-7890-123456789012",
+ "reconStatus": "Accepted",
+ "submittedQuantity": 17.0,
+ "processedQuantity": 17.0,
+ "submittedCount": 17
+ }
+]
+```
+
+*Submitted*
+
+```json
+[
+ {
+ "usageDate": "2020-11-30T00:00:00Z",
+ "usageResourceId": "11111111-2222-3333-4444-555555555555",
+ "dimension": "tokens",
+ "planId": "silver",
+ "planName": "",
+ "offerId": "mycooloffer",
+ "offerName": "",
+ "offerType": "SaaS",
+ "azureSubscriptionId": "12345678-9012-3456-7890-123456789012",
+ "reconStatus": "Submitted",
+ "submittedQuantity": 17.0,
+ "processedQuantity": 0.0,
+ "submittedCount": 17
+ }
+]
+```
+
+*Mismatch*
++
+```json
+[
+ {
+ "usageDate": "2020-11-30T00:00:00Z",
+ "usageResourceId": "11111111-2222-3333-4444-555555555555",
+ "dimension": "tokens",
+ "planId": "silver",
+ "planName": "Silver",
+ "offerId": "mycooloffer",
+ "offerName": "My Cool Offer",
+ "offerType": "SaaS",
+ "azureSubscriptionId": "12345678-9012-3456-7890-123456789012",
+ "reconStatus": "Mismatch",
+ "submittedQuantity": 17.0,
+ "processedQuantity": 16.0,
+ "submittedCount": 17
+ }
+]
+```
+
+*Rejected*
+
+```json
+[
+ {
+ "usageDate": "2020-11-30T00:00:00Z",
+ "usageResourceId": "11111111-2222-3333-4444-555555555555",
+ "dimension": "tokens",
+ "planId": "silver",
+ "planName": "",
+ "offerId": "mycooloffer",
+ "offerName": "",
+ "offerType": "SaaS",
+ "azureSubscriptionId": "12345678-9012-3456-7890-123456789012",
+ "reconStatus": "Rejected",
+ "submittedQuantity": 17.0,
+ "processedQuantity": 0.0,
+ "submittedCount": 17
+ }
+]
+```
+
+**Status codes**
+
+Code: 403
+Forbidden. The authorization token isn't provided, is invalid or expired. Or the request is attempting to access a subscription for an offer that was published with a different Azure AD App ID from the one used to create the authorization token.
+ ## Development and testing best practices To test the custom meter emission, implement the integration with metering API, create a plan for your published SaaS offer with custom dimensions defined in it with zero price per unit. And publish this offer as preview so only limited users would be able to access and test the integration.
Follow the instruction in [Support for the commercial marketplace program in Par
## Next steps
-For more information on metering service APIs , see [Marketplace metering service APIs FAQ](marketplace-metering-service-apis-faq.yml).
+For more information on metering service APIs, see [Marketplace metering service APIs FAQ](marketplace-metering-service-apis-faq.yml).
media-services Azure Media Player Accessibility https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-accessibility.md
- Title: Azure Media Player Accessibility
-description: Learn more about the Azure Media Player's accessibility settings.
---- Previously updated : 04/05/2021--
-# Accessibility #
-
-Azure Media Player works with screen reader capabilities such as Windows Narrator and Apple OSX/iOS VoiceOver. Alternative tags are available for the UI buttons, and the screen reader is capable of reading these alternative tags when the user navigates to them. Additional configurations can be set at the Operating System level.
-
-## Captions and subtitles ##
-
-At this time, Azure Media Player currently supports captions for only On-Demand events with WebVTT format and live events using CEA 708. TTML format is currently unsupported. Captions and subtitles allow the player to cater to and empower a broader audience, including people with hearing disabilities or those who want to read along in a different language. Captions also increase video engagement, improve comprehension, and make it possible to view the video in sound sensitive environments, like a workplace.
-
-## High contrast mode ##
-
-The default UI in Azure Media Player is compliant with most device/browser high contrast view modes. Configurations can be set at the Operating System level.
-
-## Mobility options ##
-
-### Tabbing focus ###
-
-Tabbing focus, provided by general HTML standards, is available in Azure Media Player. In order to enable tab focusing, you must add `tabindex=0` (or another value if you understand how tab ordering is affected in HTML) to the HTML `<video>` like so: `<video ... tabindex=0>...</video>`. On some platforms, the focus for the controls may only be present if the controls are visible and if the platform supports these capabilities.
-
-Once tabbing focus is enabled, the end user can effectively navigate and control the video player without depending on their mouse. Each context menu or controllable element can be navigated to by hitting the tab button and selected with enter or spacebar. Hitting enter or spacebar on a context menu will expand it so the end user can continue tabbing through to select a menu item. Once you have context of the item you wish to select, hit enter or spacebar again to complete the selection.
-
-### HotKeys ###
-
-Azure Media Player supports controlling through keyboard hot key. In a web browser, the only way to control the underlying video element is by having focus on the player. Once there is focus on the player, hot key can control the player functionality. The table below describes the various hot keys and their associated behavior:
-
-| Hot key | Behavior |
-|-|-|
-| F/f | Player will enter/exit FullScreen mode |
-| M/m | Player volume will mute/unmute |
-| Up and Down Arrow | Player volume will increase/decrease |
-| Left and Right Arrow | Video progress will increase /decrease |
-| 0,1,2,3,4,5,6,7,8,9 | Video progress will be changed to 0%\- 90% depending on the key pressed |
-| Click Action | Video will play/pause |
-
-## Next steps
-
-<!Some context for the following links goes here>
-- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Api Methods https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-api-methods.md
- Title: Azure Media Player API Methods
-description: The Azure Media Player API allows you to interact with the video through JavaScript, whether the browser is playing the video through HTML5 video, Flash, Silverlight, or any other supported playback technologies.
---- Previously updated : 04/05/2021---
-# API #
-
-The Azure Media Player API allows you to interact with the video through JavaScript, whether the browser is playing the video through HTML5 video, Flash, Silverlight, or any other supported playback technologies.
-
-## Referencing the player ##
-
-To use the API functions, you need access to the player object. Luckily it is easy to get. You just need to make sure your video tag has an ID. The example embed code has an ID of `vid1`. If you have multiple videos on one page, make sure every video tag has a unique ID.
-
-`var myPlayer = amp('vid1');`
-
-> [!NOTE]
-> If the player hasn't been initialized yet via the data-setup attribute or another method, this will also initialize the player.
-
-## Wait until the player is ready ##
-
-The time it takes Azure Media Player to set up the video and API will vary depending on the playback technology being used. HTML5 will often be much faster to load than Flash or Silverlight. For that reason, the player's 'ready' function should be used to trigger any code that requires the player's API.
-
-```javascript
- amp("vid_1").ready(function(){
- var myPlayer = this;
-
- // EXAMPLE: Start playing the video.
- myPlayer.play();
- });
-```
-
-OR
-
-```javascript
- var myPlayer = amp("vid_1", myOptions, function(){
- //this is the ready function and will only execute after the player is loaded
- });
-```
-
-## API methods ##
-
-Now that you have access to a ready player, you can control the video, get values, or respond to video events. The Azure Media Player API function names attempt to follow the [HTML5 media API](http://www.whatwg.org/specs/web-apps/current-work/multipage/the-video-element.html). The main difference is that getter/setter functions are used for video properties.
-
-```javascript
- // setting a property on a bare HTML5 video element
- myVideoElement.currentTime = 120;
-
- // setting a property with Azure Media Player
- myPlayer.currentTime(120);
-```
-
-## Registering for events ##
-Events should be registered directly after initializing the player for the first time to ensure all events are appropriately reported to the application, and should be done outside of the ready event.
-
-```javascript
- var myPlayer = amp("vid_1", myOptions, function(){
- //this is the ready function and will only execute after the player is loaded
- });
- myPlayer.addEventListener(amp.eventName.error, _ampEventHandler);
- //add other event listeners
-```
-
-## Next steps ##
-
-<!Some context for the following links goes here>
-- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-error-codes.md
- Title: Azure Media Player error codes
-description: An error code reference for Azure Media Player.
---- Previously updated : 04/05/2021--
-# Error codes #
-
-When playback can't start or has stopped, an error event will be fired and the `error()` function will return a code and an optional message to help the app developer get more details. `error().message` isn't the message displayed to the user. The message displayed to the user is based on `error().code` bits 27-20, see table below.
-
-```javascript
-
- var myPlayer = amp('vid1');
- myPlayer.addEventListener('error', function() {
- var errorDetails = myPlayer.error();
- var code = errorDetails.code;
- var message = errorDetails.message;
- }
-```
-
-## Error codes, bits [31-28] (4 bits) ##
-
-Describe the area of the error.
--- 0 - Unknown-- 1 - AMP-- 2 - AzureHtml5JS-- 3 - FlashSS-- 4 - SilverlightSS-- 5 - Html5-- 6 - Html5FairPlayHLS-
-## Error codes, bits [27-0] (28 bits) ##
-
-Describe details of the error, bits 27-20 provide a high level, bits 19-0 provide more detail if available.
--
-| amp.errorCode.[name] | Codes, Bits [27-0] (28 bits) | Description |
-||:||
-| **MEDIA_ERR_ABORTED errors range (0x0100000 - 0x01FFFFF)** | | |
-| abortedErrUnknown | 0x0100000 | Generic abort error |
-| abortedErrNotImplemented | 0x0100001 | Abort error, not implemented |
-| abortedErrHttpMixedContentBlocked | 0x0100002 | Abort error, mixed content blocked - generally occurs when loading an `http://` stream from an `https://` page |
-| **MEDIA_ERR_NETWORK errors start value (0x0200000 - 0x02FFFFF)** | | |
-| networkErrUnknown | 0x0200000 | Generic network error |
-| networkErrHttpBadUrlFormat | 0x0200190 | Http 400 error response |
-| networkErrHttpUserAuthRequired | 0x0200191 | Http 401 error response |
-| networkErrHttpUserForbidden | 0x0200193 | Http 403 error response |
-| networkErrHttpUrlNotFound | 0x0200194 | Http 404 error response |
-| networkErrHttpNotAllowed | 0x0200195 | Http 405 error response |
-| networkErrHttpGone | 0x020019A | Http 410 error response |
-| networkErrHttpPreconditionFailed | 0x020019C | Http 412 error response |
-| networkErrHttpInternalServerFailure | 0x02001F4 | Http 500 error response
-| networkErrHttpBadGateway | 0x02001F6 | Http 502 error response |
-| networkErrHttpServiceUnavailable | 0x02001F7 | Http 503 error response |
-| networkErrHttpGatewayTimeout | 0x02001F8 | Http 504 error response |
-| networkErrTimeout | 0x0200258 | Network timeout error
-| networkErrErr | 0x0200259 | Network connection error response |
-| **MEDIA_ERR_DECODE errors (0x0300000 - 0x03FFFFF)** | | |
-| decodeErrUnknown | 0x0300000 | Generic decode error |
-| **MEDIA_ERR_SRC_NOT_SUPPORTED errors (0x0400000 - 0x04FFFFF)** | | |
-| srcErrUnknown | 0x0400000 | Generic source not supported error |
-| srcErrParsePresentation | 0x0400001 | Presentation parse error |
-| srcErrParseSegment | 0x0400002 | Segment parse error |
-| srcErrUnsupportedPresentation | 0x0400003 | Presentation not supported |
-| srcErrInvalidSegment | 0x0400004 | Invalid segment |
-| srcErrLiveNoSegments | 0x0400005 | Segments not available yet |
-| **MEDIA_ERR_ENCRYPTED errors start value(0x0500000 - 0x05FFFFF)** | | |
-| encryptErrUnknown | 0x0500000 | Generic encrypted error |
-| encryptErrDecrypterNotFound | 0x0500001 | Decrypter not found |
-| encryptErrDecrypterInit | 0x0500002 | Decrypter initialization error |
-| encryptErrDecrypterNotSupported | 0x0500003 | Decrypter not supported |
-| encryptErrKeyAcquire | 0x0500004 | Key acquire failed |
-| encryptErrDecryption | 0x0500005 | Decryption of segment failed |
-| encryptErrLicenseAcquire | 0x0500006 | License acquire failed |
-| **SRC_PLAYER_MISMATCH errors start value(0x0600000 - 0x06FFFFF)** | | |
-| srcPlayerMismatchUnknown | 0x0600000 | Generic no matching tech player to play the source |
-| srcPlayerMismatchFlashNotInstalled | 0x0600001 |Flash plugin isn't installed, if installed the source may play. *OR* Flash 30 is installed and playing back AES content. If this is the case, please try a different browser. Flash 30 is unsupported today as of June 7th. See [known issues](azure-media-player-known-issues.md) for more details. Note: If 0x00600003, both Flash and Silverlight are not installed, if specified in the techOrder.|
-| srcPlayerMismatchSilverlightNotInstalled | 0x0600002 | Silverlight plugin is not installed, if installed the source may play. Note: If 0x00600003, both Flash and Silverlight are not installed, if specified in the techOrder. |
-| | 0x00600003 | Both Flash and Silverlight are not installed, if specified in the techOrder. |
-| **Unknown errors (0x0FF00000)** | | |
-| errUnknown | 0xFF00000 | Unknown errors |
-
-## User error messages ##
-
-User message displayed is based on error code's bits 27-20.
--- MEDIA_ERR_ABORTED (1) - "You aborted the video playback"-- MEDIA_ERR_NETWORK (2) - "A network error caused the video download to fail part-way."-- MEDIA_ERR_DECODE (3) - "The video playback was aborted due to a corruption problem or because the video used features your browser did not support."-- MEDIA_ERR_SRC_NOT_SUPPORTED (4) - "The video could not be loaded, either because the server or network failed or because the format is not supported."-- MEDIA_ERR_ENCRYPTED (5) - "The video is encrypted and we do not have the keys to decrypt it."-- SRC_PLAYER_MISMATCH (6) - "No compatible source was found for this video."-- MEDIA_ERR_UNKNOWN (0xFF) - "An unknown error occurred."-
-## Examples ##
-
-### 0x10600001 ##
-
-"No compatible source was found for this video." is displayed to the end user.
-
-There is no tech player that can play the requested sources, but if Flash plugin is installed, it is likely that a source could be played.
-
-### 0x20200194 ###
-
-"A network error caused the video download to fail part-way." is displayed to the end user.
-
-AzureHtml5JS failed to playback from an http 404 response.
-
-### Categorizing errors ###
-
-```javascript
- if(myPlayer.error().code & amp.errorCode.abortedErrStart) {
- // MEDIA_ERR_ABORTED errors
- }
- else if(myPlayer.error().code & amp.errorCode.networkErrStart) {
- // MEDIA_ERR_NETWORK errors
- }
- else if(myPlayer.error().code & amp.errorCode.decodeErrStart) {
- // MEDIA_ERR_DECODE errors
- }
- else if(myPlayer.error().code & amp.errorCode.srcErrStart) {
- // MEDIA_ERR_SRC_NOT_SUPPORTED errors
- }
- else if(myPlayer.error().code & amp.errorCode.encryptErrStart) {
- // MEDIA_ERR_ENCRYPTED errors
- }
- else if(myPlayer.error().code & amp.errorCode.srcPlayerMismatchStart) {
- // SRC_PLAYER_MISMATCH errors
- }
- else {
- // unknown errors
- }
-```
-
-### Catching a specific error ###
-
-The following code catches just 404 errors:
-
-```javascript
- if(myPlayer.error().code & amp.errorCode.networkErrHttpUrlNotFound) {
- // all http 404 errors
- }
-```
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Feature List https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-feature-list.md
- Title: Azure Media Player feature list
-description: A feature reference for Azure Media Player.
---- Previously updated : 04/05/2021--
-# Feature list #
-Here is the list of tested features and unsupported features:
-
-| Feature | TESTED | PARTIALLY TESTED | UNTESTED | UNSUPPORTED | NOTES |
-| - | | - | -- | -- | -- |
-| **Playback** | | | | | |
-| Basic On-Demand Playback | X | | | | Supports streams from Azure Media Services only |
-| Basic Live Playback | X | | | | Supports streams from Azure Media Services only |
-| AES | X | | | | Supports Azure Media Services Key Delivery Service |
-| Multi-DRM | | X | | | |
-| PlayReady | X | | | | Supports Azure Media Services Key Delivery Service |
-| Widevine | | X | | | Supports Widevine PSSH boxes outlined in manifest |
-| FairPlay | | X | | | Supports Azure Media Services Key Delivery Service |
-| **Techs** | | | | | |
-| MSE/EME (AzureHtml5JS) | X | | | | |
-| Flash Fallback (FlashSS) | X | | | | Not all features are available on this tech. |
-| Silverlight Fallback SilverlightSS | X | | | | Not all features are available on this tech. |
-| Native HLS Pass-through (Html5) | | X | | | Not all features are available on this tech due to platform restrictions. |
-| **Features** | | | | | |
-| API Support | X | | | | See known issues list |
-| Basic UI | X | | | |
-| Initialization through JavaScript | X | | | | |
-| Initialization through video tag | | X | | | |
-| Segment addressing - Time Based | X | | | | |
-| Segment addressing - Index Based | | | | X | |
-| Segment addressing - Byte Based | | | | X | |
-| Azure Media Services URL rewriter | | X | | | |
-| Accessibility - Captions and Subtitles | X | | | | WebVTT (on demand), CEA 708 (on demand and live) and IMSC1 (on demand and live) |
-| Accessibility - Hotkeys | X | | | | |
-| Accessibility - High Contrast | | X | | | |
-| Accessibility - Tab Focus | | X | | | |
-| Error Messaging | | X | | | Error messages are inconsistent across techs |
-| Event Triggering | X | | | | |
-| Diagnostics | | X | | | Diagnostic information is only available on the AzureHtml5JS tech and partially available on the SilverlightSS tech. |
-| Customizable Tech Order | | X | | | |
-| Heuristics - Basic | X | | | | |
-| Heuristics - Customization | | | X | | Customization is only available with the AzureHtml5JS tech. |
-| Discontinuities | X | | | | |
-| Select Bitrate | X | | | | This API is only available on the AzureHtml5JS and FlashSS techs. |
-| Multi-Audio Stream | | X | | | Programmatic audio switch is supported on AzureHtml5JS and FlashSS techs, and is available through UI selection on AzureHtml5JS, FlashSS, and native Html5 (in Safari). Most platforms require the same codec private data to switch audio streams (same codec, channel, sampling rate, etc.). |
-| UI Localization | | X | | | |
-| Multi-instance Playback | | | | X | This scenario may work for some techs but is currently unsupported and untested. You may also get this to work using iframes |
-| Ads Support | | X | | | AMP supports the insertion of pre- mid- and post-roll linear ads from VAST-compliant ad servers for VOD in the AzureHtml5JS tech |
-| Analytics | | X | | | AMP provides the ability to listen to analytics and diagnostic events in order to send to an Analytics backend of your choice. All events and properties are not available across techs due to platform limitations. |
-| Custom Skins | | | X | | This scenario can be achieved by turning setting controls to false in AMP and using your own HTML and CSS. |
-| Seek Bar Scrubbing | | | | X | |
-| Trick-Play | | | | X | |
-| Audio Only | X | | | | Supported in AzureHtml5JS. Progressive MP3 playback can work with the HTML5 tech if the platform supports it. |
-| Video Only | X | | | | Supported in AzureHtml5JS. |
-| Multi-period Presentation | | | | X |
-| Multiple camera angles | | | | X | |
-| Playback Speed | | X | | | Playback speed is supported in most scenarios except the mobile case due to a partial bug in Chrome |
-
-## Next steps ##
-- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Full Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-full-setup.md
- Title: Azure Media Player Full Setup
-description: Learn how to set up the Azure Media Player.
---- Previously updated : 04/05/2021---
-# Azure Media Player full setup #
-
-Azure Media Player is easy to set up. It only takes a few moments to get basic playback of media content right from your Azure Media Services account. [Samples](https://github.com/Azure-Samples/azure-media-player-samples) are also provided in the samples directory of the release.
-
-## Step 1: Include the JavaScript and CSS files in the head of your page ##
-
-With Azure Media Player, you can access the scripts from the CDN hosted version. It's often recommended now to put JavaScript before the end body tag `<body>` instead of the `<head>`, but Azure Media Player includes an 'HTML5 Shiv', which needs to be in the head for older IE versions to respect the video tag as a valid element.
-
-> [!NOTE]
-> If you're already using an HTML5 shiv like [Modernizr](https://modernizr.com/) you can include the Azure Media Player JavaScript anywhere. However make sure your version of Modernizr includes the shiv for video.
-
-### CDN Version ###
-
-```html
- <link href="//amp.azure.net/libs/amp/latest/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
- <script src= "//amp.azure.net/libs/amp/latest/azuremediaplayer.min.js"></script>
-```
-
-> [!IMPORTANT]
-> You should **NOT** use the `latest` version in production, as this is subject to change on demand. Replace `latest` with a version of Azure Media Player. For example, replace `latest` with `2.1.1`. Azure Media Player versions can be queried from [here](https://amp.azure.net/libs/amp/latest/docs/changelog.html).
-
-> [!NOTE]
-> Since the `1.2.0` release, it is no longer required to include the location to the fallback techs (it will automatically pick up the location from the relative path of the azuremediaplayer.min.js file). You can modify the location of the fallback techs by adding the following script in the `<head>` after the above scripts.
-
-> [!NOTE]
-> Due to the nature of Flash and Silverlight plugins, the swf and xap files should be hosted on a domain without any sensitive information or data - this is automatically taken care of for you with the Azure CDN hosted version.
-
-```javascript
- <script>
- amp.options.flashSS.swf = "//amp.azure.net/libs/amp/latest/techs/StrobeMediaPlayback.2.0.swf"
- amp.options.silverlightSS.xap = "//amp.azure.net/libs/amp/latest/techs/SmoothStreamingPlayer.xap"
- </script>
-```
-
-## Step 2: Add an HTML5 video tag to your page ##
-
-With Azure Media Player, you can use an HTML5 video tag to embed a video. Azure Media Player will then read the tag and make it work in all browsers, not just ones that support HTML5 video. Beyond the basic markup, Azure Media Player needs a few extra pieces.
-
-1. The `<data-setup>` attribute on the `<video>` tells Azure Media Player to automatically set up the video when the page is ready, and read any (in JSON format) from the attribute.
-1. The `id` attribute: Should be used and unique for every video on the same page.
-1. The `class` attribute contains two classes:
- - `azuremediaplayer` applies styles that are required for Azure Media Player UI functionality
- - `amp-default-skin` applies the default skin to the HTML5 controls
-1. The `<source>` includes two required attributes
- - `src` attribute can include a **.ism/manifest* file from Azure Media Services is added, Azure Media Player automatically adds the URLs for DASH, SMOOTH, and HLS to the player
- - `type` attribute is the required MIME type of the stream. The MIME type associated with *".ism/manifest"* is *"application/vnd.ms-sstr+xml"*
-1. The *optional* `<data-setup>` attribute on the `<source>` tells Azure Media Player if there are any unique delivery policies for the stream from Azure Media Services, including, but not limited to, encryption type (AES or PlayReady, Widevine, or FairPlay) and token.
-
-Include/exclude attributes, settings, sources, and tracks exactly as you would for HTML5 video.
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" poster="poster.jpg" data-setup='{"techOrder": ["azureHtml5JS", "flashSS", "html5FairPlayHLS","silverlightSS", "html5"], "nativeControlsForTouch": false}'>
- <source src="http://amssamples.streaming.mediaservices.windows.net/91492735-c523-432b-ba01-faba6c2206a2/AzureMediaServicesPromo.ism/manifest" type="application/vnd.ms-sstr+xml" />
- <p class="amp-no-js">
- To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
- </p>
- </video>
-```
-
-By default, the large play button is located in the upper left-hand corner so it doesn't cover up the interesting parts of the poster. If you'd prefer to center the large play button, you can add an additional `amp-big-play-centered` `class` to your `<video>` element.
-
-### Alternative Setup for dynamically loaded HTML ###
-
-If your web page or application loads the video tag dynamically (ajax, appendChild, etc.), so that it may not exist when the page loads, you'll want to manually set up the player instead of relying on the data-setup attribute. To do this, first remove the data-setup attribute from the tag so there's no confusion around when the player is initialized. Next, run the following JavaScript some time after the Azure Media Player JavaScript has loaded, and after the video tag has been loaded into the DOM.
-
-```javascript
- var myPlayer = amp('vid1', { /* Options */
- techOrder: ["azureHtml5JS", "flashSS", "html5FairPlayHLS","silverlightSS", "html5"],
- "nativeControlsForTouch": false,
- autoplay: false,
- controls: true,
- width: "640",
- height: "400",
- poster: ""
- }, function() {
- console.log('Good to go!');
- // add an event listener
- this.addEventListener('ended', function() {
- console.log('Finished!');
- });
- }
- );
- myPlayer.src([{
- src: "http://samplescdn.origin.mediaservices.windows.net/e0e820ec-f6a2-4ea2-afe3-1eed4e06ab2c/AzureMediaServices_Overview.ism/manifest",
- type: "application/vnd.ms-sstr+xml"
- }]);
-```
-
-The first argument in the `amp` function is the ID of your video tag. Replace it with your own.
-
-The second argument is an options object. It allows you to set additional options like you can with the data-setup attribute.
-
-The third argument is a 'ready' callback. Once Azure Media Player has initialized, it will call this function. In the ready callback, 'this' object refers to the player instance.
-
-Instead of using an element ID, you can also pass a reference to the element itself.
-
-```javascript
-
- amp(document.getElementById('example_video_1'), {/*Options*/}, function() {
- //This is functionally the same as the previous example.
- });
- myPlayer.src([{ src: "//example/path/to/myVideo.ism/manifest", type: "application/vnd.ms-sstr+xml"]);
-```
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-known-issues.md
- Title: Azure Media Player Known Issues
-description: The current release has the following known issues.
---- Previously updated : 04/05/2021--
-# Known Issues #
-
-The current release has the following known issues:
-
-## Azure Media Player ##
--- Incorrectly configured encoders may cause issues with playback-- Streams with timestamps greater than 2^53 may have playback issues.
- - Mitigation: Use 90-kHz video and 44.1-kHz audio timescales
-- No autoplay on mobile devices without user interaction. It's blocked by the platform.-- Seeking near discontinuities may cause playback failure.-- Download of large presentations may cause UI to lockup.-- Can't automatically play back same source again after presentation ended.
- - To replay a source after it has ended, it's required to set the source again.
-- Empty manifests may cause issues with the player.
- - This issue can occur when starting a live stream and not enough chunks are found in the manifest.
-- Playback position maybe outside UI seekbar.-- Event ordering isn't consistent across all techs.-- Buffered property isn't consistent across techs.-- nativeControlsForTouch must be false (default) to avoid "Object doesn't support property or method 'setControls'"-- Posters must now be absolute urls-- Minor aesthetic issues may occur in the UI when using high contrast mode of the device-- URLs containing "%" or "+" in the fully decoded string may have problems setting the source-
-## Ad insertion ##
--- Ads may have issues being inserted (on demand or live) when an ad-blocker is installed in the browser-- Mobile devices may have issues playing back ads.-- MP4 Midroll ads aren't currently supported by Azure Media Player.-
-## AzureHtml5JS ##
--- In the DVR window of Live content, when content finishes the timeline will continue to grow until seeking to the area or reaching the end of the presentation.-- Live presentations in Firefox with MSE enabled has some issues--- Assets that are audio only will not play back via the AzureHtml5JS tech.
- - If youΓÇÖd like to play back assets without audio, you can do so by inserting blank audio using the [Azure Media Services Explorer tool](https://aka.ms/amse)
- - Instructions on how to insert silent audio can be found [here](../previous/media-services-advanced-encoding-with-mes.md#silent_audio)
-
-## Flash ##
--- AES content does not play back in Flash version 30.0.0.134 due to a bug in Adobe's caching logic. Adobe has fixed the issue and released it in 30.0.0.154-- Tech and http failures (such as 404 network timeouts), the player will take longer to recover than other techs.-- Click on video area with flashSS tech won't play/pause the player.-- If the user has Flash installed but doesn't give permission to load it on the site, infinite spinning can occur. This is because the player thinks the plugin is installed and available and it thinks the plugin is running the content. JavaScript code has been sent but the browser settings have blocked the plugin from executing until the user accepts the prompt to allow the plugin. This can occur in all browsers. -
-## Silverlight ##
--- Missing features-- Tech and http failures (such as 404 network timeouts), the player will take longer to recover than other techs.-- Safari and Firefox on Mac playback with Silverlight requires explicitly defining `"http://` or `https://` for the source.-- If an API is missing for this tech, it will generally return null.-- If the user has Flash installed but doesn't give permission to load it on the site, infinite spinning can occur. This is because the player thinks the plugin is installed and available and it thinks the plugin is running the content. JavaScript code has been sent but the browser settings have blocked the plugin from executing until the user accepts the prompt to allow the plugin. This can occur in all browsers. -
-## Native HTML5 ##
--- Html5 tech is only sending canplaythrough event for first set source.-- This tech only supports what the browser has implemented. Some features may be missing in this tech. -- If an API is missing for this tech, it will generally return null.-- There are issues with Captions and Subtitles on this tech. They may or may not be available or viewable on this tech.-- Having limited bandwidth in HLS/Html5 tech scenario results in audio playing without video.-
-### Microsoft ###
--- IE8 playback does not currently work due to incompatibility with ECMAScript 5-- In IE and some versions of Edge, fullscreen cannot be entered by tabbing to the button and selecting it or using the F/f hotkey.-
-## Google ##
--- Multiple encoding profiles in the same manifest have some playback issues in Chrome and is not recommended.-- Chrome cannot play back HE-AAC with AzureHtml5JS. Follow details on the [bug tracker](https://bugs.chromium.org/p/chromium/issues/detail?id=534301).-- As of Chrome v58, widevine content must be loaded/played back via the https:// protocol otherwise playback will fail.-
-## Mozilla ##
--- Audio stream switch requires audio streams to have the same codec private data when using AzureHtml5JS. Firefox platform requires this.-
-## Apple ##
--- Safari on Mac often enables Power Saver mode by default with the setting "Stop plug-ins to save power", which blocks plugins like Flash and Silverlight when they believe it is not in favor to the user. This block does not block the plugin's existent, only capabilities. Given the default techOrder, this may cause issues when attempting to play back
- - Mitigation 1: If the video player is 'front and center' (within a 3000 x 3000 pixel boundary starting at the top-left corner of the document), it should still play.
- - Mitigation 2: Change the techOrder for Safari to be ["azureHtml5JS", "html5"]. This mitigation has implication of not getting all the features that are available in the FlashSS tech.
-- PlayReady content via Silverlight may have issues playing back in Safari.-- AES and restricted token content does not play back using iOS and older Android devices.
- - In order to achieve this scenario, a proxy must be added to your service.
-- Default skin for iOS Phone shows through.-- iPhone playback always occurs in the native player fullscreen.
- - Features, including captions, may not persist in this non-inline playback.
-- When live presentation ends, iOS devices will not properly end presentation.-- iOS does not allow for DVR capabilities.-- iOS audio stream switch can only be done though iOS native player UI and requires audio streams to have the same codec private data-- Older versions of Safari may potentially have issues playing back live streams.-
-## Older Android ##
--- AES and restricted token content does not play back using iOS and older Android devices.
- - In order to achieve this scenario, a proxy must be added to your service.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Localization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-localization.md
- Title: Azure Media Player Localization
-description: Multiple language support for users of non-English locales.
---- Previously updated : 04/05/2021--
-# Azure Media Player localization #
-
-Multiple language support allows for users of non-English locales to natively interact with the player. Azure Media Player will instantiate with a global dictionary of languages, which will localize the error messages based on the page language. A developer can also create a player instance with a forced set language. The default language is English (en).
-
-> [!NOTE]
-> This feature is still going through some testing and as such is subject to bugs.
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin" data-setup='{"language":"es"}'>...</video>
-```
-
-Azure Media Player currently supports the following languages with their corresponding language codes:
-
-| Language | Code | Language | Code | Language | Code |
-|||-|--|-|--|
-| English {default} | en | Croatian | hr | Romanian | ro |
-| Arabic | ar | Hungarian | hu | Slovak | sk |
-| Bulgarian | bg | Indonesian | id | Slovene | sl |
-| Catalan | ca | Icelandic | is | Serbian - Cyrillic | sr-cyrl-cs |
-| Czech | cs | Italian | it | Serbian - Latin | sr-latn-rs |
-| Danish | da | Japanese | ja | Russian | ru |
-| German | de | Kazakh | kk | Swedish | sv |
-| Greek | el | Korean | ko | Thai | th |
-| Spanish | es | Lithuanian | lt | Tagalog | tl |
-| Estonian | et | Latvian | lv | Turkish | tr |
-| Basque | eu | Malaysian | ms | Ukrainian | uk |
-| Farsi | fa | Norwegian - Bokmål | nb | Urdu | ur |
-| Finnish | fi | Dutch | nl | Vietnamese | vi |
-| French | fr | Norwegian - Nynorsk | nn | Chinese - simplified | zh-hans |
-| Galician | gl | Polish | pl | Chinese - traditional | zh-hant |
-| Hebrew | he | Portuguese - Brazil | pt-br | | |
-| Hindi | hi | Portuguese - Portugal | pt-pt | | |
--
-> [!NOTE]
-> If you do not want any localization to occur you must force the language to English
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-options.md
- Title: Azure Media Player Options
-description: The Azure Media Player embed code is simply an HTML5 video tag, so for many of the options you can use the standard tag attributes to set the options.
---- Previously updated : 04/05/2021--
-# Options #
-
-## Setting options ##
-
-The Azure Media Player embed code is simply an HTML5 video tag, so for many of the options you can use the standard tag attributes to set the options.
-
-`<video controls autoplay ...>`
-
-Alternatively, you can use the data-setup attribute to provide options in the [JSON](http://json.org/example.html) format. This is also how you would set options that aren't standard to the video tag.
-
-`<video data-setup='{ "controls": true, "autoplay": false }'...>`
-
-Finally, if you're not using the data-setup attribute to trigger the player setup, you can pass in an object with the player options as the second argument in the JavaScript setup function.
-
-`amp("vid1", { "controls": true, "autoplay": false });`
-
-> [!NOTE]
-> Options in the constructor are only set on the first initialization before the source is set. If you wish to modify the options on the same initialized Azure Media Player element, you must update the options before changing the source. You can update the options in JavaScript by using `myPlayer.options({/*updated options*/});`. Note that only changed options will be affected, all other previously set options will persist.
-
-## Individual options ##
-
-> [!NOTE]
->Video Tag Attributes can only be true or false (boolean), you simply include the attribute (no equals sign) to turn it on, or exclude it to turn it off. For example, to turn controls on:
-> WRONG `<video controls="true" ...>`
-> RIGHT `<video controls ...>`
-> The biggest issue people run into is trying to set these values to false using false as the value (e.g. controls="false") which actually does the opposite and sets the value to true because the attribute is still included.
-
-### controls ###
-
-The controls option sets whether or not the player has controls that the user can interact with. Without controls the only way to start the video playing is with the autoplay attribute or through the API.
-
-`<video controls ...>`
-or
-`{ "controls": true }`
-
-### autoplay ###
-
-If autoplay is true, the video will start playing as soon as page is loaded (without any interaction from the user).
-
-> [!NOTE]
-> This option is not supported by mobile devices such as Windows Phone, Apple iOS and Android. Mobile devices block the autoplay functionality to prevent over usage of consumer's monthly data plans (often expensive). A user touch/click is required to start the video in this case.
-
-`<video autoplay ...>`or `{ "autoplay": true }`
-
-### poster ###
-The poster attribute sets the image that displays before the video begins playing. This is often a frame of the video or a custom title screen. As soon as the user clicks play the image will go away.
-
-`<video poster="myPoster.jpg" ...>` or `{ "poster": "myPoster.jpg" }`
-
-### width ###
-
-The width attribute sets the display width of the video.
-
-`<video width="640" ...>` or `{ "width": 640 }`
-
-### height ###
-
-The height attribute sets the display height of the video.
-
-`<video height="480" ...>` or `{ "height": 480 }`
-
-### plugins ###
-
-The plugins JSON determines which plugins get loaded with that instance of AMP lets you configure any options that plugin may have.
-
- `<video... data-setup='{plugins: { "contentTitle": {"name": "Azure Media Services Overview"}}}'...>`
-
-For more information on plugin development and usage, see [writing plugins](azure-media-player-writing-plugins.md)
-
-### other options ###
-
-Other options can be set on the `<video>` tag by using the `data-setup` parameter that takes a JSON.
-`<video ... data-setup='{"nativeControlsForTouch": false}'>`
-
-#### nativeControlsForTouch ####
-
-This is explicitly set to false. By setting to false, it will allow the Azure Media Player skin to be rendered the same across platforms. Furthermore, contrary to the name, touch will still be enabled.
-
-### fluid ###
-
-By setting this option to true video element will take full width of the parent container and height will be adjusted to fit a video with a standard 16:9 aspect ratio.
-
-`<video ... data-setup='{"fluid": true}'>`
-
-`fluid` option overrides explicit `width` and `height` settings. This option is only available in Azure Media Player version `2.0.0` and later.
-
-### playbackSpeed ###
-
-`playbackSpeed` option controls playbackSpeed control and set of playback speed settings available for the user. `playbackSpeed` takes an object. In order to enable playback speed control on the control bar, property `enabled` of the object needs to be set to true. An example of enabling playback speed in markup:
-
-`<video ... data-setup='{"playbackSpeed": {"enabled": true}}'>`
-
-Other properties of the `playbackSpeed` setting are given by `PlaybackSpeedOptions` object.
-
-Example of setting playback speed options in JavaScript:
-
-```javascript
- var myPlayer = amp('vid1', {
- fluid: true,
- playbackSpeed: {
- enabled: true,
- initialSpeed: 1.0,
- speedLevels: [
- { name: "x4.0", value: 4.0 },
- { name: "x3.0", value: 3.0 },
- { name: "x2.0", value: 2.0 },
- { name: "x1.75", value: 1.75 },
- { name: "x1.5", value: 1.5 },
- { name: "x1.25", value: 1.25 },
- { name: "normal", value: 1.0 },
- { name: "x0.75", value: 0.75 },
- { name: "x0.5", value: 0.5 },
- ]
- }
- });
-```
-
-This option is only available in Azure Media Player version 2.0.0 and later.
-
-### staleDataTimeLimitInSec ###
-
-The `staleDataTimeLimitInSec` option is an optimization that lets you configure how many seconds worth of stale data you'd like to keep in the mediaSource buffers. This is disabled by default.
-
-### cea708CaptionsSettings ###
-
-Setting enabled to true allows you to display live CEA captioning in your live streams and live archives. The label attribute is not required, if not included the player will fall back to a default label.
-
-```javascript
- cea708CaptionsSettings: {
- enabled: true,
- srclang: 'en',
- label: 'Live CC'
- }
-```
-
-This option is only available in Azure Media Player version 2.1.1 and later.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-overview.md
- Title: Azure Media Player Overview
-description: Learn more about the Azure Media Player.
---- Previously updated : 04/05/2021--
-# Azure Media Player overview #
-
-Azure Media Player is a web video player that plays media content from [Microsoft Azure Media Services](https://azure.microsoft.com/services/media-services/) on a wide variety of browsers and devices. Azure Media Player uses industry standards, such as HTML5, Media Source Extensions (MSE), and Encrypted Media Extensions (EME) to provide an enriched adaptive streaming experience. When these standards are not available on a device or in a browser, Azure Media Player uses Flash and Silverlight as fallback technology. Regardless of the playback technology used, developers have a unified JavaScript interface to access APIs, allowing content served by Azure Media Services to be played across a wide range of devices and browsers without any extra effort.
-
-Microsoft Azure Media Services allows for content to be served up with DASH, Smooth Streaming and HLS streaming formats to play back content. Azure Media Player takes into account these various formats and automatically plays the best link based on the platform/browser capabilities. Microsoft Azure Media Services also provides dynamic encryption of assets with common encryption (PlayReady or Widevine) or AES-128 bit envelope encryption. Azure Media Player allows for decryption of PlayReady and AES-128 bit encrypted content when appropriately configured. To understand how to configure the player, see the [Protected Content](azure-media-player-protected-content.md) section.
-
-If you have and specific issues, questions or find any bugs, please [file a support ticket](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/overview) using the Client Playback category.
-
-> [!NOTE]
-> Please note that Azure Media Player only supports media streams from Azure Media Services.
-
-## License ##
-
-Azure Media Player is licensed and subject to the terms outlined in the Microsoft Software License Terms for Azure Media Player. See [license file](/legal/azure-media-player/azure-media-player-license) for full terms. For more information, see the [Privacy Statement](https://www.microsoft.com/en-us/privacystatement/default.aspx).
-
-Copyright 2015 Microsoft Corporation.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Playback Technology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-playback-technology.md
- Title: Azure Media Player Playback Technology
-description: Learn more about the playback technology used to play the video or audio.
---- Previously updated : 04/05/2021---
-# Playback technology ("tech") #
-
-Playback Technology refers to the specific browser or plugin technology used to play the video or audio.
--- **azureHtml5JS**: utilizes MSE and EME standards in conjunction with the video element for plugin-less based playback of DASH content with support for AES-128 bit envelope encrypted content or DRM common encrypted content (via PlayReady and Widevine when the browser supports it) from Azure Media Services-- **flashSS**: utilizes flash player technology to play back Smooth content with support for AES-128 bit envelope decryption from Azure Media Services - requires Flash version of 11.4 or higher-- **html5FairPlayHLS**: utilizes Safari specific in browser-based playback technology via HLS with the video element. This tech is requires to play back FairPlay protected content from Azure Media Services and was added to the techOrder as of 10/19/16-- **SilverlightSS**: utilizes Silverlight technology to play back Smooth content with support for PlayReady protected content from Azure Media Services.-- **html5**: utilizes in browser-based playback technology with the video element. When on an Apple iOS or Android device, this tech allows playback of HLS streams with some basic support for AES-128 bit envelope encryption or DRM content (via FairPlay when the browser supports it).-
-## Tech Order ##
-
-In order to ensure that your asset is playable on a wide variety of devices, the following tech order is recommended and is the default if: `techOrder: ["azureHtml5JS", "flashSS", "html5FairPlayHLS","silverlightSS", "html5"]` and can be set directly on the `<video>` or programatically in the options:
-
-`<video data-setup='{"techOrder": ["azureHtml5JS", "flashSS", "html5FairPlayHLS", "silverlightSS", "html5"]}`
-
-or
-
-```javascript
- amp("vid1", {
- techOrder: ["azureHtml5JS", "flashSS", "html5FairPlayHLS", "silverlightSS", "html5"]
- });
-```
-
-## Compatibility Matrix ##
-
-Given the recommended tech order with streaming content from Azure Media Services, the following compatibility playback matrix is expected
-
-| Browser | OS | Expected Tech (Clear) | Expected Tech (AES) | Expected Tech (DRM) |
-|-|-||-||
-| EdgeIE 11 | Windows 10, Windows 8.1, Windows Phone 10<sup>1</sup> | azureHtml5JS | azureHtml5JS | azureHtml5JS (PlayReady) |
-| IE 11 | Windows 7, Windows Vista<sup>1</sup> | flashSS | flashSS | SilverlightSS (PlayReady) |
-| IE 11 | Windows Phone 8.1 | azureHtml5JS | azureHtml5JS | not supported |
-| Edge | Xbox One<sup>1</sup> (Nov 2015 update) | azureHtml5JS | azureHtml5JS | not supported |
-| Chrome 37+ | Windows 10, Windows 8.1, macOS X Yosemite<sup>1</sup> | azureHtml5JS | azureHtml5JS | azureHtml5JS (Widevine) |
-| Firefox 47+ | Windows 10, Windows 8.1, macOS X Yosemite+<sup>1</sup> | azureHtml5JS | azureHtml5JS | azureHtml5JS (Widevine) |
-| Firefox 42-46 | Windows 10, Windows 8.1, macOS X Yosemite+<sup>1</sup> | azureHtml5JS | azureHtml5JS | SilverlightSS (PlayReady) |
-| Firefox 35-41 | Windows 10, Windows 8.1 | flashSS | flashSS | SilverlightSS (PlayReady) |
-| Safari | iOS 6+ | html5 | html5 (no token)3 | not supported |
-| Safari 8+ | OS X Yosemite+ | azureHtml5JS | azureHtml5JS | html5FairPlayHLS (FairPlay) |
-| Safari 6 | OS X Mountain Lion<sup>1</sup> | flashSS | flashSS | SilverlightSS (PlayReady) |
-| Chrome 37+ | Android 4.4.4+<sup>2</sup> | azureHtml5JS | azureHtml5JS | azureHtml5JS (Widevine) |
-| Chrome 37+ | Android 4.02 | html5 | html5 (no token)<sup>3</sup> | not supported |
-| Firefox 42+ | Android 5.0+<sup>2</sup> | azureHtml5JS | azureHtml5JS | not supported |
-| IE 8, IE 9, IE 10 | Windows | not supported | not supported | not supported |
-
-<sup>1</sup> Configuration not supported or tested; listed as reference for completion.
-
-<sup>2</sup> Successful playback on Android devices requires a combination of device capabilities, graphics support, codec rendering, OS support and more. Since Android is an open-source platform that allows phone manufacturers to change the Vanilla Android OS provided by Google, this cause some fragmentation in the Android space, and some devices may not be supported because of lack of features. Also, some Android devices do not have support for all codecs.
-
-<sup>3</sup> In the cases where there is no support for token, a proxy can be used to add this functionality. Check out this [blog](https://azure.microsoft.com/blog/2015/03/06/how-to-make-token-authorized-aes-encrypted-hls-stream-working-in-safari/) to learn more about this solution.
-
-> [!NOTE]
-> If the expected tech chosen requires a plugin be installed, like Flash, and that is not installed on the user's machine, AMP will continue to check the capabilities of the next tech, in conjunction with source types and protection info, in the tech list. For example, if attempting to view an unprotected on-demand stream in Safari 8 on OS X Yosemite, and both Flash and Silverlight are not installed, AMP will select the native Html5 tech for playback.<br/><br/>New browser technologies are emerging daily, and as such could affect this matrix.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Plugin Gallery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-plugin-gallery.md
- Title: Azure Media Player Plugin Gallery
-description: This article contains a list of plugins available for Azure Media Player.
---- Previously updated : 04/05/2021--
-# Azure Media Player Plugin Gallery #
-
-## Plugins ##
-
-| Plugin Name | Demo URL | Source Code | Description |
-|-|--|-|-|
-| Additional Features | | | |
-| **New!** AMP360Video | [Demo](http://www.babylonjs.com/demos/amp360video/) | [GitHub](https://github.com/BabylonJS/Extensions/tree/master/Amp360Video) | The plugin lets you visualize 360 video in Amp either on your desktop or in VR compatible devices. The full documentation is available [here](https://doc.babylonjs.com/extensions/amp360video): |
-| Sprite Tip | [Demo](https://www.smwcentral.net/?p=section&a=details&id=10301) | [GitHub](https://github.com/RickShahid/SpriteTip) | Azure Media Player (AMP) plugin for timeline rendering of a video thumbnail image sprite that is generated from Azure Media Services (AMS) Media Encoder Standard (MES). |
-| Diagnostics Overlay | [Demo](https://openidconnectweb.azurewebsites.net/Diagnoverlay.html) | [GitHub](https://github.com/willzhan/diagnoverlay) | This plugin displays: All key parameters, stats of video, all events in the video playback lifecycle, and DRM protection info, such as key ID, license acquisition URLs, if protected. |
-| Frame rate and Timecode calculator | No demo available | [GitHub](https://github.com/mconverti/media-services-javascript-azure-media-player-framerate-timecode-calculator-plugin) | This plugin calculates the frame rate of video based on the `tfhd`/`trun` MP4 boxes of the first MPEG-DASH video fragment, parses the time scale value from the MPEG-DASH client manifest, and also provides a way to generate the timecode for a given absolute time from the player (as well as provides the player absolute time given the timecode) |
-| <strike>Playback Speed</strike> | [Demo](https://azure-samples.github.io/media-services-javascript-Azure-Media-Player-playback-rate-plugin/) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-time-tip-plugin) | This plugin enables viewers to control what speed of the video. *Note, this functionality is automatically available in version AMP v2.0.0+ but disabled by default.* To learn how to enable it, check out our samples [here](https://github.com/Azure-Samples/azure-media-player-samples) |
-| Hover Time Tip | [Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/timetip/example.html) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-time-tip-plugin) | Displays a time tip over the progress bar on mouse hover for time accurate seeking. *Note: This plugin is already integrated into AMP* but if you're interested in seeing how it's programmed feel free to take a look. |
-| Title Overlay | [Demo](https://azure-samples.github.io/media-services-javascript-azure-media-player-title-overlay-plugin) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-title-overlay-plugin) | Overlays configurable video title over screen |
-| Timeline Markers | [Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/timelinemarkers/example.html) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-timeline-markers-plugin) | This plugin takes in an array of times and overlays tiny markers over the progress bar at those times. |
-| Analytics | | | |
-| Application Insights | [Blog Post](https://azure.microsoft.com/blog/player-analytics-azure-media-player-plugin/) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-application-insights-plugin) | Plugin that tracks your player metrics and ports it to Power BI for an intuitive graphical representation of your viewers' player experience. |
-| Google Analytics | N/A | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-google-analytics-plugin) | Google Analytics plugin for Azure Media Player |
-| Diagnostics | | | |
-| Diagnostics Output | [Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/diagnosticslogger/example.html) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-diagnostic-logger-plugin) | This plugin outputs an array of diagnostics from your player, to see it in action go to the demo link and open up your JavaScript console. |
-| Ease of Access | | | |
-| Zoom In | [Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/zoom/example.html) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-zoom-plugin) | This plugin displays a drag-able zoom-in scale on the players screen so viewers can zoom in on your content |
-| Live Captions | [Azure Blog Post](https://azure.microsoft.com/blog/live-real-time-captions-with-azure-media-services-and-player/),[SubPly Post](http://www.subply.com/en/Products/AzureLiveCaptions.htm) | N/A | *See post for more info.* End to End workflow designed for live captioning built plugin for Azure Media Player, click on the left-most link to go to SubPly's site and learn more about the solution |
-| Hot Keys | <strike>[Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/hotkeys/example.html)</strike> | <strike>[GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-hot-keys-plugin)</strike> | The hot keys plugin enables viewers to control various aspects of the player with generic plugin combinations like F for fullscreen, M for mute and arrow keys for progress bar control. *Note: This plugin has been already integrated into AMP but feel free to use it as a resource* |
-| Social | | | |
-| Share | [Demo](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/share/example.html) | [GitHub](https://github.com/Azure-Samples/media-services-javascript-azure-media-player-social-share-plugin) | This plugin adds a share button to the player's control bar so that your viewers can share the video they're watching with their friends via Facebook, Twitter, or Linkedin. |
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Protected Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-protected-content.md
- Title: Azure Media Player Protected Content
-description: Azure Media Player currently supports AES-128 bit envelope encrypted content and common encrypted content.
---- Previously updated : 04/05/2021---
-# Protected content #
-
-Azure Media Player currently supports AES-128 bit envelope encrypted content and common encrypted content (through PlayReady and Widevine) or encrypted content via FairPlay. In order to playback protected content correctly, you must tell Azure Media Player the `protectionInfo`. This information exists per source and can be added directly on the `<source>` tag via the `data-setup`. You can also add the `protectionInfo` directly as a parameter if setting the source dynamically.
-
-`protectionInfo` accepts a JSON object and includes:
--- `type`: `AES` or `PlayReady` or `Widevine` or `FairPlay`-- `certificateUrl`: this should be a direct link to your hosted FairPlay cert--- `authenticationToken`: this is an option field to add an unencoded authentication token-
-> [!IMPORTANT]
-> The **certificateUrl** object is only needed for FairPlay DRM.***
->[!NOTE]
-> The default techOrder has been changed to accommodate the new tech- `html5FairPlayHLS` specifically to playback FairPlay content natively on browsers that support it (Safari on OSX 8+). If you have FairPlay content to playback **AND** you've changed the default techOrder to a custom one in your application, you will need to add this new tech into your techOrder object. We recommend you include it before silverlightSS so your content doesn't playback via PlayReady.
-
-## Code sample ##
-
-```html
-Ex:
-
- <video id="vid1" class="azuremediaplayer amp-default-skin">
- <source
- src="//example/path/to/myVideo.ism/manifest"
- type="application/vnd.ms-sstr+xml"
- data-setup='{"protectionInfo": [{"type": "AES", "authenticationToken": "Bearer=urn%3amicrosoft%3aazure%3amediaservices%3acontentkeyidentifier=8130520b-c116-45a9-824e-4a0082f3cb3c&Audience=urn%3atest&ExpiresOn=1450207516&Issuer=http%3a%2f%2ftestacs.com%2f&HMACSHA256=eV7HDgZ9msp9H9bnEPGN91sBdU7XsZ9OyB6VgFhKBAU%3d"}]}'
- />
- </video>
-or
-
-```javascript
- var myPlayer = amp("vid1", /* Options */);
- myPlayer.src([{
- src: "//example/path/to/myVideo.ism/manifest",
- type: "application/vnd.ms-sstr+xml",
- protectionInfo: [{
- type: "PlayReady",
- authenticationToken: "Bearer=urn%3amicrosoft%3aazure%3amediaservices%3acontentkeyidentifier=d5646e95-63ee-4fbe-ba4e-295c8d9502e0&Audience=urn%3atest&ExpiresOn=1450222961&Issuer=http%3a%2f%2ftestacs.com%2f&HMACSHA256=4Jop3kNJdzVI8L5IZLgFtPdImyE%2fHTRil0x%2bEikSdPs%3d"
- }] }, ]
- );
-```
-
-or, with multiple DRM
-
-```javascript
- var myPlayer = amp("vid1", /* Options */);
- myPlayer.src([{
- src: "//example/path/to/myVideo.ism/manifest",
- type: "application/vnd.ms-sstr+xml",
- protectionInfo: [{
- type: "PlayReady",
- authenticationToken: "Bearer=urn%3amicrosoft%3aazure%3amediaservices%3acontentkeyidentifier=d5646e95-63ee-4fbe-ba4e-295c8d9502e0&Audience=urn%3atest&ExpiresOn=1450222961&Issuer=http%3a%2f%2ftestacs.com%2f&HMACSHA256=4Jop3kNJdzVI8L5IZLgFtPdImyE%2fHTRil0x%2bEikSdPs%3d"
- },
- {
- type: "Widevine",
- authenticationToken: "Bearer=urn%3amicrosoft%3aazure%3amediaservices%3acontentkeyidentifier=d5646e95-63ee-4fbe-ba4e-295c8d9502e0&Audience=urn%3atest&ExpiresOn=1450222961&Issuer=http%3a%2f%2ftestacs.com%2f&HMACSHA256=4Jop3kNJdzVI8L5IZLgFtPdImyE%2fHTRil0x%2bEikSdPs%3d"
- },
- {
- type: "FairPlay",
- certificateUrl: "//example/path/to/myFairplay.der",
- authenticationToken: "Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1cm46bWljcm9zb2Z0OmF6dXJlOm1lZGlhc2VydmljZXM6Y29udGVudGtleWlkZW50aWZpZXIiOiIyMTI0M2Q2OC00Yjc4LTRlNzUtYTU5MS1jZWMzMDI0NDNhYWMiLCJpc3MiOiJodHRwOi8vY29udG9zbyIsImF1ZCI6InVybjp0ZXN0IiwiZXhwIjoxNDc0NTkyNDYzLCJuYmYiOjE0NzQ1ODg1NjN9.mE7UxgNhkieMMqtM_IiYQj-FK1KKIzB6lAptw4Mi67A"
- }] } ]
- );
-```
-
-> [!NOTE]
-> Not all browsers/platforms are capable of playing back protected content. See the [Playback Technology](azure-media-player-playback-technology.md) section for more information on what is supported.
-> [!IMPORTANT]
-> The token passed into the player is meant for secured content and only used for authenticated users. It is assumed that the application is using SSL or some other form of security measure. Also, the end user is assummed to be trusted to not misuse the token; if that is not the case, please involve your security experts.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-quickstart.md
- Title: Azure Media Player Quickstart
-description: Learn the basic steps to set up the Azure Media Player.
---- Previously updated : 04/05/2021---
-# Azure Media Player quickstart
-Azure Media Player is easy to set up. It only takes a few minutes to get basic playback of media content from your Azure Media Services account. This section shows you the basic steps without going into details. The sections that follow give you specifics on how to set up and configure Azure Media Player. Simply add the following includes to your document's `<head>`:
-
-```html
- <link href="//amp.azure.net/libs/amp/latest/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
- <script src= "//amp.azure.net/libs/amp/latest/azuremediaplayer.min.js"></script>
-```
-
-> [!IMPORTANT]
-> You should **NOT** use the `latest` version in production, as this is subject to change on demand. Replace `latest` with a version of Azure Media Player; for example replace `latest` with `1.0.0`. Azure Media Player versions can be queried from [here](https://amp.azure.net/libs/amp/latest/docs/changelog.html).
-
-## Use the video element
-
-Next, simply use the `<video>` element as you normally would, but with an additional `data-setup` attribute containing any options. These options can include any Azure Media Services option in a valid JSON object.
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin" autoplay controls width="640" height="400" poster="poster.jpg" data-setup='{"nativeControlsForTouch": false}'>
- <source src="http://amssamples.streaming.mediaservices.windows.net/91492735-c523-432b-ba01-faba6c2206a2/AzureMediaServicesPromo.ism/manifest" type="application/vnd.ms-sstr+xml" />
- <p class="amp-no-js">
- To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
- </p>
- </video>
-```
-
-If you don't want to use auto-setup, you can omit the `data-setup` attribute and initialize a video element manually.
-
-```javascript
- var myPlayer = amp('vid1', { /* Options */
- "nativeControlsForTouch": false,
- autoplay: false,
- controls: true,
- width: "640",
- height: "400",
- poster: ""
- }, function() {
- console.log('Good to go!');
- // add an event listener
- this.addEventListener('ended', function() {
- console.log('Finished!');
- });
- }
- );
- myPlayer.src([{
- src: "http://amssamples.streaming.mediaservices.windows.net/91492735-c523-432b-ba01-faba6c2206a2/AzureMediaServicesPromo.ism/manifest",
- type: "application/vnd.ms-sstr+xml"
- }]);
-```
-
-## Next steps ##
--- [Azure Media Player Full Setup](./azure-media-player-full-setup.md)
media-services Azure Media Player Url Rewriter https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-url-rewriter.md
- Title: Azure Media Player URL Rewriter
-description: Azure Media Player will rewrite a given URL from Azure Media Services to provide streams for SMOOTH, DASH, HLS v3, and HLS v4.
---- Previously updated : 04/05/2021---
-# URL rewriter #
-
-By default, Azure Media Player will rewrite a given URL from Azure Media Services to provide streams for SMOOTH, DASH, HLS v3, and HLS v4. For example, if the source is given as follows, Azure Media Player will ensure that it attempts to play all of the above protocols:
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin">
- <source src="//example/path/to/myVideo.ism/manifest" type="application/vnd.ms-sstr+xml" />
- </video>
-```
-
-However, if you wish to not use the URL rewriter, you can do so by adding the `disableUrlRewriter` property to the parameter. This means all the information that is passed to the sources are directly passed to the player without modification. Here is an example of adding two sources to the player, on DASH and one SMOOTH Streaming.
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin">
- <source src="//example/path/to/myVideo.ism/manifest(format=mpd-time-csf)" type="application/dash+xml" data-setup='{"disableUrlRewriter": true}'/>
- <source src="//example/path/to/myVideo.ism/manifest" type="application/vnd.ms-sstr+xml" data-setup='{"disableUrlRewriter": true}'/>
- </video>
-```
-
-or
-
-```javascript
- myPlayer.src([
- { src: "//example/path/to/myVideo.ism/manifest(format=mpd-time-csf)", type: "application/dash+xml", disableUrlRewriter: true },
- { src: "//example/path/to/myVideo.ism/manifest", type: "application/vnd.ms-sstr+xml", disableUrlRewriter: true }
- ]);
-```
-
-Also, if you want, you can specify the specific streaming formats you would like Azure Media Player to rewrite to using the `streamingFormats` parameter. Options include `DASH`, `SMOOTH`, `HLSv3`, `HLSv4`, `HLS`. The difference between HLS and HLSv3 & v4 is that the HLS format supports playback of FairPlay content. v3 and v4 do not support FairPlay. This is useful if you do not have a delivery policy for a particular protocol available. Here is an example of when a DASH protocol is not enabled with your asset.
-
-```html
- <video id="vid1" class="azuremediaplayer amp-default-skin">
- <source src="//example/path/to/myVideo.ism/manifest" type="application/vnd.ms-sstr+xml" data-setup='{"streamingFormats": ["SMOOTH", "HLS","HLS-V3", "HLS-V4"] }'/>
- </video>
-```
-
-or
-
-```javascript
- myPlayer.src([
- { src: "//example/path/to/myVideo.ism/manifest", type: "application/vnd.ms-sstr+xml", streamingFormats: ["SMOOTH", "HLS","HLS-V3", "HLS-V4"]},
- ]);
-```
-
-The above two can be used in combination with each other for multiple circumstances based on your particular asset.
-
-> [!NOTE]
-> Widevine protection information only persists on the DASH protocol.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Azure Media Player Writing Plugins https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/azure-media-player-writing-plugins.md
- Title: Writing plugins for Azure Media Player
-description: Learn how to write a plugin with Azure Media Player with JavaScript
---- Previously updated : 04/05/2021---
-# Writing plugins for Azure Media Player #
-
-A plugin is JavaScript written to extend or enhance the player. You can write plugins that change Azure Media Player's appearance, its functionality or even have it interface with other services. You can do this in two easy steps:
-
-## Step 1 ##
-
-Write your JavaScript in a function like so:
-
-```javascript
-
- (function () {
- amp.plugin('yourPluginName', function (options) {
- var myPlayer = this;
- myPlayer.addEventListener(amp.eventName.ready, function () {
- console.log("player is ready!");
- });
- });
- }).call(this);
-```
-
-You can write your code directly in your HTML page within `<script>` tags or in an external JavaScript file. If you do the latter, be sure to include the JavaScript file in the `<head>` of your HTML page *after* the AMP script.
-
-Example:
-
-```javascript
- <!--*****START OF Azure Media Player Scripts*****-->
- <script src="//amp.azure.net/libs/amp/latest/azuremediaplayer.min.js"></script>
- <link href="//amp.azure.net/libs/amp/latest/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
- <!--*****END OF Azure Media Player Scripts*****-->
- <!--Add Plugins-->
- <script src="yourPluginName.js"></script>
-```
-
-## Step 2 ##
-Initialize the plugin with JavaScript in one of two ways:
-
-Method 1:
-
-```javascript
- var myOptions = {
- autoplay: true,
- controls: true,
- width: "640",
- height: "400",
- poster: "",
- plugins: {
- yourPluginName: {
- [your plugin options]: [example options]
- }
- }
- };
- var myPlayer = amp([videotag id], myOptions);
-```
-
-Method 2:
-
-```javascript
- var video = amp([videotag id]);
- video.yourPluginName({[your plugins option]: [example option]});
-```
-
-Plugin options are not required, including them just allows the developers using your plugin to configure its behavior without having to change the source code.
-
-For inspiration and more examples on creating a plugin take a look at our [gallery](azure-media-player-plugin-gallery.md)
-
->[!NOTE]
-> Plugin code dynamically changes items in the DOM during the lifetime of the viewer's player experience, it never makes permanent changes to the player's source code. This is where an understanding of your browser's developer tools comes in handy. For example, if you'd like to change the appearance of an element in the player you can find its HTML element by its class name and then add or change attributes from there. Here's a great resource on [changing HTML attributes.](http://www.w3schools.com/js/js_htmldom_html.asp)
-
-### Integrated Plugins ###
-
- There are currently two plugins baked into AMP: the [time-tip](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/timetip/example.html) and [hotkeys](http://sr-test.azurewebsites.net/Tests/Plugin%20Gallery/plugins/hotkeys/example.html). These plugins were originally developed to be modular plugins for the player but are now included into the player source code.
-
-### Plugin Gallery ###
-
-The [plugin gallery](https://aka.ms/ampplugins) has several plugins that the community has already contributed for features like time-line markers, zoom, analytics and more. The page provides accesses to the plugins and instructions on how to set it up as well as a demo that shows the plugin in action. If you create a cool plugin that you think should be included in our gallery, feel free to submit it so we can check it out.
-
-## Next steps ##
--- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Demos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/azure-media-player/demos.md
- Title: Azure Media Player demos
-description: This page contains a listing of links to demos of the Azure Media Player.
---- Previously updated : 04/05/2021---
-# Azure Media Player demos
-
-The following is a list of links to demos of the Azure Media Player. You can download all of the [Azure Media Player samples](https://github.com/Azure-Samples/azure-media-player-samples) from GitHub.
-
-## Demo listing
-
-| Sample name | Programmatic via JavaScript | Static via HTML5 video element | Description |
-| |-|-|--|
-| Basic |
-| Set Source | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_setsource.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_setsource.html) |Playback unprotected content.|
-| Features |
-| Playback Speed | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_playback_speed.html)| N/A | Enables viewers to control what speed they're watching their video at. |
-| AMP Flush Skin | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_flush_skin.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_flush_skin.html) | Enables new AMP skin. **Note:** AMP flush is only supported in AMP versions 2.1.0+ |
-| Captions and Subtitles | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_webvtt.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_webvtt.html) | Playback with WebVTT subtitles.
-| Live CEA 708 Captions | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_live_captions.html) | N/A | Playback with live CEA 708 inbound captions with the captions left-aligned. |
-| Streaming with Progressive Fallback | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_progressiveFallback.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_progressiveFallback.html) | Basic setup of adaptive playback with fallback for progressive if streaming not supported on platform. |
-| Progressive Video MP4 | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_progressiveVideo.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_progressiveVideo.html) | Playback of progressive audio MP4. |
-| Progressive Audio MP3 | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_progressiveAudio.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_progressiveAudio.html) | Playback of progressive audio MP3. |
-| DD+ | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_dolbyDigitalPlus.html) | N/A | Playback of content with DD+ audio. |
-| Options |
-| Heuristic Profile | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_heuristicsProfile.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_heuristicsProfile.html) | Changing the heuristics profile |
-| Localization | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_localization.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_localization.html) |
-Setting localization |
-| Audio Tracks Menu | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_multiAudio.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_multiAudio.html) |
-Options to show how to display audio tracks menu on the default skin. |
-| Hotkeys | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_hotKeys.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_hotKeys.html) | This sample shows how to configure which hotkeys are enabled in the player |
-| Events, Logging and Diagnostics |
-| Register Events | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_registerEvents.html) | N/A | Playback with event listeners. |
-| Logging | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_logging.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_logging.html) | Turning on verbose logging to console. |
-| Diagnostics | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_diagnostics.html) | N/A | Getting diagnostic data. This sample only works on some techs. |
-| AES |
-| AES no token | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_aes_notoken.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_aes_notoken.html) | Playback of AES content with no token. |
-| AES token | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_aes_token.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_aes_token.html) | Playback of AES content with token. |
-| AES HLS proxy simulation | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_aes_token_withHLSProxy.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_aes_token_withHLSProxy.html) | Playback of AES content with token, showing a proxy for HLS so that token can be used with iOS devices. |
-| AES token force flash | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_aes_token_forceFlash.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_aes_token_forceFlash.html) | Playback of AES content with token, forcing the flashSS tech. |
-| DRM |
-| Multi-DRM with PlayReady, Widevine, and FairPlay | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_multiDRM_PlayReadyWidevineFairPlay_notoken.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_multiDRM_PlayReadyWidevineFairPlay_notoken.html) | Playback of DRM content with no token, with PlayReady, Widevine, and FairPlay headers. |
-| PlayReady no token | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_playready_notoken.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_playready_notoken.html) | Playback of PlayReady content with no token. |
-| PlayReady no token force Silverlight | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_playready_notoken_forceSilverlight.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_playready_notoken_forceSilverlight.html) | Playback of PlayReady content with no token, forcing silverlightSS tech. |
-| PlayReady token | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_playready_token.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_playready_token.html) | Playback of PlayReady content with token. |
-| PlayReady token force Silverlight | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_playready_token_forceSilverlight.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_playready_token_forceSilverlight.html) | Playback of PlayReady content with token, forcing silverlightSS tech. |
-| Protocol and Tech |
-| Change techOrder | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_techOrder.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_techOrder.html) | Changing the tech order |
-| Force MPEG-DASH only in UrlRewriter | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_forceDash.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_forceDash.html) | Playback of unprotected content only using the DASH protocol. |
-| Exclude MPEG-DASH in UrlRewriter | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_forceNoDash.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_forceNoDash.html) | Playback of unprotected content only using the Smooth and HLS protocols. |
-| Multiple delivery policy | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_multipleDeliveryPolicy.html) | [Static](https://amp.azure.net/libs/amp/latest/samples/videotag_multipleDeliveryPolicy.html) | Setting the source with content that has multiple delivery policies from Azure Media Services |
-| Programatically Select |
-| Select text track | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_selectTextTrack.html) | N/A | Selecting a WebVTT track from the track list. |
-| Select bitrate | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_selectBitrate.html) | N/A | Selecting a bitrate from the list of bitrates. This sample only works on some techs. |
-| Select audio stream | [Dynamic](https://amp.azure.net/libs/amp/latest/samples/dynamic_selectAudioStream.html) | N/A | Selecting an Audio Stream from the list of available audio streams. This sample only works on some techs. |
-
-## Next steps
-
-<!Some context for the following links goes here>
-- [Azure Media Player Quickstart](azure-media-player-quickstart.md)
media-services Access Api Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/access-api-howto.md
- Title: Get started with Azure AD authentication
-description: Learn how to access Azure Active Directory (Azure AD) authentication to consume the Azure Media Services API.
------- Previously updated : 03/31/2021--
-# Get credentials to access Media Services API
--
-When you use Azure AD authentication to access the Azure Media Services API, you have two authentication options:
--- **Service principal authentication** (recommended)-
- Authenticate a service. Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs: web apps, function apps, logic apps, APIs, or a microservice.
-- **User authentication**-
- Authenticate a person who is using the app to interact with Media Services resources. The interactive application should first prompt the user for credentials. An example is a management console app used by authorized users to monitor encoding jobs or live streaming.
-
-This article describes steps for getting credentials to access Media Services API. Choose from the following tabs.
-
-## Prerequisites
--- An Azure account. If you don't have an account, start with an [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).-- A Media Services account. For more information, see [Create an Azure Media Services account by using the Azure portal](account-create-how-to.md).-
-## [Portal](#tab/portal/)
-
-### API access
-
-The **API access** page lets you select the authentication method you want to use to connect to the API. The page also provides the values you need to connect to the API.
-
-1. In the [Azure portal](https://portal.azure.com/), select your Media Services account.
-2. Select the **API access** blade on the left navigation bar.
-3. Under **Connect to Media Services API**, select the Media Services API version you want to connect to (V3 is the latest version of the service).
-
-### Service principal authentication (recommended)
-
-Authenticates a service using an Azure Active Directory (Azure AD) app and secret. This is recommended for any middle-tier services calling to the Media Services API. Examples are Web Apps, Functions, Logic Apps, APIs, and microservices. This is the recommended authentication method.
-
-#### Manage your Azure AD app and secret
-
-The **Manage your AAD app and secret** section lets you select or create a new Azure AD app and generate a secret. For security purposes, the secret cannot be shown after the blade is closed. The application uses the application ID and secret for authentication to obtain a valid token for media services.
-
-Make sure that you have sufficient permissions to register an application with your Azure AD tenant and to assign the application to a role in your Azure subscription. For more information, see [Required permissions](../../active-directory/develop/howto-create-service-principal-portal.md#permissions-required-for-registering-an-app).
-
-#### Connect to Media Services API
-
-The **Connect to Media Services API** provides you with values that you use to connect your service principal application. You can get text values or copy the JSON or XML blocks.
-
-### User authentication
-
-This option could be used to authenticate an employee or member of an Azure Active Directory who is using an app to interact with Media Services resources. The interactive application should first prompt the user for the user's credentials. This authentication method should only be used for Management applications.
-
-#### Connect to Media Services API
-
-Copy your credentials to connect your user application from the **Connect to Media Services API** section. You can get text values or copy the JSON or XML blocks.
-
-## [CLI](#tab/cli/)
----
-## Next steps
-
-[Tutorial: Upload, encode, and stream videos with Media Services v3](stream-files-tutorial-with-api.md).
media-services Account Add Account Storage How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-add-account-storage-how-to.md
- Title: Add storage to a Media Services Account
-description: This article shows you how to add storage to a Media Services account.
----- Previously updated : 03/08/2022--
-# Add storage to a Media Services Account
--
-<!-- NOTE: The following are in the includes folder and are reused in other How To articles. All task based content should be in the includes folder with the task- prefix prepended to the file name. -->
-
-This article shows you how to add storage to a Media Services account.
-
-## Methods
-
-You can use the following methods to add storage to a Media Services account.
-
-## CLI
---
media-services Account Create How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-create-how-to.md
- Title: Create an Azure Media Services account
-description: This tutorial walks you through the steps of creating an Azure Media Services account.
----- Previously updated : 03/04/2022--
-# Create a Media Services account
--
-To start encrypting, encoding, analyzing, managing, and streaming media content in Azure, you need to create a Media Services account. The Media Services account needs to be associated with one or more storage accounts. This article describes steps for creating a new Azure Media Services account.
--
-## Prerequisites
-
-If you aren't familiar with the Azure Managed Identity platform, take some time to understand the platform and the differences between identity types. A Media Services account default managed identity type is a user-managed identity.
--- Read about the [Microsoft identity platform](../../active-directory/develop/app-objects-and-service-principals.md). -- Read about [managed identities for Azure resources](../../active-directory/managed-identities-azure-resources/overview.md).-- You might also want to take a few moments to read about [applications and service principals](../../active-directory/develop/app-objects-and-service-principals.md).-
-## Create an account
-
-You can use either the Azure portal or the CLI to create a Media Services account. Choose the tab for the method you would like to use.
--
-<!-- NOTE: The following are in the includes folder and are reused in other How To articles. All task based content should be in the includes folder with the task- prefix prepended to the file name. -->
--
-## [Portal](#tab/portal/)
-----
-## [CLI](#tab/cli/)
-
-<!-- Set the subscription -->
--
-<!-- Create a storage account -->
--
-## Create a Media Services account
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/mediaservices/create-or-update).
--
media-services Account Delete How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-delete-how-to.md
- Title: Delete a Media Services account
-description: This article shows you how to delete a Media Services account.
----- Previously updated : 03/04/2022--
-# Delete a Media Services account
--
-This article shows you how to delete a Media Services account.
-
-## [Portal](#tab/portal/)
-
-You can find the Media Services accounts in the portal by navigating to your subscription and finding the resource groups within the subscription.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Service [REST API](/rest/api/media/mediaservices/delete).
--
media-services Account List Assets How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-list-assets-how-to.md
- Title: List the assets in a Media Services account
-description: This article shows you how to list the assets in a Media Services account.
----- Previously updated : 03/08/2022---
-# List the assets in a Media Services account
--
-This article shows you how to list the assets in a Media Services account.
-
-## Methods
-
-You can use the following methods to list the assets in a Media Services account.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
media-services Account List How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-list-how-to.md
- Title: List the Media Services accounts in a subscription
-description: This article shows you how to list the Media Service accounts in an Azure subscription.
----- Previously updated : 03/04/2022--
-# List the Media Services accounts in a subscription
--
-This article shows you how to list the Media Service accounts in an Azure subscription.
-
-## [Portal](#tab/portal/)
-
-You can find the Media Services accounts in the portal by navigating to your subscription and finding the resource groups within the subscription.
-
-## [CLI](#tab/cli/)
---
media-services Account List Transforms How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-list-transforms-how-to.md
- Title: List the transforms in a Media Services account
-description: This article shows you how to list the transforms in a Media Services account.
----- Previously updated : 03/08/2022---
-# List the transforms in a Media Services account
--
-This article shows you how to list the transforms in a Media Services account.
-
-## Methods
-
-You can use the following methods to list the transforms in a Media Services account.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/transforms/list).
--
media-services Account Move Account How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-move-account-how-to.md
- Title: Manage Azure Media Services v3 accounts
-description: To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account. This article explains how to manage Azure Media Services v3 accounts.
------- Previously updated : 03/29/2021---
-# Manage Azure Media Services v3 accounts
--
-To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account. When creating a Media Services account, you need to supply the name of an Azure Storage account resource. The specified storage account is attached to your Media Services account. The Media Services account and all associated storage accounts must be in the same Azure subscription. For more information, see [Storage accounts](storage-account-concept.md).
--
-## Media Services account names
-
-Media Services account names must be all lowercase letters or numbers with no spaces, and between 3 to 24 characters in length. Media Services account names must be unique within an Azure location.
-
-When a Media Services account is deleted, the account name is reserved for one year. For a year after the account is deleted, account name may only be reused in the same Azure location by the
-subscription that contained the original account.
-
-Media Services account names are used in DNS names, including for Key Delivery, Live Events and Streaming Endpoint names. If you have configured firewalls or proxies to allow Media Services
-DNS names, ensure these configurations are removed within a year of deleting a Media Services account.
-
-## Moving a Media Services account between subscriptions
-
-If you need to move a Media Services account to a new subscription, you need to first move the entire resource group that contains the Media Services account to the new subscription. You must move all attached resources: Azure Storage accounts, Azure CDN profiles, etc. For more information, see [Move resources to new resource group or subscription](../../azure-resource-manager/management/move-resource-group-and-subscription.md). As with any resources in Azure, resource group moves can take some time to complete.
-
-### Considerations
-
-* Create backups of all data in your account before migrating to a different subscription.
-* You need to stop all the Streaming Endpoints and live streaming resources. Your users will not be able to access your content for the duration of the resource group move.
-
-> [!IMPORTANT]
-> Do not start the Streaming Endpoint until the move completes successfully.
-
-### Troubleshoot
-
-If a Media Services account or an associated Azure Storage account become "disconnected" following the resource group move, try rotating the Storage Account keys. If rotating the Storage Account keys does not resolve the "disconnected" status of the Media Services account, file a new support request from the "Support + troubleshooting" menu in the Media Services account.
-
-## Next steps
-
-[Create an account](./account-create-how-to.md)
media-services Account Remove Account Storage How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-remove-account-storage-how-to.md
- Title: Remove a storage account from a Media Services account
-description: This article shows you how to remove a storage account from a Media Services account
----- Previously updated : 03/08/2022---
-# Remove a storage account from a Media Services account
--
-This article shows you how to remove a storage account from a Media Services account.
-
-## Methods
-
-You can use the following methods to remove a storage account from a Media Services account.
-
-## CLI
-
media-services Account Reset Account Credentials https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-reset-account-credentials.md
- Title: Reset your account credentials -CLI
-description: Use the Azure CLI script to reset your account credentials and get the app.config settings back.
------ Previously updated : 08/31/2020----
-# Azure CLI example: Reset the account credentials
--
-The Azure CLI script in this article shows how to reset your account credentials and get the app.config settings back.
-
-## Prerequisites
-
-[Create a Media Services account](./create-account-howto.md).
-
-## Example script
-
-```azurecli-interactive
-# Update the following variables for your own settings:
-resourceGroup=amsResourceGroup
-amsAccountName=amsmediaaccountname
-
-az ams account sp reset-credentials \
- --account-name $amsAccountName \
- --resource-group $resourceGroup
- ```
-
-## Next steps
-
-* [az ams](/cli/azure/ams)
-* [Reset credentials](/cli/azure/ams/account/sp#az-ams-account-sp-reset-credentials)
media-services Account Set Account Encryption Customer Managed Key How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-set-account-encryption-customer-managed-key-how-to.md
- Title: Set the Media Services account encryption with customer managed keys
-description: This article shows you how to set the Media Services account encryption with customer managed keys.
----- Previously updated : 03/08/2022---
-# Set the Media Services account encryption with customer managed keys
--
-This article shows you how to set the Media Services account encryption with customer managed keys.
-
-## Methods
-
-You can use the following methods to set the Media Services account encryption with customer managed keys.
-
-## CLI
-
media-services Account Set Account Encryption System Managed Key How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-set-account-encryption-system-managed-key-how-to.md
- Title: Set the Media Services account encryption with system managed keys
-description: This article shows you how to set the Media Services account encryption with system managed keys.
----- Previously updated : 03/08/2022---
-# Set the Media Services account encryption with system managed keys
--
-This article shows you how to set the Media Services account encryption with system managed keys.
-
-## Methods
-
-You can use the following methods to set the Media Services account encryption with system managed keys.
-
-## CLI
-
media-services Account Set Storage Authentication How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-set-storage-authentication-how-to.md
- Title: Set the Media Services storage authentication
-description: This article shows you how to set the Media Services storage authentication.
----- Previously updated : 03/08/2022---
-# Set the Media Services storage authentication
--
-This article shows you how to set the Media Services storage authentication.
-
-## Methods
-
-You can use the following methods to set the Media Services storage authentication.
-
-## CLI
-
media-services Account Show Encryption How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-show-encryption-how-to.md
- Title: Show the account encryption of a Media Services Account
-description: This article shows you how to show the account encryption of a Media Services Account.
----- Previously updated : 03/08/2022--
-# Show the account encryption of a Media Services Account
--
-This article shows you how to show the account encryption of a Media Services Account.
-
-<!-- NOTE: The following are in the includes folder and are reused in other How To articles. All task based content should be in the includes folder with the task- prefix prepended to the file name. -->
-
-## Methods
-
-You can use the following methods to show the account encryption of a Media Services Account.
-
-## CLI
---
media-services Account Show How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-show-how-to.md
- Title: Show the details of a Media Services account
-description: This article shows you how to show the details of a Media Services account.
----- Previously updated : 03/04/2022--
-# Show the details of a Media Services account
--
-This article shows you how to show the details of a Media Services account.
-
-## [Portal](#tab/portal/)
-
-You can find the Media Services accounts in the portal by navigating to your subscription and finding the resource groups within the subscription.
-
-## [CLI](#tab/cli/)
---
media-services Account Update How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/account-update-how-to.md
- Title: Update the details of a Media Services account
-description: This article shows you how to update a Media Services account.
----- Previously updated : 03/04/2022--
-# Update the details of a Media Services account
--
-This article shows you how to update a Media Services account.
-
-## [Portal](#tab/portal/)
-
-The Media Services account can be updated in the portal using the Media Services account navigation.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/mediaservices/update).
--
media-services Analyze Face Redaction Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/analyze-face-redaction-concept.md
- Title: Find and redact faces in Azure Media Services v3 API | Microsoft Docs
-description: Azure Media Services v3 provides a face detection and redaction (blur) preset that enables you to submit a video file, detect faces, and optionally apply redaction (blurring) to them in a single combined pass, or through a two-stage operation allowing for editing. This article demonstrates how to find and redact faces with the Face Detector preset in the v3 API.
------ Previously updated : 03/25/2021---
-# Find and redact (blur) faces with the Face Detector preset
--
-Azure Media Services v3 API includes a Face Detector preset that offers scalable face detection and redaction (blurring) in the cloud. Face redaction enables you to modify your video in order to blur faces of selected individuals. You may want to use the face redaction service in public safety and news media scenarios. A few minutes of footage that contains multiple faces can take hours to redact manually, but with this preset the face redaction process will require just a few simple steps.
-
-This article gives details about **Face Detector Preset** and shows how to use it with Azure Media Services SDK for .NET.
-
-## Compliance, privacy, and security
-
-As an important reminder, you must comply with all applicable laws in your use of analytics in Azure Media Services. You must not use Azure Media Services or any other Azure service in a manner that violates the rights of others. Before uploading any videos, including any biometric data, to the Azure Media Services service for processing and storage, you must have all the proper rights, including all appropriate consents, from the individuals in the video. To learn about compliance, privacy and security in Azure Media Services, the Azure [Cognitive Services Terms](https://azure.microsoft.com/support/legal/cognitive-services-compliance-and-privacy/). For MicrosoftΓÇÖs privacy obligations and handling of your data, review MicrosoftΓÇÖs [Privacy Statement](https://privacy.microsoft.com/PrivacyStatement), the [Online Services Terms](https://www.microsoft.com/licensing/product-licensing/products) (OST) and [Data Processing Addendum](https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67) (ΓÇ£DPAΓÇ¥). More privacy information, including on data retention, deletion/destruction, is available in the OST and [here](../../azure-video-analyzer/video-analyzer-for-media-docs/faq.yml). By using Azure Media Services, you agree to be bound by the Cognitive Services Terms, the OST, DPA, and the Privacy Statement
-
-## Face redaction modes
-
-Facial redaction works by detecting faces in every frame of video and tracking the face object both forwards and backwards in time, so that the same individual can be blurred from other angles as well. The automated redaction process is complex and does not always blur every face 100% guaranteed. For this reason, the preset can be used a two-pass mode to improve the quality and accuracy of the blurring through an editing stage prior to submitting the file for the final blur pass.
-
-In addition to a fully automatic **Combined** mode, the two-pass workflow allows you the ability to choose the faces you wish to blur (or not blur) via a list of face IDs. To make arbitrary per frame adjustments the preset uses a metadata file in JSON format as input to the second pass. This workflow is split into **Analyze** and **Redact** modes.
-
-You can also easily just combine the two modes in a single pass that runs both tasks in one job; this mode is called **Combined**. In this article, the sample code will show how to use the simplified single pass **Combined** mode on a sample source file.
-
-### Combined mode
-
-This produces a redacted MP4 video file in a single pass without any manual editing of the JSON file required. The output in the asset folder for the job will be a single .mp4 file that contains blurred faces using the selected blur effect. Use the resolution property set to **SourceResolution** to achieve the best results for redaction.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |"ignite-sample.mp4" |Video in WMV, MOV, or MP4 format |
-| Preset config |Face Detector configuration | **mode**: FaceRedactorMode.Combined, **blurType**: BlurType.Med, **resolution**: AnalysisResolution.SourceResolution |
-| Output asset |"ignite-redacted.mp4 |Video with blurring effect applied to faces |
-
-### Analyze mode
-
-The **Analyze** pass of the two-pass workflow takes a video input and produces a JSON file with a list of the face locations, Face ID's and jpg images of each detected face.
-Be advised that the face id's are not guaranteed to be identical on subsequent runs of the analysis pass.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |"ignite-sample.mp4" |Video in WMV, MPV, or MP4 format |
-| Preset config |Face Detector configuration |**mode**: FaceRedactorMode.Analyze, **resolution**: AnalysisResolution.SourceResolution|
-| Output asset |ignite-sample_annotations.json |Annotation data of face locations in JSON format. Face id's are not guaranteed to be identical on subsequent runs of the analysis pass. This can be edited by the user to modify the blurring bounding boxes. See sample below. |
-| Output asset |foo_thumb%06d.jpg [foo_thumb000001.jpg, foo_thumb000002.jpg] |A cropped jpg of each detected face, where the number indicates the labelId of the face |
-
-#### Output example
-
-```json
-{
- "version": 1,
- "timescale": 24000,
- "offset": 0,
- "framerate": 23.976,
- "width": 1280,
- "height": 720,
- "fragments": [
- {
- "start": 0,
- "duration": 48048,
- "interval": 1001,
- "events": [
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [
- {
- "index": 13,
- "id": 1138,
- "x": 0.29537,
- "y": -0.18987,
- "width": 0.36239,
- "height": 0.80335
- },
- {
- "index": 13,
- "id": 2028,
- "x": 0.60427,
- "y": 0.16098,
- "width": 0.26958,
- "height": 0.57943
- }
- ],
-
- ... truncated
-```
-
-### Redact (blur) mode
-
-The second pass of the workflow takes a larger number of inputs that must be combined into a single asset.
-
-This includes a list of IDs to blur, the original video, and the annotations JSON. This mode uses the annotations to apply blurring on the input video.
-
-The output from the Analyze pass does not include the original video. The video needs to be uploaded into the input asset for the Redact mode task and selected as the primary file.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |"ignite-sample.mp4" |Video in WMV, MPV, or MP4 format. Same video as in step 1. |
-| Input asset |"ignite-sample_annotations.json" |Annotations metadata file from phase one, with optional modifications if you wish to change the faces blurred. This must be edited in an external application, code, or text editor. |
-| Input asset | "ignite-sample_IDList.txt" (Optional) | Optional new line separated list of face IDs to redact. If left blank, all faces in the source will have blur applied. You can use the list to selectively choose not to blur specific faces. |
-| Face Detector preset |Preset configuration | **mode**: FaceRedactorMode.Redact, **blurType**: BlurType.Med |
-| Output asset |"ignite-sample-redacted.mp4" |Video with blurring applied based on annotations |
-
-#### Example output
-
-This is the output from an IDList with one ID selected.
-The face id's are not guaranteed to be identical on subsequent runs of the analysis pass.
-
-Example foo_IDList.txt
-
-```
-1
-2
-3
-```
-
-## Blur types
-
-In the **Combined** or **Redact** mode, there are five different blur modes you can choose from via the JSON input configuration: **Low**, **Med**, **High**, **Box**, and **Black**. By default **Med** is used.
-
-You can find samples of the blur types below.
-
-#### Low
-
-![Low resolution blur setting example.](./media/media-services-face-redaction/blur-1.png)
-
-#### Med
-
-![Medium resolution blur setting example.](./media/media-services-face-redaction/blur-2.png)
-
-#### High
-
-![High resolution blur setting example.](./media/media-services-face-redaction/blur-3.png)
-
-#### Box
-
-![Box mode for use in debugging your output.](./media/media-services-face-redaction/blur-4.png)
-
-#### Black
-
-![Black box mode covers all faces with black boxes.](./media/media-services-face-redaction/blur-5.png)
-
-## Elements of the output JSON file
-
-The Redaction MP provides high precision face location detection and tracking that can detect up to 64 human faces in a video frame. Frontal faces provide the best results, while side faces and small faces (less than or equal to 24x24 pixels) are challenging.
--
-## .NET sample code
-
-The following program shows how to use the **Combined** single-pass redaction mode:
--- Create an asset and upload a media file into the asset.-- Configure the Face Detector preset that uses the mode and blurType settings.-- Create a new Transform using the Face Detector preset-- Download the output redacted video file.-
-## Download and configure the sample
-
-Clone a GitHub repository that contains the .NET sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git
- ```
-
-The sample is located in the [FaceRedactor](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoAnalytics/FaceRedactor) folder. Open [appsettings.json](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoAnalytics/FaceRedactor/appsettings.json) in your downloaded project. Replace the values with the credentials you got from [accessing APIs](./access-api-howto.md).
-
-**Optionally** you can copy the **[sample.env](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/sample.env)** file at the root of the repository and fill out the details in there, and rename that file to **.env** (Note the dot on the front!) so that it can be used across all sample projects in the repository. This eliminates the need to have a populated appsettings.json file in each sample, and also protects you from checking any settings into your own GitHub cloned repositories.
-
-#### Examples
-
-This code shows how to set up the **FaceDetectorPreset** for a **Combined** mode blur.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/VideoAnalytics/FaceRedactor/Program.cs#FaceDetectorPreset)]
-
-This code sample shows how the preset is passed into a Transform object during creation. After creating the Transform, jobs may be submitted directly to it.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/VideoAnalytics/FaceRedactor/Program.cs#FaceDetectorPresetTransform)]
-
-## Next steps
--
-## Provide feedback
-
media-services Analyze Video Audio Files Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/analyze-video-audio-files-concept.md
- Title: Analyze video and audio files
-description: Learn how to analyze audio and video content using AudioAnalyzerPreset and VideoAnalyzerPreset in Azure Media Services.
----- Previously updated : 07/26/2021---
-# Analyze video and audio files with Azure Media Services
---
-Media Services lets you extract insights from your video and audio files using the audio and video analyzer presets. This article describes the analyzer presets used to extract insights. If you want more detailed insights from your videos, use the [Azure Video Analyzer for Media service](../../azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md). To understand when to use Video Analyzer for Media vs. Media Services analyzer presets, check out the [comparison document](../../azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md).
-
-There are two modes for the Audio Analyzer preset, basic and standard. See the description of the differences in the table below.
-
-To analyze your content using Media Services v3 presets, you create a **Transform** and submit a **Job** that uses one of these presets: [VideoAnalyzerPreset](/rest/api/medi).
-
-## Compliance, Privacy, and Security
-
-You must comply with all applicable laws in your use of Video Analyzer for Media, and you may not use Video Analyzer for Media or any other Azure service in a manner that violates the rights of others or may be harmful to others. Before uploading any videos, including any biometric data, to the Video Analyzer for Media service for processing and storage, You must have all the proper rights, including all appropriate consents, from the individual(s) in the video. To learn about compliance, privacy and security in Video Analyzer for Media, the Azure [Cognitive Services Terms](https://azure.microsoft.com/support/legal/cognitive-services-compliance-and-privacy/). For MicrosoftΓÇÖs privacy obligations and handling of your data, review MicrosoftΓÇÖs [Privacy Statement](https://privacy.microsoft.com/PrivacyStatement), the [Online Services Terms](https://www.microsoft.com/licensing/product-licensing/products) (ΓÇ£OSTΓÇ¥) and [Data Processing Addendum](https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67) (ΓÇ£DPAΓÇ¥). More privacy information, including on data retention, deletion/destruction, is available in the OST and [here](../../azure-video-analyzer/video-analyzer-for-media-docs/faq.yml). By using Video Analyzer for Media, you agree to be bound by the Cognitive Services Terms, the OST, DPA, and the Privacy Statement.
-
-## Built-in presets
-
-Media Services currently supports the following built-in analyzer presets:
-
-|**Preset name**|**Scenario / Mode**|**Details**|
-||||
-|[AudioAnalyzerPreset](/rest/api/media/transforms/createorupdate#audioanalyzerpreset)|Analyzing audio Standard mode|The preset applies a predefined set of AI-based analysis operations, including speech transcription. Currently, the preset supports processing content with a single audio track that contains speech in a single language. Specify the language for the audio payload in the input using the BCP-47 format of 'language tag-region'. See supported languages list below for available language codes. The automatic language detection chooses the first language detected and continues with the selected language for the whole file if it not set, or set to null. The automatic language detection feature currently supports: English, Chinese, French, German, Italian, Japanese, Spanish, Russian, and Brazilian Portuguese. It doesn't support dynamically switching between languages after the first language is detected. The automatic language detection feature works best with audio recordings with clearly discernible speech. If automatic language detection fails to find the language, the transcription falls back to English.|
-|[AudioAnalyzerPreset](/rest/api/media/transforms/createorupdate#audioanalyzerpreset)|Analyzing audio Basic mode|This preset mode performs speech-to-text transcription and generation of a VTT subtitle/caption file. The output of this mode includes an Insights JSON file including only the keywords, transcription, and timing information. Automatic language detection and speaker diarization are not included in this mode. The list of supported languages is identical to the Standard mode above.|
-|[VideoAnalyzerPreset](/rest/api/medi).|
-|[FaceDetectorPreset](/rest/api/media/transforms/createorupdate#facedetectorpreset)|Detecting faces present in video|Describes the settings to be used when analyzing a video to detect all the faces present.|
-
-## Supported languages
-
-* Arabic ('ar-BH', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-OM', 'ar-QA', 'ar-SA' and 'ar-SY')
-* Brazilian Portuguese ('pt-BR')
-* Chinese ('zh-CN')
-* Danish('da-DK')
-* English ('en-US', 'en-GB' and 'en-AU')
-* Finnish ('fi-FI')
-* French ('fr-FR' and 'fr-CA')
-* German ('de-DE')
-* Hebrew (he-IL)
-* Hindi ('hi-IN'), Korean ('ko-KR')
-* Italian ('it-IT')
-* Japanese ('ja-JP')
-* Norwegian ('nb-NO')
-* Persian ('fa-IR')
-* Portugal Portuguese ('pt-PT')
-* Russian ('ru-RU')
-* Spanish ('es-ES' and 'es-MX')
-* Swedish ('sv-SE')
-* Thai ('th-TH')
-* Turkish ('tr-TR')
-
-### AudioAnalyzerPreset standard mode
-
-The preset enables you to extract multiple audio insights from an audio or video file.
-
-The output includes a JSON file (with all the insights) and VTT file for the audio transcript. This preset accepts a property that specifies the language of the input file in the form of a [BCP47](https://tools.ietf.org/html/bcp47) string. The audio insights include:
-
-* **Audio transcription**: A transcript of the spoken words with timestamps. Multiple languages are supported.
-* **Speaker indexing**: A mapping of the speakers and the corresponding spoken words.
-* **Speech sentiment analysis**: The output of sentiment analysis performed on the audio transcription.
-* **Keywords**: Keywords that are extracted from the audio transcription.
-
-### AudioAnalyzerPreset basic mode
-
-The preset enables you to extract multiple audio insights from an audio or video file.
-
-The output includes a JSON file and VTT file for the audio transcript. This preset accepts a property that specifies the language of the input file in the form of a [BCP47](https://tools.ietf.org/html/bcp47) string. The output includes:
-
-* **Audio transcription**: A transcript of the spoken words with timestamps. Multiple languages are supported, but automatic language detection and speaker diarization are not included.
-* **Keywords**: Keywords that are extracted from the audio transcription.
-
-### VideoAnalyzerPreset
-
-The preset enables you to extract multiple audio and video insights from a video file. The output includes a JSON file (with all the insights), a VTT file for the video transcript, and a collection of thumbnails. This preset also accepts a [BCP47](https://tools.ietf.org/html/bcp47) string (representing the language of the video) as a property. The video insights include all the audio insights mentioned above and the following extra items:
-
-* **Face tracking**: The time during which faces are present in the video. Each face has a face ID and a corresponding collection of thumbnails.
-* **Visual text**: The text that's detected via optical character recognition. The text is time stamped and also used to extract keywords (in addition to the audio transcript).
-* **Keyframes**: A collection of keyframes extracted from the video.
-* **Visual content moderation**: The portion of the videos flagged as adult or racy in nature.
-* **Annotation**: A result of annotating the videos based on a pre-defined object model
-
-## insights.json elements
-
-The output includes a JSON file (insights.json) with all the insights found in the video or audio. The JSON may contain the following elements:
-
-### transcript
-
-|Name|Description|
-|||
-|id|The line ID.|
-|text|The transcript itself.|
-|language|The transcript language. Intended to support transcript where each line can have a different language.|
-|instances|A list of time ranges where this line appeared. If the instance is transcript, it will have only one instance.|
-
-Example:
-
-```json
-"transcript": [
-{
- "id": 0,
- "text": "Hi I'm Doug from office.",
- "language": "en-US",
- "instances": [
- {
- "start": "00:00:00.5100000",
- "end": "00:00:02.7200000"
- }
- ]
-},
-{
- "id": 1,
- "text": "I have a guest. It's Michelle.",
- "language": "en-US",
- "instances": [
- {
- "start": "00:00:02.7200000",
- "end": "00:00:03.9600000"
- }
- ]
-}
-]
-```
-
-### ocr
-
-|Name|Description|
-|||
-|id|The OCR line ID.|
-|text|The OCR text.|
-|confidence|The recognition confidence.|
-|language|The OCR language.|
-|instances|A list of time ranges where this OCR appeared (the same OCR can appear multiple times).|
-
-```json
-"ocr": [
- {
- "id": 0,
- "text": "LIVE FROM NEW YORK",
- "confidence": 0.91,
- "language": "en-US",
- "instances": [
- {
- "start": "00:00:26",
- "end": "00:00:52"
- }
- ]
- },
- {
- "id": 1,
- "text": "NOTICIAS EN VIVO",
- "confidence": 0.9,
- "language": "es-ES",
- "instances": [
- {
- "start": "00:00:26",
- "end": "00:00:28"
- },
- {
- "start": "00:00:32",
- "end": "00:00:38"
- }
- ]
- }
- ],
-```
-
-### faces
-
-|Name|Description|
-|||
-|id|The face ID.|
-|name|The face name. It can be ΓÇÿUnknown #0ΓÇÖ, an identified celebrity, or a customer trained person.|
-|confidence|The face identification confidence.|
-|description|A description of the celebrity. |
-|thumbnailId|The ID of the thumbnail of that face.|
-|knownPersonId|The internal ID (if it's a known person).|
-|referenceId|The Bing ID (if it's a Bing celebrity).|
-|referenceType|Currently just Bing.|
-|title|The title (if it's a celebrityΓÇöfor example, "Microsoft's CEO").|
-|imageUrl|The image URL, if it's a celebrity.|
-|instances|Instances where the face appeared in the given time range. Each instance also has a thumbnailsId. |
-
-```json
-"faces": [{
- "id": 2002,
- "name": "Xam 007",
- "confidence": 0.93844,
- "description": null,
- "thumbnailId": "00000000-aee4-4be2-a4d5-d01817c07955",
- "knownPersonId": "8340004b-5cf5-4611-9cc4-3b13cca10634",
- "referenceId": null,
- "title": null,
- "imageUrl": null,
- "instances": [{
- "thumbnailsIds": ["00000000-9f68-4bb2-ab27-3b4d9f2d998e",
- "cef03f24-b0c7-4145-94d4-a84f81bb588c"],
- "adjustedStart": "00:00:07.2400000",
- "adjustedEnd": "00:00:45.6780000",
- "start": "00:00:07.2400000",
- "end": "00:00:45.6780000"
- },
- {
- "thumbnailsIds": ["00000000-51e5-4260-91a5-890fa05c68b0"],
- "adjustedStart": "00:10:23.9570000",
- "adjustedEnd": "00:10:39.2390000",
- "start": "00:10:23.9570000",
- "end": "00:10:39.2390000"
- }]
-}]
-```
-
-### shots
-
-|Name|Description|
-|||
-|id|The shot ID.|
-|keyFrames|A list of key frames within the shot (each has an ID and a list of instances time ranges). Key frames instances have a thumbnailId field with the keyFrameΓÇÖs thumbnail ID.|
-|instances|A list of time ranges of this shot (shots have only one instance).|
-
-```json
-"Shots": [
- {
- "id": 0,
- "keyFrames": [
- {
- "id": 0,
- "instances": [
- {
- "thumbnailId": "00000000-0000-0000-0000-000000000000",
- "start": "00: 00: 00.1670000",
- "end": "00: 00: 00.2000000"
- }
- ]
- }
- ],
- "instances": [
- {
- "thumbnailId": "00000000-0000-0000-0000-000000000000",
- "start": "00: 00: 00.2000000",
- "end": "00: 00: 05.0330000"
- }
- ]
- },
- {
- "id": 1,
- "keyFrames": [
- {
- "id": 1,
- "instances": [
- {
- "thumbnailId": "00000000-0000-0000-0000-000000000000",
- "start": "00: 00: 05.2670000",
- "end": "00: 00: 05.3000000"
- }
- ]
- }
- ],
- "instances": [
- {
- "thumbnailId": "00000000-0000-0000-0000-000000000000",
- "start": "00: 00: 05.2670000",
- "end": "00: 00: 10.3000000"
- }
- ]
- }
- ]
-```
-
-### statistics
-
-|Name|Description|
-|||
-|CorrespondenceCount|Number of correspondences in the video.|
-|WordCount|The number of words per speaker.|
-|SpeakerNumberOfFragments|The amount of fragments the speaker has in a video.|
-|SpeakerLongestMonolog|The speaker's longest monolog. If the speaker has silences inside the monolog it's included. Silence at the beginning and the end of the monolog is removed.|
-|SpeakerTalkToListenRatio|The calculation is based on the time spent on the speaker's monolog (without the silence in between) divided by the total time of the video. The time is rounded to the third decimal point.|
--
-### sentiments
-
-Sentiments are aggregated by their sentimentType field (Positive/Neutral/Negative). For example, 0-0.1, 0.1-0.2.
-
-|Name|Description|
-|||
-|id|The sentiment ID.|
-|averageScore |The average of all scores of all instances of that sentiment type - Positive/Neutral/Negative|
-|instances|A list of time ranges where this sentiment appeared.|
-|sentimentType |The type can be 'Positive', 'Neutral', or 'Negative'.|
-
-```json
-"sentiments": [
-{
- "id": 0,
- "averageScore": 0.87,
- "sentimentType": "Positive",
- "instances": [
- {
- "start": "00:00:23",
- "end": "00:00:41"
- }
- ]
-}, {
- "id": 1,
- "averageScore": 0.11,
- "sentimentType": "Positive",
- "instances": [
- {
- "start": "00:00:13",
- "end": "00:00:21"
- }
- ]
-}
-]
-```
-
-### labels
-
-|Name|Description|
-|||
-|id|The label ID.|
-|name|The label name (for example, 'Computer', 'TV').|
-|language|The label name language (when translated). BCP-47|
-|instances|A list of time ranges where this label appeared (a label can appear multiple times). Each instance has a confidence field. |
-
-```json
-"labels": [
- {
- "id": 0,
- "name": "person",
- "language": "en-US",
- "instances": [
- {
- "confidence": 1.0,
- "start": "00: 00: 00.0000000",
- "end": "00: 00: 25.6000000"
- },
- {
- "confidence": 1.0,
- "start": "00: 01: 33.8670000",
- "end": "00: 01: 39.2000000"
- }
- ]
- },
- {
- "name": "indoor",
- "language": "en-US",
- "id": 1,
- "instances": [
- {
- "confidence": 1.0,
- "start": "00: 00: 06.4000000",
- "end": "00: 00: 07.4670000"
- },
- {
- "confidence": 1.0,
- "start": "00: 00: 09.6000000",
- "end": "00: 00: 10.6670000"
- },
- {
- "confidence": 1.0,
- "start": "00: 00: 11.7330000",
- "end": "00: 00: 20.2670000"
- },
- {
- "confidence": 1.0,
- "start": "00: 00: 21.3330000",
- "end": "00: 00: 25.6000000"
- }
- ]
- }
- ]
-```
-
-### keywords
-
-|Name|Description|
-|||
-|id|The keyword ID.|
-|text|The keyword text.|
-|confidence|The keyword's recognition confidence.|
-|language|The keyword language (when translated).|
-|instances|A list of time ranges where this keyword appeared (a keyword can appear multiple times).|
-
-```json
-"keywords": [
-{
- "id": 0,
- "text": "office",
- "confidence": 1.6666666666666667,
- "language": "en-US",
- "instances": [
- {
- "start": "00:00:00.5100000",
- "end": "00:00:02.7200000"
- },
- {
- "start": "00:00:03.9600000",
- "end": "00:00:12.2700000"
- }
- ]
-},
-{
- "id": 1,
- "text": "icons",
- "confidence": 1.4,
- "language": "en-US",
- "instances": [
- {
- "start": "00:00:03.9600000",
- "end": "00:00:12.2700000"
- },
- {
- "start": "00:00:13.9900000",
- "end": "00:00:15.6100000"
- }
- ]
-}
-]
-```
-
-#### visualContentModeration
-
-The visualContentModeration block contains time ranges which Video Analyzer for Media found to potentially have adult content. If visualContentModeration is empty, there's no adult content that was identified.
-
-Videos that are found to contain adult or racy content might be available for private view only. Users can submit a request for a human review of the content, in which case the `IsAdult` attribute will contain the result of the human review.
-
-|Name|Description|
-|||
-|id|The visual content moderation ID.|
-|adultScore|The adult score (from content moderator).|
-|racyScore|The racy score (from content moderation).|
-|instances|A list of time ranges where this visual content moderation appeared.|
-
-```json
-"VisualContentModeration": [
-{
- "id": 0,
- "adultScore": 0.00069,
- "racyScore": 0.91129,
- "instances": [
- {
- "start": "00:00:25.4840000",
- "end": "00:00:25.5260000"
- }
- ]
-},
-{
- "id": 1,
- "adultScore": 0.99231,
- "racyScore": 0.99912,
- "instances": [
- {
- "start": "00:00:35.5360000",
- "end": "00:00:35.5780000"
- }
- ]
-}
-]
-```
-## Next steps
-
-[Tutorial: Analyze videos with Azure Media Services](analyze-videos-tutorial.md)
media-services Analyze Videos Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/analyze-videos-tutorial.md
- Title: Analyze videos with Media Services v3
-description: Learn how to analyze videos using Azure Media Services.
----- Previously updated : 07/26/2021---
-# Tutorial: Analyze videos with Media Services v3
--
-This tutorial shows you how to analyze videos with Azure Media Services. There are many scenarios in which you might want to gain deep insights into recorded videos or audio content. For example, to achieve higher customer satisfaction, organizations can run speech-to-text processing to convert customer support recordings into a searchable catalog, with indexes and dashboards.
-
-This tutorial shows you how to:
-
-> [!div class="checklist"]
-> * Download the sample app described in the topic.
-> * Examine the code that analyzes the specified video.
-> * Run the app.
-> * Examine the output.
-> * Clean up resources.
--
-## Compliance, Privacy, and Security
-
-As an important reminder, you must comply with all applicable laws in your use of Azure Video Analyzer for Media (formerly Video Indexer). You must not use Video Analyzer for Media or any other Azure service in a manner that violates the rights of others. Before uploading any videos, including any biometric data, to the Video Analyzer for Media service for processing and storage, you must have all the proper rights, including all appropriate consents, from the individuals in the video. To learn about compliance, privacy and security in Video Analyzer for Media, the Azure [Cognitive Services Terms](https://azure.microsoft.com/support/legal/cognitive-services-compliance-and-privacy/). For MicrosoftΓÇÖs privacy obligations and handling of your data, review MicrosoftΓÇÖs [Privacy Statement](https://privacy.microsoft.com/PrivacyStatement), the [Online Services Terms](https://www.microsoft.com/licensing/product-licensing/products) (OST) and [Data Processing Addendum](https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67) (ΓÇ£DPAΓÇ¥). More privacy information, including on data retention, deletion/destruction, is available in the OST and [here](../../azure-video-analyzer/video-analyzer-for-media-docs/faq.yml). By using Video Analyzer for Media, you agree to be bound by the Cognitive Services Terms, the OST, DPA, and the Privacy Statement.
-
-## Prerequisites
--- Install [Visual Studio Code for Windows/macOS/Linux](https://code.visualstudio.com/) or [Visual Studio 2019 for Windows or Mac](https://visualstudio.microsoft.com/).-- Install [.NET 5.0 SDK](https://dotnet.microsoft.com/download)-- [Create a Media Services account](./account-create-how-to.md). Be sure to copy the **API Access** details in JSON format or store the values needed to connect to the Media Services account in the *.env* file format used in this sample.-- Follow the steps in [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You'll need to use them to access the API in this sample, or enter them into the *.env* file format.-
-## Download and configure the sample
-
-Clone a GitHub repository that contains the .NET sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git
- ```
-
-The sample is located in the [AnalyzeVideos](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/tree/main/AMSV3Tutorials/AnalyzeVideos) folder.
--
-## Examine the code that analyzes the specified video
-
-This section examines functions defined in the [Program.cs](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/AnalyzeVideos/Program.cs) file of the *AnalyzeVideos* project.
-
-The sample completes the following actions:
-
-1. Creates a **Transform** and a **Job** that analyzes your video.
-2. Creates an input **Asset** and uploads the video into it. The asset is used as the job's input.
-3. Creates an output asset that stores the job's output.
-4. Submits the job.
-5. Checks the job's status.
-6. Downloads the files that resulted from running the job.
-
-### Start using Media Services APIs with the .NET SDK
-
-To start using Media Services APIs with .NET, you need to create an `AzureMediaServicesClient` object. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory. Another option is to use interactive authentication, which is implemented in `GetCredentialsInteractiveAuthAsync`.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#CreateMediaServicesClientAsync)]
-
-In the code that you cloned at the beginning of the article, the `GetCredentialsAsync` function creates the `ServiceClientCredentials` object based on the credentials supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsAsync)]
-
-In the case of interactive authentication, the `GetCredentialsInteractiveAuthAsync` function creates the `ServiceClientCredentials` object based on an interactive authentication and the connection parameters supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsInteractiveAuthAsync)]
-
-### Create an input asset and upload a local file into it
-
-The **CreateInputAsset** function creates a new input [Asset](/rest/api/medi) article.
-
-In Media Services v3, you use Azure Storage APIs to upload files. The following .NET snippet shows how.
-
-The following function completes these actions:
-
-* Creates an Asset.
-* Gets a writable [SAS URL](../../storage/common/storage-sas-overview.md) to the AssetΓÇÖs [container in storage](../../storage/blobs/storage-quickstart-blobs-dotnet.md#upload-a-blob-to-a-container).
-
- If using assetΓÇÖs [ListContainerSas](/rest/api/media/assets/listcontainersas) function to get SAS URLs, note that the function returns multiple SAS URLs as there are two storage account keys for each storage account. A storage account has two keys because it allows for seamless rotation of storage account keys (for example, change one while using the other then start using the new key and rotate the other key). The 1st SAS URL represents storage key1 and second one storage key2.
-* Uploads the file into the container in storage using the SAS URL.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#CreateInputAsset)]
-
-### Create an output asset to store the result of the job
-
-The output [Asset](/rest/api/media/assets) stores the result of your job. The project defines the **DownloadResults** function that downloads the results from this output asset into the "output" folder, so you can see what you got.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#CreateOutputAsset)]
-
-### Create a transform and a job that analyzes videos
-
-When encoding or processing content in Media Services, it's a common pattern to set up the encoding settings as a recipe. You would then submit a **Job** to apply that recipe to a video. By submitting new Jobs for each new video, you're applying that recipe to all the videos in your library. A recipe in Media Services is called a **Transform**. For more information, see [Transforms and jobs](./transform-jobs-concept.md). The sample described in this tutorial defines a recipe that analyzes the specified video.
-
-#### Transform
-
-When creating a new [Transform](/rest/api/media/transforms) instance, you need to specify what you want it to produce as an output. **TransformOutput** is a required parameter. Each **TransformOutput** contains a **Preset**. **Preset** describes step-by-step instructions of video and/or audio processing operations that are to be used to generate the desired **TransformOutput**. In this example, the **VideoAnalyzerPreset** preset is used and the language ("en-US") is passed to its constructor (`new VideoAnalyzerPreset("en-US")`). This preset enables you to extract multiple audio and video insights from a video. You can use the **AudioAnalyzerPreset** preset if you need to extract multiple audio insights from a video.
-
-When creating a **Transform**, check first if one already exists using the **Get** method, as shown in the code that follows. In Media Services v3, **Get** methods on entities return **null** if the entity doesnΓÇÖt exist (a case-insensitive check on the name).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#EnsureTransformExists)]
-
-#### Job
-
-As mentioned above, the [Transform](/rest/api/media/transforms) object is the recipe and a [Job](/rest/api/media/jobs) is the actual request to Media Services to apply that **Transform** to a given input video or audio content. The **Job** specifies information like the location of the input video and the location for the output. You can specify the location of your video using: HTTPS URLs, SAS URLs, or Assets that are in your Media Service account.
-
-In this example, the job input is a local video.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#SubmitJob)]
-
-### Wait for the job to complete
-
-The job takes some time to complete. When it does, you want to be notified. There are different options to get notified about the [Job](/rest/api/media/jobs) completion. The simplest option (that's shown here) is to use polling.
-
-Polling isn't a recommended best practice for production apps because of potential latency. Polling can be throttled if overused on an account. Developers should instead use Event Grid.
-
-Event Grid is designed for high availability, consistent performance, and dynamic scale. With Event Grid, your apps can listen for and react to events from virtually all Azure services, as well as custom sources. Simple, HTTP-based reactive event handling helps you build efficient solutions through intelligent filtering and routing of events. For more information, see [Route events to a custom web endpoint](monitoring/job-state-events-cli-how-to.md).
-
-The **Job** usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has come across an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and then **Canceled** when it's done.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#WaitForJobToFinish)]
-
-### Job error codes
-
-See [Error codes](/rest/api/media/jobs/get#joberrorcode).
-
-### Download the result of the job
-
-The following function downloads the results from the output [Asset](/rest/api/media/assets) into the "output" folder so you can examine the results of the job.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#DownloadResults)]
-
-### Clean up resources in your Media Services account
--
-Generally, you should clean up everything except objects that you're planning to reuse (typically, you'll reuse Transforms and persist StreamingLocators). If you want for your account to be clean after experimenting, delete the resources that you don't plan to reuse. For example, the following code deletes the job and output asset:
-
-### Delete resources with code
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/AnalyzeVideos/Program.cs#CleanUp)]
-
-You can also use the CLI.
--
-## Run the sample app
-
-Press Ctrl+F5 to run the *AnalyzeVideos* app.
-
-When we run the program, the job produces thumbnails for each face that it finds in the video. It also produces the insights.json file.
-
-## Examine the output
-
-The output file of analyzing videos is called insights.json. This file contains insights about your video. You can find description of elements found in the json file in the [Media intelligence](./analyze-video-audio-files-concept.md) article.
-
-## Multithreading
-
-> [!WARNING]
-> The Azure Media Services v3 SDKs aren't thread-safe. When working with a multi-threaded app, you should generate a new AzureMediaServicesClient object per thread.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Tutorial: upload, encode, and stream files](stream-files-tutorial-with-api.md)
media-services Architecture Azure Ad Content Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/architecture-azure-ad-content-protection.md
- Title: End-to-end content protection using Azure AD
-description: This article teaches you how to protect your content with Azure Media Services and Azure Active Directory
------ Previously updated : 08/31/2020----
-# Tutorial: End-to-End content protection using Azure AD
--
-With this tutorial and the provided player sample, you can set up an end-to-end media content protection subsystem on Azure Media Services (AMS) and Azure Active Directory (AAD) to stream media content with all AMS supported DRM/AES-128, streaming protocols, codec, and container formats. The sample is generic enough for secure access to any REST API protected by OAuth 2 through Authorization Code Flow with Proof Key for Code Exchange (PKCE). (Azure Media Services license delivery service is just one of them.) It also works for Microsoft Graph API or any custom developed REST API secured with OAuth 2 Authorization Code Flow. This is the companion document to the [sample code](https://github.com/Azure-Samples/media-services-content-protection-azure-ad).
-
-In this tutorial, you will:
-
-> [!div class="checklist"]
->
-> * Consider the authentication requirements
-> * Understand how the app works
-> * Register a backend resource app
-> * Register a client app
-> * Set up the media services account content key policy and streaming policies
-> * Set up the player app
-
-If you donΓÇÖt have an Azure Media Services subscription, create an Azure [free trial account](https://azure.microsoft.com/free/) then create a Media Services account.
-
-### Duration
-The tutorial should take about two hours to complete after you have the prerequisite technology ready to go.
-
-## Prerequisites
-
-The following latest technology versions and concepts are used. It's recommended that you become familiar with them before beginning this tutorial.
-
-### Prerequisite knowledge
-
-It is optional but recommended that you are familiar with the following concepts before beginning this tutorial:
-
-* Digital Rights Management (DRM)
-* [Azure Media Services (AMS) v3](./media-services-overview.md)
-* AMS [content key policies](drm-content-key-policy-concept.md) using the AMS API v3, Azure portal, or the [Azure Media Services Explorer (AMSE) tool](https://github.com/Azure/Azure-Media-Services-Explorer)
-* Azure AD v2 endpoints on the [Microsoft Identity Platform](../../active-directory/develop/index.yml)
-* Modern cloud authentication such as [OAuth 2.0 and OpenID Connect](../../active-directory/develop/active-directory-v2-protocols.md)
- * [Authorization code flow in OAuth 2.0](../../active-directory/develop/v2-oauth2-auth-code-flow.md) and why PKCE is needed
- * [Delegated permission vs application permission](../../active-directory/develop/developer-glossary.md#permissions)
-* [JWT token](../../active-directory/develop/access-tokens.md), its claims, and signing key rollover (included in sample.)
-
-### Prerequisite code and installations
-
-* The sample code. Download the [sample code](https://github.com/Azure-Samples/media-services-content-protection-azure-ad).
-* An installation of Visual Studio Code. Download Visual Studio Code here [https://code.visualstudio.com/download](https://code.visualstudio.com/download).
-* An installation of Node.js. Download Node.js here [https://nodejs.org](https://nodejs.org). NPM comes with the install.
-* An [Azure subscription](https://azure.microsoft.com/free/).
-* An Azure Media Services (AMS) account.
-* @azure/msal-browser v2.0, one of the members in [Microsoft Authentication Library (MSAL)](../../active-directory/develop/msal-overview.md) SDK family for different client platforms
-* The latest version of [Azure Media Player](https://github.com/Azure-Samples/azure-media-player-samples)(included in sample.)
-* FPS credentials from Apple if you want to include FairPlay DRM and the application certificate hosted with CORS that is accessible via client-side JavaScript.
-
-> [!IMPORTANT]
-> This tutorial uses .NET to create the content key policy restriction. If you are not a .NET developer, but want to try Node.js to connect to Azure Media Services, read [Connect to Media Services v3 API - Node.js](configure-connect-nodejs-howto.md). There is also a Node.js module available to handle key rollover automatically, see Node.js [passport-ad module](https://github.com/AzureAD/passport-azure-ad).
-
-## Consider the authentication and authorization requirements
-
-A few challenges are presented in designing the subsystem. It has multiple moving parts, there are client app constraints, and the Azure AD key rollover that occurs every six weeks.
-
-The Single-Page App (SPA) used in this tutorial takes into account challenges to authentication requirements and the restrictions that follow. It uses:
-
-* Azure AD v2 endpoints as Azure AD developer platform (v1 endpoints) are changing to Microsoft Identity Platform (v2 endpoints).
-* Authorization Code Flow because OAuth 2 implicit grant flow has been deprecated.
-* An app that is subject to the following constraints:
- * A public client can't hide the client secret. Authorization Code Flow alone requires hiding the client secret, so Authorization Code Flow with PKCE is used.
- * A browser-based app that is subject to a browser security sandbox (CORS/preflight constraint).
- * A browser-based app using modern JavaScript that is subject to JavaScript security constraints (custom header accessibility, correlation-id).
-
-## Understand the subsystem design
-
-The design of the subsystem is shown in the following diagram. It has three layers:
-
-* Back-office layer (in black) for configuring the content key policy and publishing content for streaming
-* Public endpoints (in blue) that are player/customer-facing endpoints providing authentication, authorization, DRM license, and encrypted content
-* Player app (in light blue) which integrates all components and
- * handles user authentication via Azure AD.
- * handles access_token acquisition from Azure AD.
- * receives manifest and encrypted content from AMS/CDN.
- * acquires DRM license from Azure Media Services.
- * handles content decryption, decode, and display.
-
-![screen for parsing JWT tokens](media/aad-ams-content-protection/subsystem.svg)
-
-Read [Design of a multi-DRM content protection system with access control](./architecture-design-multi-drm-system.md) for more details about the subsystem.
-
-## Understand the Single-page app
-
-The player app is a Single-page application (SPA), developed in Visual Studio Code using:
-
-* Node.js with ES 6 JavaScript
-* @azure/msal-browser 2.0 beta
-* Azure Media Player SDK
-* OAuth 2 flow against Azure AD v2 endpoints (Microsoft Identity Platform)
-
-The SPA player app completes the following actions:
-
-* User authentication for users native to the tenant, and guest users from other AAD tenants or MSA accounts. Users can choose to sign in through either a browser popup or redirect (for browsers not allowing popups such as Safari).
-* Acquisition of `access_token` through Authorization Code Flow with PKCE.
-* Renewal of `access_token` (tokens issued by AAD are valid for 1 hour), for which `refresh_token` is also acquired.
-* Parsing JWT tokens (both `access_token` and `id_token`) for test/inspection.
-* Acquisition of DRM licenses for all three DRMs or AES-128 content key.
-* Streaming of content under various combinations of DRM vs Streaming Protocol vs Container Format. The correct format string is generated for each combination.
-* Decryption, decode, and display.
-* Microsoft Graph API calls for troubleshooting purposes. <!--See more details in the subsection Shortest path: testing my protected asset in my subscription with your hosted player app and underlying tenant. -->
-
-The screen for sign-in, token acquisition, token renewal, and token display:
-
- ![Screen for sign in, token acquisition, token renewal, and token display](media/aad-ams-content-protection/token-acquisition.png)
-
-The screen for parsing JWT tokens (access_token or id_token):
-
-![Screenshot that shows parsing J W T tokens.](media/aad-ams-content-protection/parsing-jwt-tokens.png)
-
-The screen for testing protected content with different combinations of DRM/AES vs Streaming Protocols vs Container Format:
-
-![Screenshot that shows testing protected content with different combinations of D R M or A E S versus Streaming Protocols versus Container Format](media/aad-ams-content-protection/testing-protected-content.png)
-
-<!-- You can see a hosted version of the sample at [https://aka.ms/ott](https://aka.ms/ott)-->
-
-## Choose an Azure AD tenant
-
-> [!NOTE]
-> From here forward, it is assumed that you have logged in to the Azure portal and have at least one Azure AD tenant.
-
-Choose an Azure AD tenant to use for our end-to-end sample. You have two options:
-
-* An existing Azure AD tenant. Any Azure subscription must have one Azure AD tenant, but an Azure AD tenant can be used by multiple Azure subscriptions.
-* A new Azure AD tenant that is *not* used by any Azure subscription. If you choose the new tenant option, the media service account and the sample player app must be in an Azure subscription that uses a separate Azure AD tenant. This provides some flexibility. For example, you could use your own Azure AD tenant but also the customerΓÇÖs media service account in the customerΓÇÖs Azure subscription.
-
-## Register the backend resource app
-
-1. Navigate to the Azure AD tenant you chose or created.
-1. Select **Azure Active Directory** from the menu.
-1. Select **App registrations** from the menu.
-1. Click **+ New Registration**.
-1. Name the app *LicenseDeliveryResource2* (where 2 indicates AAD v2 endpoints).
-1. Select **Accounts in this organizational directory only ([*your tenant name*] only - Single tenant)**. If you want to enable access to multiple tenants, select one of the other multitenant options.
-1. The **Redirect URI** is optional and can be changed later.
-1. Click **Register**. The App registrations view will appear.
-1. Select **Manifest** from the menu. The Manifest view will appear.
-1. Change the value of the `accessTokenAcceptedVersion` to *2* (no quotes).
-1. Change the value of the `groupMembershipClaims` to *"SecurityGroup"* (with quotes).
-1. Click **Save**.
-1. Select **Expose an API** from the menu. The Add a scope view will appear. (Azure provides an Application ID URI, but if you want to change it, you can edit in the Application ID URI field.)
-1. Click **Save and continue**. The view will change. For each of the settings in the Setting column in the table below, enter the value in the Value column, then click **Add scope**.
-
-| Setting | Value | Description |
-| - | -- | -- |
-| Scope name | *DRM.License.Delivery* | How the scope will appear when access to this API is being requested, and in access tokens when the scope has been granted to a client application. This must be unique across this application. A best practice is to use ΓÇ£resource.operation.constraintΓÇ¥ as a pattern to generate the name. |
-| Who can consent? | *Admins and users* | Determines whether users can consent to this scope in directories where user consent is enabled. |
-| Admin consent display name | *DRM license delivery* | What the scope will be called in the consent screen when admins consent to this scope. |
-| Admin consent description** | *DRM license delivery backend resource scope* | A detailed description of the scope that is displayed when tenant admins expand a scope on the consent screen. |
-| User consent display name | *DRM.License.Delivery* | What the scope will be called in the consent screen when users consent to this scope. |
-| User consent description | *DRM license delivery backend resource scope* | This is a detailed description of the scope that is displayed when users expand a scope on the consent screen. |
-| State | *Enabled* | Determines whether this scope is available for clients to request. Set it to ΓÇ£DisabledΓÇ¥ for scopes that you do not want to be visible to clients. Only disabled scopes can be deleted, and we recommend waiting at least a week after a scope has been disabled before deleting it to ensure no clients are still using it. |
-
-## Register the client app
-
-1. Navigate to the Azure AD tenant you chose or created.
-1. Select **Azure Active Directory** from the menu.
-1. Select **App registrations** from the menu.
-1. Click **+ New Registration**.
-1. Give the client app a name, for example, *AMS AAD Content Protection*.
-1. Select **Accounts in this organizational directory only ([*your tenant name*] only - Single tenant)**. If you want to enable access to multiple tenants, select one of the other multitenant options.
-1. The **Redirect URI** is optional and can be changed later.
-1. Click **Register**.
-1. Select **API permissions** from the menu.
-1. Click **+ Add a permission**. The Request API permissions view will open.
-1. Click on the **My API** tab and select the *LicenseDeliveryResource2* app you created in the last section.
-1. Click on the DRM arrow and check the *DRM.License.Delivery* permission.
-1. Click **Add permissions**. The Add permissions view will close.
-1. Select **Manifest** from the menu. The Manifest view will appear.
-1. Find and add the following value pairs to the `replyUrlsWithType` attribute:
-
- ```json
- "replyUrlsWithType": [
- {
- "url": "https://npmwebapp.azurewebsites.net/",
- "type": "SPA"
- },
- {
- "url": "http://localhost:3000/",
- "type": "SPA"
- }
- ],
- ```
-
- > [!NOTE]
- > At this point, you do not yet have the URL for your player app. If you are running the app from your localhost webserver, you can use just the localhost value pair. Once you deploy your player app, you can add the entry here with the deployed URL. If you forget to do so, you will see an error message in the Azure AD sign in.
-
-1. Click **Save**.
-1. Finally to verify that your configuration is correct, select **Authentication**. The Authentication view will appear. Your client application will be listed as a Single Page App (SPA), the redirect URI will be listed, and the grant type will be Authorization Code Flow with PKCE.
-
-### Set up the Media Services account content key policy and streaming policies
-
-**This section assumes that you are a .NET developer and are familiar with using the AMS v3 API.**
-
-> [!NOTE]
-> As of this writing, you can't use the Azure portal for the media services account key policy setup because it doesn't support using an asymmetric token signing key with OpenID-Config. The setup must support Azure AD key rollover because the Azure AD issued token is signed by an asymmetric key and the key rolls over every six weeks. Therefore, this tutorial uses .NET and the AMS v3 API.
-
-Configuration of the [content key policy](drm-content-key-policy-concept.md) and [streaming policies](stream-streaming-policy-concept.md) for DRM and AES-128 apply. Change the `ContentKeyPolicyRestriction` in the content key policy.
-
-Below is the .NET code for creating the content key policy restriction.
-
-```dotnetcli
-ContentKeyPolicyRestriction objContentKeyPolicyRestriction;
-
-//use Azure Active Directory OpenId discovery document, supporting key rollover
-objContentKeyPolicyRestriction = new ContentKeyPolicyTokenRestriction()
- {
- OpenIdConnectDiscoveryDocument = ConfigAccessor.AppSettings["ida_AADOpenIdDiscoveryDocument"]
- };
-
-string audience = ConfigAccessor.AppSettings["ida_audience"];
-string issuer = ConfigAccessor.AppSettings["ida_issuer"];
-
-ContentKeyPolicyTokenRestriction objContentKeyPolicyTokenRestriction = (ContentKeyPolicyTokenRestriction)objContentKeyPolicyRestriction;
-objContentKeyPolicyTokenRestriction.Audience = audience;
-objContentKeyPolicyTokenRestriction.Issuer = issuer;
-objContentKeyPolicyTokenRestriction.RestrictionTokenType = ContentKeyPolicyRestrictionTokenType.Jwt;
-
-objContentKeyPolicyRestriction = objContentKeyPolicyTokenRestriction;
-
-return objContentKeyPolicyRestriction;
-```
-
-Change the `ida_AADOpenIdDiscoveryDocument`, `ida_audience`, and `ida_issuer` values in the above code. To find the values for these items in the Azure portal:
-
-1. Select the AAD tenant you used earlier, click on **App registrations** in the menu, then click on the **Endpoints** link.
-1. Select and copy the value of the **OpenIdConnect metadata document** field and paste it into the code as the `ida_AADOpenIdDiscoveryDocument` value.
-1. The `ida_audience` value is the Application (client) ID of the registered app *LicenseDeliveryResource2*.
-1. The `ida_issuer`value is the URL `https://login.microsoftonline.com/[tenant_id]/v2.0`. Replace [*tenant_id*] with your tenant ID.
-
-## Set up the sample player app
-
-If you haven't done so already, clone or download the app from the source repo: [https://github.com/Azure-Samples/media-services-content-protection-azure-ad](https://github.com/Azure-Samples/media-services-content-protection-azure-ad).
-
-You have two options to set up the player app:
-
-* Minimal customization (only replacing some constant string values such as client_id, tenant_id, and streaming URL), but you must use Visual Studio Code and Node.js.
-* If you prefer to use another IDE and web platform such as ASP.NET Core, you can put the web page files, JavaScript files, and CSS file into your project since the player app itself does not use any server-side code.
-
-### Option 1
-
-1. Start Visual Studio Code.
-1. To open the project, click File -> Open Folder -> browse to and select the parent folder of the *package.json* file.
-1. Open the JavaScript file *public/javascript/constants.js*.
-1. Replace `OAUTH2_CONST.CLIENT_ID` by the `client_id` of your registered client app in AAD tenant. You can find the `client_id` at the Overview section of the registered app in Azure portal. Note: it's the client ID, not object ID.
-1. Replace `OAUTH2_CONST.TENANT_ID` by the `tenant_id` of your Azure AD tenant. You can find your `tenant_id` by clicking on Azure Active Directory menu. The tenant_id will appear in the Overview section.
-1. Replace `OAUTH2_CONST.SCOPE` by the scope you added into your registered client app. You can find the scope by navigating to the registered client app from the **App registrations** menu then selecting your client app:
- 1. Select your client app, click on the **API permissions** menu, then select the scope *DRM.License.Delivery* under the API permission *LicenseDeliveryResource2*. The permission should be in the format like *api://df4ed433-dbf0-4da6-b328-e1fe05786db5/DRM.License.Delivery*. **Important**: Keep the space in front of `offline_access` in `OAUTH2_CONST.SCOPE`.
-1. Replace the two constant strings for `AMS_CONST` as shown below. One is the protected streaming URL of your test asset, and the other is FPS Application Certificate URL if you would like to include the FairPlay test case. Otherwise you can leave it as is for `AMS_CONST.APP_CERT_URL`. Then, click **Save**.
-
-```javascript
-//defaults in ams.js
-class AMS_CONST {
- static get DEFAULT_URL() {
- return "https://eventgridmediaservice-usw22.streaming.media.azure.net/9591e337-ae90-420e-be30-1da36c06665b/MicrosoftElite01.ism/manifest(format=mpd-time-csf,encryption=cenc)";
- }
- //FairPlay application cert URL
- static get APP_CERT_URL() {
- return `${window.location.href}cert/FPSAC.cer`;
- }
-}
-```
-
-To test locally:
-
-1. In Visual Studio Code (VSC), select **View** from the main menu then **Terminal**.
-1. If you haven't already installed npm, at the command prompt enter `npm install`.
-1. Enter `npm start` at the command prompt. (If npm doesn't start, try changing the directory to `npmweb` by entering `cd npmweb` at the command prompt.)
-1. Use a browser to browse to `http://localhost:3000`.
-
-Depending on the browser you use, pick the correct combination of DRM/AES vs Streaming Protocol vs Container Format to test after sign in (`access_token` acquisition). If you are testing in Safari on macOS, check the Redirect API option since Safari does not allow popups. Most other browsers allow both popups and redirect options.
-
-### Option 2
-
-If you plan to use another IDE/web platform and/or a webserver such as IIS running on your development machine, copy the following files into a new directory in your local webserver. The paths below are where you will find them in the code you downloaded.
-
-* *views/index.ejs* (change suffix to .html)
-* *views/jwt.ejs* (change suffix to .html)
-* *views/info.ejs* (change suffix to html)
-* *public/** (JavaScript files, CSS, images as shown below)
-
-1. Copy the files found in the *view* folder to the root of the new directory.
-1. Copy the *folders* found in the *public* folder to the root of the new directory.
-1. Change the extensions of the `.ejs` files to `.html`. (No server-side variable is used so you can safely change it.)
-1. Open *https://docsupdatetracker.net/index.html* in VSC (or other code editor) and change the `<script>` and `<link>` paths so that they reflect where the files are located. If you followed the previous steps, you only have to delete the `\` in the path. For example, `<script type="text/javascript" src="/javascript/constants.js"></script>` becomes `<script type="text/javascript" src="javascript/constants.js"></script>`.
-1. Customize the constants in the *javascript/constants.js* file as in Option 1.
-
-## Common customer scenarios
-
-Now that you've completed the tutorial and have a working subsystem, you can try modifying it for the following customer scenarios:
-
-### Azure role-based access control (Azure RBAC) for license delivery via Azure AD group membership
-
-So far, the system allows any user who can sign in to get a valid license and play the protected content.
-
-It is a common customer requirement that a subset of authenticated users is allowed to watch content while others are not, for example, a customer who offers basic and premium subscriptions for its video content. Their customers, who paid for a basic subscription should not be able to watch content which requires a premium subscription. Below are the additional steps required to meet this requirement:
-
-#### Set up the Azure AD tenant
-
-1. Set up two accounts in your tenant. They could be named *premium_user* and *basic_user*;
-1. Create a user group and call it *PremiumGroup*.
-1. Add the *premium_user* to the *PremiumGroup* as a member, but do not add the *basic_user* to the group.
-1. Take note of the **Object ID** of the *PremiumGroup*.
-
-#### Set up the Media Services account
-
-Modify `ContentKeyPolicyRestriction` (as shown in the section above in the Setup in Media Service Account), by adding a claim named *groups*, where `ida_EntitledGroupObjectId` has the object ID of *PremiumGroup* as its value:
-
-```dotnetcli
-
-var tokenClaims = new ContentKeyPolicyTokenClaim[] { new ContentKeyPolicyTokenClaim("groups", ConfigAccessor.AppSettings["ida_EntitledGroupObjectId"])
-//, other claims, if any.
-};
-
-if (tokenClaims != null && tokenClaims.Length > 0)
-{
- objContentKeyPolicyTokenRestriction.RequiredClaims = new List<ContentKeyPolicyTokenClaim>(tokenClaims);
-}
-```
-
-The *groups* claim is a member of a [Restricted Claim Set](../../active-directory/develop/reference-claims-mapping-policy-type.md#claim-sets) in Azure AD.
-
-#### Test
-
-1. Sign in with the *premium_user* account. You should be able to play the protected content.
-1. Sign in with the *basic_user* account. You should get an error indicating the video is encrypted but there is no key to decrypt it. If you view the Events, errors, and downloads with the dropdown at the bottom of the player diagnostic overlay, the error message should indicate license acquire failure due to the missing claim value for groups claim in the JWT issued by Azure AD token endpoint.
-
-### Supporting multiple media service accounts (across multiple subscriptions)
-
-A customer may have multiple media service accounts across either a single or multiple Azure subscriptions. For example, a customer may have one media service account as production primary, another one as secondary/backup, and another for dev/test.
-
-All you need to do is to ensure that you use the same set of parameters as you used in the section (Setup in Media Service Account) in creating the `ContentKeyPolicyRestriction` in all of the media service accounts.
-
-### Supporting a customer, its vendors, and/or subsidiaries across multiple AAD tenants
-
-As users of the solution, a customer's subsidiaries, vendors/partners may reside in different AAD tenants, such as `mycustomer.com`, `mysubsidiary.com`, and `myparther.com`. While this solution is built on a single specific AAD tenant, such as `mycustomer.com`, you can make it work for users from other tenants.
-
-Using `mycustomer.com` for this solution, add a user from `mypartner.com` as a guest user to the `mycustomer.com` tenant. Make sure the `mypartner.com` user activates the guest account. The guest account can be either from another AAD tenant or an `outlook.com` account.
-
-Notice that the guest users from `mypartner.com`, after being activated in `mycustomer.com`, are still authenticated through their own/original AAD tenant, `mypartner.com`, but the `access_token` is issued by `mycustomer.com`.
-
-### Supporting a customer tenant/subscription with a setup in your subscription/tenant
-
-You can use your setup to test protected content in your customer's media service account/subscription. You would set it up with an Azure AD tenant and a media service account in the same subscription. The customer's media service account would be in their Azure subscription with their own Azure AD tenant.
-
-1. Add your customer's account into your tenant as a guest account.
-1. Work with your customer to prepare protected content in your customer's media service account by providing the three parameters as listed in the Setup in Media Service Account section.
-
-Your customer can then browse to your setup, sign in with the guest account, and test their own protected content. You can also sign in with your own account and test your customer's content.
-
-Your sample solution may be set up in a Microsoft tenant with Microsoft subscription or custom tenant with Microsoft subscription. The Azure Media Service instance can be from another subscription with its tenant.
-
-## Clean up resources
-
-> [!WARNING]
-> If you're not going to continue to use this application, delete the resources you created while following this tutorial. Otherwise, you will be charged for them.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Quickstart: Encrypt content](drm-encrypt-content-how-to.md)
media-services Architecture Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/architecture-concept.md
- Title: Media Services architectures
-description: This article describes architectures for Media Services.
------- Previously updated : 11/20/2020----
-# Media Services architectures
--
-## Live streaming digital media
-
-A live streaming solution allows you to capture video in real-time and broadcast it to consumers in real time, such as streaming interviews, conferences, and sporting events online. In this solution, video is captured by a video camera and sent to a channel input endpoint. The channel receives the live input stream and makes it available for streaming through a streaming endpoint to a web browser or mobile app. The channel also provides a preview monitoring endpoint to preview and validate your stream before further processing and delivery. The channel can also record and store the ingested content in order to be streamed later (video-on-demand).
-
-This solution is built on the Azure managed
-
-See [Live streaming digital media](/azure/architecture/solution-ideas/articles/digital-media-live-stream) in the Azure Architecture center.
-
-## Video-on-demand digital media
-
-A basic video-on-demand solution that gives you the capability to stream recorded video content such as movies, news clips, sports segments, training videos, and customer support tutorials to any video-capable endpoint device, mobile application, or desktop browser. Video files are uploaded to Azure Blob storage, encoded to a multi-bitrate standard format, and then distributed via all major adaptive bit-rate streaming protocols (HLS, MPEG-DASH, Smooth) to the Azure Media Player client.
-
-This solution is built on the Azure managed
-
-See [Video-on-demand digital media](/azure/architecture/solution-ideas/articles/digital-media-video) in the Azure Architecture center.
-
-## Gridwich media processing system
-
-The Gridwich system demonstrates best practices for processing and delivering media assets on Azure. Although the Gridwich system is media-specific, the message processing and eventing framework can apply to any stateless event processing workflow.
-
-See [Gridwich media processing system](/azure/architecture/reference-architectures/media-services/gridwich-architecture) in the Azure Architecture center.
-
-## Next steps
-
-> [Azure Media Services overview](media-services-overview.md)
media-services Architecture High Availability Encoding Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/architecture-high-availability-encoding-concept.md
- Title: High Availability with Media Services Video on Demand
-description: This article is an overview of the Azure services you can use to facilitate high availability for the VOD application.
-------- Previously updated : 08/31/2020---
-# High Availability with Media Services and Video on Demand (VOD)
--
-## High availability for VOD
-
-There is a high availability design pattern called [Geodes](/azure/architecture/patterns/geodes) in the Azure Architecture documentation. It describes how duplicate resources are deployed to different geographic regions to provide scalability and resiliency. You can use Azure services to create such an architecture to cover many high availability design considerations such as redundancy, health monitoring, load balancing, and data backup and recovery. One such architecture is described below with details on each service used in the solution as well as how the individual services can be used to create a high availability architecture for your VOD application.
-
-### Sample
-
-There is a sample available for you to use to become familiar with high availability with Media Services and Video on Demand (VOD). It also goes into more detail about how the services are used for a VOD scenario. The sample is not intended to be used in production in its current form. Carefully review the sample code and the readme, particularly the section on [Failure Modes](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/master/HighAvailabilityEncodingStreaming) before integrating it into a production application. A production implementation of high availability for Video on Demand (VOD) should also carefully review their Content Delivery Network (CDN) strategy. Check out the [code on GitHub](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/master/HighAvailabilityEncodingStreaming).
-
-## Overview of services
-
-The services used in this example architecture include:
-
-| Icon | Name | Description |
-| :--: | - | -- |
-|![This is the Media services account icon.](./medi). |
-|![This is the Storage account icon.](./medi). |
-|![This is the Azure Storage Queue icon.](./medi). |
-|![This is the Azure Cosmos DB icon.](./medi). |
-|![This is the Managed identity icon.](./medi). |
-|![This is the Key Vault icon.](./medi). |
-|![This is the Azure Functions icon.](./medi).<br><br>**VOD use:**<br>Azure Functions can be used to store host the modules of your VOD application. Modules for a VOD application could include:<br><br>**Job Scheduling Module**<br>The job scheduling module would be for submitting new jobs to a Media Services cluster (two or more instances in different regions). It would track the health status of each Media Services instance and would submit a new job to the next healthy instance.<br><br>**Job Status Module**<br>The job status module would be listening to job output status events coming from Azure Event Grid service. It would store events to event store to minimize the number of calls to Media Services APIs by rest of the modules.<br><br>**Instance Health Module**<br>This module would track submitted jobs and determine the health status for each Media Services instance. It would track finished jobs, failed jobs and jobs that never finished.<br><br>**Provisioning Module**<br>This module would provision processed assets. It would copy asset data to all Media Services instances and set up Azure Front Door service to ensure that assets could be streamed even if some Media Services instances werenΓÇÖt available. It would also set up streaming locators.<br><br>**Job Verification Module**<br>This module would track each submitted job, resubmit failed jobs and perform job data clean up once a job is successfully finished. |
-|![This is the App Service icon.](./medi). |
-|![This is the Azure Front Door icon.](./medi). |
-|![This is the Azure Event Grid icon.](./medi). |
-|![This is the Application Insights icon.](./medi). |
-## Architecture
-
-This high-level diagram shows the architecture of the sample provided to get you started with high availability and media services.
-
-[ ![Video on Demand (VOD) High Level Architecture Diagram](media/architecture-high-availability-encoding-concept/high-availability-architecture.svg) ](./media/architecture-high-availability-encoding-concept/high-availability-architecture.svg#lightbox)
-
-## Best practices
-
-### Regions
-
-* [Create](./account-create-how-to.md) two (or more) Azure Media Services accounts. The two accounts need to be in different regions. For more information, see [Regions in which the Azure Media Services service is deployed](https://azure.microsoft.com/global-infrastructure/services/?products=media-services).
-* Upload your media to the same region from which you are planning to submit the job. For more information about how to start encoding, see [Create a job input from an HTTPS URL](./job-input-from-http-how-to.md) or [Create a job input from a local file](./job-input-from-local-file-how-to.md).
-* If you then need to resubmit the [job](./transform-jobs-concept.md) to another region, you can use `JobInputHttp` or use `Copy-Blob` to copy the data from the source Asset container to an Asset container in the alternate region.
-
-### Monitoring
-
-* Subscribe for `JobStateChange` messages in each account via Azure Event Grid.
- * [Register for events](./reacting-to-media-services-events.md) via the Azure portal or the CLI (you can also do it with the Event Grid Management SDK)
- * Use the [Microsoft.Azure.EventGrid SDK](https://www.nuget.org/packages/Microsoft.Azure.EventGrid/) (which supports Media Services events natively).
- * You can also consume Event Grid events via Azure Functions.
-
- For more information:
-
- * See the [Audio Analytics sample](./transform-jobs-concept.md) which shows how to monitor a job with Azure Event Grid including adding a fallback in case the Azure Event Grid messages are delayed for some reason.
- * Take a look at the [Azure Event Grid schemas for Media Services events](./media-services-event-schemas.md).
-
-* When you create a [job](./transform-jobs-concept.md):
- * Randomly select an account from the list of currently used accounts (this list will normally contain both accounts but if issues are detected it may contain only one account). If the list is empty, raise an alert so an operator can investigate.
- * Create a record to keep track of each inflight job and the region/account used.
-* When your `JobStateChange` handler gets a notification that a job has reached the scheduled state, record the time it enters the scheduled state and the region/account used.
-* When your `JobStateChange` handler gets a notification that a job has reached the processing state, mark the record for the job as processing and record the time it enters the processing state.
-* When your `JobStateChange` handler gets a notification that a job has reached a final state (Finished/Errored/Canceled), mark the record for the job appropriately.
-* Have a separate process that periodically looks at your records of the jobs
- * If you have jobs in the scheduled state that haven't advanced to the processing state in a reasonable amount of time for a given region, remove that region from your list of currently used accounts. Depending on your business requirements, you could decide to cancel those jobs right away and resubmit them to the other region. Or, you could give them some more time to move to the next state.
- * If a region was removed from the account list, monitor it for recovery before adding it back to the list. The regional health can be monitored via the existing jobs on the region (if they weren't canceled and resubmitted), by adding the account back to the list after a period of time, and by operators monitoring Azure communications about outages that may be affecting Azure Media Services.
-
-## Next steps
-
-* Check out [code samples](/samples/browse/?products=azure-media-services)
media-services Asset Create Asset Filter How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-create-asset-filter-how-to.md
- Title: Create a Media Services asset filter
-description: This article shows you how to create a Media Services asset filter.
----- Previously updated : 03/08/2022--
-# Create an account filter
--
-This article shows you how to create a Media Services asset filter.
-
-<!-- NOTE: The following are in the includes folder and are reused in other How To articles. All task based content should be in the includes folder with the task- prefix prepended to the file name. -->
-
-## Methods
-
-You can use the following methods to create a Media Services asset filter.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/asset-filters/create-or-update).
--
media-services Asset Create Asset How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-create-asset-how-to.md
- Title: Upload content to an asset CLI
-description: The Azure CLI script in this topic shows how to create a Media Services Asset to upload content to.
----- Previously updated : 03/01/2022---
-# Create an Asset
--
-This article shows how to create a Media Services Asset. You will use an asset to hold media content for encoding and streaming. To learn more about Media Services assets, read [Assets in Azure Media Services v3](assets-concept.md)
-
-## Prerequisites
-
-Follow the steps in [Create a Media Services account](./account-create-how-to.md) to create the needed Media Services account and resource group to create an asset.
-
-## Methods
-
-Use the following methods to create a Media Services asset.
-
-## [Portal](#tab/portal/)
-
-Creating assets in the portal is as simple as uploading a file.
--
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-### Using REST
--
-### Using cURL
--
-## [.NET](#tab/net/)
---
media-services Asset Delete Asset Filter How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-delete-asset-filter-how-to.md
- Title: Delete a Media Services asset filter
-description: This article shows you how to delete a Media Services asset filter.
----- Previously updated : 03/01/2022---
-# Delete an asset filter
--
-This article shows how to delete a Media Services asset filter.
-
-## Methods
-
-You can use the following methods to delete a Media Services asset filter.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/asset-filters/delete).
media-services Asset Delete Asset How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-delete-asset-how-to.md
- Title: Delete a Media Services asset
-description: This article shows you how to delete a Media Services asset.
----- Previously updated : 03/01/2022---
-# Delete an asset
--
-This article shows how to delete a Media Services asset.
-
-## Methods
-
-You can use the following methods to delete a Media Services asset.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/assets/delete).
media-services Asset Get Asset Sas Urls How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-get-asset-sas-urls-how-to.md
- Title: Get a Media Services asset SAS URLs
-description: This article shows you how to get a Media Services asset's SAS URLs.
----- Previously updated : 03/08/2022---
-# Get an asset's SAS URLs
--
-This article shows you how to get a Media Services asset's SAS URLs.
-
-## Methods
-
-You can use the following methods to get a Media Services asset's SAS URLs.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/assets/list-container-sas).
--
media-services Asset Get Encryption Key How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-get-encryption-key-how-to.md
- Title: Get a Media Services asset encryption key
-description: This article shows you how to get a Media Services asset encryption key.
----- Previously updated : 03/08/2022---
-# Get an asset encryption key
--
-This article shows you how to get a Media Services asset encryption key.
-
-## Methods
-
-You can use the following methods to get a Media Services asset encryption key.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/assets/get-encryption-key).
--
media-services Asset List Asset Filters How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-list-asset-filters-how-to.md
- Title: List the asset filters of a Media Services asset
-description: This article shows you how to list the asset filters of a Media Services asset.
----- Previously updated : 03/08/2022---
-# List the asset filters of an asset
--
-This article shows you how to list the asset filters of a Media Services asset.
-
-## Methods
-
-You can use the following methods to list the asset filters of a Media Services asset.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/asset-filters/list).
--
media-services Asset List Asset Streaming Locators How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-list-asset-streaming-locators-how-to.md
- Title: List the streaming locators of a Media Services asset
-description: This article shows you how to list the streaming locators of a Media Services asset.
----- Previously updated : 03/08/2022---
-# List the streaming locators of an asset
--
-This article shows you how to list the streaming locators of a Media Services asset.
-
-## Methods
-
-You can use the following methods to list the streaming locators of a Media Services asset.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/assets/list-streaming-locators).
--
media-services Asset Update Asset Filter How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-update-asset-filter-how-to.md
- Title: Update a Media Services asset filter
-description: This article shows you how to update a Media Services asset filter.
----- Previously updated : 03/08/2022---
-# Update an asset filter
--
-This article shows you how to update a Media Services asset filter.
-
-## Methods
-
-You can use the following methods to update a Media Services asset filter.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/asset-filters/update).
--
media-services Asset Upload Media How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/asset-upload-media-how-to.md
- Title: Upload media
-: Azure Media Services
-description: Learn how to upload media for streaming or encoding.
------- Previously updated : 08/31/2020---
-# Upload media for streaming or encoding
--
-In Media Services, you upload your digital files (media) into a blob container associated with an asset. The [Asset](/rest/api/media/operations/asset) entity can contain video, audio, images, thumbnail collections, text tracks and closed caption files (and the metadata about these files). Once the files are uploaded into the asset's container, your content is stored securely in the cloud for further processing and streaming.
-
-Before you get started though, you'll need to collect or think about a few values.
-
-1. The local file path to the file you want to upload
-1. The asset ID for the asset (container)
-1. The name you want to use for the uploaded file including the extension
-1. The name of the storage account you are using
-1. The access key for your storage account
-
-## [Portal](#tab/portal/)
--
-## [CLI](#tab/cli/)
--
-## [Python](#tab/python)
-
-Assuming that your code has already established authentication and you have already created an input Asset, use the following code snippet to upload local files to that asset (in_container).
-
-```python
-#The storage objects
-from azure.storage.blob import BlobServiceClient, BlobClient
-
-#Establish storage variables
-storage_account_name = '<your storage account name'
-storage_account_key = '<your storage account key'
-storage_blob_url = 'https://<your storage account name>.blob.core.windows.net/'
-
-in_container = 'asset-' + inputAsset.asset_id
-
-#The file path of local file you want to upload
-source_file = "ignite.mp4"
-
-# Use the Storage SDK to upload the video
-blob_service_client = BlobServiceClient(account_url=storage_blob_url, credential=storage_account_key)
-blob_client = blob_service_client.get_blob_client(in_container,source_file)
-
-# Upload the video to storage as a block blob
-with open(source_file, "rb") as data:
- blob_client.upload_blob(data, blob_type="BlockBlob")
-```
--
-<!-- add these to the tabs when available -->
-For other methods see the [Azure Storage documentation](../../storage/blobs/index.yml) for working with blobs in [.NET](../../storage/blobs/storage-quickstart-blobs-dotnet.md), [Java](../../storage/blobs/storage-quickstart-blobs-java.md), [Python](../../storage/blobs/storage-quickstart-blobs-python.md), and [JavaScript (Node.js)](../../storage/blobs/storage-quickstart-blobs-nodejs.md).
-
-## Next steps
-
-> [Media Services overview](media-services-overview.md)
media-services Assets Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/assets-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Assets in Azure Media Services
-description: Learn about what assets are and how they're used by Azure Media Services.
------- Previously updated : 01/06/2022-----
-# Assets in Azure Media Services v3
--
-In Azure Media Services, an [Asset](/rest/api/media/assets) is a core concept. It is where you input media (for example, through upload or live ingest), output media (from a job output), and publish media (for streaming).
-
-An Asset is mapped to a blob container in the [Azure Storage account](storage-account-concept.md) and the files in the Asset are stored as block blobs in that container. Assets contain information about digital files stored in Azure Storage (including video, audio, images, thumbnail collections, text tracks, and closed caption files).
-
-Media Services supports Blob tiers when the account uses General-purpose v2 (GPv2) storage. With GPv2, you can move files to [Cool or Archive storage](../../storage/blobs/access-tiers-overview.md). **Archive** storage is suitable for archiving source files when no longer needed (for example, after they've been encoded).
-
-The **Archive** storage tier is only recommended for very large source files that have already been encoded and the encoding Job output was put in an output blob container. The blobs in the output container that you want to associate with an Asset and use to stream or analyze your content must exist in a **Hot** or **Cool** storage tier.
-
-## Naming
-
-### Assets
-
-Asset's names must be unique. Media Services v3 resource names (for example, Assets, Jobs, Transforms) are subject to Azure Resource Manager naming constraints. For more information, see [Naming conventions](media-services-apis-overview.md#naming-conventions).
-
-### Blobs
-
-The names of files/blobs within an asset must follow both the [blob name requirements](/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata) and the [NTFS name requirements](/windows/win32/fileio/naming-a-file). The reason for these requirements is the files can get copied from blob storage to a local NTFS disk for processing.
-
-## Next steps
-
-[Media Services Overview](media-services-overview.md)
-
-## See also
-
-[Differences between Media Services v2 and v3](migrate-v-2-v-3-migration-introduction.md)
media-services Azure Clouds Regions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/azure-clouds-regions.md
- Title: Azure Media Services v3 regions availability
-description: This article talks about Azure clouds and regions in which Azure Media Services v3 is available.
----- Previously updated : 12/9/2021---
-# Azure Media Services v3 clouds and regions availability
--
-Azure Media Services v3 is available via Azure Resource Manager. However, not all Media Services features are available in all the Azure clouds. This document outlines availabilities of main Media Services v3 components. The following tables show which Media Services features are available in each region.
--
-Use the navigation on the right to find the region you are interested in.
-
-<!-- US and US Gov -->
-<!-- Africa -->
-<!-- APAC -->
-<!-- Australia -->
-<!-- Brazil -->
-<!-- Canada -->
-<!-- China -->
-<!-- Europe -->
-<!-- Germany -->
-<!-- India -->
-<!-- Japan -->
-<!-- Korea -->
-<!-- Norway -->
-<!-- Sweden -->
-<!-- Switzerland -->
-<!-- UAE -->
-<!-- UK -->
-
-## Regions/geographies/locations
-
-[Regions in which the Azure Media Services service is deployed](https://azure.microsoft.com/global-infrastructure/services/?products=media-services)
-
-## See also
-
-* [Azure regions](https://azure.microsoft.com/global-infrastructure/regions/)
-* [Regional code names and endpoints](azure-regions-code-names.md)
-* [Azure geographies](https://azure.microsoft.com/global-infrastructure/geographies/)
-* [Azure locations](https://azure.microsoft.com/global-infrastructure/locations/)
-
-## Next steps
-
-[Media Services v3 overview](media-services-overview.md)
media-services Azure Regions Code Names https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/azure-regions-code-names.md
- Title: Clouds and regions for Azure Media Services v3
-description: This article talks about the URLs used for endpoints and code for regions.
----- Previously updated : 10/28/2020---
-# Regional code names and endpoints
-
-### Region code name
-
-When the **location** parameter is used in a command or request, you need to provide the region code name as the **location** value. To get the code name of the region that your account is in and that your call should be routed to, you can run the following line in Azure CLI.
-
-```azurecli-interactive
-az account list-locations
-```
-
-Once you run the line shown above, you get a list of all Azure regions. Navigate to the Azure region that has the *displayName* you are looking for, and use its *name* value for the **location** parameter.
-
-For example, for the Azure region West US 2 (displayed below), you will use "westus2" for the **location** parameter.
-
-```json
- {
- "displayName": "West US 2",
- "id": "/subscriptions/00000000-23da-4fce-b59c-f6fb9513eeeb/locations/westus2",
- "latitude": "47.233",
- "longitude": "-119.852",
- "name": "westus2",
- "subscriptionId": null
- }
-```
-
-## Endpoints
-
-The following endpoints are important to know when connecting to Media Services accounts from different national Azure clouds.
-
-### Global Azure
-
-| Service | Endpoint |
-| - | -- |
-| Azure Resource Manager | `https://management.azure.com/` |
-| Authentication | `https://login.microsoftonline.com/` |
-| Token audience | `https://management.core.windows.net/` |
-
-### Azure Government
-
-| Service | Endpoint |
-| - | -- |
-| Azure Resource Manager | `https://management.usgovcloudapi.net/` |
-| Authentication | `https://login.microsoftonline.us/` |
-| Token audience | `https://management.core.usgovcloudapi.net/` |
--
-### Azure Germany
-
-> [!NOTE]
-> The Azure Germany endpoints only apply to the Sovereign clouds in Germany.
-
-| Service | Endpoint |
-| - | -- |
-| Azure Resource Manager | `https://management.cloudapi.de/` |
-| Authentication | `https://login.microsoftonline.de/` |
-| Token audience | `https://management.core.cloudapi.de/`|
-
-### Azure China 21Vianet
-
-| Service | Endpoint |
-| - | -- |
-| Azure Resource Manager | `https://management.chinacloudapi.cn/` |
-| Authentication | `https://login.chinacloudapi.cn/` |
-| Token audience | `https://management.core.chinacloudapi.cn/` |
-
-## See also
-
-* [Azure regions](https://azure.microsoft.com/global-infrastructure/regions/)
-* [Regional code names and endpoints](azure-regions-code-names.md)
-* [Azure geographies](https://azure.microsoft.com/global-infrastructure/geographies/)
-* [Azure locations](https://azure.microsoft.com/global-infrastructure/locations/)
-
-## Next steps
-
-[Media Services v3 overview](media-services-overview.md)
media-services Compliance Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/compliance-concept.md
- Title: Media Services regulatory compliance
-description: Azure Media Services helps Azure Government customers meet their compliance obligations.
------- Previously updated : 09/11/2021---
-# Media Services regulatory compliance
--
-Media Services meets the demanding requirements of the US Federal Risk & Authorization Management Program (FedRAMP) and of the US Department of Defense (DoD) Cloud Computing Security Requirements Guide (SRG) Impact Level (IL) 2, IL4, and IL5. By deploying authorized services in Azure Government, Office 365 GCC High and DoD, and Dynamics 365 US Government, federal and defense agencies can use a rich array of cloud services while meeting their compliance obligations.
-
-## FedRAMP and DoD compliance
-
-Media Services in Azure Public maintains:
--- FedRAMP High Provisional Authorization to Operate (P-ATO)-- DoD IL2 Provisional Authorization (PA)-
-Media Services in Azure Government maintains:
--- FedRAMP High P-ATO-- DoD IL2 PA-- DoD IL4 PA-- DoD IL5 PA-
-For more information about Azure compliance coverage for US government, see Azure [FedRAMP High](/azure/compliance/offerings/offering-fedramp), [DoD IL2](/azure/compliance/offerings/offering-dod-il2), [DoD IL4](/azure/compliance/offerings/offering-dod-il4), and [DoD IL5](/azure/compliance/offerings/offering-dod-il5) documentation. For FedRAMP and DoD audit scope, see [Cloud services by audit scope](../../azure-government/compliance/azure-services-in-fedramp-auditscope.md).
-
-## Azure compliance documentation
-
-To help you meet your own compliance obligations across regulated industries and markets worldwide, Azure maintains the largest compliance portfolio in the industry both in terms of breadth (total number of [compliance offerings](/azure/compliance/offerings/)) and depth (number of [customer-facing services](https://azure.microsoft.com/services/) in assessment scope). For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/).
-
-Azure compliance offerings are grouped into four segments - globally applicable, US Government, industry specific, and region/country specific. Compliance offerings are based on various types of assurances, including formal certifications, attestations, validations, authorizations, and assessments produced by independent third-party auditing firms, as well as contractual amendments, self-assessments, and customer guidance documents produced by Microsoft. For more information, see [Azure compliance documentation](../../compliance/index.yml). You will also find additional compliance resources such as audit reports, a checklist for privacy and General Data Protection Regulation (GDPR), compliance blueprints, country and regional guidelines, implementation and mappings, as well as white papers and analyst reports.
-
-## Next steps
-
-> [Azure Media Services overview](media-services-overview.md)
media-services Concept Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concept-availability-zones.md
- Title: Availability Zones
-description: Media Services supports Availability Zones providing fault-isolation
----- Previously updated : 05/27/2021--
-# Availability Zones
-
-Azure Media Services uses [Availability Zones](../../availability-zones/az-overview.md), providing fault-isolated locations within the same Azure region. Media Services is zone-redundant by default in the [available locations](../../availability-zones/az-region.md#azure-regions-with-availability-zones) and no extra configuration on the account is required to enable this capability. Media Services stores media data in the associated storage account(s). These storage accounts should be created as zone-redundant storage (ZRS) or Geo-zone-redundant storage (GZRS) to provide the same level of redundancy as the Media Services account. For details on how to configure replication on the associated storage account(s), see the article [Change how a storage account is replicated](../../storage/common/redundancy-migration.md).
-
-## How Media Services components handle an Availability Zone fault
-
-| Component | Behavior on Availability Zone fault |
-|-- |-|
-| Control Plane (Azure Resource Management) | Continues to work as normal |
-| Key delivery | Continues to work as normal |
-| Jobs | Jobs are rescheduled in another Availability Zone. There will be a delay in processing time as in-flight processing jobs are rescheduled to start over in the Availability Zone |
-| Live Events | Streaming and ingest to the live event continues to work as normal. Calling "reset" on a Live Event is currently not supported during an Availability Zone fault. It's recommended to first stop and restart the live event as a replacement for the "reset" operation. If a live event was transitioning from to the "Running" state during a zone down event, customers may see the live event stuck in the "starting" state. In this case, it's recommended to start a new live event and clean up the "starting" live events after the zone recovers. |
-| Streaming Endpoints | Continues to work as normal |
--
-## High Availability Streaming and Encoding for VOD
-
-Availability Zones increase the fault-isolation in a single region. To provide high availability for on-demand streaming and encoding you can use other Azure services to create an architecture that covers concerns like redundancy, health monitoring, load balancing, and data backup and recovery. One such architecture is provided in the [High Availability with Media Services Video on Demand](architecture-high-availability-encoding-concept.md) article.
-The article and sample code provides a solution for how individual regional Media Services accounts can be used to create a high availability architecture for your VOD application.
-
-## Media Services support for Availability Zones by region
-
-Availability Zones are currently only supported in certain Azure regions. To learn more about Availability Zones region support, see [Azure Regions with Availability Zones](../../availability-zones/az-region.md#azure-regions-with-availability-zones)
-
-## Further reading
-
-To learn more about Availability Zones, see [Regions and Availability Zones in Azure](../../availability-zones/az-overview.md).
-
-To learn more about High Availability encoding and streaming, see [High Availability with Media Services Video on Demand](architecture-high-availability-encoding-concept.md).
-
-To learn how to properly configure storage account replication to support Availability Zones, see [Change how a storage account is replicated](../../storage/common/redundancy-migration.md).
media-services Concept Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concept-managed-identities.md
- Title: Managed identities
-description: Media Services can be used with Azure Managed Identities.
----- Previously updated : 02/17/2022---
-# Managed identities
-
-A common challenge for developers is the management of secrets and credentials to secure communication between different services. On Azure, managed identities eliminate the need for developers having to manage credentials by providing an identity for the Azure resource in Azure AD and using it to obtain Azure Active Directory (Azure AD) tokens.
--
-## Media Services Managed Identity scenarios
-
-There are three scenarios where Managed Identities can be used with Media
--- [Granting a Media Services account access to Key Vault to enable Customer Managed Keys](security-encrypt-data-managed-identity-cli-tutorial.md)-- [Granting a Media Services account access to storage accounts to allow Media Services to bypass Azure Storage Network ACLs](security-access-storage-managed-identity-cli-tutorial.md)-- Allowing other services (for example, VMs or [Azure Functions](security-function-app-managed-identity-cli-tutorial.md)) to access Media Services-
-In the first two scenarios, the Managed Identity is used to grant the *Media Services account* access to other services. In the third scenario, *the service* has a Managed Identity which is used to access Media Services.
--
-> [!NOTE]
-> These scenarios can be combined. You could create Managed Identities for both the Media Services account (for example, to access customer managed keys) and the Azure Functions resource to access to Media Services account.
-
-## Tutorials and How-tos
-
-Try these tutorials to get some hands-on experience with using a Managed Identity with Media Services.
--- [CLI: Encrypt data into a Media Services account using a key in Key Vault](security-encrypt-data-managed-identity-cli-tutorial.md)-- [CLI: Allow Media Services to access a storage account that is configured to block requests from unknown IP addresses](security-access-storage-managed-identity-cli-tutorial.md)-- [CLI: Give a Function App access to Media Services account](security-function-app-managed-identity-cli-tutorial.md)-- [PORTAL: Use the Azure portal to use customer-managed keys or BYOK with Media Services](security-customer-managed-keys-portal-tutorial.md)-- [POSTMAN/REST: Use customer-managed keys or BYOK with Media Services REST API](security-customer-managed-keys-rest-postman-tutorial.md)-
-## Further reading
-
-To learn more about what managed identities can do for you and your Azure applications, see [Azure AD Managed Identities](../../active-directory/managed-identities-azure-resources/overview.md).
-
-To learn more about Azure Functions, see [About Azure Functions](../../azure-functions/functions-overview.md)
media-services Concept Media Reserved Units https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concept-media-reserved-units.md
- Title: Media reserved units - Azure
-description: Media reserved units allow you to scale media process and determine the speed of your media processing tasks.
------- Previously updated : 08/25/2021---
-# Media Reserved Units
--
-Media Reserved Units (MRUs) were previously used in Azure Media Services v2 to control encoding concurrency and performance. You no longer need to manage MRUs or request quota increases for any media services account as the system will automatically scale up and down based on load. You'll also see performance that is equal to or improved in comparison to using MRUs.
-
-If you have an account that was created using a version prior to the 2020-05-01 API, you'll still have access to APIs for managing MRUs, however none of the MRU configuration that you set will be used to control encoding concurrency or performance. If you donΓÇÖt see the option to manage MRUs in the Azure portal, you have an account that was created with the 2020-05-01 API or later.
-
-## Billing
-
-While there were previously charges for Media Reserved Units, as of April 17, 2021 there are no longer any charges for accounts that have configuration for Media Reserved Units. For more information on billing for encoding jobs, see [Encoding video and audio with Media Services](encoding-concept.md)
-
-For accounts created in with the **2020-05-01** version of the API, that is, the v3 version, or through the Azure portal, scaling and media reserved units are no longer required. Scaling is now automatically handled by the service internally. Media reserved units are no longer needed or supported for any Azure Media Services account. See [Media reserved units (legacy)](concept-media-reserved-units.md) for additional information.
-
-## See also
-
-* [Migrate from Media Services v2 to v3](migrate-v-2-v-3-migration-introduction.md)
-* [Scale Media Reserved Units with CLI](media-reserved-units-how-to.md)
media-services Concept Trusted Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concept-trusted-storage.md
- Title: Trusted storage for Media Services
-description: Managed Identities authentication allows Media Services to access the storage account that has been configured with a firewall or a VNet restriction through trusted storage access.
-keywords: trusted storage, managed identities
----- Previously updated : 1/29/2020---
-# Trusted storage for Media Services
-
-When you create a Media Services account, you must associate it with a storage account. Media Services can access that storage account using system authentication or Managed Identity authentication. Media Services validates that the Media Services account and the storage account are in the same subscription and it validates that the user adding the association has access the storage account with Azure Resource Manager RBAC.
-
->[!NOTE]
->Trusted storage is only available in the API, and is not currently enabled in the Azure portal.
-
-## Trusted storage with a firewall
-
-However, if you want to use a firewall to secure your storage account and enable trusted storage, [Managed Identities](concept-managed-identities.md) authentication is the preferred option. It allows Media Services to access the storage account that has been configured with a firewall or a VNet restriction through trusted storage access.
-
-## Tutorial
-
-You can learn more about enabling trusted storage with the [Media Services trusted storage](security-trusted-storage-rest-tutorial.md) tutorial.
-
-> [!NOTE]
-> You need to grant the AMS Managed Identity Storage Blob Data Contributor access in order for Media Services to be able to read and write to the storage account. Granting the generic Contributor role wonΓÇÖt work as it doesnΓÇÖt enable the correct permissions on the data plane.
-
-## Further reading
-
-To understand the methods of creating trusted storage with Managed Identities, read [Managed Identities and Media Services](concept-managed-identities.md).
-
-For more information about Trusted Microsoft Services, see [Configure Azure Storage firewalls and virtual networks](../../storage/common/storage-network-security.md#trusted-microsoft-services).
-
-## Next steps
-
-To learn more about what managed identities can do for you and your Azure applications, see [Azure AD Managed Identities](../../active-directory/managed-identities-azure-resources/overview.md).
media-services Concept Use Customer Managed Keys Byok https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concept-use-customer-managed-keys-byok.md
- Title: Bring your own key (customer managed keys)
-description: You can use a customer managed key (that is, bring your own key) with Media Services.
---- Previously updated : 1/28/2020--
-# Bring your own key (customer-managed keys) with Media Services
--
-Bring Your Own Key (BYOK) is an Azure wide initiative to help customers move their workloads to the cloud. Customer managed keys allow customers to adhere to industry compliance regulations and improves tenant isolation of a service. Giving customers control of encryption keys is a way to minimize unnecessary access and control and build confidence in Microsoft services.
-
-## Keys and key management
-
-You can use your own key with Media Services when you use the Media Services 2020-05-01 or later API. A default account key is created for all accounts which is encrypted by a system key owned by Media Services. When you use your own key, the account key is encrypted with your key. Content keys are encrypted by the account key. JobInputHttp urls and symmetric token validation keys are also encrypted.
--
-Media Services uses the Managed Identity of the Media Services account to read your key from a Key Vault owned by you. Media Services requires that the Key Vault is in the same region as the account, and that it has soft-delete and purge protection enabled.
-
-Your key can be a 2048, 3072, or a 4096 RSA key, and both HSM and software keys are supported.
-
-> [!NOTE]
-> EC keys are not supported.
-
-You can specify a key name and key version, or just a key name. When you use only a key name, Media Services will use the latest key version. New versions of customer keys are automatically detected, and the account key is re-encrypted.
-
-> [!WARNING]
-> Media Services monitors access to the customer key. If the customer key becomes inaccessible (for example, the key has been deleted or the Key Vault has been deleted or the access grant has been removed), Media Services will transition the account to the Customer Key Inaccessible State (effectively disabling the account). However, the account can be deleted in this state. The only supported operations are account GET, LIST and DELETE; all other requests (encoding, streaming, and so on) will fail until access to the account key is restored.
-
-## Double encryption
-
-Media Services automatically supports double encryption. For data at rest, the first layer of encryption uses a customer managed key or a Microsoft managed key depending on the `AccountEncryption` setting on the account. The second layer of encryption for data at rest is provided automatically using a separate Microsoft managed key. To learn more about double encryption, see [Azure double encryption](../../security/fundamentals/double-encryption.md).
-
-> [!NOTE]
-> Double encryption is enabled automatically on the Media Services account. However, you need to configure the customer managed key and double encryption on your storage account separately. To learn more, see [Storage encryption](../../storage/common/storage-service-encryption.md).
-
-## Tutorials
--- [Use the Azure portal to use customer-managed keys or BYOK with Media Services](security-customer-managed-keys-portal-tutorial.md)-- [Use customer-managed keys or BYOK with Media Services REST API](security-customer-managed-keys-rest-postman-tutorial.md).-
-## Next steps
-
-[Protect your content with Media Services dynamic encryption](drm-content-protection-concept.md)
media-services Concepts Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/concepts-overview.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Media Services terminology and concepts
-description: Learn about terminology and concepts for Azure Media Services.
------ Previously updated : 08/31/2020-----
-# Media Services terminology and concepts
--
-This topic gives a brief overview of Azure Media Services terminology and concepts. The article also provides links to articles with an in-depth explanation of Media Services v3 concepts and functionality.
-
-The fundamental concepts described in these topics should be reviewed before starting development.
--
-## Media Services v3 terminology
-
-|Term|Description|
-|||
-|Live Event|A **Live Event** represents a pipeline for ingesting, transcoding (optionally), and packaging live streams of video, audio, and real-time metadata.<br/><br/>For customers migrating from Media Services v2 APIs, the **Live Event** replaces the **Channel** entity in v2. For more information, see [Migrating from v2 to v3](migrate-v-2-v-3-migration-introduction.md).|
-|Streaming Endpoint/Packaging/Origin|A **Streaming Endpoint** represents a dynamic (just-in-time) packaging and origin service that can deliver your live and on-demand content directly to a client player application. It uses one of the common streaming media protocols (HLS or DASH). In addition, the **Streaming Endpoint** provides dynamic (just-in-time) encryption to industry-leading digital rights management systems (DRMs).<br/><br/>In the media streaming industry, this service is commonly referred to as a **Packager** or **Origin**. Other common terms in the industry for this capability include JITP (just-in-time-packager) or JITE (just-in-time-encryption).
-
-## Media Services v3 concepts
-
-|Concepts|Description|Links|
-||||
-|Assets and uploading content|To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account and upload your digital files into **Assets**.|[Cloud upload and storage](storage-account-concept.md)<br/><br/>[Assets concept](assets-concept.md)|
-|Encoding content|Once you upload your high-quality digital media files into Assets, you can encode them into formats that can be played on a wide variety of browsers and devices. <br/><br/>To encode with Media Services v3, you need to create **Transforms** and **Jobs**.|[Transforms and Jobs](transform-jobs-concept.md)<br/><br/>[Encoding with Media Services](encode-concept.md)|
-|Analyzing content (Video Analyzer for Media)|Media Services v3 lets you extract insights from your video and audio files using Media Services v3 presets. To analyze your content using Media Services v3 presets, you need to create **Transforms** and **Jobs**.<br/><br/>If you want more detailed insights, use [Video Analyzer for Media](../../azure-video-analyzer/video-analyzer-for-media-docs/index.yml) directly.|[Analyzing video and audio files](analyze-video-audio-files-concept.md)|
-|Packaging and delivery|Once your content is encoded, you can take advantage of **Dynamic Packaging**. In Media Services, a **Streaming Endpoint** is the dynamic packaging service used to deliver media content to client players. To make videos in the output asset available to clients for playback, you have to create a **Streaming Locator** and then build streaming URLs. <br/><br/>When creating the **Streaming Locator**, in addition to the asset's name, you need to specify **Streaming Policy**. **Streaming Policies** enable you to define streaming protocols and encryption options (if any) for your **Streaming Locators**. Dynamic Packaging is used whether you stream your content live or on-demand. <br/><br/>You can use Media Services **Dynamic Manifests** to stream only a specific rendition or subclips of your video. In addition, if you have pre-encoded content, or content that is already encoded by a 3rd party encoder you can stream the content with the AMS origin services. For an example of using a pre-encoded source file, see the sample - [Streaming an existing Mp4](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamExistingMp4) |[Dynamic packaging](encode-dynamic-packaging-concept.md)<br/><br/>[Streaming Endpoints](stream-streaming-endpoint-concept.md)<br/><br/>[Streaming Locators](stream-streaming-locators-concept.md)<br/><br/>[Streaming Policies](stream-streaming-policy-concept.md)<br/><br/>[Dynamic manifests](filters-dynamic-manifest-concept.md)<br/><br/>[Filters](filters-concept.md)|
-|Content protection|With Media Services, you can deliver your live and on-demand content encrypted dynamically with Advanced Encryption Standard (AES-128) or/and any of the three major DRM systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients. <br/><br/>If specifying encryption options on your stream, create the **Content Key Policy** and associate it with your **Streaming Locator**. The **Content Key Policy** enables you to configure how the content key is delivered to end clients.<br/><br/> Try to reuse policies whenever the same options are needed.| [Content Key Policies](drm-content-key-policy-concept.md)<br/><br/>[Content protection](drm-content-protection-concept.md)|
-|Live streaming|Media Services enables you to deliver live events to your customers on the Azure cloud. **Live Events** are responsible for ingesting and processing the live video feeds. When you create a **Live Event**, an input endpoint is created that you can use to send a live signal from a remote encoder. Once you have the stream flowing into the **Live Event**, you can begin the streaming event by creating an **Asset**, **Live Output**, and **Streaming Locator**. **Live Output** will archive the stream into the **Asset** and make it available to viewers through the **Streaming Endpoint**. A live event can be set to either a *pass-through* (an on-premises live encoder sends a multiple bitrate stream) or *live encoding* (an on-premises live encoder sends a single bitrate stream). |[Live streaming overview](stream-live-streaming-concept.md)<br/><br/>[Live Events and Live Outputs](live-event-outputs-concept.md)|
-|Monitoring with Event Grid|To see the progress of the job, use **Event Grid**. Media Services also emits the live event types. With Event Grid, your apps can listen for and react to events from virtually all Azure services, as well as custom sources. |[Handling Event Grid events](monitoring/reacting-to-media-services-events.md)<br/><br/>[Schemas](monitoring/media-services-event-schemas.md)|
-|Monitoring with Azure Monitor|Monitor metrics and diagnostic logs that help you understand how your apps are performing with Azure Monitor.|[Metrics and diagnostic logs](monitoring/monitor-media-services-data-reference.md)<br/><br/>[Diagnostic logs schemas](monitoring/monitor-media-services-data-reference.md)|
-|Player clients|You can use any player framework that supports the HLS or DASH streaming protocol. There are many open source and commercial players available on the market (Shaka, Hls.js, Video.js, Theo Player, Bitmovin Player, etc.) as well as built-in native browser and OS level streaming support for HLS and DASH. The Azure Media Player is also available to play back media content streamed by Media Services on a wide variety of browsers. The Azure Media Player uses industry standards, such as HTML5, Media Source Extensions (MSE), and Encrypted Media Extensions (EME) to provide an adaptive streaming experience. |[Azure Media Player overview](player-use-azure-media-player-how-to.md)|
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-* [Encode remote file and stream video ΓÇô REST](stream-files-tutorial-with-rest.md)
-* [Encode uploaded file and stream video - .NET](stream-files-tutorial-with-api.md)
-* [Stream live - .NET](stream-live-tutorial-with-api.md)
-* [Analyze your video - .NET](analyze-videos-tutorial.md)
-* [AES-128 dynamic encryption - .NET](drm-playready-license-template-concept.md)
-* [Encrypt dynamically with multi-DRM - .NET](drm-protect-with-drm-tutorial.md)
media-services Configure Connect Dotnet Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/configure-connect-dotnet-howto.md
- Title: Connect to Azure Media Services v3 API - .NET
-description: This article demonstrates how to connect to Media Services v3 API with .NET.
------- Previously updated : 11/17/2020---
-# Connect to Media Services v3 API - .NET
--
-This article shows you how to connect to the Azure Media Services v3 .NET SDK using the service principal login method.
-
-## Prerequisites
--- [Create a Media Services account](./account-create-how-to.md). Make sure to remember the resource group name and the Media Services account name-- Install a tool that you would like to use for .NET development. The steps in this article show how to use [Visual Studio 2019 Community Edition](https://www.visualstudio.com/downloads/). You can use Visual Studio Code, see [Working with C#](https://code.visualstudio.com/docs/languages/csharp). Or, you can use a different code editor.-
-> [!IMPORTANT]
-> Review [naming conventions](media-services-apis-overview.md#naming-conventions).
-
-## Create a console application
-
-1. Start Visual Studio.
-1. From the **File** menu, click **New** > **Project**.
-1. Create a **.NET Core** console application.
-
-The sample app in this topic, targets `netcoreapp2.0`. The code uses 'async main', which is available starting with C# 7.1. See this [blog](/archive/blogs/benwilli/async-main-is-available-but-hidden) for more details.
-
-## Add required NuGet packages/assemblies
-
-1. In Visual Studio, select **Tools** > **NuGet Package Manager** > **NuGet Manager Console**.
-2. In the **Package Manager Console** window, use `Install-Package` command to add the following NuGet packages. For example, `Install-Package Microsoft.Azure.Management.Media`.
-
-|Package|Description|
-|||
-|`Microsoft.Azure.Management.Media`|Azure Media Services SDK. <br/>To make sure you are using the latest Azure Media Services package, check [Microsoft.Azure.Management.Media](https://www.nuget.org/packages/Microsoft.Azure.Management.Media).|
-
-### Other required assemblies
--- Azure.Storage.Blobs-- Microsoft.Extensions.Configuration-- Microsoft.Extensions.Configuration.EnvironmentVariables-- Microsoft.Extensions.Configuration.Json-- Microsoft.Rest.ClientRuntime.Azure.Authentication-
-## Create and configure the app settings file
-
-### Create appsettings.json
-
-1. Go go **General** > **Text file**.
-1. Name it "appsettings.json".
-1. Set the "Copy to Output Directory" property of the .json file to "Copy if newer" (so that the application is able to access it when published).
-
-### Set values in appsettings.json
-
-Run the `az ams account sp create` command as described in [access APIs](./access-api-howto.md). The command returns json that you should copy into your "appsettings.json".
-
-## Add configuration file
-
-For convenience, add a configuration file that is responsible for reading values from "appsettings.json".
-
-1. Add a new .cs class to your project. Name it `ConfigWrapper`.
-1. Paste the following code in this file (this example assumes you have the namespace is `ConsoleApp1`).
-
-```csharp
-using System;
-
-using Microsoft.Extensions.Configuration;
-
-namespace ConsoleApp1
-{
- public class ConfigWrapper
- {
- private readonly IConfiguration _config;
-
- public ConfigWrapper(IConfiguration config)
- {
- _config = config;
- }
-
- public string SubscriptionId
- {
- get { return _config["SubscriptionId"]; }
- }
-
- public string ResourceGroup
- {
- get { return _config["ResourceGroup"]; }
- }
-
- public string AccountName
- {
- get { return _config["AccountName"]; }
- }
-
- public string AadTenantId
- {
- get { return _config["AadTenantId"]; }
- }
-
- public string AadClientId
- {
- get { return _config["AadClientId"]; }
- }
-
- public string AadSecret
- {
- get { return _config["AadSecret"]; }
- }
-
- public Uri ArmAadAudience
- {
- get { return new Uri(_config["ArmAadAudience"]); }
- }
-
- public Uri AadEndpoint
- {
- get { return new Uri(_config["AadEndpoint"]); }
- }
-
- public Uri ArmEndpoint
- {
- get { return new Uri(_config["ArmEndpoint"]); }
- }
-
- public string Location
- {
- get { return _config["Location"]; }
- }
- }
-}
-```
-
-## Connect to the .NET client
-
-To start using Media Services APIs with .NET, you need to create an **AzureMediaServicesClient** object. To create the object, you need to supply credentials needed for the client to connect to Azure using Azure AD. In the code below, the GetCredentialsAsync function creates the ServiceClientCredentials object based on the credentials supplied in local configuration file.
-
-1. Open `Program.cs`.
-1. Paste the following code:
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.IO;
-using System.Linq;
-using System.Threading.Tasks;
-
-using Microsoft.Azure.Management.Media;
-using Microsoft.Azure.Management.Media.Models;
-using Microsoft.Extensions.Configuration;
-using Microsoft.IdentityModel.Clients.ActiveDirectory;
-using Microsoft.Rest;
-using Microsoft.Rest.Azure.Authentication;
-
-namespace ConsoleApp1
-{
- class Program
- {
- public static async Task Main(string[] args)
- {
-
- ConfigWrapper config = new ConfigWrapper(new ConfigurationBuilder()
- .SetBasePath(Directory.GetCurrentDirectory())
- .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
- .AddEnvironmentVariables()
- .Build());
-
- try
- {
- IAzureMediaServicesClient client = await CreateMediaServicesClientAsync(config);
- Console.WriteLine("connected");
- }
- catch (Exception exception)
- {
- if (exception.Source.Contains("ActiveDirectory"))
- {
- Console.Error.WriteLine("TIP: Make sure that you have filled out the appsettings.json file before running this sample.");
- }
-
- Console.Error.WriteLine($"{exception.Message}");
-
- ApiErrorException apiException = exception.GetBaseException() as ApiErrorException;
- if (apiException != null)
- {
- Console.Error.WriteLine(
- $"ERROR: API call failed with error code '{apiException.Body.Error.Code}' and message '{apiException.Body.Error.Message}'.");
- }
- }
-
- Console.WriteLine("Press Enter to continue.");
- Console.ReadLine();
- }
-
- private static async Task<ServiceClientCredentials> GetCredentialsAsync(ConfigWrapper config)
- {
- // Use ApplicationTokenProvider.LoginSilentWithCertificateAsync or UserTokenProvider.LoginSilentAsync to get a token using service principal with certificate
- //// ClientAssertionCertificate
- //// ApplicationTokenProvider.LoginSilentWithCertificateAsync
-
- // Use ApplicationTokenProvider.LoginSilentAsync to get a token using a service principal with symmetric key
- ClientCredential clientCredential = new ClientCredential(config.AadClientId, config.AadSecret);
- return await ApplicationTokenProvider.LoginSilentAsync(config.AadTenantId, clientCredential, ActiveDirectoryServiceSettings.Azure);
- }
-
- private static async Task<IAzureMediaServicesClient> CreateMediaServicesClientAsync(ConfigWrapper config)
- {
- var credentials = await GetCredentialsAsync(config);
-
- return new AzureMediaServicesClient(config.ArmEndpoint, credentials)
- {
- SubscriptionId = config.SubscriptionId,
- };
- }
-
- }
-}
-```
-
-## Next steps
--- [Tutorial: Upload, encode, and stream videos - .NET](stream-files-tutorial-with-api.md) -- [Tutorial: Stream live with Media Services v3 - .NET](stream-live-tutorial-with-api.md)-- [Tutorial: Analyze videos with Media Services v3 - .NET](analyze-videos-tutorial.md)-- [Create a job input from a local file - .NET](job-input-from-local-file-how-to.md)-- [Create a job input from an HTTPS URL - .NET](job-input-from-http-how-to.md)-- [Encode with a custom Transform - .NET](transform-custom-transform-how-to.md)-- [Use AES-128 dynamic encryption and the key delivery service - .NET](drm-playready-license-template-concept.md)-- [Use DRM dynamic encryption and license delivery service - .NET](drm-protect-with-drm-tutorial.md)-- [Get a signing key from the existing policy - .NET](drm-get-content-key-policy-how-to.md)-- [Create filters with Media Services - .NET](filters-dynamic-manifest-dotnet-how-to.md)-- [Advanced video on-demand examples of Azure Functions v2 with Media Services v3](https://aka.ms/ams3functions)-
-## See also
-
-* [.NET reference](/dotnet/api/overview/azure/mediaservices/management)
-* For more code examples, see the [.NET SDK samples](https://github.com/Azure-Samples/media-services-v3-dotnet) repo.
media-services Configure Connect Java Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/configure-connect-java-howto.md
- Title: Connect to Azure Media Services v3 API - Java
-description: This article describes how to connect to Azure Media Services v3 API with Java.
------ Previously updated : 11/17/2020---
-# Connect to Media Services v3 API - Java
--
-This article shows you how to connect to the Azure Media Services v3 Java SDK using the service principal sign in method.
-
-In this article, the Visual Studio Code is used to develop the sample app.
-
-## Prerequisites
--- Follow [Writing Java with Visual Studio Code](https://code.visualstudio.com/docs/java/java-tutorial) to install:-
- - JDK
- - Apache Maven
- - Java Extension Pack
-- Make sure to set `JAVA_HOME` and `PATH` environment variables.-- [Create a Media Services account](./account-create-how-to.md). Be sure to remember the resource group name and the Media Services account name.-- Follow the steps in the [Access APIs](./access-api-howto.md) topic. Record the subscription ID, application ID (client ID), the authentication key (secret), and the tenant ID that you need in a later step.-
-Also review:
--- [Java in Visual Studio Code](https://code.visualstudio.com/docs/languages/java)-- [Java Project Management in VS Code](https://code.visualstudio.com/docs/java/java-project)-
-> [!IMPORTANT]
-> Review [naming conventions](media-services-apis-overview.md#naming-conventions).
-
-## Create a Maven project
-
-Open a command-line tool and `cd` to a directory where you want to create the project.
-
-```
-mvn archetype:generate -DgroupId=com.azure.ams -DartifactId=testAzureApp -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false
-```
-
-When you run the command, the `pom.xml`, `App.java`, and other files are created.
-
-## Add dependencies
-
-1. In Visual Studio Code, open the folder where your project is
-1. Find and open the `pom.xml`
-1. Add the needed dependencies.
-
- See `pom.xml` in the [Video encoding](https://github.com/Azure-Samples/media-services-v3-java/blob/master/VideoEncoding/EncodingWithMESCustomPreset/pom.xml) sample.
-
-## Connect to the Java client
-
-1. Open the `App.java` file under `src\main\java\com\azure\ams` and make sure your package is included at the top:
-
- ```java
- package com.azure.ams;
- ```
-1. Under the package statement, add these import statements:
-
- ```java
- import com.azure.core.management.AzureEnvironment;
- import com.azure.core.management.profile.AzureProfile;
- import com.azure.identity.ClientSecretCredential;
- import com.azure.identity.ClientSecretCredentialBuilder;
- import com.azure.resourcemanager.mediaservices.MediaServicesManager;
- ```
-1. To create the Active Directory credentials that you need to make requests, add following code to the main method of the App class and set the values that you got from [Access APIs](./access-api-howto.md):
-
- ```java
- try {
- AzureProfile azureProfile = new AzureProfile("<YOUR_TENANT_ID>", "<YOUR_SUBSCRIPTION_ID>", AzureEnvironment.AZURE);
- ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
- .clientId("<YOUR_CLIENT_ID>")
- .clientSecret("<YOUR_CLIENT_SECRET>")
- .tenantId("<YOUR_TENANT_ID>")
- // authority host is optional
- .authorityHost("<AZURE_AUTHORITY_HOST>")
- .build();
- MediaServicesManager mediaServicesManager = MediaServicesManager.authenticate(clientSecretCredential, azureProfile);
- System.out.println("Hello Azure");
- }
- catch (Exception e) {
- System.out.println("Exception encountered.");
- System.out.println(e.toString());
- }
- ```
-1. Run the app.
-
-## See also
--- [Media Services concepts](concepts-overview.md)-- [Java SDK](https://aka.ms/ams-v3-java-sdk)-- [Java reference](/java/api/overview/azure/mediaservices/management)-- [com.azure.resourcemanager.mediaservices](https://mvnrepository.com/artifact/com.azure.resourcemanager/azure-resourcemanager-mediaservices)-
-## Next steps
-
-You can now include `import com.azure.resourcemanager.mediaservices.*` and start manipulating entities.
-
-For more code examples, see the [Java SDK samples](/samples/azure-samples/media-services-v3-java/azure-media-services-v3-samples-using-java/) repo.
media-services Configure Connect Nodejs Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/configure-connect-nodejs-howto.md
- Title: Connect to Azure Media Services v3 API - Node.js
-description: This article demonstrates how to connect to Media Services v3 API with Node.js.
------- Previously updated : 12/13/2021---
-# Connect to Media Services v3 API - Node.js
--
-This article shows you how to connect to the Azure Media Services v3 Node.js SDK using the service principal sign-in method. You will work with files in the *media-services-v3-node-tutorials* samples repository. The *HelloWorld-ListAssets* sample contains the code for connecting then list Assets in the account.
-
-## Prerequisites
--- An installation of Visual Studio Code.-- Install [Node.js](https://nodejs.org/en/download/).-- Install [TypeScript](https://www.typescriptlang.org/download).-- [Create a Media Services account](./account-create-how-to.md). Be sure to remember the resource group name and the Media Services account name.-- Create a service principal for your application. See [access APIs](./access-api-howto.md).<br/>**Pro tip!** Keep this window open or copy everything in the JSON tab to Notepad. -- Make sure to get the latest version of the [AzureMediaServices SDK for JavaScript](https://www.npmjs.com/package/@azure/arm-mediaservices).-
-> [!IMPORTANT]
-> Review the Azure Media Services [naming conventions](media-services-apis-overview.md#naming-conventions) to understand the important naming restrictions on entities.
-
-## Clone the Node.JS samples repo
-
-You will work with some files in Azure Samples. Clone the Node.JS samples repository.
-
-```git
-git clone https://github.com/Azure-Samples/media-services-v3-node-tutorials.git
-```
-
-## Install the Node.js packages
-
-### Install @azure/arm-mediaservices
-
-```bash
-npm install @azure/arm-mediaservices
-```
-
-For this example, you will use the following packages in the `package.json` file.
-
-|Package|Description|
-|||
-|`@azure/arm-mediaservices`|Azure Media Services SDK. <br/>To make sure you are using the latest Azure Media Services package, check [npm install @azure/arm-mediaservices](https://www.npmjs.com/package/@azure/arm-mediaservices).|
-|`@azure/identity` | Required for Azure AD authentication using Service Principal or Managed Identity|
-|`@azure/storage-blob`|Storage SDK. Used when uploading files into assets.|
-|`@azure/abort-controller`| Used along with the storage client to time out long running download operations|
-
-### Create the package.json file
-
-1. Create a `package.json` file using your favorite editor.
-1. Open the file and paste the following code:
-
-```json
-{
- "name": "media-services-node-sample",
- "version": "0.1.0",
- "description": "",
- "main": "./index.ts",
- "dependencies": {
- "@azure/arm-mediaservices": "^10.0.0",
- "@azure/abort-controller": "^1.0.2",
- "@azure/identity": "^2.0.0",
- "@azure/storage-blob": "^12.4.0"
- }
-}
-```
-
-## Connect to Node.js client using TypeScript
-
-### Sample *.env* file
-
-Copy the content of this file to a file named *.env*. It should be stored at the root of your working repository. These are the values you got from the API Access page for your Media Services account in the portal.
-
-To access the values needed for entering into the *.env* file, it is recommended to first read and review the how-to article [Access the API](./access-api-howto.md).
-You can use either the Azure portal or the CLI to get the values needed to enter into this sample's environment variables file.
-
-Once you have created the *.env* file, you can start working with the samples.
-
-```nodejs
-# Values from the API Access page in the portal
-AADCLIENTID="00000000-0000-0000-0000-000000000000"
-AADSECRET="00000000-0000-0000-0000-000000000000"
-AADTENANTID="00000000-0000-0000-0000-000000000000"
-
-# Change this to match your Azure AD Tenant domain name.
-AADTENANTDOMAIN="microsoft.onmicrosoft.com"
-
-# Set this to your Media Services Account name, resource group it is contained in, and location
-ACCOUNTNAME="amsaccount"
-RESOURCEGROUP="amsResourceGroup"
-
-# Set this to your Azure Subscription ID
-SUBSCRIPTIONID="00000000-0000-0000-0000-000000000000"
-
-# You must change this if you are using Gov Cloud, China, or other non standard cloud regions
-AADENDPOINT="https://login.microsoftonline.com"
-
-# DRM Testing
-DRMSYMMETRICKEY="add random base 64 encoded string here"
-```
-
-## Run the sample application *HelloWorld-ListAssets*
--
-1. Launch Visual Studio Code from the root Folder.
-
-```bash
-cd media-services-v3-node-tutorials
-code .
-```
-
-2. Install the packages used in the *package.json* file from a Terminal
-
-```
-npm install
-```
-3. Make a copy of the *sample.env* file, rename it to *.env* and update the values in the file to match your account and subscription information. You may need to gather this information from the Azure portal first.
-
-4. Change directory into the *HelloWorld-ListAssets* folder
-
-```bash
-cd HelloWorld-ListAssets
-```
-
-5. Open the *list-assets.ts* file in the HelloWorld-ListAssets folder and press the F5 key in Visual Studio code to begin running the script. You should see a list of assets displayed if you have assets already in the account. If the account is empty, you will see an empty list.
-
-To quickly see assets listed, use the portal to upload a few video files. Assets will automatically be created each one and running this script again will then return their names.
-
-### A closer look at the *HelloWorld-ListAssets* sample
-
-The *HelloWorld-ListAssets* sample shows you how to connect to the Media Services client with a Service Principal and list Assets in the account. See the comments in the code for a detailed explanation of what it does.
-
-```ts
-import { DefaultAzureCredential } from "@azure/identity";
-import {
- AzureMediaServices
-} from '@azure/arm-mediaservices';
-
-// Load the .env file if it exists
-import * as dotenv from "dotenv";
-dotenv.config();
-
-export async function main() {
- // Copy the samples.env file and rename it to .env first, then populate it's values with the values obtained
- // from your Media Services account's API Access page in the Azure portal.
- const clientId: string = process.env.AADCLIENTID as string;
- const secret: string = process.env.AADSECRET as string;
- const tenantDomain: string = process.env.AADTENANTDOMAIN as string;
- const subscriptionId: string = process.env.SUBSCRIPTIONID as string;
- const resourceGroup: string = process.env.RESOURCEGROUP as string;
- const accountName: string = process.env.ACCOUNTNAME as string;
-
- // This sample uses the default Azure Credential object, which relies on the environment variable settings.
- // If you wish to use User assigned managed identity, see the samples for v2 of @azure/identity
- // Managed identity authentication is supported via either the DefaultAzureCredential or the ManagedIdentityCredential classes
- // https://docs.microsoft.com/javascript/api/overview/azure/identity-readme?view=azure-node-latest
- // See the following examples for how to authenticate in Azure with managed identity
- // https://github.com/Azure/azure-sdk-for-js/blob/@azure/identity_2.0.1/sdk/identity/identity/samples/AzureIdentityExamples.md#authenticating-in-azure-with-managed-identity
--
- // const credential = new ManagedIdentityCredential("<USER_ASSIGNED_MANAGED_IDENTITY_CLIENT_ID>");
- const credential = new DefaultAzureCredential();
-
- let mediaServicesClient = new AzureMediaServices(credential, subscriptionId)
-
- // List Assets in Account
- console.log("Listing assets in account:")
- for await (const asset of mediaServicesClient.assets.list(resourceGroup, accountName, { top:1000 })){
- console.log(asset.name);
- }
-
-}
-
-main().catch((err) => {
- console.error("Error running sample:", err.message);
-});
-```
-
-## More samples
-
-Many more samples are available in the [repository](https://github.com/Azure-Samples/media-services-v3-node-tutorials). Please review the readme file for the latest updated samples.
-
-## References for Media Services JavaScript/TypeScript developers
--- [npm install @azure/arm-mediaservices](https://www.npmjs.com/package/@azure/arm-mediaservices)-- [Reference documentation for Azure Media Services modules for Node.js](/javascript/api/overview/azure/media-services)-- [Azure for JavaScript & Node.js developers](/azure/developer/javascript/)-- [Media Services source code in the @azure/azure-sdk-for-js Git Hub repo](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/mediaservices/arm-mediaservices)-- [Azure Package Documentation for Node.js developers](/javascript/api/overview/azure/)-- [Media Services concepts](concepts-overview.md)-- [Azure for JavaScript & Node.js developers](/azure/developer/javascript/)-- [Media Services source code in the @azure/azure-sdk-for-js repo](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/mediaservices/arm-mediaservices)-
-## Next steps
-
-Explore the Media Services [Node.js ref](/javascript/api/overview/azure/arm-mediaservices-readme) documentation and check out [samples](https://github.com/Azure-Samples/media-services-v3-node-tutorials) that show how to use Media Services API with Node.js.
media-services Configure Connect Python Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/configure-connect-python-howto.md
- Title: Connect to Azure Media Services v3 API - Python
-description: This article demonstrates how to connect to Media Services v3 API with Python.
------- Previously updated : 11/18/2020----
-# Connect to Media Services v3 API - Python
--
-This article shows you how to connect to the Azure Media Services v3 Python SDK using the service principal sign in method.
-
-## Prerequisites
--- Download Python from [python.org](https://www.python.org/downloads/)-- Make sure to set the `PATH` environment variable-- [Create a Media Services account](./account-create-how-to.md). Be sure to remember the resource group name and the Media Services account name.-- Follow the steps in the [Access APIs](./access-api-howto.md) topic, selecting the Service principal authentication method. Record the subscription ID, application client ID, the authentication key, and the tenant ID that you need in the later steps.-
-> [!IMPORTANT]
-> Review [naming conventions](media-services-apis-overview.md#naming-conventions).
-
-## Install the modules
-
-To work with Azure Media Services using Python, you need to install these modules.
-
-* The `azure-identity` module, which includes Azure modules for Active Directory.
-* The `azure-mgmt-media` module, which includes the Media Services entities.
-
- Make sure to get [the latest version of the Media Services SDK for Python](https://pypi.org/project/azure-mgmt-media/).
-
-Open a command-line tool and use the following commands to install the modules.
-
-```cmd
-pip3 install azure-identity
-pip3 install azure-mgmt-media
-```
-
-## Connect to the Python client
-
-1. Create a file with a `.py` extension
-1. Open the file in your favorite editor
-1. Add the the following code to the file. The code imports the required modules and creates the Active Directory credentials object you need to connect to Media Services.
-
- Set the variables' values to the values you got from [Access APIs](./access-api-howto.md). Update the `ACCOUNT_NAME` and `RESOURCE_GROUP_NAME` variables to the Media Services account name and Resource Group names used when creating those resources.
-
- ```python
- from azure.identity import ClientSecretCredential
- from azure.mgmt.media import AzureMediaServices
-
- # Tenant ID for your Azure Subscription
- TENANT_ID = "(update-this-value)"
-
- # Your Application Client ID of your Service Principal
- CLIENT_ID = "(update-this-value)"
-
- # Your Service Principal secret key
- CLIENT_SECRET = "(update-this-value)"
-
- # Your Azure Subscription ID
- SUBSCRIPTION_ID = "(update-this-value)"
-
- # Your Resource Group name
- RESOURCE_GROUP_NAME = "(update-this-value)"
-
- # Your Azure Media Service account name
- ACCOUNT_NAME = "(update-this-value)"
-
- credentials = ClientSecretCredential(TENANT_ID, CLIENT_ID, CLIENT_SECRET)
-
- # The Azure Media Services Client
- client = AzureMediaServices(credentials, SUBSCRIPTION_ID)
-
- # Now that you are authenticated, you can manipulate the entities.
- # For example, list assets in your Media Services account
- assets = client.assets.list(RESOURCE_GROUP_NAME, ACCOUNT_NAME)
-
- for i, r in enumerate(assets):
- print(r)
- ```
-
-1. Run the file
--
-## Additional samples
-
-Additional samples are available in GitHub in the [Azure Media Services v3 Python Samples](https://github.com/Azure-Samples/media-services-v3-python) repo.
-
-## Next steps
--- Use [Python SDK](https://aka.ms/ams-v3-python-sdk).-- Review the Media Services [Python ref](/python/api/overview/azure/mediaservices/management) documentation.
media-services Create Streaming Locator Build Url https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/create-streaming-locator-build-url.md
- Title: Create a streaming locator and build URLs
-description: This article demonstrates how to create a streaming locator and build URLs.
----- Previously updated : 03/01/2022---
-# Create a streaming locator and build URLs
--
-In Azure Media Services, to build a streaming URL, you need to first create a [Streaming Locator](stream-streaming-locators-concept.md). You then concatenate the [Streaming Endpoint](/rest/api/media/streamingendpoints) host name and the **Streaming Locator** path. In this sample, the *default* **Streaming Endpoint** is used. When you first create a Media Service account, this *default* **Streaming Endpoint** will be in a stopped state, so you need to call **Start** to start streaming.
-
-This article demonstrates how to create a streaming locator and build a streaming URL using Java and .NET SDKs.
-
-## Prerequisites
-
-Preview [Dynamic packaging](encode-dynamic-packaging-concept.md)
-
-## Create a streaming locator
-
-## [Portal](#tab/portal/)
--
-## [.NET](#tab/net/)
-
-## Using .NET
-
-```csharp
-/// <summary>
-/// Creates a StreamingLocator for the specified asset and with the specified streaming policy name.
-/// Once the StreamingLocator is created the output asset is available to clients for playback.
-/// </summary>
-/// <param name="client">The Media Services client.</param>
-/// <param name="resourceGroupName">The name of the resource group within the Azure subscription.</param>
-/// <param name="accountName"> The Media Services account name.</param>
-/// <param name="assetName">The name of the output asset.</param>
-/// <param name="locatorName">The StreamingLocator name (unique in this case).</param>
-/// <returns>A task.</returns>
-private static async Task<StreamingLocator> CreateStreamingLocatorAsync(
- IAzureMediaServicesClient client,
- string resourceGroup,
- string accountName,
- string assetName,
- string locatorName)
-{
- Console.WriteLine("Creating a streaming locator...");
- StreamingLocator locator = await client.StreamingLocators.CreateAsync(
- resourceGroup,
- accountName,
- locatorName,
- new StreamingLocator
- {
- AssetName = assetName,
- StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
- });
-
- return locator;
-}
-
-/// <summary>
-/// Checks if the streaming endpoint is in the running state,
-/// if not, starts it. Then, builds the streaming URLs.
-/// </summary>
-/// <param name="client">The Media Services client.</param>
-/// <param name="resourceGroupName">The name of the resource group within the Azure subscription.</param>
-/// <param name="accountName"> The Media Services account name.</param>
-/// <param name="locatorName">The name of the StreamingLocator that was created.</param>
-/// <param name="streamingEndpoint">The streaming endpoint.</param>
-/// <returns>A task.</returns>
-private static async Task<IList<string>> GetStreamingUrlsAsync(
- IAzureMediaServicesClient client,
- string resourceGroupName,
- string accountName,
- String locatorName,
- StreamingEndpoint streamingEndpoint)
-{
- IList<string> streamingUrls = new List<string>();
-
- ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);
-
- foreach (StreamingPath path in paths.StreamingPaths)
- {
- UriBuilder uriBuilder = new UriBuilder
- {
- Scheme = "https",
- Host = streamingEndpoint.HostName,
-
- Path = path.Paths[0]
- };
- streamingUrls.Add(uriBuilder.ToString());
- }
-
- return streamingUrls;
-}
-```
-
-See the full code sample: [EncodingWithMESPredefinedPreset](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoEncoding/Encoding_PredefinedPreset/Program.cs)
--
media-services Drm Add Option Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-add-option-content-key-policy-how-to.md
- Title: Add an option to a content key policy
-description: This article shows how to add an option to a content key policy.
----- Previously updated : 03/10/2022---
-# Add an option to a content key policy
--
-## Methods
-
-Use the following methods to add an option to a content key policy.
-
-## [CLI](#tab/cli/)
---
media-services Drm Content Key Policy Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-content-key-policy-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Content Key Policies in Media Services - Azure
-description: This article gives an explanation of what Content Key Policies are, and how they are used by Azure Media Services.
------- Previously updated : 08/31/2020-----
-# Content Key Policies
--
-With Media Services, you can deliver your live and on-demand content encrypted dynamically with Advanced Encryption Standard (AES-128) or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients.
-
-To specify encryption options on your stream, you need to create a [Streaming Policy](stream-streaming-policy-concept.md) and associate it with your [Streaming Locator](stream-streaming-locators-concept.md). You create the [Content Key Policy](/rest/api/medi)) is delivered to end clients. You need to set the requirements (restrictions) on the Content Key Policy that must be met in order for keys with the specified configuration to be delivered to clients. The content key policy is not needed for clear streaming or downloading.
-
-Usually, you associate your content key policy with your [Streaming Locator](stream-streaming-locators-concept.md). Alternatively, you can specify the content key policy inside a [Streaming Policy](stream-streaming-policy-concept.md) (when creating a custom streaming policy for advanced scenarios).
-
-## Best practices and considerations
-
-> [!IMPORTANT]
-> Please review the following recommendations.
-
-* You should design a limited set of policies for your Media Service account and reuse them for your streaming locators whenever the same options are needed. For more information, see [Quotas and limits](limits-quotas-constraints-reference.md).
-* Content key policies are updatable. It can take up to 15 minutes for the key delivery caches to update and pick up the updated policy.
-
- By updating the policy, you are overwriting your existing CDN cache which could cause playback issue for customers that are using cached content.
-* We recommend that you do not create a new content key policy for each asset. The main benefits of sharing the same content key policy between assets that need the same policy options are:
-
- * It is easier to manage a small number of policies.
- * If you need to make updates to the content key policy, the changes go into effect on all new license requests almost right away.
-* If you do need to create a new policy, you have to create a new streaming locator for the asset.
-* It is recommended to let Media Services autogenerate the content key.
-
- Typically, you would use a long-lived key and check for the existence of the content key policy with [Get](/rest/api/media/contentkeypolicies/get). To get the key, you need to call a separate action method to get secrets or credentials, see the example that follows.
-
-## Example
-
-To get to the key, use `GetPolicyPropertiesWithSecretsAsync`, as shown in the [Get a signing key from the existing policy](drm-get-content-key-policy-how-to.md#get-contentkeypolicy-with-secrets) example.
-
-## Filtering, ordering, paging
-
-See [Filtering, ordering, paging of Media Services entities](filter-order-page-entities-how-to.md).
-
-## Additional notes
-
-* Properties of the Content Key Policies that are of the `Datetime` type are always in UTC format.
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-* [Use AES-128 dynamic encryption and the key delivery service](drm-playready-license-template-concept.md)
-* [Use DRM dynamic encryption and license delivery service](drm-protect-with-drm-tutorial.md)
-* [Basic AES Clear Key Encryption and streaming sample code](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/BasicAESClearKey)
media-services Drm Content Protection Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-content-protection-concept.md
- Title: Protect your content with Media Services v3 dynamic encryption
-description: Learn about content protection with dynamic encryption, streaming protocols, and encryption types in Azure Media Services.
------- Previously updated : 05/25/2021----
-# Protect your content with Media Services dynamic encryption
--
-Use Azure Media Services to help secure your media from the time it leaves your computer all the way through storage, processing, and delivery. With Media Services, you can deliver your live and on-demand content encrypted dynamically with Advanced Encryption Standard (AES-128) or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients. If content is encrypted with an AES clear key and is sent over HTTPS, it is not in clear until it reaches the client.
--
-In Media Services v3, a content key is associated with Streaming Locator (see [this example](drm-playready-license-template-concept.md)). If using the Media Services key delivery service, you can let Azure Media Services generate the content key for you. The content key should be generated yourself if you're using you own key delivery service, or if you need to handle a high availability scenario where you need to have the same content key in two data centers.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES clear key or DRM encryption. To decrypt the stream, the player requests the key from Media Services key delivery service or the key delivery service you specified. To decide if the user is authorized to get the key, the service evaluates the content key policy that you specified for the key.
-
-You can use the REST API, or a Media Services client library to configure authorization and authentication policies for your licenses and keys.
-
-The following image illustrates the workflow for Media Services content protection:
-
-![Workflow for Media Services content protection](./media/content-protection/content-protection.svg)
-
-&#42; *Dynamic encryption supports AES-128 clear key, CBCS, and CENC. For details, see the [support matrix](#streaming-protocols-and-encryption-types).*
-
-This article explains concepts and terminology that help you understand content protection with Media Services.
-
-## Main components of a content protection system
-
-To successfully complete your content protection system, you need to fully understand the scope of the effort. The following sections give an overview of three parts that you need to implement.
-
-> [!NOTE]
-> We highly recommended that you focus and fully test each part in the following sections before you move on to the next part. To test your content protection system, use the tools specified in the sections.
-
-### Media Services code
-
-The [DRM sample](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs) shows you how to implement a multi-DRM system with Media Services v3 by using .NET. It also shows how to use the Media Services license/key delivery service.
-
-You can encrypt each asset with multiple encryption types (AES-128, PlayReady, Widevine, FairPlay). To see what makes sense to combine, see [Streaming protocols and encryption types](#streaming-protocols-and-encryption-types).
-
-The example shows how to:
-
-1. Create and configure a [content key policy](drm-content-key-policy-concept.md).
-
- You create a content key policy to configure how the content key (which provides secure access to your assets) is delivered to end clients:
-
- * Define license delivery authorization. Specify the logic of the authorization check based on claims in JSON Web Token (JWT).
- * Configure [PlayReady](drm-playready-license-template-concept.md), [Widevine](drm-widevine-license-template-concept.md), and/or [FairPlay](drm-fairplay-license-overview.md) licenses. The templates let you configure rights and permissions for each of the DRMs.
-
- ```
- ContentKeyPolicyPlayReadyConfiguration playReadyConfig = ConfigurePlayReadyLicenseTemplate();
- ContentKeyPolicyWidevineConfiguration widevineConfig = ConfigureWidevineLicenseTempate();
- ContentKeyPolicyFairPlayConfiguration fairPlayConfig = ConfigureFairPlayPolicyOptions();
- ```
-
-2. Create a [streaming locator](stream-streaming-locators-concept.md) that's configured to stream the encrypted asset.
-
- The streaming locator has to be associated with a [streaming policy](stream-streaming-policy-concept.md). In the example, we set `StreamingLocator.StreamingPolicyName` to the "Predefined_MultiDrmCencStreaming" policy.
-
- The PlayReady and Widevine encryptions are applied, and the key is delivered to the playback client based on the configured DRM licenses. If you also want to encrypt your stream with CBCS (FairPlay), use the "Predefined_MultiDrmStreaming" policy.
-
- The streaming locator is also associated with the content key policy that you defined.
-
-3. Create a test token.
-
- The `GetTokenAsync` method shows how to create a test token.
-4. Build the streaming URL.
-
- The `GetDASHStreamingUrlAsync` method shows how to build the streaming URL. In this case, the URL streams the DASH content.
-
-### Player with an AES or DRM client
-
-A video player app based on a player SDK (either native or browser-based) needs to meet the following requirements:
-
-* The player SDK supports the needed DRM clients.
-* The player SDK supports the required streaming protocols: Smooth, DASH, and/or HTTP Live Streaming (HLS).
-* The player SDK can handle passing a JWT token in a license acquisition request.
-
-You can create a player by using the [Azure Media Player API](https://amp.azure.net/libs/amp/latest/docs/). Use the [Azure Media Player ProtectionInfo API](https://amp.azure.net/libs/amp/latest/docs/) to specify which DRM technology to use on different DRM platforms.
-
-For testing AES or CENC (Widevine and/or PlayReady) encrypted content, you can use [Azure Media Player](https://aka.ms/azuremediaplayer). Make sure that you select **Advanced options** and check your encryption options.
-
-If you want to test FairPlay encrypted content, use [this test player](https://aka.ms/amtest). The player supports Widevine, PlayReady, and FairPlay DRMs, along with AES-128 clear key encryption.
-
-Choose the right browser to test different DRMs:
-
-* Chrome, Opera, or Firefox for Widevine.
-* Microsoft Edge or Internet Explorer 11 for PlayReady.
-* Safari on macOS for FairPlay.
-
-### Security token service
-
-A security token service (STS) issues JWT as the access token for back-end resource access. You can use the Azure Media Services license/key delivery service as the back-end resource. An STS has to define the following things:
-
-* Issuer and audience (or scope).
-* Claims, which are dependent on business requirements in content protection.
-* Symmetric or asymmetric verification for signature verification.
-* Key rollover support (if necessary).
-
-You can use [this STS tool](https://openidconnectweb.azurewebsites.net/DRMTool/Jwt) to test the STS. It supports all three types of verification keys: symmetric, asymmetric, or Azure Active Directory (Azure AD) with key rollover.
-
-## Streaming protocols and encryption types
-
-You can use Media Services to deliver your content encrypted dynamically with AES clear key or DRM encryption by using PlayReady, Widevine, or FairPlay. Currently, you can encrypt the HLS, MPEG DASH, and Smooth Streaming formats. Each protocol supports the following encryption methods.
-
-### HLS
-
-The HLS protocol supports the following container formats and encryption schemes:
-
-|Container format|Encryption scheme|URL example|
-||||
-|All|AES|`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=m3u8-aapl,encryption=cbc)`|
-|MPG2-TS |CBCS (FairPlay) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=m3u8-aapl,encryption=cbcs-aapl)`|
-|CMAF(fmp4) |CBCS (FairPlay) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=m3u8-cmaf,encryption=cbcs-aapl)`|
-|MPG2-TS |CENC (PlayReady) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=m3u8-aapl,encryption=cenc)`|
-|CMAF(fmp4) |CENC (PlayReady) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=m3u8-cmaf,encryption=cenc)`|
-
-HLS/CMAF + FairPlay (including HEVC/H.265) is supported on the following devices:
-
-* iOS 11 or later.
-* iPhone 8 or later.
-* macOS High Sierra with Intel 7th Generation CPU.
-
-### MPEG-DASH
-
-The MPEG-DASH protocol supports the following container formats and encryption schemes:
-
-|Container format|Encryption scheme|URL Examples
-||||
-|All|AES|`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=mpd-time-csf,encryption=cbc)`|
-|CSF(fmp4) |CENC (Widevine + PlayReady) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=mpd-time-csf,encryption=cenc)`|
-|CMAF(fmp4)|CENC (Widevine + PlayReady)|`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(format=mpd-time-cmaf,encryption=cenc)`|
-
-### Smooth Streaming
-
-The Smooth Streaming protocol supports the following container formats and encryption schemes.
-
-|Protocol|Container format|Encryption scheme|
-||||
-|fMP4|AES|`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(encryption=cbc)`|
-|fMP4 | CENC (PlayReady) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(encryption=cenc)`|
-|fMP4 | PIFF 1.1 (PlayReady) |`https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(encryption=piff)`|
-
-> [!NOTE]
-> PIFF 1.1 support is provided as a backwards compatible solution for Smart TV (Samsung, LG) that implemented the early "Silverlight" version of Common Encryption. It is recommended to only use the PIFF format where needed for support of legacey Samsung or LG Smart TVs shipped between 2009-2015 that supported the PIFF 1.1 version of PlayReady encryption.
-
-### Browsers
-
-Common browsers support the following DRM clients:
-
-|Browser|Encryption|
-|||
-|Chrome|Widevine|
-|Microsoft Edge, Internet Explorer 11|PlayReady|
-|Firefox|Widevine|
-|Opera|Widevine|
-|Safari|FairPlay|
-
-## Controlling content access
-
-You can control who has access to your content by configuring the content key policy. Media Services supports multiple ways of authorizing users who make key requests. The client (player) must meet the policy before the key can be delivered to the client. The content key policy can have *open* or *token* restriction.
-
-An open-restricted content key policy may be used when you want to issue license to anyone without authorization. For example, if your revenue is ad-based and not subscription-based.
-
-With a token-restricted content key policy, the content key is sent only to a client that presents a valid JWT token or a simple web token (SWT) in the license/key request. This token must be issued by an STS.
-
-You can use Azure AD as an STS or deploy a [custom STS](#using-a-custom-sts). The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration. The Media Services license/key delivery service returns the requested license or key to the client if both of these conditions exist:
-
-* The token is valid.
-* The claims in the token match those configured for the license or key.
-
-When you configure the token-restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the STS that issues the token. The audience, sometimes called scope, describes the intent of the token or the resource that the token authorizes access to. The Media Services license/key delivery service validates that these values in the token match the values in the template.
-
-### Token replay prevention
-
-The *Token Replay Prevention* feature allows Media Services customers to set a limit on how many times the same token can be used to request a key or a license. The customer can add a claim of type `urn:microsoft:azure:media
-
-#### Considerations
-
-* Customers must have control over token generation. The claim needs to be placed in the token itself.
-* When using this feature, requests with tokens whose expiry time is more than one hour away from the time the request is received are rejected with an unauthorized response.
-* Tokens are uniquely identified by their signature. Any change to the payload (for example, update to the expiry time or the claim) changes the signature of the token and it will count as a new token that Key Delivery hasn't come across before.
-* Playback fails if the token has exceeded the `maxuses` value set by the customer.
-* This feature can be used for all existing protected content (only the token issued needs to be changed).
-* This feature works with both JWT and SWT.
-
-## Using a custom STS
-
-A customer might choose to use a custom STS to provide tokens. Reasons include:
-
-* The identity provider (IDP) used by the customer doesn't support STS. In this case, a custom STS might be an option.
-* The customer might need more flexible or tighter control to integrate STS with the customer's subscriber billing system.
-
- For example, an [OTT](https://en.wikipedia.org/wiki/Over-the-top_media_services) service operator might offer multiple subscriber packages, such as premium, basic, and sports. The operator might want to match the claims in a token with a subscriber's package so that only the contents in a specific package are made available. In this case, a custom STS provides the needed flexibility and control.
-
-* To include custom claims in the token to select between different ContentKeyPolicyOptions with different DRM license parameters (a subscription license versus a rental license).
-* To include a claim representing the content key identifier of the key that the token grants access to.
-
-When you use a custom STS, two changes must be made:
-
-* When you configure license delivery service for an asset, you need to specify the security key used for verification by the custom STS instead of the current key from Azure AD.
-* When a JTW token is generated, a security key is specified instead of the private key of the current X509 certificate in Azure AD.
-
-There are two types of security keys:
-
-* Symmetric key: The same key is used to generate and to verify a JWT.
-* Asymmetric key: A public-private key pair in an X509 certificate is used with a private key to encrypt/generate a JWT and with the public key to verify the token.
-
-If you use .NET Framework/C# as your development platform, the X509 certificate used for an asymmetric security key must have a key length of at least 2048. This key length is a requirement of the class System.IdentityModel.Tokens.X509AsymmetricSecurityKey in .NET Framework. Otherwise, the following exception is thrown: IDX10630: The 'System.IdentityModel.Tokens.X509AsymmetricSecurityKey' for signing can't be smaller than '2048' bits.
-
-## Custom key and license acquisition URL
-
-Use the following templates if you want to specify a different license/key delivery service (not Media Services). The two replaceable fields in the templates are there so that you can share your streaming policy across many assets instead of creating a streaming policy per asset.
-
-* `EnvelopeEncryption.CustomKeyAcquisitionUrlTemplate`: Template for the URL of the custom service that delivers keys to end-user players. It isn't required when you're using Azure Media Services for issuing keys.
-
- The template supports replaceable tokens that the service will update at runtime with the value specific to the request. The currently supported token values are:
- * `{AlternativeMediaId}`, which is replaced with the value of StreamingLocatorId.AlternativeMediaId.
- * `{ContentKeyId}`, which is replaced with the value of the identifier of the requested key.
-* `StreamingPolicyPlayReadyConfiguration.CustomLicenseAcquisitionUrlTemplate`: Template for the URL of the custom service that delivers licenses to end-user players. It isn't required when you're using Azure Media Services for issuing licenses.
-
- The template supports replaceable tokens that the service will update at runtime with the value specific to the request. The currently supported token values are:
- * `{AlternativeMediaId}`, which is replaced with the value of StreamingLocatorId.AlternativeMediaId.
- * `{ContentKeyId}`, which is replaced with the value of the identifier of the requested key.
-* `StreamingPolicyWidevineConfiguration.CustomLicenseAcquisitionUrlTemplate`: Same as the previous template, only for Widevine.
-* `StreamingPolicyFairPlayConfiguration.CustomLicenseAcquisitionUrlTemplate`: Same as the previous template, only for FairPlay.
-
-For example:
-
-```csharp
-streamingPolicy.EnvelopEncryption.customKeyAcquisitionUrlTemplate = "https://mykeyserver.hostname.com/envelopekey/{AlternativeMediaId}/{ContentKeyId}";
-```
-
-`ContentKeyId` has a value of the requested key. You can use `AlternativeMediaId` if you want to map the request to an entity on your side. For example, `AlternativeMediaId` can be used to help you look up permissions.
-
-For REST examples that use custom license/key acquisition URLs, see [Streaming Policies - Create](/rest/api/media/streamingpolicies/create).
-
-> [!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
---
-## Troubleshoot
-
-If you get the `MPE_ENC_ENCRYPTION_NOT_SET_IN_DELIVERY_POLICY` error, make sure that you specify the appropriate streaming policy.
-
-If you get errors that end with `_NOT_SPECIFIED_IN_URL`, make sure that you specify the encryption format in the URL. An example is `…/manifest(format=m3u8-cmaf,encryption=cbcs-aapl)`. See [Streaming protocols and encryption types](#streaming-protocols-and-encryption-types).
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
media-services Drm Content Protection Key Delivery Ip Allow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-content-protection-key-delivery-ip-allow.md
- Title: Restrict access to DRM license and AES key delivery using IP allowlists
-description: Learn how to restrict access to DRM and AES Keys by using IP allowlists.
------- Previously updated : 06/11/2021----
-# Restrict access to DRM license and AES key delivery using IP allowlists
--
-When securing media with the [content protection](./drm-content-protection-concept.md) and DRM features of Media Services, you could encounter scenarios where you need to limit the delivery of licenses or key requests to to a specific IP range of client devices on your network. To restrict content playback and delivery of keys, you can use the IP allowlist for Key Delivery.
-
-In addition, you can also use the allowlist to completely block all public internet access to Key Delivery traffic and only allow traffic from your private network endpoints.
-
-The IP allowlist for Key Delivery restricts the delivery of both DRM licenses and AES-128 keys to clients within the supplied IP allowlist range.
-
-## Setting the allowlist for key delivery
-
-The settings for the Key Delivery IP allowlist are on the Media Services account resource. When creating a new Media Services account, you can restrict the allowed IP ranges through the **KeyDelivery** property on the [Media Services account resource.](/rest/api/media/mediaservices/create-or-update)
-
-The **defaultAction** property can be set to "Allow" or "Deny" to control delivery of licenses and keys to clients in the allowlist range.
-
-The **ipAllowList** property is an array of single IPv4 address and/or IPv4 ranges using [CIDR notation](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing#CIDR_notation).
-
-## Setting the allowlist in the portal
-
-The Azure portal provides a method for configuring and updating the IP allowlist for key delivery. Navigate to your Media Services account and access the **Key delivery** menu under **Settings**.
-
-## Next steps
--- [Create an account](./account-create-how-to.md)-- [DRM and content protection overview](./drm-content-protection-concept.md)
media-services Drm Create Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-create-content-key-policy-how-to.md
- Title: Create a content key policy
-description: This article shows how to create a content key policy.
----- Previously updated : 03/10/2022---
-# Create a content key policy
--
-## Methods
-
-Use the following methods to create a content key policy.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
---
media-services Drm Delete Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-delete-content-key-policy-how-to.md
- Title: Delete a content key policy
-description: This article shows how to delete a content key policy.
----- Previously updated : 03/10/2022---
-# Delete a content key policy
--
-## Methods
-
-Use the following methods to delete a content key policy.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
---
media-services Drm Encrypt Content How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-encrypt-content-how-to.md
- Title: Encrypt content with the Azure portal
-description: This quickstart shows you how to configure encryption for your content using Azure Media Services in the Azure portal.
- Previously updated : 08/31/2020--
-# Quickstart: Use portal to encrypt content
--
-Use Azure Media Services to help secure your media from the time it leaves your computer all the way through storage, processing, and delivery. With Media Services, you can deliver your live and on-demand content encrypted dynamically with Advanced Encryption Standard (AES-128) or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients.
-
-To specify encryption options (if any) on your stream, you use a **streaming policy** and associate it with your streaming locator. You create the **content key policy** to configure how the content key (that provides secure access to your **assets**) is delivered to end clients. You need to set the requirements (restrictions) on the content key policy that must be met in order for keys with the specified configuration to be delivered to clients.
-
-> [!NOTE]
-> The content key policy is not needed for clear streaming or downloading.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES clear key or DRM encryption. To decrypt the stream, the player requests the key from Media Services key delivery service or the key delivery service you specified. To decide if the user is authorized to get the key, the service evaluates the **content key policy** that you specified for the key.
-
-This quickstart shows you how to create a content key policy where you specify what encryption should be applied to your asset when it is streamed. The quickstart also shows how to set the configured encryption on your asset.
-
-### Suggested pre-reading
-
-* [Dynamic encryption and key delivery](drm-content-protection-concept.md)
-* [Streaming locators](stream-streaming-locators-concept.md)
-* [Streaming policies](stream-streaming-policy-concept.md)
-* [Content key policies](drm-content-key-policy-concept.md)
-
-## Prerequisites
-
-Upload and process your content as described in [manage assets in the Azure portal](asset-create-asset-upload-portal-quickstart.md)
-
-## Create a content key policy
-
-Create the **content key policy** to configure how the content key (that provides secure access to your **assets**) is delivered to end clients.
-
-1. Sign in at the [Azure portal](https://portal.azure.com/).
-1. Locate and click on your Media Services account.
-1. Select **Content key policies (new)**.
-1. Press **+ Add content key policy** in the top of the window.
-
-The **Create a content key policy** window appears. In this window, you choose encryption options. You can choose to protect your media by choosing digital rights management (DRM), the advanced encryption standard (AES), or both.
-
-![Create a content key policy](./media/drm-encrypt-content-how-to/create-content-key-policy.png)
-
-Whether you choose one of the DRM options or an AES-128 clear key option, you will be recommended to specify how you want to configure restrictions. You can choose to have an open or token restriction. For detailed explanation, see [Controlling content access](drm-content-protection-concept.md#controlling-content-access).
-
-### Add a DRM content key
-
-You can choose to protect your content with Microsoft PlayReady and/or Google Widevine, or Apple FairPlay. Each license delivery type will verify the content keys based on your credentials in an encrypted format.
-
-#### License templates
-
-For details about license templates, see:
-
-* [Google Widevine license template](drm-widevine-license-template-concept.md)
-
- > [!NOTE]
- > You can create an empty license template with no values, just "{}." Then a license template is created with defaults. The default works for most cases.
-* [Apple FairPlay license requirements and configuration](drm-fairplay-license-overview.md)
-* [PlayReady license template](drm-playready-license-template-concept.md)
-
-### Add AES clear key
-
-You can also add an AES-128 clear key encryption to your content. The content key is transmitted to the client in an unencrypted format.
-
-![AES clear key](./media/drm-encrypt-content-how-to/aes-clear-key-policy.png)
-
-## Create a streaming locator for your asset
-
-1. Locate and click on your Media Services account.
-1. Select **Assets (new)**.
-1. From the list of assets, select the one you want to encrypt.
-1. In the **Streaming locator** section for the selected asset, press **+ Add a streaming locator**.
-1. Select a **streaming policy** that is appropriate for the **content key policy** that you configured.
-
- The [Streaming policies](stream-streaming-policy-concept.md) topic gives details on what streaming policy matches what content key policy.
-1. Once you select the appropriate streaming policy, you can select the content key policy from the drop-down list.
-1. Press **Add** to add the streaming locator to your asset.
-
- This publishes the asset and generates the streaming URLs.
-
-![A streaming locator](./media/drm-encrypt-content-how-to/multi-drm.png)
-
-## Cleanup resources
-
-If you intend to try the other quickstarts, you should hold on to the resources created. Otherwise, go to the Azure portal, browse to your resource groups, select the resource group under which you ran this quickstart, and delete all the resources.
----
-## Next steps
-
-[Manage assets](asset-create-asset-upload-portal-quickstart.md)
media-services Drm Fairplay License Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-fairplay-license-overview.md
- Title: Media Services Apple FairPlay license support
-description: This topic gives an overview of an Apple FairPlay license requirements and configuration.
------ Previously updated : 08/31/2020----
-# Apple FairPlay license requirements and configuration
--
-Azure Media Services enables you to encrypt your HLS content with **Apple FairPlay** (AES-128 CBC). Media Services also provides a service for delivering FairPlay licenses. When a player tries to play your FairPlay-protected content, a request is sent to the license delivery service to obtain a license. If the license service approves the request, it issues the license that is sent to the client and is used to decrypt and play the specified content.
-
-Media Services also provides APIs that you can use to configure your FairPlay licenses. This topic discusses FairPlay license requirements and demonstrates how you can configure a **FairPlay** license using Media Services APIs.
-
-## Requirements
-
-The following are required when using Media Services to encrypt your HLS content with **Apple FairPlay** and use Media Services to deliver FairPlay licenses:
-
-* Sign up with [Apple Development Program](https://developer.apple.com/).
-* Apple requires the content owner to obtain the [deployment package](https://developer.apple.com/contact/fps/). State that you already implemented Key Security Module (KSM) with Media Services, and that you are requesting the final FPS package. There are instructions in the final FPS package to generate certification and obtain the Application Secret Key (ASK). You use ASK to configure FairPlay.
-* The following things must be set on Media Services key/license delivery side:
-
- * **App Cert (AC)**: This is a .pfx file that contains the private key. You create this file and encrypt it with a password. The .pfx file should be in Base64 format.
-
- The following steps describe how to generate a .pfx certificate file for FairPlay:
-
- 1. Install OpenSSL from https://slproweb.com/products/Win32OpenSSL.html.
-
- Go to the folder where the FairPlay certificate and other files delivered by Apple are.
- 2. Run the following command from the command line. This converts the .cer file to a .pem file.
-
- "C:\OpenSSL-Win32\bin\openssl.exe" x509 -inform der -in FairPlay.cer -out FairPlay-out.pem
- 3. Run the following command from the command line. This converts the .pem file to a .pfx file with the private key. The password for the .pfx file is then asked by OpenSSL.
-
- "C:\OpenSSL-Win32\bin\openssl.exe" pkcs12 -export -out FairPlay-out.pfx -inkey privatekey.pem -in FairPlay-out.pem -passin file:privatekey-pem-pass.txt
-
- * **App Cert password**: The password for creating the .pfx file.
- * **ASK**: This key is received when you generate the certification by using the Apple Developer portal. Each development team receives a unique ASK. Save a copy of the ASK, and store it in a safe place. You need to configure ASK as FairPlayAsk with Media Services.
-
-* The following things must be set by the FPS client side:
-
- * **App Cert (AC)**: This is a .cer/.der file that contains the public key, which the operating system uses to encrypt some payload. Media Services needs to know about it because it is required by the player. The key delivery service decrypts it using the corresponding private key.
-
-* To play back a FairPlay encrypted stream, get a real ASK first, and then generate a real certificate. That process creates all three parts:
-
- * .der file
- * .pfx file
- * password for the .pfx
-
-> [!NOTE]
-> Azure Media Services doesn't check the certificate expiration date during packaging or key delivery. It will continue to work after the certificate expires.
-
-## FairPlay and player apps
-
-When your content is encrypted with **Apple FairPlay**, the individual video and audio samples are encrypted by using the **AES-128 CBC** mode. **FairPlay Streaming** (FPS) is integrated into the device operating systems, with native support on iOS and Apple TV. Safari on OS X enables FPS by using the Encrypted Media Extensions (EME) interface support.
-
-Azure Media Player also supports FairPlay playback. For more information, see [Azure Media Player documentation](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html).
-
-You can develop your own player apps by using the iOS SDK. To be able to play FairPlay content, you have to implement the license exchange protocol. This protocol is not specified by Apple. It is up to each app how to send key delivery requests. The Media Services FairPlay key delivery service expects the SPC to come as a www-form-url encoded post message, in the following form:
-
-```
-spc=<Base64 encoded SPC>
-```
-
-## FairPlay configuration .NET example
-
-You can use Media Services API to configure FairPlay licenses. When the player tries to play your FairPlay-protected content, a request is sent to the license delivery service to obtain the license. If the license service approves the request, the service issues the license. It's sent to the client and is used to decrypt and play the specified content.
-
-> [!NOTE]
-> Usually, you would want to configure FairPlay policy options only once, because you will only have one set of a certification and an ASK.
-
-The following example uses [Media Services .NET SDK](/dotnet/api/microsoft.azure.management.media.models) to configure the license.
-
-```csharp
-private static ContentKeyPolicyFairPlayConfiguration ConfigureFairPlayPolicyOptions()
-{
-
- string askHex = "";
- string FairPlayPfxPassword = "";
-
- var appCert = new X509Certificate2("FairPlayPfxPath", FairPlayPfxPassword, X509KeyStorageFlags.Exportable);
-
- byte[] askBytes = Enumerable
- .Range(0, askHex.Length)
- .Where(x => x % 2 == 0)
- .Select(x => Convert.ToByte(askHex.Substring(x, 2), 16))
- .ToArray();
-
- ContentKeyPolicyFairPlayConfiguration fairPlayConfiguration =
- new ContentKeyPolicyFairPlayConfiguration
- {
- Ask = askBytes,
- FairPlayPfx =
- Convert.ToBase64String(appCert.Export(X509ContentType.Pfx, FairPlayPfxPassword)),
- FairPlayPfxPassword = FairPlayPfxPassword,
- RentalAndLeaseKeyType =
- ContentKeyPolicyFairPlayRentalAndLeaseKeyType
- .PersistentUnlimited,
- RentalDuration = 2249 // in seconds
- };
-
- return fairPlayConfiguration;
-}
-```
-
-## Next steps
-
-Check out how to [protect with DRM](drm-protect-with-drm-tutorial.md)
media-services Drm Get Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-get-content-key-policy-how-to.md
- Title: Get a signing key from a policy
-description: This topic shows how to get a signing key from the existing policy using Media Services v3.
---- Previously updated : 03/09/2022---
-# Get a signing key from the existing policy
--
-One of the key design principles of the v3 API is to make the API more secure. v3 APIs do not return secrets or credentials on **Get** or **List** operations. See the detailed explanation here: For more information, see [Azure RBAC and Media Services accounts](security-rbac-concept.md)
-
-The example in this article shows how to get a signing key from the existing policy.
-
-## Download
-
-Clone a GitHub repository that contains the full .NET sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git
- ```
-
-The ContentKeyPolicy with secrets example is located in the [EncryptWithDRM](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/tree/main/AMSV3Tutorials/EncryptWithDRM) folder.
-
-## [.NET](#tab/net/)
-
-## Get ContentKeyPolicy with secrets
-
-To get to the key, use **GetPolicyPropertiesWithSecretsAsync**, as shown in the example below.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#GetOrCreateContentKeyPolicy)]
--
media-services Drm List Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-list-content-key-policy-how-to.md
- Title: List the content key policies
-description: This article shows how to list the content key policies.
----- Previously updated : 03/10/2022---
-# List the content key policies
--
-## Methods
-
-Use the following methods to list the content key policies.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
---
media-services Drm Offline Fairplay For Ios Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-offline-fairplay-for-ios-concept.md
- Title: Media Services v3 offline FairPlay Streaming for iOS
-description: This topic gives an overview and shows how to use Azure Media Services v3 to dynamically encrypt your HTTP Live Streaming (HLS) content with Apple FairPlay in offline mode.
---- Previously updated : 03/09/2022---
-# Offline FairPlay Streaming for iOS with Media Services v3
--
- Azure Media Services provides a set of well-designed [content protection services](https://azure.microsoft.com/services/media-services/content-protection/) that cover:
--- Microsoft PlayReady-- Google Widevine
-
- Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-- Apple FairPlay-- AES-128 encryption-
-Digital rights management (DRM)/Advanced Encryption Standard (AES) encryption of content is performed dynamically upon request for various streaming protocols. DRM license/AES decryption key delivery services also are provided by Media Services.
-
-Besides protecting content for online streaming over various streaming protocols, offline mode for protected content is also an often-requested feature. Offline-mode support is needed for the following scenarios:
-
-* Playback when internet connection isn't available, such as during travel.
-* Some content providers might disallow DRM license delivery beyond a country/region's border. If users want to watch content while traveling outside of the country/region, offline download is needed.
-* In some countries/regions, internet availability and/or bandwidth is still limited. Users might choose to download first to be able to watch content in a resolution that is high enough for a satisfactory viewing experience. In this case, the issue typically isn't network availability but limited network bandwidth. Over-the-top (OTT)/online video platform (OVP) providers request offline-mode support.
-
-This article covers FairPlay Streaming (FPS) offline-mode support that targets devices running iOS 10 or later. This feature isn't supported for other Apple platforms, such as watchOS, tvOS, or Safari on macOS.
-
-> [!NOTE]
-> Offline DRM is only billed for making a single request for a license when you download the content. Any errors are not billed.
-
-## Prerequisites
-
-Before you implement offline DRM for FairPlay on an iOS 10+ device:
-
-* Review online content protection for FairPlay:
-
- - [Apple FairPlay license requirements and configuration](drm-fairplay-license-overview.md)
- - [Use DRM dynamic encryption and license delivery service](drm-protect-with-drm-tutorial.md)
- - A .NET sample that includes configuration of online FPS streaming: [ConfigureFairPlayPolicyOptions](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs#L493)
-* Obtain the FPS SDK from the Apple Developer Network. The FPS SDK contains two components:
-
- - The FPS Server SDK, which contains the Key Security Module (KSM), client samples, a specification, and a set of test vectors.
- - The FPS Deployment Pack, which contains the D function specification, along with instructions about how to generate the FPS Certificate, customer-specific private key, and Application Secret Key. Apple issues the FPS Deployment Pack only to licensed content providers.
-* Clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git.
-
- You will need to modify the code in [Encrypt with DRM using .NET](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/tree/main/AMSV3Tutorials/EncryptWithDRM) to add FairPlay configurations.
-
-## [.NET](#tab/net/)
-
-## Configure content protection in Azure Media Services
-
-In the [GetOrCreateContentKeyPolicyAsync](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs#L192) method, do the following:
-
-Uncomment the code that configures the FairPlay policy option:
-
-```csharp
-ContentKeyPolicyFairPlayConfiguration fairplayConfig = ConfigureFairPlayPolicyOptions();
-```
-
-Also, uncomment the code that adds CBCS ContentKeyPolicyOption into the list of ContentKeyPolicyOptions
-
-```csharp
-options.Add(
- new ContentKeyPolicyOption()
- {
- Configuration = fairplayConfig,
- Restriction = restriction,
- Name = "ContentKeyPolicyOption_CBCS"
- });
-```
-
-## Enable offline mode
-
-To enable offline mode, create a custom StreamingPolicy and use its name when creating a StreamingLocator in [CreateStreamingLocatorAsync](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs#L538).
-
-```csharp
-CommonEncryptionCbcs objStreamingPolicyInput= new CommonEncryptionCbcs()
-{
- Drm = new CbcsDrmConfiguration()
- {
- FairPlay = new StreamingPolicyFairPlayConfiguration()
- {
- AllowPersistentLicense = true // This enables offline mode
- }
- },
- EnabledProtocols = new EnabledProtocols()
- {
- Hls = true,
- Dash = true // Even though DASH under CBCS is not supported for either CSF or CMAF, HLS-CMAF-CBCS uses DASH-CBCS fragments in its HLS playlist
- },
-
- ContentKeys = new StreamingPolicyContentKeys()
- {
- // Default key must be specified if keyToTrackMappings is present
- DefaultKey = new DefaultKey()
- {
- Label = "CBCS_DefaultKeyLabel"
- }
- }
-}
-
-```
-
-Now your Media Services account is configured to deliver offline FairPlay licenses.
-
-## Sample iOS Player
-
-FPS offline-mode support is available only on iOS 10 and later. The FPS Server SDK (version 3.0 or later) contains the document and sample for FPS offline mode.
-Specifically, FPS Server SDK (version 3.0 or later) contains the following two items related to offline mode:
-
-* Document: "Offline Playback with FairPlay Streaming and HTTP Live Streaming." Apple, September 14, 2016. In FPS Server SDK version 4.0, this document is merged into the main FPS document.
-* Sample code: HLSCatalog sample (part of the Apple's FPS Server SDK) for FPS offline mode in the \FairPlay Streaming Server SDK version 3.1\Development\Client\HLSCatalog_With_FPS\HLSCatalog\.
-In the HLSCatalog sample app, the following code files are used to implement offline-mode features:
-
- - AssetPersistenceManager.swift code file: AssetPersistenceManager is the main class in this sample that demonstrates how to:
-
- - Manage downloading HLS streams, such as the APIs used to start and cancel downloads and to delete existing assets off devices.
- - Monitor the download progress.
- - AssetListTableViewController.swift and AssetListTableViewCell.swift code files: AssetListTableViewController is the main interface of this sample. It provides a list of assets the sample can use to play, download, delete, or cancel a download.
-
-These steps show how to set up a running iOS player. Assuming you start from the HLSCatalog sample in FPS Server SDK version 4.0.1, make the following code changes:
-
-In HLSCatalog\Shared\Managers\ContentKeyDelegate.swift, implement the method `requestContentKeyFromKeySecurityModule(spcData: Data, assetID: String)` by using the following code. Let "drmUr" be a variable assigned to the HLS URL.
-
-```swift
- var ckcData: Data? = nil
-
- let semaphore = DispatchSemaphore(value: 0)
- let postString = "spc=\(spcData.base64EncodedString())&assetId=\(assetIDString)"
-
- if let postData = postString.data(using: .ascii, allowLossyConversion: true), let drmServerUrl = URL(string: self.drmUrl) {
- var request = URLRequest(url: drmServerUrl)
- request.httpMethod = "POST"
- request.setValue(String(postData.count), forHTTPHeaderField: "Content-Length")
- request.setValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
- request.httpBody = postData
-
- URLSession.shared.dataTask(with: request) { (data, _, error) in
- if let data = data, var responseString = String(data: data, encoding: .utf8) {
- responseString = responseString.replacingOccurrences(of: "<ckc>", with: "").replacingOccurrences(of: "</ckc>", with: "")
- ckcData = Data(base64Encoded: responseString)
- } else {
- print("Error encountered while fetching FairPlay license for URL: \(self.drmUrl), \(error?.localizedDescription ?? "Unknown error")")
- }
-
- semaphore.signal()
- }.resume()
- } else {
- fatalError("Invalid post data")
- }
-
- semaphore.wait()
- return ckcData
-```
-
-In HLSCatalog\Shared\Managers\ContentKeyDelegate.swift, implement the method `requestApplicationCertificate()`. This implementation depends on whether you embed the certificate (public key only) with the device or host the certificate on the web. The following implementation uses the hosted application certificate used in the test samples. Let "certUrl" be a variable that contains the URL of the application certificate.
-
-```swift
-func requestApplicationCertificate() throws -> Data {
-
- var applicationCertificate: Data? = nil
- do {
- applicationCertificate = try Data(contentsOf: URL(string: certUrl)!)
- } catch {
- print("Error loading FairPlay application certificate: \(error)")
- }
-
- return applicationCertificate
- }
-```
-
-For the final integrated test, both the video URL and the application certificate URL are provided in the section "Integrated Test."
-
-In HLSCatalog\Shared\Resources\Streams.plist, add your test video URL. For the content key ID, use the FairPlay license acquisition URL with the skd protocol as the unique value.
-
-![Offline FairPlay iOS App Streams](media/drm-offline-fairplay-for-ios-concept/offline-fairplay-ios-app-streams.png)
-
-Use your own test video URL, FairPlay license acquisition URL, and application certificate URL, if you have them set up. Or you can continue to the next section, which contains test samples.
-
-## Integrated test
-
-Three test samples in Media Services cover the following three scenarios:
-
-* FPS protected, with video, audio, and alternate audio track
-* FPS protected, with video and audio, but no alternate audio track
-* FPS protected, with video only and no audio
-
-You can find these samples at [this demo site](https://aka.ms/poc#22), with the corresponding application certificate hosted in an Azure web app.
-With either the version 3 or version 4 sample of the FPS Server SDK, if a master playlist contains alternate audio, during offline mode it plays audio only. Therefore, you need to strip the alternate audio. In other words, the second and third samples listed previously work in online and offline mode. The sample listed first plays audio only during offline mode, while online streaming works properly.
---
-## Offline Fairplay questions
-
-See [offline fairplay questions in the FAQ](frequently-asked-questions.yml).
media-services Drm Offline Playready Streaming For Windows 10 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-offline-playready-streaming-for-windows-10.md
- Title: Configure offline PlayReady streaming
-description: This article shows how to configure your Azure Media Services v3 account for streaming PlayReady for Windows 10 offline.
-
-keywords: DASH, DRM, Widevine Offline Mode, ExoPlayer, Android
---- Previously updated : 03/09/2022---
-# Offline PlayReady Streaming for Windows 10 with Media Services v3
--
-Azure Media Services support offline download/playback with DRM protection. This article covers offline support of Azure Media Services for Windows 10/PlayReady clients. You can read about the offline mode support for iOS/FairPlay and Android/Widevine devices in the following articles:
--- [Offline FairPlay Streaming for iOS](drm-offline-fairplay-for-ios-concept.md)-- [Offline Widevine Streaming for Android](drm-offline-widevine-for-android.md)-
-> [!NOTE]
-> Offline DRM is only billed for making a single request for a license when you download the content. Any errors are not billed.
-
-## Background on offline mode playback
-
-This section gives some background on offline mode playback, especially why:
-
-* In some countries/regions, Internet availability and/or bandwidth is still limited. Users may choose to download first to be able to watch content in high enough resolution for satisfactory viewing experience. In this case, more often, the issue is not network availability, rather it is limited network bandwidth. OTT/OVP providers are asking for offline mode support.
-* As disclosed at Netflix 2016 Q3 shareholder conference, downloading content is a ΓÇ£oft-requested featureΓÇ¥, and ΓÇ£we are open to itΓÇ¥ said by Reed Hastings, Netflix CEO.
-* Some content providers may disallow DRM license delivery beyond a country/region's border. If a user needs to travel abroad and still wants to watch content, offline download is needed.
-
-The challenge we face in implementing offline mode is the following:
-
-* MP4 is supported by many players, encoder tools, but there is no binding between MP4 container and DRM;
-* In the long term, CFF with CENC is the way to go. However, today, the tools/player support ecosystem is not there yet. We need a solution, today.
-
-The idea is: smooth streaming ([PIFF](/iis/media/smooth-streaming/protected-interoperable-file-format)) file format with H264/AAC has a binding with PlayReady (AES-128 CTR). Individual smooth streaming .ismv file (assuming audio is muxed in video) is itself a fMP4 and can be used for playback. If a smooth streaming content goes through PlayReady encryption, each .ismv file becomes a PlayReady protected fragmented MP4. We can choose an .ismv file with the preferred bitrate and rename it as .mp4 for download.
-
-There are two options for hosting the PlayReady protected MP4 for progressive download:
-
-* One can put this MP4 in the same container/media service asset and leverage Azure Media Services streaming endpoint for progressive download;
-* One can use SAS locator for progressive download directly from Azure Storage, bypassing Azure Media Services.
-
-You can use two types of PlayReady license delivery:
-
-* PlayReady license delivery service in Azure Media Services;
-* PlayReady license servers hosted anywhere.
-
-Below are two sets of test assets, the first one using PlayReady license delivery in AMS while the second one using my PlayReady license server hosted on an Azure VM:
-
-## Asset #1
-
-* Progressive download URL: [https://willzhanmswest.streaming.mediaservices.windows.net/8d078cf8-d621-406c-84ca-88e6b9454acc/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4](https://willzhanmswest.streaming.mediaservices.windows.net/8d078cf8-d621-406c-84ca-88e6b9454acc/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4)
-* PlayReady LA_URL (AMS): `https://willzhanmswest.keydelivery.mediaservices.windows.net/PlayReady/`
-
-## Asset #2
-
-* Progressive download URL: [https://willzhanmswest.streaming.mediaservices.windows.net/7c085a59-ae9a-411e-842c-ef10f96c3f89/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4](https://willzhanmswest.streaming.mediaservices.windows.net/7c085a59-ae9a-411e-842c-ef10f96c3f89/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4)
-* PlayReady LA_URL (on premises): `https://willzhan12.cloudapp.net/playready/rightsmanager.asmx`
-
-For playback testing, we used a Universal Windows Application on Windows 10. In [Windows 10 Universal samples](https://github.com/Microsoft/Windows-universal-samples), there is a basic player sample called [Adaptive Streaming Sample](https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/AdaptiveStreaming). All we have to do is to add the code for us to pick downloaded video and use it as the source, instead of adaptive streaming source. The changes are in button click event handler:
-
-## [.NET](#tab/net/)
-
-```csharp
-private async void LoadUri_Click(object sender, RoutedEventArgs e)
-{
- //Uri uri;
- //if (!Uri.TryCreate(UriBox.Text, UriKind.Absolute, out uri))
- //{
- // Log("Malformed Uri in Load text box.");
- // return;
- //}
- //LoadSourceFromUriTask = LoadSourceFromUriAsync(uri);
- //await LoadSourceFromUriTask;
-
- //willzhan change start
- // Create and open the file picker
- FileOpenPicker openPicker = new FileOpenPicker();
- openPicker.ViewMode = PickerViewMode.Thumbnail;
- openPicker.SuggestedStartLocation = PickerLocationId.ComputerFolder;
- openPicker.FileTypeFilter.Add(".mp4");
- openPicker.FileTypeFilter.Add(".ismv");
- //openPicker.FileTypeFilter.Add(".mkv");
- //openPicker.FileTypeFilter.Add(".avi");
-
- StorageFile file = await openPicker.PickSingleFileAsync();
-
- if (file != null)
- {
- //rootPage.NotifyUser("Picked video: " + file.Name, NotifyType.StatusMessage);
- this.mediaPlayerElement.MediaPlayer.Source = MediaSource.CreateFromStorageFile(file);
- this.mediaPlayerElement.MediaPlayer.Play();
- UriBox.Text = file.Path;
- }
- else
- {
- // rootPage.NotifyUser("Operation cancelled.", NotifyType.ErrorMessage);
- }
-
- // On small screens, hide the description text to make room for the video.
- DescriptionText.Visibility = (ActualHeight < 500) ? Visibility.Collapsed : Visibility.Visible;
-}
-```
-
-![Offline mode playback of PlayReady protected fMP4](./media/offline-playready-for-windows/offline-playready1.jpg)
-
-Since the video is under PlayReady protection, the screenshot will not be able to include the video.
-
-In summary, we have achieved offline mode on Azure Media
-
-* Content transcoding and PlayReady encryption can be done in Azure Media Services or other tools;
-* Content can be hosted in Azure Media Services or Azure Storage for progressive download;
-* PlayReady license delivery can be from Azure Media Services or elsewhere;
-* The prepared smooth streaming content can still be used for online streaming via DASH or smooth with PlayReady as the DRM.
--
media-services Drm Offline Widevine For Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-offline-widevine-for-android.md
- Title: Stream Widevine Android offline
-description: This topic shows how to configure your Azure Media Services v3 account for offline streaming of Widevine protected content.
----- Previously updated : 03/09/2022---
-# Offline Widevine streaming for Android with Media Services v3
--
-In addition to protecting content for online streaming, media content subscription and rental services offer downloadable content that works when you are not connected to the internet. You might need to download content onto your phone or tablet for playback in airplane mode when flying disconnected from the network. Additional scenarios, in which you might want to download content:
--- Some content providers may disallow DRM license delivery beyond a country/region's border. If a user wants to watch content while traveling abroad, offline download is needed.-- In some countries/regions, Internet availability and/or bandwidth is limited. Users may choose to download content to be able to watch it in high enough resolution for satisfactory viewing experience.--
-This article discusses how to implement offline mode playback for DASH content protected by Widevine on Android devices. Offline DRM allows you to provide subscription, rental, and purchase models for your content, enabling customers of your services to easily take content with them when disconnected from the internet.
-
-For building the Android player apps, we outline three options:
-
-> [!div class="checklist"]
-> * Build a player using the Java API of ExoPlayer SDK
-> * Build a player using Xamarin binding of ExoPlayer SDK
-> * Build a player using Encrypted Media Extension (EME) and Media Source Extension (MSE) in Chrome mobile browser v62 or later
-
-The article also answers some common questions related to offline streaming of Widevine protected content.
-
-> [!NOTE]
-> Offline DRM is only billed for making a single request for a license when you download the content. Any errors are not billed.
-
-## Prerequisites
-
-Before implementing offline DRM for Widevine on Android devices, you should first:
--- Become familiar with the concepts introduced for online content protection using Widevine DRM. This is covered in detail in the following documents/samples:
- - [Design of a multi-DRM content protection system with access control](architecture-design-multi-drm-system.md)
- - [Use DRM dynamic encryption and license delivery service](drm-protect-with-drm-tutorial.md)
-- Clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git.-
- You will need to modify the code in [Encrypt with DRM using .NET](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/tree/main/AMSV3Tutorials/EncryptWithDRM) to add Widevine configurations.
-- Become familiar with the Google ExoPlayer SDK for Android, an open-source video player SDK capable of supporting offline Widevine DRM playback.
- - [ExoPlayer SDK](https://github.com/google/ExoPlayer)
- - [ExoPlayer Developer Guide](https://google.github.io/ExoPlayer/guide.html)
- - [EoPlayer Developer Blog](https://medium.com/google-exoplayer)
-
-## [.NET](#tab/net/)
-
-## Configure content protection in Azure Media Services
-
-In the [GetOrCreateContentKeyPolicyAsync](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs#L192) method, the following necessary steps are present:
-
-1. Specify how content key delivery is authorized in the license delivery service:
-
- ```csharp
- ContentKeyPolicySymmetricTokenKey primaryKey = new ContentKeyPolicySymmetricTokenKey(tokenSigningKey);
- List<ContentKeyPolicyTokenClaim> requiredClaims = new List<ContentKeyPolicyTokenClaim>()
- {
- ContentKeyPolicyTokenClaim.ContentKeyIdentifierClaim
- };
- List<ContentKeyPolicyRestrictionTokenKey> alternateKeys = null;
- ContentKeyPolicyTokenRestriction restriction
- = new ContentKeyPolicyTokenRestriction(Issuer, Audience, primaryKey, ContentKeyPolicyRestrictionTokenType.Jwt, alternateKeys, requiredClaims);
- ```
-2. Configure Widevine license template:
-
- ```csharp
- ContentKeyPolicyWidevineConfiguration widevineConfig = ConfigureWidevineLicenseTempate();
- ```
-
-3. Create ContentKeyPolicyOptions:
-
- ```csharp
- options.Add(
- new ContentKeyPolicyOption()
- {
- Configuration = widevineConfig,
- Restriction = restriction
- });
- ```
-
-## Enable offline mode
-
-To enable **offline** mode for Widevine licenses, you need to configure [Widevine license template](drm-widevine-license-template-concept.md). In the **policy_overrides** object, set the **can_persist** property to **true** (default is false), as shown in [ConfigureWidevineLicenseTemplate](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM/Program.cs#L452).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#ConfigureWidevineLicenseTempate)]
-
-## Configuring the Android player for offline playback
-
-The easiest way to develop a native player app for Android devices is to use the [Google ExoPlayer SDK](https://github.com/google/ExoPlayer), an open-source video player SDK. ExoPlayer supports features not currently supported by Android's native MediaPlayer API, including MPEG-DASH and Microsoft Smooth Streaming delivery protocols.
-
-ExoPlayer version 2.6 and higher includes many classes that support offline Widevine DRM playback. In particular, the OfflineLicenseHelper class provides utility functions to facilitate the use of the DefaultDrmSessionManager for downloading, renewing, and releasing offline licenses. The classes provided in the SDK folder "library/core/src/main/java/com/google/android/exoplayer2/offline/" support offline video content downloading.
-
-The following list of classes facilitates offline mode in the ExoPlayer SDK for Android:
--- `library/core/src/main/java/com/google/android/exoplayer2/drm/OfflineLicenseHelper.java` -- `library/core/src/main/java/com/google/android/exoplayer2/drm/DefaultDrmSession.java`-- `library/core/src/main/java/com/google/android/exoplayer2/drm/DefaultDrmSessionManager.java`-- `library/core/src/main/java/com/google/android/exoplayer2/drm/DrmSession.java`-- `library/core/src/main/java/com/google/android/exoplayer2/drm/ErrorStateDrmSession.java`-- `library/core/src/main/java/com/google/android/exoplayer2/drm/ExoMediaDrm.java`-- `library/core/src/main/java/com/google/android/exoplayer2/offline/SegmentDownloader.java`-- `library/core/src/main/java/com/google/android/exoplayer2/offline/DownloaderConstructorHelper.java` -- `library/core/src/main/java/com/google/android/exoplayer2/offline/Downloader.java`-- `library/dash/src/main/java/com/google/android/exoplayer2/source/dash/offline/DashDownloader.java` -
-Developers should reference the [ExoPlayer Developer Guide](https://google.github.io/ExoPlayer/guide.html) and the corresponding [Developer Blog](https://medium.com/google-exoplayer) during development of an application. Google has not released a fully documented reference implementation or sample code for the ExoPlayer app supporting Widevine offline at this time, so the information is limited to the developers' guide and blog.
-
-### Working with older Android devices
-
-For some older Android devices, you must set values for the following **policy_overrides** properties (defined in [Widevine license template](drm-widevine-license-template-concept.md): **rental_duration_seconds**, **playback_duration_seconds**, and **license_duration_seconds**. Alternatively, you can set them to zero, which means infinite/unlimited duration.
-
-The values must be set to avoid an integer overflow bug. For more explanation about the issue, see https://github.com/google/ExoPlayer/issues/3150 and https://github.com/google/ExoPlayer/issues/3112. <br/>If you do not set the values explicitly, very large values for **PlaybackDurationRemaining** and **LicenseDurationRemaining** will be assigned, (for example, 9223372036854775807, which is the maximum positive value for a 64-bit integer). As a result, the Widevine license appears expired and hence the decryption will not happen.
-
-This issue does not occur on Android 5.0 Lollipop or later since Android 5.0 is the first Android version, which has been designed to fully support ARMv8 ([Advanced RISC Machine](https://en.wikipedia.org/wiki/ARM_architecture)) and 64-bit platforms, while Android 4.4 KitKat was originally designed to support ARMv7 and 32-bit platforms as with other older Android versions.
-
-## Using Xamarin to build an Android playback app
-
-You can find Xamarin bindings for ExoPlayer using the following links:
--- [Xamarin bindings library for the Google ExoPlayer library](https://github.com/martijn00/ExoPlayerXamarin)-- [Xamarin bindings for ExoPlayer NuGet](https://www.nuget.org/packages/Xam.Plugins.Android.ExoPlayer/)-
-Also, see the following thread: [Xamarin binding](https://github.com/martijn00/ExoPlayerXamarin/pull/57).
-
-## Chrome player apps for Android
-
-Starting with the release of [Chrome for Android v. 62](https://developers.google.com/web/updates/2017/09/chrome-62-media-updates), persistent license in EME is supported. [Widevine L1](https://developers.google.com/web/updates/2017/09/chrome-62-media-updates#widevine_l1) is now also supported in Chrome for Android. This allows you to create offline playback applications in Chrome if your end users have this (or higher) version of Chrome.
-
-In addition, Google has produced a Progressive Web App (PWA) sample and open-sourced it:
--- [Source code](https://github.com/GoogleChromeLabs/sample-media-pwa)-- [Google hosted version](https://biograf-155113.appspot.com/ttt/episode-2/) (only works in Chrome v 62 and higher on Android devices)-
-If you upgrade your mobile Chrome browser to v62 (or higher) on an Android phone and test the above hosted sample app, you will see that both online streaming and offline playback work.
-
-The above open-source PWA app is authored in Node.js. If you want to host your own version on an Ubuntu server, keep in mind the following common encountered issues that can prevent playback:
-
-1. CORS issue: The sample video in the sample app is hosted in https://storage.googleapis.com/biograf-video-files/videos/. Google has set up CORS for all their test samples hosted in Google Cloud Storage bucket. They are served with CORS headers, specifying explicitly the CORS entry: `https://biograf-155113.appspot.com` (the domain in which google hosts their sample) preventing access by any other sites. If you try, you will see the following HTTP error: `Failed to load https://storage.googleapis.com/biograf-video-files/videos/poly-sizzle-2015/mp4/dash.mpd: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https:\//13.85.80.81:8080' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.`
-2. Certificate issue: Starting from Chrome v 58, EME for Widevine requires HTTPS. Therefore, you need to host the sample app over HTTPS with an X509 certificate. A usual test certificate does not work due to the following requirements: You need to obtain a certificate meeting the following minimum requirements:
- - Chrome and Firefox require SAN-Subject Alternative Name setting to exist in the certificate
- - The certificate must have trusted CA and a self-signed development certificate does not work
- - The certificate must have a CN matching the DNS name of the web server or gateway
---
-## More information
-
-For more information, see [Content Protection in the FAQ](frequently-asked-questions.yml).
-
-Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
media-services Drm Playready License Template Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-playready-license-template-concept.md
- Title: Media Services Microsoft PlayReady license template
-description: Learn about Azure Media Services v3 with the PlayReady license template and how to configure PlayReady licenses.
------ Previously updated : 08/31/2020---
-# Media Services v3 with PlayReady license template
--
-Azure Media Services enables you to encrypt your content with **Microsoft PlayReady**. Media Services also provides a service for delivering PlayReady licenses. You can use Media Services APIs to configure PlayReady licenses. When a player tries to play your PlayReady-protected content, a request is sent to the license delivery service to obtain a license. If the license service approves the request, it issues the license that is sent to the client and is used to decrypt and play the specified content.
-
-PlayReady licenses contain the rights and restrictions that you want the PlayReady digital rights management (DRM) runtime to enforce when a user tries to play back protected content. Here are some examples of PlayReady license restrictions that you can specify:
-
-* The date and time from which the license is valid.
-* The DateTime value when the license expires.
-* For the license to be saved in persistent storage on the client. Persistent licenses are typically used to allow offline playback of the content.
-* The minimum security level that a player must have to play your content.
-* The output protection level for the output controls for audio\video content.
-* For more information, see the "Output Controls" section (3.5) in the [PlayReady Compliance Rules](https://www.microsoft.com/playready/licensing/compliance/) document.
-
-> [!NOTE]
-> Currently, you can only configure the PlayRight of the PlayReady license. This right is required. The PlayRight gives the client the ability to play back the content. You also can use the PlayRight to configure restrictions specific to playback.
->
-
-This topic describes how to configure PlayReady licenses with Media Services.
-
-## Basic streaming license example
-
-The following example shows the simplest (and most common) template that configures a basic streaming license. With this license, your clients can play back your PlayReady-protected content.
-
-The XML conforms to the PlayReady license template XML schema defined in the [PlayReady license template XML schema](#schema) section.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<PlayReadyLicenseResponseTemplate xmlns:i="https://www.w3.org/2001/XMLSchema-instance"
- xmlns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1">
- <LicenseTemplates>
- <PlayReadyLicenseTemplate>
- <ContentKey i:type="ContentEncryptionKeyFromHeader" />
- <PlayRight />
- </PlayReadyLicenseTemplate>
- </LicenseTemplates>
-</PlayReadyLicenseResponseTemplate>
-```
-
-## <a id="classes"></a>Use Media Services APIs to configure license templates
-
-Media Services provides types that you can use to configure a PlayReady license template.
-
-The snippet that follows uses Media Services .NET classes to configure the PlayReady license template. The classes are defined in the [Microsoft.Azure.Management.Media.Models](/dotnet/api/microsoft.azure.management.media.models) namespace. The snippet configures the PlayRight of the PlayReady license. PlayRight grants the user the ability to play back the content subject to any restrictions configured in the license and on the PlayRight itself (for playback-specific policy). Much of the policy on a PlayRight concerns output restriction that control the types of outputs that the content can be played over. It also includes any restrictions that must be put in place when a given output is used. For example, if DigitalVideoOnlyContentRestriction is enabled, the DRM runtime only allows the video to be displayed over digital outputs. (Analog video outputs aren't allowed to pass the content.)
-
-> [!IMPORTANT]
-> PlayReady license has restrictions that are powerful. If the output protections are too restrictive, the content might be unplayable on some clients. For more information, see the [PlayReady Compliance Rules](https://www.microsoft.com/playready/licensing/compliance/).
-
-### Configure PlayReady license template with .NET
-
-```csharp
-ContentKeyPolicyPlayReadyLicense objContentKeyPolicyPlayReadyLicense;
-objContentKeyPolicyPlayReadyLicense = new ContentKeyPolicyPlayReadyLicense
-{
- AllowTestDevices = true,
- BeginDate = new DateTime(2016, 1, 1),
- ContentKeyLocation = new ContentKeyPolicyPlayReadyContentEncryptionKeyFromHeader(),
- ContentType = ContentKeyPolicyPlayReadyContentType.UltraVioletStreaming,
- LicenseType = drmSettings.EnableOfflineMode ? ContentKeyPolicyPlayReadyLicenseType.Persistent : ContentKeyPolicyPlayReadyLicenseType.NonPersistent,
- PlayRight = new ContentKeyPolicyPlayReadyPlayRight
- {
- ImageConstraintForAnalogComponentVideoRestriction = true,
- ExplicitAnalogTelevisionOutputRestriction = new ContentKeyPolicyPlayReadyExplicitAnalogTelevisionRestriction(true, 2),
- AllowPassingVideoContentToUnknownOutput = ContentKeyPolicyPlayReadyUnknownOutputPassingOption.Allowed,
- FirstPlayExpiration = TimeSpan.FromSeconds(20.0),
- }
-};
-```
-
-## <a id="schema"></a>PlayReady license template XML schema
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<xs:schema xmlns:tns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1" xmlns:ser="http://schemas.microsoft.com/2003/10/Serialization/" elementFormDefault="qualified" targetNamespace="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1" xmlns:xs="https://www.w3.org/2001/XMLSchema">
- <xs:import namespace="http://schemas.microsoft.com/2003/10/Serialization/" />
- <xs:complexType name="AgcAndColorStripeRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="AgcAndColorStripeRestriction" nillable="true" type="tns:AgcAndColorStripeRestriction" />
- <xs:simpleType name="ContentType">
- <xs:restriction base="xs:string">
- <xs:enumeration value="Unspecified" />
- <xs:enumeration value="UltravioletDownload" />
- <xs:enumeration value="UltravioletStreaming" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="ContentType" nillable="true" type="tns:ContentType" />
- <xs:complexType name="ExplicitAnalogTelevisionRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="BestEffort" type="xs:boolean" />
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ExplicitAnalogTelevisionRestriction" nillable="true" type="tns:ExplicitAnalogTelevisionRestriction" />
- <xs:complexType name="PlayReadyContentKey">
- <xs:sequence />
- </xs:complexType>
- <xs:element name="PlayReadyContentKey" nillable="true" type="tns:PlayReadyContentKey" />
- <xs:complexType name="ContentEncryptionKeyFromHeader">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:PlayReadyContentKey">
- <xs:sequence />
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="ContentEncryptionKeyFromHeader" nillable="true" type="tns:ContentEncryptionKeyFromHeader" />
- <xs:complexType name="ContentEncryptionKeyFromKeyIdentifier">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:PlayReadyContentKey">
- <xs:sequence>
- <xs:element minOccurs="0" name="KeyIdentifier" type="ser:guid" />
- </xs:sequence>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="ContentEncryptionKeyFromKeyIdentifier" nillable="true" type="tns:ContentEncryptionKeyFromKeyIdentifier" />
- <xs:complexType name="PlayReadyLicenseResponseTemplate">
- <xs:sequence>
- <xs:element name="LicenseTemplates" nillable="true" type="tns:ArrayOfPlayReadyLicenseTemplate" />
- <xs:element minOccurs="0" name="ResponseCustomData" nillable="true" type="xs:string">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyLicenseResponseTemplate" nillable="true" type="tns:PlayReadyLicenseResponseTemplate" />
- <xs:complexType name="ArrayOfPlayReadyLicenseTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="PlayReadyLicenseTemplate" nillable="true" type="tns:PlayReadyLicenseTemplate" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfPlayReadyLicenseTemplate" nillable="true" type="tns:ArrayOfPlayReadyLicenseTemplate" />
- <xs:complexType name="PlayReadyLicenseTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" name="AllowTestDevices" type="xs:boolean" />
- <xs:element minOccurs="0" name="BeginDate" nillable="true" type="xs:dateTime">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element name="ContentKey" nillable="true" type="tns:PlayReadyContentKey" />
- <xs:element minOccurs="0" name="ContentType" type="tns:ContentType">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ExpirationDate" nillable="true" type="xs:dateTime">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="GracePeriod" nillable="true" type="ser:duration">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="LicenseType" type="tns:PlayReadyLicenseType" />
- <xs:element minOccurs="0" name="PlayRight" nillable="true" type="tns:PlayReadyPlayRight" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyLicenseTemplate" nillable="true" type="tns:PlayReadyLicenseTemplate" />
- <xs:simpleType name="PlayReadyLicenseType">
- <xs:restriction base="xs:string">
- <xs:enumeration value="Nonpersistent" />
- <xs:enumeration value="Persistent" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="PlayReadyLicenseType" nillable="true" type="tns:PlayReadyLicenseType" />
- <xs:complexType name="PlayReadyPlayRight">
- <xs:sequence>
- <xs:element minOccurs="0" name="AgcAndColorStripeRestriction" nillable="true" type="tns:AgcAndColorStripeRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="AllowPassingVideoContentToUnknownOutput" type="tns:UnknownOutputPassingOption">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="AnalogVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="CompressedDigitalAudioOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="CompressedDigitalVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="DigitalVideoOnlyContentRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ExplicitAnalogTelevisionOutputRestriction" nillable="true" type="tns:ExplicitAnalogTelevisionRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="FirstPlayExpiration" nillable="true" type="ser:duration">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ImageConstraintForAnalogComponentVideoRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ImageConstraintForAnalogComputerMonitorRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ScmsRestriction" nillable="true" type="tns:ScmsRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="UncompressedDigitalAudioOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="UncompressedDigitalVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyPlayRight" nillable="true" type="tns:PlayReadyPlayRight" />
- <xs:simpleType name="UnknownOutputPassingOption">
- <xs:restriction base="xs:string">
- <xs:enumeration value="NotAllowed" />
- <xs:enumeration value="Allowed" />
- <xs:enumeration value="AllowedWithVideoConstriction" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="UnknownOutputPassingOption" nillable="true" type="tns:UnknownOutputPassingOption" />
- <xs:complexType name="ScmsRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ScmsRestriction" nillable="true" type="tns:ScmsRestriction" />
-</xs:schema>
-```
-
-## Next steps
-
-Check out how to [protect with DRM](drm-protect-with-drm-tutorial.md)
media-services Drm Protect With Aes128 Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-protect-with-aes128-tutorial.md
- Title: Encrypt video with AES-128
-: Azure Media Services
-description: Learn how to encrypt video with AES 128-bit encryption and how to use the key delivery service in Azure Media Services.
------- Previously updated : 05/25/2021---
-# Tutorial: Encrypt video with AES-128 and use the key delivery service
--
-> [!NOTE]
-> Even though the tutorial uses the [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.liveevent) examples, the general steps are the same for [REST API](/rest/api/medi#sdks).
-
-You can use Media Services to deliver HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming encrypted with the AES by using 128-bit encryption keys. Media Services also provides the key delivery service that delivers encryption keys to authorized users. If you want for Media Services to dynamically encrypt your video, you associate the encryption key with a Streaming Locator and also configure the content key policy. When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content with AES-128. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the content key policy that you specified for the key.
-
-You can encrypt each asset with multiple encryption types (AES-128, PlayReady, Widevine, FairPlay). See [Streaming protocols and encryption types](drm-content-protection-concept.md#streaming-protocols-and-encryption-types), to see what makes sense to combine. Also, see [How to protect with DRM](drm-protect-with-drm-tutorial.md).
-
-The output from the sample this article includes a URL to the Azure Media Player, manifest URL, and the AES token needed to play back the content. The sample sets the expiration of the JSON Web Token (JWT) token to 1 hour. You can open a browser and paste the resulting URL to launch the Azure Media Player demo page with the URL and token filled out for you already in the following format: ```https://ampdemo.azureedge.net/?url= {dash Manifest URL} &aes=true&aestoken=Bearer%3D{ JWT Token here}```.
-
-This tutorial shows you how to:
-
-> [!div class="checklist"]
-> * Download the [EncryptWithAES](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithAES) sample described in the article.
-> * Start using Media Services APIs with .NET SDK.
-> * Create an output asset.
-> * Create an encoding Transform.
-> * Submit a Job.
-> * Wait for the Job to complete.
-> * Create a Content Key Policy
-> * Configure the policy to use JWT token restriction.
-> * Create a Streaming Locator.
-> * Configure the Streaming Locator to encrypt the video with AES (ClearKey).
-> * Get a test token.
-> * Build a streaming URL.
-> * Clean up resources.
--
-## Prerequisites
-
-The following are required to complete the tutorial.
-
-* Review the [Content protection overview](drm-content-protection-concept.md) article.
-* Install Visual Studio Code or Visual Studio.
-* [Create a Media Services account](./account-create-how-to.md).
-* Get credentials needed to use Media Services APIs by following [Access APIs](./access-api-howto.md).
-* Set the appropriate values in the app configuration file (appsettings.json or .env file).
-
-## Download and configure the sample
-
-Clone a GitHub repository that contains the full .NET sample discussed in this article to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git
- ```
-
-The "Encrypt with AES-128" sample is located in the [EncryptWithAES](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithAES) folder.
--
-> [!NOTE]
-> The sample creates unique resources every time you run the app. Typically, you'll reuse existing resources like transforms and policies (if existing resource have required configurations).
-
-### Start using Media Services APIs with the .NET SDK
-
-To start using Media Services APIs with .NET, you need to create an `AzureMediaServicesClient` object. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory. Another option is to use interactive authentication, which is implemented in `GetCredentialsInteractiveAuthAsync`.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#CreateMediaServicesClientAsync)]
-
-In the code that you cloned at the beginning of the article, the `GetCredentialsAsync` function creates the `ServiceClientCredentials` object based on the credentials supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsAsync)]
-
-In the case of interactive authentication, the `GetCredentialsInteractiveAuthAsync` function creates the `ServiceClientCredentials` object based on an interactive authentication and the connection parameters supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsInteractiveAuthAsync)]
-
-## Create an output asset
-
-The output [Asset](/rest/api/media/assets) stores the result of your encoding job.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#CreateOutputAsset)]
-
-## Get or create an encoding Transform
-
-When creating a new [Transform](/rest/api/media/transforms) instance, you need to specify what you want it to produce as an output. The required parameter is a **TransformOutput** object, as shown in the code below. Each **TransformOutput** contains a **Preset**. **Preset** describes the step-by-step instructions of video and/or audio processing operations that are to be used to generate the desired **TransformOutput**. The sample described in this article uses a built-in Preset called **AdaptiveStreaming**. The Preset encodes the input video into an autogenerated bitrate ladder (bitrate-resolution pairs) based on the input resolution and bitrate, and then produces ISO MP4 files with H.264 video and AAC audio corresponding to each bitrate-resolution pair.
-
-Before creating a new [Transform](/rest/api/media/transforms), first check if one already exists using the **Get** method, as shown in the code that follows. In Media Services v3, **Get** methods on entities return **null** if the entity doesnΓÇÖt exist (a case-insensitive check on the name).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#EnsureTransformExists)]
-
-## Submit Job
-
-As mentioned above, the [Transform](/rest/api/media/transforms) object is the recipe and a [Job](/rest/api/media/jobs) is the actual request to Media Services to apply that **Transform** to a given input video or audio content. The **Job** specifies information like the location of the input video and the location for the output.
-
-In this tutorial, we create the job's input based on a file that's ingested directly from an [HTTPs source URL](job-input-from-http-how-to.md).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#SubmitJob)]
-
-## Wait for the Job to complete
-
-The job takes some time to complete. When it does, you want to be notified. The code sample below shows how to poll the service for the status of the [Job](/rest/api/medi).
-
-The **Job** usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has come across an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and **Canceled** when it's done.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#WaitForJobToFinish)]
-
-## Create a Content Key Policy
-
-A content key provides secure access to your Assets. You need to create a **Content Key Policy** that configures how the content key is delivered to end clients. The content key is associated with the **Streaming Locator**. Media Services also provides the key delivery service that delivers encryption keys to authorized users.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content (in this case, by using AES encryption.) To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the content key policy that you specified for the key.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#GetOrCreateContentKeyPolicy)]
-
-## Create a Streaming Locator
-
-After the encoding is complete, and the content key policy is set, the next step is to make the video in the output Asset available to clients for playback. You make the video available in two steps:
-
-1. Create a [Streaming Locator](/rest/api/media/streaminglocators).
-2. Build the streaming URLs that clients can use.
-
-The process of creating the **Streaming Locator** is called publishing. By default, the **Streaming Locator** is valid immediately after you make the API calls. It lasts until it's deleted, unless you configure the optional start and end times.
-
-When creating a [Streaming Locator](/rest/api/media/streaminglocators), you'll need to specify the desired **StreamingPolicyName**. In this tutorial, we're using one of the PredefinedStreamingPolicies, which tells Azure Media Services how to publish the content for streaming. In this example, the AES Envelope encryption is applied (this encryption is also known as ClearKey encryption because the key is delivered to the playback client via HTTPS and not a DRM license).
-
-> [!IMPORTANT]
-> When using a custom [StreamingPolicy](/rest/api/media/streamingpolicies), you should design a limited set of such policies for your Media Service account, and re-use them for your Streaming Locators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of StreamingPolicy entries. You shouldn't be creating a new StreamingPolicy for each Streaming Locator.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#CreateStreamingLocator)]
-
-## Get a test token
-
-In this tutorial, we specify for the content key policy to have a token restriction. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the [JWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_3) format and that's what we configure in the sample.
-
-The ContentKeyIdentifierClaim is used in the **Content Key Policy**, which means that the token presented to the Key Delivery service must have the identifier of the content key in it. In the sample, we didn't specify a content key when creating the Streaming Locator, the system created a random one for us. To generate the test token, we must get the ContentKeyId to put in the ContentKeyIdentifierClaim claim.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#GetToken)]
-
-## Build a DASH streaming URL
-
-Now that the [Streaming Locator](/rest/api/media/streaminglocators) has been created, you can get the streaming URLs. To build a URL, you need to concatenate the [StreamingEndpoint](/rest/api/media/streamingendpoints) host name and the **Streaming Locator** path. In this sample, the *default* **Streaming Endpoint** is used. When you first create a Media Service account, this *default* **Streaming Endpoint** will be in a stopped state, so you need to call **Start**.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#GetMPEGStreamingUrl)]
-
-## Clean up resources in your Media Services account
-
-Generally, you should clean up everything except objects that you're planning to reuse (typically, you'll reuse Transforms, Streaming Locators, and so on). If you want for your account to be clean after experimenting, delete the resources that you don't plan to reuse. For example, the following code deletes the job, created assets and content key policy:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithAES/Program.cs#CleanUp)]
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group you created earlier.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Protect with DRM](drm-protect-with-drm-tutorial.md)
media-services Drm Protect With Drm Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-protect-with-drm-tutorial.md
- Title: Media Services DRM encryption and license delivery
-description: Learn how to use DRM dynamic encryption and license delivery service to deliver streams encrypted with Microsoft PlayReady, Google Widevine, or Apple FairPlay licenses.
------ Previously updated : 05/25/2021---
-# Tutorial: Use DRM dynamic encryption and license delivery service
--
-> [!NOTE]
-> Even though this tutorial uses the [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.liveevent) examples, the general steps are the same for [REST API](/rest/api/medi#sdks).
-
-You can use Azure Media Services to deliver your streams encrypted with Microsoft PlayReady, Google Widevine, or Apple FairPlay licenses. For in-depth explanation, see [Content protection with dynamic encryption](drm-content-protection-concept.md).
-
-Media Services also provides a service for delivering PlayReady, Widevine, and FairPlay DRM licenses. When a user requests DRM-protected content, the player app requests a license from the Media Services license service. If the player app is authorized, the Media Services license service issues a license to the player. A license contains the decryption key that can be used by the client player to decrypt and stream the content.
-
-This article is based on the [Encrypting with DRM](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM) sample.
-
-The sample described in this article produces the following result:
-
-![AMS with DRM protected video in Azure Media Player](./media/drm-protect-with-drm-tutorial/ams_player.png)
-
-This tutorial shows you how to:
-
-> [!div class="checklist"]
-> * Create an encoding Transform.
-> * Set the signing key used for verification of your token.
-> * Set requirements on the content key policy.
-> * Create a StreamingLocator with the specified streaming policy.
-> * Create a URL used to playback your file.
--
-## Prerequisites
-
-The following items are required to complete the tutorial:
-
-* Review the [Content protection overview](drm-content-protection-concept.md) article.
-* Review the [Design multi-DRM content protection system with access control](architecture-design-multi-drm-system.md).
-* Install Visual Studio Code or Visual Studio.
-* Create a new Azure Media Services account, as described in [this quickstart](./account-create-how-to.md).
-* Get credentials needed to use Media Services APIs by following [Access APIs](./access-api-howto.md)
-* Set the appropriate values in the app configuration file (appsettings.json or .env file).
-
-## Download the code and configure the sample
-
-Clone a GitHub repository that contains the full .NET sample discussed in this article to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git
- ```
-
-The "Encrypt with DRM" sample is located in the [EncryptWithDRM](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithDRM) folder.
--
-> [!NOTE]
-> The sample creates unique resources every time you run the app. Typically, you'll reuse existing resources like transforms and policies (if existing resource have required configurations).
-
-### Start using Media Services APIs with the .NET SDK
-
-To start using Media Services APIs with .NET, you need to create an `AzureMediaServicesClient` object. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory. Another option is to use interactive authentication, which is implemented in `GetCredentialsInteractiveAuthAsync`.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#CreateMediaServicesClientAsync)]
-
-In the code that you cloned at the beginning of the article, the `GetCredentialsAsync` function creates the `ServiceClientCredentials` object based on the credentials supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsAsync)]
-
-In the case of interactive authentication, the `GetCredentialsInteractiveAuthAsync` function creates the `ServiceClientCredentials` object based on an interactive authentication and the connection parameters supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsInteractiveAuthAsync)]
-
-## Create an output asset
-
-The output [Asset](assets-concept.md) stores the result of your encoding job.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#CreateOutputAsset)]
-
-## Get or create an encoding Transform
-
-When creating a new [Transform](transform-jobs-concept.md) instance, you need to specify what you want it to produce as an output. The required parameter is a `transformOutput` object, as shown in the code below. Each TransformOutput contains a **Preset**. Preset describes the step-by-step instructions of video and/or audio processing operations that are to be used to generate the desired TransformOutput. The sample described in this article uses a built-in Preset called **AdaptiveStreaming**. The Preset encodes the input video into an autogenerated bitrate ladder (bitrate-resolution pairs) based on the input resolution and bitrate, and produces ISO MP4 files with H.264 video and AAC audio corresponding to each bitrate-resolution pair.
-
-Before creating a new **Transform**, you should first check if one already exists using the **Get** method, as shown in the code that follows. In Media Services v3, **Get** methods on entities return **null** if the entity doesnΓÇÖt exist (a case-insensitive check on the name).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#EnsureTransformExists)]
-
-## Submit Job
-
-As mentioned above, the **Transform** object is the recipe and a [Job](transform-jobs-concept.md) is the actual request to Media Services to apply that **Transform** to a given input video or audio content. The **Job** specifies information like the location of the input video and the location for the output.
-
-In this tutorial, we create the job's input based on a file that's ingested directly from an [HTTPs source URL](job-input-from-http-how-to.md).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#SubmitJob)]
-
-## Wait for the Job to complete
-
-The job takes some time to complete. When it does, you want to be notified. The code sample below shows how to poll the service for the status of the **Job**. Polling isn't a recommended best practice for production apps because of potential latency. Polling can be throttled if overused on an account. Developers should instead use Event Grid. See [Route events to a custom web endpoint](monitoring/job-state-events-cli-how-to.md).
-
-The **Job** usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has come across an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and **Canceled** when it's done.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#WaitForJobToFinish)]
-
-## Create a Content Key Policy
-
-A content key provides secure access to your Assets. You need to create a [Content Key Policy](drm-content-key-policy-concept.md) when encrypting your content with a DRM. The policy configures how the content key is delivered to end clients. The content key is associated with a Streaming Locator. Media Services also provides the key delivery service that delivers encryption keys and licenses to authorized users.
-
-You need to set the requirements (restrictions) on the **Content Key Policy** that must be met to deliver keys with the specified configuration. In this example, we set the following configurations and requirements:
-
-* Configuration
-
- The [PlayReady](drm-playready-license-template-concept.md) and [Widevine](drm-widevine-license-template-concept.md) licenses are configured so they can be delivered by the Media Services license delivery service. Even though this sample app doesn't configure the [FairPlay](drm-fairplay-license-overview.md) license, it contains a method you can use to configure FairPlay. You can add FairPlay configuration as another option.
-
-* Restriction
-
- The app sets a JSON Web Token (JWT) token type restriction on the policy.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the content key policy that you specified for the key.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#GetOrCreateContentKeyPolicy)]
-
-## Create a Streaming Locator
-
-After the encoding is complete, and the content key policy is set, the next step is to make the video in the output Asset available to clients for playback. You make the video available in two steps:
-
-1. Create a [Streaming Locator](stream-streaming-locators-concept.md).
-2. Build the streaming URLs that clients can use.
-
-The process of creating the **Streaming Locator** is called publishing. By default, the **Streaming Locator** is valid immediately after you make the API calls. It lasts until it's deleted, unless you configure the optional start and end times.
-
-When creating a **Streaming Locator**, you need to specify the desired `StreamingPolicyName`. In this tutorial, we're using one of the predefined Streaming Policies, which tells Azure Media Services how to publish the content for streaming. In this example, we set StreamingLocator.StreamingPolicyName to the "Predefined_MultiDrmCencStreaming" policy. The PlayReady and Widevine encryptions are applied and the key is delivered to the playback client based on the configured DRM licenses. If you also want to encrypt your stream with CBCS (FairPlay), use "Predefined_MultiDrmStreaming".
-
-> [!IMPORTANT]
-> When using a custom [Streaming Policy](stream-streaming-policy-concept.md), you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of StreamingPolicy entries. You shouldn't be creating a new StreamingPolicy for each StreamingLocator.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#CreateStreamingLocator)]
-
-## Get a test token
-
-In this tutorial, we specify for the content key policy to have a token restriction. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the [JWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_3) formats and that's what we configure in the sample.
-
-The ContentKeyIdentifierClaim is used in the ContentKeyPolicy, which means that the token presented to the key delivery service must have the identifier of the ContentKey in it. In the sample, we don't specify a content key when creating the Streaming Locator, the system creates a random one for us. To generate the test token, we must get the ContentKeyId to put in the ContentKeyIdentifierClaim claim.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#GetToken)]
-
-## Build a streaming URL
-
-Now that the [StreamingLocator](/rest/api/media/streaminglocators) has been created, you can get the streaming URLs. To build a URL, you need to concatenate the [StreamingEndpoint](/rest/api/media/streamingendpoints) host name and the **Streaming Locator** path. In this sample, the *default* **Streaming Endpoint** is used. When you first create a Media Service account, this *default* **Streaming Endpoint** will be in a stopped state, so you need to call **Start**.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#GetMPEGStreamingUrl)]
-
-When you run the app, you see the following screen:
-
-![Protect with DRM](./media/drm-protect-with-drm-tutorial/playready_encrypted_url.png)
-
-You can open a browser and paste the resulting URL to launch the Azure Media Player demo page with the URL and token filled out for you already.
-
-## Clean up resources in your Media Services account
-
-Generally, you should clean up everything except objects that you're planning to reuse (typically, you'll reuse Transforms, StreamingLocators, and so on). If you want for your account to be clean after experimenting, delete the resources that you don't plan to reuse. For example, the following code deletes the job, created assets and content key policy:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/EncryptWithDRM/Program.cs#CleanUp)]
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group you created earlier.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-Check out
-
-> [!div class="nextstepaction"]
-> [Protect with AES-128](drm-playready-license-template-concept.md)
media-services Drm Remove Option Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-remove-option-content-key-policy-how-to.md
- Title: Remove an option from a content key policy
-description: This article shows how to remove an option from a content key policy.
----- Previously updated : 03/10/2022---
-# Remove an option from a content key policy
--
-## Methods
-
-Use the following methods to remove an option from a content key policy.
-
-## [CLI](#tab/cli/)
---
media-services Drm Show Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-show-content-key-policy-how-to.md
- Title: Show an existing content key policy
-description: This article shows how to show an existing content key policy.
----- Previously updated : 03/10/2022---
-# Show an existing content key policy
--
-## Methods
-
-Use the following methods to show an existing content key policy.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
---
media-services Drm Update Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-update-content-key-policy-how-to.md
- Title: Update an existing content key policy
-description: This article shows how to update an existing content key policy.
----- Previously updated : 03/10/2022---
-# Update an existing content key policy
--
-## Methods
-
-Use the following methods to update an existing content key policy.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
---
media-services Drm Update Option Content Key Policy How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-update-option-content-key-policy-how-to.md
- Title: Update an option in a content key policy
-description: This article shows how to update an option in a content key policy.
----- Previously updated : 03/10/2022---
-# Update an option in a content key policy
--
-## Methods
-
-Use the following methods to update an option in a content key policy.
-
-## [CLI](#tab/cli/)
---
media-services Drm Widevine License Template Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/drm-widevine-license-template-concept.md
- Title: Media Services v3 Widevine license template overview
-description: Learn about Azure Media Services with the Widevine license template and how it is used to configure Widevine licenses.
------ Previously updated : 05/07/2020---
-# Media Services v3 with Widevine license template overview
-
-Azure Media Services enables you to encrypt your content with **Google Widevine**. Media Services also provides a service for delivering Widevine licenses. You can use Azure Media Services APIs to configure Widevine licenses. When a player tries to play your Widevine-protected content, a request is sent to the license delivery service to obtain the license. If the license service approves the request, the service issues the license. It's sent to the client and is used to decrypt and play the specified content.
--
-A Widevine license request is formatted as a JSON message.
---
-```json
-{
- "payload":"<license challenge>",
- "content_id": "<content id>"
- "provider": "<provider>"
- "allowed_track_types":"<types>",
- "content_key_specs":[
- {
- "track_type":"<track type 1>"
- },
- {
- "track_type":"<track type 2>"
- },
- …
- ],
- "policy_overrides":{
- "can_play":<can play>,
- "can persist":<can persist>,
- "can_renew":<can renew>,
- "rental_duration_seconds":<rental duration>,
- "playback_duration_seconds":<playback duration>,
- "license_duration_seconds":<license duration>,
- "renewal_recovery_duration_seconds":<renewal recovery duration>,
- "renewal_server_url":"<renewal server url>",
- "renewal_delay_seconds":<renewal delay>,
- "renewal_retry_interval_seconds":<renewal retry interval>,
- "renew_with_usage":<renew with usage>
- }
-}
-```
-
->[!NOTE]
-> You can create an empty message with no values, just "{}." Then a license template is created with defaults. The default works for most cases. Microsoft-based license-delivery scenarios should always use the defaults. If you need to set the "provider" and "content_id" values, a provider must match Widevine credentials.
-
-## JSON message
-
-| Name | Value | Description |
-| | | |
-| payload |Base64-encoded string |The license request sent by a client. |
-| content_id |Base64-encoded string |Identifier used to derive the key ID and content key for each content_key_specs.track_type. |
-| provider |string |Used to look up content keys and policies. If Microsoft key delivery is used for Widevine license delivery, this parameter is ignored. |
-| policy_name |string |Name of a previously registered policy. Optional. |
-| allowed_track_types |enum |SD_ONLY or SD_HD. Controls which content keys are included in a license. |
-| content_key_specs |Array of JSON structures, see the section "Content key specs." |A finer-grained control on which content keys to return. For more information, see the section "Content key specs." Only one of the allowed_track_types and content_key_specs values can be specified. |
-| use_policy_overrides_exclusively |Boolean, true or false |Use policy attributes specified by policy_overrides, and omit all previously stored policy. |
-| policy_overrides |JSON structure, see the section "Policy overrides." |Policy settings for this license. In the event this asset has a predefined policy, these specified values are used. |
-| session_init |JSON structure, see the section "Session initialization." |Optional data is passed to the license. |
-| parse_only |Boolean, true or false |The license request is parsed, but no license is issued. However, values from the license request are returned in the response. |
-
-## Content key specs
-If a pre-existing policy exists, there is no need to specify any of the values in the content key spec. The pre-existing policy associated with this content is used to determine the output protection, such as High-bandwidth Digital Content Protection (HDCP) and the Copy General Management System (CGMS). If a pre-existing policy isn't registered with the Widevine license server, the content provider can inject the values into the license request.
-
-Each content_key_specs value must be specified for all tracks, regardless of the use_policy_overrides_exclusively option.
-
-| Name | Value | Description |
-| | | |
-| content_key_specs. track_type |string |A track type name. If content_key_specs is specified in the license request, make sure to specify all track types explicitly. Failure to do so results in failure to play back past 10 seconds. |
-| content_key_specs <br/> security_level |uint32 |Defines client robustness requirements for playback. <br/> - Software-based white-box cryptography is required. <br/> - Software cryptography and an obfuscated decoder are required. <br/> - The key material and cryptography operations must be performed within a hardware-backed trusted execution environment. <br/> - The cryptography and decoding of content must be performed within a hardware-backed trusted execution environment. <br/> - The cryptography, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware-backed trusted execution environment. |
-| content_key_specs <br/> required_output_protection.hdc |string, one of HDCP_NONE, HDCP_V1, HDCP_V2 |Indicates whether HDCP is required. |
-| content_key_specs <br/>key |Base64-<br/>encoded string |Content key to use for this track. If specified, the track_type or key_id is required. The content provider can use this option to inject the content key for this track instead of letting the Widevine license server generate or look up a key. |
-| content_key_specs.key_id |Base64-encoded string binary, 16 bytes |Unique identifier for the key. |
-
-## Policy overrides
-| Name | Value | Description |
-| | | |
-| policy_overrides&#46;can_play |Boolean, true or false |Indicates that playback of the content is allowed. Default is false. |
-| policy_overrides&#46;can_persist |Boolean, true or false |Indicates that the license might be persisted to nonvolatile storage for offline use. Default is false. |
-| policy_overrides&#46;can_renew |Boolean, true or false |Indicates that renewal of this license is allowed. If true, the duration of the license can be extended by heartbeat. Default is false. |
-| policy_overrides&#46;license_duration_seconds |int64 |Indicates the time window for this specific license. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides&#46;rental_duration_seconds |int64 |Indicates the time window while playback is permitted. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides&#46;playback_duration_seconds |int64 |The viewing window of time after playback starts within the license duration. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides&#46;renewal_server_url |string |All heartbeat (renewal) requests for this license is directed to the specified URL. This field is used only if can_renew is true. |
-| policy_overrides&#46;renewal_delay_seconds |int64 |How many seconds after license_start_time before renewal is first attempted. This field is used only if can_renew is true. Default is 0. |
-| policy_overrides&#46;renewal_retry_interval_seconds |int64 |Specifies the delay in seconds between subsequent license renewal requests, in case of failure. This field is used only if can_renew is true. |
-| policy_overrides&#46;renewal_recovery_duration_seconds |int64 |The window of time in which playback can continue while renewal is attempted, yet unsuccessful due to back-end problems with the license server. A value of 0 indicates that there is no limit to the duration. This field is used only if can_renew is true. |
-| policy_overrides&#46;renew_with_usage |Boolean, true or false |Indicates that the license is sent for renewal when usage starts. This field is used only if can_renew is true. |
-
-## Session initialization
-| Name | Value | Description |
-| | | |
-| provider_session_token |Base64-encoded string |This session token is passed back in the license and exists in subsequent renewals. The session token doesn't persist beyond sessions. |
-| provider_client_token |Base64-encoded string |Client token to send back in the license response. If the license request contains a client token, this value is ignored. The client token persists beyond license sessions. |
-| override_provider_client_token |Boolean, true or false |If false and the license request contains a client token, use the token from the request even if a client token was specified in this structure. If true, always use the token specified in this structure. |
-
-## Configure your Widevine license with .NET
-
-Media Services provides a class that lets you configure a Widevine license. To construct the license, pass JSON to [WidevineTemplate](/dotnet/api/microsoft.azure.management.media.models.contentkeypolicywidevineconfiguration.widevinetemplate#Microsoft_Azure_Management_Media_Models_ContentKeyPolicyWidevineConfiguration_WidevineTemplate).
-
-To configure the template, you can:
-
-### Directly construct a JSON string
-
-This method may be error-prone. It is recommended to use other method, described in [Define needed classes and serialize to JSON](#classes).
-
-```csharp
-ContentKeyPolicyWidevineConfiguration objContentKeyPolicyWidevineConfiguration = new ContentKeyPolicyWidevineConfiguration
-{
- WidevineTemplate = @"{""allowed_track_types"":""SD_HD"",""content_key_specs"":[{""track_type"":""SD"",""security_level"":1,""required_output_protection"":{""hdcp"":""HDCP_V2""}}],""policy_overrides"":{""can_play"":true,""can_persist"":true,""can_renew"":false}}"
-};
-```
-
-### <a id="classes"></a> Define needed classes and serialize to JSON
-
-#### Define classes
-
-The following example shows an example of definitions of classes that map to Widevine JSON schema. You can instantiate the classes before serializing them to JSON string.
-
-```csharp
-/// <summary>
-/// Widevine PolicyOverrides class.
-/// </summary>
-public class PolicyOverrides
-{
- /// <summary>
- /// Gets or sets a value indicating whether playback of the content is allowed. Default is false.
- /// </summary>
- [JsonProperty("can_play")]
- public bool CanPlay { get; set; }
-
- /// <summary>
- /// Gets or sets a value indicating whether the license might be persisted to nonvolatile storage for offline use. Default is false.
- /// </summary>
- [JsonProperty("can_persist")]
- public bool CanPersist { get; set; }
-
- /// <summary>
- /// Gets or sets a value indicating whether renewal of this license is allowed. If true, the duration of the license can be extended by heartbeat. Default is false.
- /// </summary>
- [JsonProperty("can_renew")]
- public bool CanRenew { get; set; }
-
- /// <summary>
- /// Gets or sets the time window while playback is permitted. A value of 0 indicates that there is no limit to the duration. Default is 0.
- /// </summary>
- [JsonProperty("rental_duration_seconds")]
- public int RentalDurationSeconds { get; set; }
-
- /// <summary>
- /// Gets or sets the viewing window of time after playback starts within the license duration. A value of 0 indicates that there is no limit to the duration. Default is 0.
- /// </summary>
- [JsonProperty("playback_duration_seconds")]
- public int PlaybackDurationSeconds { get; set; }
-
- /// <summary>
- /// Gets or sets the time window for this specific license. A value of 0 indicates that there is no limit to the duration. Default is 0.
- /// </summary>
- [JsonProperty("license_duration_seconds")]
- public int LicenseDurationSeconds { get; set; }
-}
-
-/// <summary>
-/// Widevine ContentKeySpec class.
-/// </summary>
-public class ContentKeySpec
-{
- /// <summary>
- /// Gets or sets track type.
- /// If content_key_specs is specified in the license request, make sure to specify all track types explicitly.
- /// Failure to do so results in failure to play back past 10 seconds.
- /// </summary>
- [JsonProperty("track_type")]
- public string TrackType { get; set; }
-
- /// <summary>
- /// Gets or sets client robustness requirements for playback.
- /// Software-based white-box cryptography is required.
- /// Software cryptography and an obfuscated decoder are required.
- /// The key material and cryptography operations must be performed within a hardware-backed trusted execution environment.
- /// The cryptography and decoding of content must be performed within a hardware-backed trusted execution environment.
- /// The cryptography, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware-backed trusted execution environment.
- /// </summary>
- [JsonProperty("security_level")]
- public int SecurityLevel { get; set; }
-
- /// <summary>
- /// Gets or sets the OutputProtection.
- /// </summary>
- [JsonProperty("required_output_protection")]
- public OutputProtection RequiredOutputProtection { get; set; }
-}
-
-/// <summary>
-/// OutputProtection Widevine class.
-/// </summary>
-public class OutputProtection
-{
- /// <summary>
- /// Gets or sets HDCP protection.
- /// Supported values : HDCP_NONE, HDCP_V1, HDCP_V2
- /// </summary>
- [JsonProperty("hdcp")]
- public string HDCP { get; set; }
-
- /// <summary>
- /// Gets or sets CGMS.
- /// </summary>
- [JsonProperty("cgms_flags")]
- public string CgmsFlags { get; set; }
-}
-
-/// <summary>
-/// Widevine template.
-/// </summary>
-public class WidevineTemplate
-{
- /// <summary>
- /// Gets or sets the allowed track types.
- /// SD_ONLY or SD_HD.
- /// Controls which content keys are included in a license.
- /// </summary>
- [JsonProperty("allowed_track_types")]
- public string AllowedTrackTypes { get; set; }
-
- /// <summary>
- /// Gets or sets a finer-grained control on which content keys to return.
- /// For more information, see the section "Content key specs."
- /// Only one of the allowed_track_types and content_key_specs values can be specified.
- /// </summary>
- [JsonProperty("content_key_specs")]
- public ContentKeySpec[] ContentKeySpecs { get; set; }
-
- /// <summary>
- /// Gets or sets policy settings for the license.
- /// In the event this asset has a predefined policy, these specified values are used.
- /// </summary>
- [JsonProperty("policy_overrides")]
- public PolicyOverrides PolicyOverrides { get; set; }
-}
-```
-
-#### Configure the license
-
-Use classes defined in the previous section to create JSON that is used to configure [WidevineTemplate](/dotnet/api/microsoft.azure.management.media.models.contentkeypolicywidevineconfiguration.widevinetemplate#Microsoft_Azure_Management_Media_Models_ContentKeyPolicyWidevineConfiguration_WidevineTemplate):
-
-```csharp
-private static ContentKeyPolicyWidevineConfiguration ConfigureWidevineLicenseTempate()
-{
- WidevineTemplate template = new WidevineTemplate()
- {
- AllowedTrackTypes = "SD_HD",
- ContentKeySpecs = new ContentKeySpec[]
- {
- new ContentKeySpec()
- {
- TrackType = "SD",
- SecurityLevel = 1,
- RequiredOutputProtection = new OutputProtection()
- {
- HDCP = "HDCP_V2"
- }
- }
- },
- PolicyOverrides = new PolicyOverrides()
- {
- CanPlay = true,
- CanPersist = true,
- CanRenew = false,
- RentalDurationSeconds = 2592000,
- PlaybackDurationSeconds = 10800,
- LicenseDurationSeconds = 604800,
- }
- };
-
- ContentKeyPolicyWidevineConfiguration objContentKeyPolicyWidevineConfiguration = new ContentKeyPolicyWidevineConfiguration
- {
- WidevineTemplate = Newtonsoft.Json.JsonConvert.SerializeObject(template)
- };
- return objContentKeyPolicyWidevineConfiguration;
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-Check out how to [protect with DRM](drm-protect-with-drm-tutorial.md)
media-services Encode Autogen Bitrate Ladder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-autogen-bitrate-ladder.md
- Title: Encode videos with Standard Encoder in Media Services
-description: This topic shows how to use the Standard Encoder in Media Services to encode an input video with an auto-generated bitrate ladder, based on the input resolution and bitrate.
------- Previously updated : 09/21/2021----
-# Encode with an auto-generated bitrate ladder
--
-## Overview
-
-This article explains how to use the Standard Encoder in Media Services to encode an input video into an auto-generated bitrate ladder (bitrate-resolution pairs) based on the input resolution and bitrate. This built-in encoder setting, or preset, will never exceed the input resolution and bitrate. For example, if the input is 720p at 3 Mbps, output remains 720p at best, and will start at rates lower than 3 Mbps.
-
-### Encoding for streaming
-
-When you use the **AdaptiveStreaming** or **H265AdaptiveStreaming** preset in **Transform**, you get an output that is suitable for delivery via streaming protocols like HLS and DASH. When using one of these two presets, the service intelligently determines how many video layers to generate and at what bitrate and resolution. The output content contains MP4 files where AAC-encoded audio and either H.264-encoded video (in the case of the AdaptiveStreaming preset) or H.265/HEVC (in the case of the H265AdaptiveStreaming preset). The output MP4 files are non-interleaved.
-
-To see an example of how this preset is used, see [Stream a file](stream-files-dotnet-quickstart.md).
-
-## Output
-
-This section shows three examples of the output video layers produced by the Media Services encoder as a result of encoding with the **AdaptiveStreaming**(H.264) or the **H265AdaptiveStreaming** (HEVC) presets. In all cases, the output contains an audio-only MP4 file with stereo audio encoded at 128 kbps.
-
-### Example 1
-Source with height "1080" and framerate "29.970" produces 6 video layers:
-
-|Layer|Height|Width|Bitrate (kbps)|
-|||||
-|1|1080|1920|6780|
-|2|720|1280|3520|
-|3|540|960|2210|
-|4|360|640|1150|
-|5|270|480|720|
-|6|180|320|380|
-
-### Example 2
-Source with height "720" and framerate "23.970" produces 5 video layers:
-
-|Layer|Height|Width|Bitrate (kbps)|
-|||||
-|1|720|1280|2940|
-|2|540|960|1850|
-|3|360|640|960|
-|4|270|480|600|
-|5|180|320|320|
-
-### Example 3
-Source with height "360" and framerate "29.970" produces 3 video layers:
-
-|Layer|Height|Width|Bitrate (kbps)|
-|||||
-|1|360|640|700|
-|2|270|480|440|
-|3|180|320|230|
--
-## Content-aware encoding comparison
-
-The [content-aware encoding presets](./encode-content-aware-concept.md) offer a better solution over the adaptive streaming presets by analyzing the source content prior to deciding the right set of output bitrates and resolutions to use in the ladder.
-It's recommended to test out the [content-aware encoding presets](./encode-content-aware-concept.md) first before using the more static and fixed ladder provided by the adaptive bitrate streaming presets.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Stream a file](stream-files-dotnet-quickstart.md)
-> [Using the content-aware encoding presets](./encode-content-aware-concept.md)
-> [How to use content-aware encoding](./encode-content-aware-how-to.md)
media-services Encode Basic Encoding Python Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-basic-encoding-python-quickstart.md
- Title: Media Services Python basic encoding quickstart
-description: This quickstart shows you how to do basic encoding with Python and Azure Media Services.
------ Previously updated : 01/25/2022----
-# Media Services basic encoding with Python
--
-## Introduction
-
-This quickstart shows you how to do basic encoding with Python and Azure Media Services. It uses the 2020-05-01 Media Service v3 API.
-
-## Prerequisites
--- If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.-- [Create a resource group](../../azure-resource-manager/management/manage-resource-groups-portal.md#create-resource-groups) to use with this quickstart.-
- > [!IMPORTANT]
- > When you create the storage account for your media services account, change the storage authentication type to *System authentication*. Otherwise, you will get authentication errors for this example.
---- [Create a Media Services v3 account](account-create-how-to.md).-- [Get your storage account key](../../storage/common/storage-account-keys-manage.md#view-account-access-keys).-- [Create a service principal and key](../../purview/create-service-principal-azure.md).-
-## Get the sample
-
-Create a fork and clone the sample located in the [Python samples repository](https://github.com/Azure-Samples/media-services-v3-python). For this quickstart, we're working with the BasicEncoding sample.
-
-## Create the .env file
-
-Get the values from your account to create a *.env* file. Save it without a name and just the extension. Use *sample.env* as a template for your *.env* file. Save the *.env* file to the *BasicEncoding* folder in your local clone.
-
-## Use Python virtual environments
-
-For samples, we recommend you always create and activate a Python virtual environment using the following steps:
-
-1. Open the sample folder in VSCode or other editor.
-2. Create a virtual environment.
-
- ``` bash
- # py -3 uses the global Python interpreter. You can also use python -m venv .venv.
- py -3 -m venv .venv
- ```
-
- This command runs the Python `venv` module and creates a virtual environment in a folder named *.venv*.
-
-3. Activate the virtual environment:
-
- ``` bash
- . .venv/Scripts/activate
- ```
-
- A virtual environment is a folder within a project that isolates a copy of a specific Python interpreter. Once you activate that environment (which Visual Studio Code does automatically), running `pip install` installs a library into that environment only. When you then run your Python code, it runs in the environment's exact context with specific versions of every library. And when you run `pip freeze`, you get the exact list of those libraries. (In many of the samples, you create a requirements.txt file for the libraries you need, then use `pip install -r requirements.txt`. A requirements file is usually needed when you deploy code to Azure.)
-
-## Set up
-
-1. Set up and [configure your local Python dev environment for Azure](/azure/developer/python/configure-local-development-environment).
-
-1. Install the `python-dotenv` library. This will enable you to load the environment variables quickly and easily.
-
- ```bash
- pip install python-dotenv
- ```
-
-1. Install the `azure-identity` library for Python. This module is needed for Azure Active Directory authentication. See the details at [Azure Identity client library for Python](/python/api/overview/azure/identity-readme#environment-variables).
-
- ``` bash
- pip install azure-identity
- ```
-
-1. Install the Python SDK for [Azure Media Services](/python/api/overview/azure/media-services).
-
- The Pypi page for the Media Services Python SDK with latest version details is located at - [azure-mgmt-media](https://pypi.org/project/azure-mgmt-media/).
-
- ``` bash
- pip install azure-mgmt-media
- ```
-
-1. Install the [Azure Storage SDK for Python](https://pypi.org/project/azure-storage-blob/).
-
- ``` bash
- pip install azure-storage-blob
- ```
-
-You can optionally install ALL of the requirements for a given sample by using the *requirements.txt* file in the samples folder.
-
- ``` bash
- pip install -r requirements.txt
- ```
-
-## Try the code
-
-The code below is thoroughly commented. Use the whole script or use parts of it for your own script.
-
-In this sample, a random number gets generated for naming things so you can identify them as a group that gets created together when you ran the script. The random number is optional, and can be removed when you're done testing the script.
-
-We're not using the SAS URL for the input asset in this sample.
-
-[!code-python[Main](../../../media-services-v3-python/BasicEncoding/basic-encoding.py)]
-
-## Delete resources
-
-Once you successfully complete the quickstart, delete the resources created in the resource group.
-
-## Next steps
-
-Get familiar with the [Media Services Python SDK](/python/api/azure-mgmt-media/)
-
-## Resources
--- See the Azure Media Services [management API](/python/api/overview/azure/mediaservices/management).-- Learn how to use the [Storage APIs with Python](/azure/developer/python/azure-sdk-example-storage-use?tabs=cmd)-- Learn more about the [Azure Identity client library for Python](/python/api/overview/azure/identity-readme#environment-variables)-- Learn more about [Azure Media Services v3](./media-services-overview.md).-- Learn about the [Azure Python SDKs](/azure/developer/python)-- Learn more about [usage patterns for Azure Python SDKs](/azure/developer/python/azure-sdk-library-usage-patterns)-- Find more Azure Python SDKs in the [Azure Python SDK index](/azure/developer/python/azure-sdk-library-package-index)-- [Azure Storage Blob Python SDK reference](/python/api/azure-storage-blob/)
media-services Encode Concept Preset Overrides https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-concept-preset-overrides.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Using preset overrides to change transform settings on job submission
-description: This article how to use preset overrides to adjust transform settings on a per-job instance
------- Previously updated : 09/20/2021-----
-# Using preset overrides to control per-job settings
--
-## Transforms and jobs overview
-
-To encode with Media Services v3, you need to create a [Transform](/rest/api/medi).
-
-When encoding or using analytics with Media Services you can define custom presets in a transform to define what settings to use. Sometimes it is required to override the settings on a transform on a per-job basis to avoid having to create a custom transform for every scenario. To override any setting on your transform preset, you can use the preset override property of the [job output asset](/dotnet/api/microsoft.azure.management.media.models.joboutputasset) prior to submitting the job to the transform.
-
-## Preset overrides
-
-Preset overrides allow you the ability to pass in a customized preset that will override the settings supplied to a transform object after it was first created. This property is available on the [job output asset](/dotnet/api/microsoft.azure.management.media.models.joboutputasset) when submitting a new job to a transform.
-
-This can be useful for situations where you need to override some properties of your custom defined transforms, or a property on a built-in preset. For example, consider the scenario where you have created a custom transform that uses the [audio analyzer built-in preset](/rest/api/media/transforms/create-or-update#audioanalyzerpreset), but you initially set up that preset to use the audio language setting of "en-us" for English. This would result in a transform where each job submitted would be sent to the speech-to-text transcription engine as US English only. Every job submitted to that transform would be locked to the "en-us" language setting. You could work around this scenario by having a transform defined for every language, but that would be much more difficult to manage and you could hit transform quota limitations in your account.
-To best solve for this scenario, you use a preset override on the job output asset prior to submitting the job to the transform. You can then define a single "Audio transcription" transform and pass in the required language settings per-job.
-
-The preset override provides you a way to pass in a new custom preset definition with each job submitted to the transform. This property is available on the [job output](/dotnet/api/microsoft.azure.management.media.models.joboutput) entity in all SDK versions based off the 2021-06-01 version of the API.
-
-For reference, see the [presetOverride](https://github.com/Azure/azure-rest-api-specs/blob/ce90f9b45945c73b8f38649ee6ead390ff6efe7b/specification/mediaservices/resource-manager/Microsoft.Media/stable/2021-06-01/Encoding.json#L1960) property on the job output entity in the REST documentation.
-
-> [!NOTE]
-> You can only use preset overrides to override the settings on a defined preset in the transform. You cannot switch from one specific preset to another type. For example, attempting to override a transform created with the built-in content-aware encoding preset to use another preset like the audio analyzer would result in an error message.
--
-## Example of preset override in .NET
-
-A complete example using the .NET SDK for Media Services showing how to use preset override with a basic audio analyzer transform is available in GitHub.
-See the [Analyze a media file with a audio analyzer preset](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/AudioAnalytics/AudioAnalyzer) sample for details on how to use the preset override property of the job output.
-
-## Sample code of preset override in .NET
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/AudioAnalytics/AudioAnalyzer/program.cs#PresetOverride)]
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-* [Upload, encode, and stream using Media Services](stream-files-tutorial-with-api.md).
-* [Encode from an HTTPS URL using built-in presets](job-input-from-http-how-to.md).
-* [Encode a local file using built-in presets](job-input-from-local-file-how-to.md).
-* [Build a custom preset to target your specific scenario or device requirements](transform-custom-transform-how-to.md).
media-services Encode Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Encoding video and audio with Media Services
-description: This article explains about encoding video and audio with Azure Media Services.
------- Previously updated : 08/31/2020-----
-# Encoding video and audio with Media Services
--
-The term encoding in Media Services applies to the process of converting files containing digital video and/or audio from one standard format to another, with the purpose of (a) reducing the size of the files, and/or (b) producing a format that's compatible with a broad range of devices and apps. This process is also referred to as video compression, or transcoding. See the [Data compression](https://en.wikipedia.org/wiki/Data_compression) and the [What Is Encoding and Transcoding?](https://www.streamingmedia.com/Articles/Editorial/What-Is-/What-Is-Encoding-and-Transcoding-75025.aspx) for further discussion of the concepts.
-
-Videos are typically delivered to devices and apps by [progressive download](https://en.wikipedia.org/wiki/Progressive_download) or through [adaptive bitrate streaming](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming).
-
-> [!IMPORTANT]
-> Media Services does not bill for canceled or errored jobs. For example, a job that has reached 50% progress and is canceled is not billed at 50% of the job minutes. You are only charged for finished jobs.
-
-* To deliver by progressive download, you can use Azure Media Services to convert a digital media file (mezzanine) into an [MP4](https://en.wikipedia.org/wiki/MPEG-4_Part_14) file, which contains video that's been encoded with the [H.264](https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC) codec, and audio that's been encoded with the [AAC](https://en.wikipedia.org/wiki/Advanced_Audio_Coding) codec. This MP4 file is written to an Asset in your storage account. You can use the Azure Storage APIs or SDKs (for example, [Storage REST API](../../storage/common/storage-rest-api-auth.md) or [.NET SDK](../../storage/blobs/storage-quickstart-blobs-dotnet.md)) to download the file directly. If you created the output Asset with a specific container name in storage, use that location. Otherwise, you can use Media Services to [list the asset container URLs](/rest/api/media/assets/listcontainersas).
-* To prepare content for delivery by adaptive bitrate streaming, the mezzanine file needs to be encoded at multiple bitrates (high to low). To ensure graceful transition of quality, the resolution of the video is lowered as the bitrate is lowered. This results in a so-called encoding ladderΓÇôa table of resolutions and bitrates (see [auto-generated adaptive bitrate ladder](encode-autogen-bitrate-ladder.md) or use the recommended content aware encoding preset). You can use Media Services to encode your mezzanine files at multiple bitrates. In doing so, you'll get a set of MP4 files and associated streaming configuration files written to an Asset in your storage account. You can then use the [Dynamic Packaging](encode-dynamic-packaging-concept.md) capability in Media Services to deliver the video via streaming protocols like [MPEG-DASH](https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP) and [HLS](https://en.wikipedia.org/wiki/HTTP_Live_Streaming). This requires you to create a [Streaming Locator](stream-streaming-locators-concept.md) and build streaming URLs corresponding to the supported protocols, which can then be handed off to devices/apps based on their capabilities.
-
-The following diagram shows the workflow for on-demand encoding with dynamic packaging.
-
-![Workflow for on-demand encoding with dynamic packaging](./media/encode-dynamic-packaging-concept/media-services-dynamic-packaging.svg)
-
-This topic gives you guidance on how to encode your content with Media Services v3.
-
-## Transforms and jobs
-
-To encode with Media Services v3, you need to create a [Transform](/rest/api/medi).
-
-When encoding with Media Services, you use presets to tell the encoder how the input media files should be processed. In Media Services v3, you use Standard Encoder to encode your files. For example, you can specify the video resolution and/or the number of audio channels you want in the encoded content.
-
-You can get started quickly with one of the recommended built-in presets based on industry best practices or you can choose to build a custom preset to target your specific scenario or device requirements. For more information, see [Encode with a custom Transform](transform-custom-transform-how-to.md).
-
-Starting with January 2019, when encoding with the Standard Encoder to produce MP4 file(s), a new .mpi file is generated and added to the output Asset. This MPI file is intended to improve performance for [dynamic packaging](encode-dynamic-packaging-concept.md) and streaming scenarios.
-
-> [!NOTE]
-> You shouldn't modify or remove the MPI file, or take any dependency in your service on the existence (or not) of such a file.
-
-### Creating job input from an HTTPS URL
-
-When you submit Jobs to process your videos, you have to tell Media Services where to find the input video. One of the options is to specify an HTTPS URL as a job input. Currently, Media Services v3 doesn't support chunked transfer encoding over HTTPS URLs.
-
-#### Examples
-
-* [Encode from an HTTPS URL with .NET](stream-files-dotnet-quickstart.md)
-* [Encode from an HTTPS URL with REST](stream-files-tutorial-with-rest.md)
-* [Encode from an HTTPS URL with CLI](stream-files-cli-quickstart.md)
-* [Encode from an HTTPS URL with Node.js](stream-files-nodejs-quickstart.md)
-
-### Creating job input from a local file
-
-The input video can be stored as a Media Service Asset, in which case you create an input asset based on a file (stored locally or in Azure Blob storage).
-
-#### Examples
-
-[Encode a local file using built-in presets](job-input-from-local-file-how-to.md)
-
-### Creating job input with subclipping
-
-When encoding a video, you can specify to also trim or clip the source file and produce an output that has only a desired portion of the input video. This functionality works with any [Transform](/rest/api/media/transforms) that's built using either the [BuiltInStandardEncoderPreset](/rest/api/media/transforms/createorupdate#builtinstandardencoderpreset) presets, or the [StandardEncoderPreset](/rest/api/media/transforms/createorupdate#standardencoderpreset) presets.
-
-You can specify to create a [Job](/rest/api/media/jobs/create) with a single clip of a video on-demand or live archive (a recorded event). The job input could be an Asset or an HTTPS URL.
-
-> [!TIP]
-> If you want to stream a sublip of your video without re-encoding the video, consider using [Pre-filtering manifests with Dynamic Packager](filters-dynamic-manifest-concept.md).
-
-#### Examples
-
-See examples:
-
-* [Subclip a video with .NET](transform-subclip-video-how-to.md)
-
-## Built-in presets
-
-Media Services supports the following built-in encoding presets:
-
-### BuiltInStandardEncoderPreset
-
-[BuiltInStandardEncoderPreset](/rest/api/media/transforms/createorupdate#builtinstandardencoderpreset) is used to set a built-in preset for encoding the input video with the Standard Encoder.
-
-The following built-in presets are currently supported:
--- **EncoderNamedPreset.AACGoodQualityAudio**: Produces a single MP4 file containing only stereo audio encoded at 192 kbps.-- **EncoderNamedPreset.AdaptiveStreaming** (recommended): This supports H.264 adaptive bitrate encoding. For more information, see [auto-generating a bitrate ladder](encode-autogen-bitrate-ladder.md).-- **EncoderNamerPreset.H265AdaptiveStreaming** : Similar to the AdaptiveStreaming preset, but uses the HEVC (H.265) codec. Produces a set of GOP aligned MP4 files with H.265 video and stereo AAC audio. Auto-generates a bitrate ladder based on the input resolution, bitrate and frame rate. The auto-generated preset will never exceed the input resolution. For example, if the input is 720p, output will remain 720p at best.-- **EncoderNamedPreset.ContentAwareEncoding**: Exposes a preset for H.264 content-aware encoding. Produces a set of GOP-aligned MP4s by using content-aware encoding. Given any input content, the service performs an initial lightweight analysis of the input content, and uses the results to determine the optimal number of layers, appropriate bitrate and resolution settings for delivery by adaptive streaming. This preset is particularly effective for low and medium complexity videos, where the output files will be at lower bitrates but at a quality that still delivers a good experience to viewers. The output will contain MP4 files with video and audio interleaved. This preset only produces output up to 1080P HD. If 4K output is required, you can configure the preset with [PresetConfigurations](https://github.com/Azure/azure-rest-api-specs/blob/7026463801584950d4ccbaa6b05b15d29555dd3).-- **EncoderNamedPreset.H265ContentAwareEncoding**: Exposes a preset for HEVC (H.265) content-aware encoding. Produces a set of GOP-aligned MP4s by using content-aware encoding. Given any input content, the service performs an initial lightweight analysis of the input content, and uses the results to determine the optimal number of layers, appropriate bitrate and resolution settings for delivery by adaptive streaming. This preset is particularly effective for low and medium complexity videos, where the output files will be at lower bitrates but at a quality that still delivers a good experience to viewers. The output will contain MP4 files with video and audio interleaved. This preset produces output up to 4K HD. If 8K output is required, you can configure the preset with [PresetConfigurations](https://github.com/Azure/azure-rest-api-specs/blob/7026463801584950d4ccbaa6b05b15d29555dd3a/specification/mediaservices/resource-manager/Microsoft.Media/stable/2021-06-01/Encoding.json#L2397) by using the "maxBitrateBps" property.
- > [!NOTE]
- > Make sure to use **ContentAwareEncoding** not ContentAwareEncodingExperimental which is now deprecated
--- **EncoderNamedPreset.H264MultipleBitrate1080p**: produces a set of eight GOP-aligned MP4 files, ranging from 6000 kbps to 400 kbps, and stereo AAC audio. Resolution starts at 1080p and goes down to 360p.-- **EncoderNamedPreset.H264MultipleBitrate720p**: produces a set of six GOP-aligned MP4 files, ranging from 3400 kbps to 400 kbps, and stereo AAC audio. Resolution starts at 720p and goes down to 360p.-- **EncoderNamedPreset.H264MultipleBitrateSD**: produces a set of five GOP-aligned MP4 files, ranging from 1600 kbps to 400 kbps, and stereo AAC audio. Resolution starts at 480p and goes down to 360p.-- **EncoderNamedPreset.H264SingleBitrate1080p**: produces an MP4 file where the video is encoded with H.264 codec at 6750 kbps and a picture height of 1080 pixels, and the stereo audio is encoded with AAC-LC codec at 64 kbps.-- **EncoderNamedPreset.H264SingleBitrate720p**: produces an MP4 file where the video is encoded with H.264 codec at 4500 kbps and a picture height of 720 pixels, and the stereo audio is encoded with AAC-LC codec at 64 kbps.-- **EncoderNamedPreset.H264SingleBitrateSD**: produces an MP4 file where the video is encoded with H.264 codec at 2200 kbps and a picture height of 480 pixels, and the stereo audio is encoded with AAC-LC codec at 64 kbps.-- **EncoderNamedPreset.H265SingleBitrate720P**: produces an MP4 file where the video is encoded with HEVC (H.265) codec at 1800 kbps and a picture height of 720 pixels, and the stereo audio is encoded with AAC-LC codec at 128 kbps.-- **EncoderNamedPreset.H265SingleBitrate1080p**: produces an MP4 file where the video is encoded with HEVC (H.265) codec at 3500 kbps and a picture height of 1080 pixels, and the stereo audio is encoded with AAC-LC codec at 128 kbps.-- **EncoderNamedPreset.H265SingleBitrate4K**: produces an MP4 file where the video is encoded with HEVC (H.265) codec at 9500 kbps and a picture height of 2160 pixels, and the stereo audio is encoded with AAC-LC codec at 128 kbps.-
-To see the most up-to-date presets list, see [built-in presets to be used for encoding videos](/rest/api/media/transforms/createorupdate#encodernamedpreset).
-
-To see how the presets are used, see [Uploading, encoding, and streaming files](stream-files-tutorial-with-api.md).
-
-### StandardEncoderPreset
-
-[StandardEncoderPreset](/rest/api/media/transforms/createorupdate#standardencoderpreset) describes settings to be used when encoding the input video with the Standard Encoder. Use this preset when customizing Transform presets.
-
-#### Considerations
-
-When creating custom presets, the following considerations apply:
--- All values for height and width on AVC content must be a multiple of four.-- In Azure Media Services v3, all of the encoding bitrates are in bits per second. This is different from the presets with our v2 APIs, which used kilobits/second as the unit. For example, if the bitrate in v2 was specified as 128 (kilobits/second), in v3 it would be set to 128000 (bits/second).-
-### Customizing presets
-
-Media Services fully supports customizing all values in presets to meet your specific encoding needs and requirements. For examples that show how to customize encoder presets, see the list below:
-
-#### Examples
--- [Customize presets with .NET](transform-custom-transform-how-to.md)-
-## Preset schema
-
-In Media Services v3, presets are strongly typed entities in the API itself. You can find the "schema" definition for these objects in [Open API Specification (or Swagger)](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01). You can also view the preset definitions (like **StandardEncoderPreset**) in the [REST API](/rest/api/media/transforms/createorupdate#standardencoderpreset), [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.standardencoderpreset) (or other Media Services v3 SDK reference documentation).
-
-## Scaling encoding in v3
-
-To scale media processing, see [Scale with CLI](media-reserved-units-how-to.md).
-For accounts created with the **2020-05-01** or later version of the API or through the Azure portal, scaling and media reserved units are no longer required. Scaling will be automatic and handled by the service internally.
-
-## Billing
-
-Media Services does not bill for canceled or errored jobs. For example, a job that has reached 50% progress and is canceled is not billed at 50% of the job minutes. You are only charged for finished jobs.
-
-For more information, see [pricing](https://azure.microsoft.com/pricing/details/media-services/).
media-services Encode Content Aware Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-content-aware-concept.md
- Title: Content-aware encoding preset
-description: This article discusses content-aware encoding in Microsoft Azure Media Services v3.
------ Previously updated : 09/16/2021----
-# Content-aware encoding
--
-## Overview of the content-aware encoding preset
-
-To prepare content for delivery using [adaptive bitrate streaming](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming), video needs to be encoded at multiple bit-rates (high to low) and multiple resolutions. This technique allows today's modern video players on Apple iOS, Android, Windows, and Mac to use streaming protocols that smoothly stream content without buffering. These different renditions of display size (resolution) and quality (bitrate) allows the player to select the best version of the video that the current network conditions can support. The network could vary widely from LTE, 4G, 5G, public Wi-Fi, or a home network.
-
-The process of encoding content into multiple renditions requires the generation of an "encoding ladder" ΓÇô a table of resolutions and bitrates that tells the encoder what to generate. For an example of such a ladder, see the Media Services [built-in encoding presets](/rest/api/media/transforms/createorupdate#encodernamedpreset).
-
-In ideal conditions, you want to be aware of the type of content you are encoding. Using this information you can tune the encoding ladder to match the complexity and motion in your source video. This means that at each display size (resolution) in the ladder, there should be a bitrate beyond which any increase in quality is not perceptive ΓÇô the encoder operates at this optimal bitrate value.
-
-The next level of optimization that can be made is to select the resolutions based on the content ΓÇô for example, a video of a PowerPoint presentation with small text would look blurry when encoded below 720 pixel lines in height. In addition, you may also have a video that changes motion and complexity throughout based on how it was shot and edited. This provides an opportunity to tune and adjust the encoding settings at each scene or shot boundary. A smart encoder can be tasked to optimize encoding settings for each shot within the video.
-
-Azure Media Services provides an [Adaptive Streaming](encode-autogen-bitrate-ladder.md) preset that partially addresses the problem of the variability in the bitrate and resolution of the source videos. However, this preset does not analyze the source content to see how complex it is, or how much motion it contains.
-
-The content-aware encoding preset improves on the more static "adaptive bitrate streaming" encoding preset by adding logic that allows the encoder to seek an optimal bitrate value for a given resolution, but without requiring extensive computational analysis. This preset outputs a uniques "ladder" of GOP-aligned MP4s based on the source file. Given a source video, the preset performs an initial fast analysis of the input content and uses the results to determine the optimal number of layers, bitrate, and resolutions needed to deliver the highest-quality adaptive bitrate streaming experience. This preset is effective with low-to-medium complexity videos, where the output files will be at lower bitrates than the more static Adaptive Streaming preset but at a quality that still delivers a good experience to audiences. The output folder will contain several MP4 files with video and audio ready for streaming.
-
-## Configure output settings
-
-In addition, developers can also control the range of outputs that the content-aware encoding preset uses when deciding the optimal settings for encoding the adaptive bitrate streaming ladder.
-
-By using the **PresetConfigurations** class, developers can pass in a set of constraints and options to the content-aware encoding preset to control the resulting files generated by the encoder. The properties are especially useful for situations where you want to limit all encoding to a specific maximum resolution to control the experience or costs of your encoding jobs. It is also useful to be able to control the maximum and minimum bitrates that your audience may be able to support on a mobile network or in a global region that has bandwidth constraints.
-
-## Supported codecs
-
-The content-aware encoding preset is available for use with the following codecs:
-- H.264-- HEVC (H.265)-
-## How-to use
-
-See the [content-aware encoding how-to](./encode-content-aware-How-to.md) for details on using the preset in your code and links to complete samples.
-
-## Technical details on content-aware preset
-
-Lets now dig a bit deeper into how the content-aware encoding preset works. following sample graphs show the comparison using quality metrics like [PSNR](https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio) and [VMAF](https://en.wikipedia.org/wiki/Video_Multimethod_Assessment_Fusion). The source was created by concatenating short clips of high complexity shots from movies and TV shows, intended to stress the encoder. By definition, this preset produces results that vary from content to content ΓÇô it also means that for some content, there may not be significant reduction in bitrate or improvement in quality.
-
-![Rate-distortion (RD) curve using PSNR](media/encode-content-aware-concept/msrv1.png)
-
-**Figure 1: Rate-distortion (RD) curve using PSNR metric for high complexity source**
-
-![Rate-distortion (RD) curve using VMAF](media/encode-content-aware-concept/msrv2.png)
-
-**Figure 2: Rate-distortion (RD) curve using VMAF metric for high complexity source**
-
-Below are the results for another category of source content, where the encoder was able to determine that the input was of poor quality (many compression artifacts because of the low bitrate). With the content-aware preset, the encoder decided to produce just one output layer ΓÇô at a low enough bitrate so that most clients would be able to play the stream without stalling.
-
-![RD curve using PSNR](media/encode-content-aware-concept/msrv3.png)
-
-**Figure 3: RD curve using PSNR for low-quality input (at 1080p)**
-
-![RD curve using VMAF](media/encode-content-aware-concept/msrv4.png)
-
-**Figure 4: RD curve using VMAF for low-quality input (at 1080p)**
-
-
-## Next steps
-* [How to use the content-aware encoding presets](encode-content-aware-how-to.md)
-* [Tutorial: Upload, encode, and stream videos with Media Services v3](stream-files-tutorial-with-api.md)
media-services Encode Content Aware How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-content-aware-how-to.md
- Title: How to use the content-aware encoding preset
-description: This article discusses how to use the content-aware encoding preset in Microsoft Azure Media Services v3.
------ Previously updated : 09/17/2021----
-# How to use the content-aware encoding preset
--
-## Introduction
-The [content-aware encoding presets](encode-content-aware-concept.md) better prepare content for delivery by [adaptive bitrate streaming](encode-autogen-bitrate-ladder.md), where videos need to encode at multiple bitrates.It includes custom logic that lets the encoder render the optimal bitrate value for a given resolution without requiring extensive computational analysis. The preset determines the optimal number of layers, appropriate bitrate and resolution settings for delivery by adaptive streaming.
-
-Compared to the adaptive streaming preset, content-aware encoding is more effective for low and medium complexity videos, where the output files will be at a lower bitrate but still at a quality that delivers a good viewing experience.
-
-### Using the content-aware encoding presets
-
-To encode with the content-aware preset in your code, use the [BuiltInStandardEncoderPreset](/dotnet/api/microsoft.azure.management.media.models.builtinstandardencoderpreset) object and set the [PresetName](/dotnet/api/microsoft.azure.management.media.models.builtinstandardencoderpreset.presetname) property to one of the following built-in preset names.
--- **ContentAwareEncoding** - this preset supports H.264.-- **H265ContentAwareEncoding** - this preset supports HEVC (H.265)-
-> [!NOTE]
-> Make sure to use the **ContentAwareEncoding** preset not ContentAwareEncodingExperimental. Or, if you would like to encode with HEVC, you can use **H265ContentAwareEncoding**.
-
-```csharp
-TransformOutput[] output = new TransformOutput[]
-{
- new TransformOutput
- {
- // The preset for the Transform is set to one of Media Services built-in sample presets.
- // You can customize the encoding settings by changing this to use "StandardEncoderPreset" class.
- Preset = new BuiltInStandardEncoderPreset()
- {
- // This sample uses the new preset for content-aware encoding
- PresetName = EncoderNamedPreset.ContentAwareEncoding
- }
- }
-};
-```
-
-## Configure output settings
-
-In addition, developers can also control the range of outputs that the content-aware encoding preset uses when deciding the optimal settings for encoding the adaptive bitrate streaming ladder.
-
-By using the **PresetConfigurations** class, developers can pass in a set of constraints and options to the content-aware encoding preset to control the resulting files generated by the encoder. The properties are especially useful for situations where you want to limit all encoding to a specific maximum resolution to control the experience or costs of your encoding jobs. It is also useful to be able to control the maximum and minimum bitrates that your audience may be able to support on a mobile network or in a global region that has bandwidth constraints.
-
-The PresetConfigurations class allows you to adjust the following settings on the content-aware encoding presets.
--
-| Property | Description |
-| - | -- |
-| complexity | Speed, Balanced, or Quality. Allows you to configure the encoder settings to control the balance between speed and quality. Example: set Complexity as Speed for faster encoding but less compression efficiency |
-| interleaveOutput | Controls how the output MP4 file is formatted. They can be interleaved with both audio and video to make them easier to share individually, or they can be separated audio and video files to make it easier to late-bind extra language audio tracks or modify the audio later. |
-| keyFrameIntervalInSeconds | Sets the key-frame distance used for each group of pictures (GOP) when encoding the files. Typical modern adaptive streaming protocols use 2 seconds. Depending on what network conditions you are delivering over, it can sometimes help to use longer intervals. Check with your CDN provider or do some experimentation on your own networks. |
-| maxBitrateBps | Constrains the top bitrate of the adaptive streaming ladder. This constraint is useful for controlling storage costs and network delivery costs. In addition, this can also keep your delivered bitrates within a range that is supported by your clients. |
-| maxHeight | The top resolution that is allowed to be encoded in the adaptive ladder. This is useful for also controlling costs, or controlling the experience of your audience. You can limit all encodes to be Standard Definition, or limit them to be max 720P if desired. |
-| maxLayers | This property controls the number of layers in the adaptive streaming ladder. This also by definition controls the number of files output as MP4 into the storage container. Lowering this value reduces costs as well, since you are producing fewer output renditions, the total number of encoding minutes will be reduced. use this in combination with maxHeight and maxBitrateBps to keep costs in control during encoding and make your billing more predictable. |
-| minBitrateBps | This controls the lowest bitrate that the encoder will choose. Use this to optimize for quality on your low bitrate and low-resolution renditions. |
-| minHeight | This setting controls the smallest resolution that the adaptive bitrate ladder will produce. This helps to avoid clients selecting a very low-resolution rendition that could be too blurry for the type of content that you are encoding. This helps a lot with the quality of experience you are delivering. |
-|
-
-### Example code for configuring content-aware preset
-
-The following code snippet is from the sample [Encode with content-aware, H.246, and constraints](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264_ContentAware_Constrained).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/VideoEncoding/Encoding_H264_ContentAware_Constrained/Program.cs#PresetConfigurations)]
-
-## Samples
-
-The following samples demonstrate the various codecs and constraints that can be used, how to instantiate the preset, and how to submit and monitor an encoding job.
-
-### .NET
-
-| Sample | Description|
-|||
-| [Encode with content-aware and H.246](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264_ContentAware) | Demonstrates the most basic use of H.264 content-aware encoding without any constraints |
-| [Encode with content-aware, H.246, and constraints](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264_ContentAware_Constrained) | Demonstrates how to use the PresetConfigurations class to constrain the output behavior of the preset|
-| [Encode with content-aware and HEVC (H.265)](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_HEVC_ContentAware) | Shows basic usage of the HEVC codec with content-aware encoding and no constraints. The PresetConfigurations class is also supported for HEVC and can be added to this sample|
-
-> [!NOTE]
-> Encoding jobs using the `ContentAwareEncoding` preset are being billed based on solely the output minutes. AMS uses two-pass encoding and there are not any additional charges associated with using any of the presets beyond what is listed on our [pricing page](https://azure.microsoft.com/pricing/details/media-services/#overview).
-
-## Next steps
-
-* [Tutorial: Upload, encode, and stream videos with Media Services v3](stream-files-tutorial-with-api.md)
-* [Tutorial: Encode a remote file based on URL and stream the video - REST](stream-files-tutorial-with-rest.md)
-* [Tutorial: Encode a remote file based on URL and stream the video - CLI](stream-files-cli-quickstart.md)
-* [Tutorial: Encode a remote file based on URL and stream the video - .NET](stream-files-dotnet-quickstart.md)
-* [Tutorial: Encode a remote file based on URL and stream the video - Node.js](stream-files-nodejs-quickstart.md)
media-services Encode Dynamic Packaging Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-dynamic-packaging-concept.md
- Title: Dynamic packaging in Azure Media Services v3
-description: This article gives an overview of dynamic packaging in Azure Media Services.
------ Previously updated : 09/30/2020 ---
-# Dynamic packaging in Media Services v3
--
-Azure Media Services provides built-in origin server and packaging capabilities to deliver content in HLS and MPEG DASH streaming protocol formats. In AMS, the [streaming endpoint](stream-streaming-endpoint-concept.md) acts as the "origin" server sending formatted HLS and DASH content to client players that support adaptive bitrate streaming using those popular formats. The Streaming Endpoint also supports many features such as just-in-time, dynamic packaging with or without content protection, to reach all major devices (like iOS and Android devices).
-
-Most browsers and mobile devices on the market today support and understand the HLS or DASH streaming protocols. For example, iOS requires streams to be delivered in HTTP Live Streaming (HLS) format and Android devices support HLS as well as MPEG DASH on certain models (or through the use of the application level player [Exoplayer](https://exoplayer.dev/) for Android devices.
-
-In Media Services, a [streaming endpoint](stream-streaming-endpoint-concept.md) (origin) represents a dynamic (just-in-time) packaging and origin service that can deliver your live and on-demand content directly to a client player app. It uses one of the common streaming media protocols mentioned in the following section. *Dynamic packaging* is a feature that comes standard on all streaming endpoints.
-
-The advantages of just-in-time packaging are the following:
-
-* You can store all your files in standard MP4 file format
-* You do not need to store multiple copies of static packaged HLS and DASH formats in blob storage, reducing the amount of video content stored and lowering your overall costs of storage
-* You can instantly take advantage of new protocol updates and changes to the specifications as they evolve over time without need of re-packaging the static content in your catalog
-* You can deliver content with or without encryption and DRM using the same MP4 files in storage
-* You can dynamically filter or alter the manifests with simple asset-level or global filters to remove specific tracks, resolutions, languages, or provide shorter highlight clips from the same MP4 files without re-encoding or re-rendering the content.
-
-## To prepare your source files for delivery
-
-To take advantage of dynamic packaging, you need to [encode](encode-concept.md) your mezzanine (source) file into a set of single or multiple bitrate MP4 (ISO Base Media 14496-12) files. You need to have an [asset](assets-concept.md) with the encoded MP4 and streaming configuration files needed by Media Services dynamic packaging. From this set of MP4 files, you can use dynamic packaging to deliver video via the streaming media protocols described below.
-
-Typically, you will use the Azure Media Services standard encoder to generate this content using the Content Aware Encoding presets, or the Adaptive Bitrate presets. Both generate a set of MP4 files ready for streaming and dynamic packaging. Alternatively, you can choose to encode in an external service, on-premises, or on your own VM's or serverless function apps. Content encoded externally can be uploaded into an asset for streaming provided that it meets the encoding requirements for adaptive bitrate streaming formats. An example project of uploading a pre-encoded MP4 for streaming is available in the .NET SDK samples - see [Stream Existing Mp4 files](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamExistingMp4).
--
-Azure Media Services dynamic packaging only supports video and audio file in the MP4 container format. Audio files must be encoded into an MP4 container as well when using alternate codecs like Dolby.
-
-> [!TIP]
-> One way to get the MP4 and streaming configuration files is to [encode your mezzanine file with Media Services](#encode-to-adaptive-bitrate-mp4s). We recommend using the [content aware encoding preset](encode-content-aware-concept.md) to generate the best adaptive streaming layers and settings for your content. See code samples for encoding with the [.NET SDK in the VideoEncoding folder](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding)
-
-To make videos in the encoded asset available to clients for playback, you have to publish the asset using a [Streaming Locator](stream-streaming-locators-concept.md) and build the appropriate HLS and DASH streaming URLs. By changing the protocol used on the URL format query, the service will delivery the appropriate streaming manifest (HLS, MPEG DASH.)
-
-As a result, you only need to store and pay for the files in single storage format (MP4) and Media Services will generate and serve the appropriate HLS or DASH manifests based on requests from your client players.
-
-If you plan to protect your content by using Media Services dynamic encryption, see [Streaming protocols and encryption types](drm-content-protection-concept.md#streaming-protocols-and-encryption-types).
-
-## Deliver HLS
-### HLS dynamic packaging
-
-Your streaming client can specify the following HLS formats. We recommend using the CMAF format for compatibility with the latest players and iOS devices. For legacy devices, the v4 and v3 formats are available as well by simply changing the format query string.
-
-|Protocol| Format string| Example|
-||||
-|HLS CMAF (recommended)| format=m3u8-cmaf | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=m3u8-cmaf)`|
-|HLS V4 | format=m3u8-aapl | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=m3u8-aapl)`|
-|HLS V3 | format=m3u8-aapl-v3 | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=m3u8-aapl-v3)`|
--
-> [!NOTE]
-> Previous guidelines from Apple recommended that the fallback for low bandwidth networks was to provide an audio-only stream. At present, the Media Services encoder automatically generates an audio-only track. Apple guidelines now state that the audio-only track should *not* be included, especially for Apple TV distribution. In order to prevent the player from defaulting to an audio-only track, we suggest using the ΓÇ£audio-only=falseΓÇ¥ tag in the URL which removes audio-only rendition in HLS, or simply use HLS-V3. For example, `http://host/locator/asset.ism/manifest(format=m3u8-aapl,audio-only=false)`.
--
-### HLS packing ratio for VOD
-
-To control the packing ratio of VOD content for older HLS formats, you can set the **fragmentsPerHLSSegment** metadata tag in the .ism file to control the default 3:1 packing ratio for TS segments delivered from the older v3 and v4 HLS format manifests. This setting change requires you to directly modify the .ism file in storage to adjust the packing ratio.
-
-Example .ism server manifest with **fragmentsPerHLSSegment** set to 1.
-``` xml
- <?xml version="1.0" encoding="utf-8" standalone="yes"?>
- <smil xmlns="http://www.w3.org/2001/SMIL20/Language">
- <head>
- <meta name="formats" content="mp4" />
- <meta name="fragmentsPerHLSSegment" content="1"/>
- </head>
- <body>
- <switch>
- ...
- </switch>
- </body>
- </smil>
-```
-
-## Deliver DASH
-### DASH dynamic packaging
-
-Your streaming client can specify the following MPEG-DASH formats:
-
-|Protocol| Format string| Example|
-||||
-|MPEG-DASH CMAF (recommended)| format=mpd-time-cmaf | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=mpd-time-cmaf)` |
-|MPEG-DASH CSF (legacy)| format=mpd-time-csf | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=mpd-time-csf)` |
-
-## Deliver Smooth Streaming manifests
-### Smooth Streaming dynamic packaging
-
-Your streaming client can specify the following Smooth Streaming formats:
-
-|Protocol|Notes/examples|
-|||
-|Smooth Streaming| `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest`|
-|Smooth Streaming 2.0 (legacy manifest)|By default, Smooth Streaming manifest format contains the repeat tag (r-tag). However, some players do not support the `r-tag`. Clients with these players can use a format that disables the r-tag:<br/><br/>`https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=fmp4-v20)`|
-
-> [!NOTE]
-> Smooth Streaming requires that both audio and video should be present in your stream.
-
-## On-demand streaming workflow
-
-The following steps show a common Media Services streaming workflow where dynamic packaging is used along with the Standard Encoder in Azure Media Services.
-
-1. [Upload an input file](job-input-from-http-how-to.md) such as a MP4, QuickTime/MOV, or other supported file format. This file is also referred to as the mezzanine or source file. For the list of supported formats, see [Formats Supported by the Standard Encoder](encode-media-encoder-standard-formats-reference.md).
-1. [Encode](#encode-to-adaptive-bitrate-mp4s) your mezzanine file into an H.264/AAC MP4 adaptive bitrate set.
-
- If you already have encoded files and just want to copy and stream the files, use: [CopyVideo](/rest/api/media/transforms/createorupdate#copyvideo) and [CopyAudio](/rest/api/media/transforms/createorupdate#copyaudio) APIs. A new MP4 file with a streaming manifest (.ism file) will be created as a result.
-
- In addition, you can just generate the .ism and .ismc file on a pre-encoded file, as long as it is encoded using the right settings for adaptive bitrate streaming (this is typically 2-second GOPs, Key frame distances of 2s min and max, and Constant Bitrate (CBR) mode encoding.)
-
- See the [stream existing Mp4 .NET SDK sample](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamExistingMp4) for details on how to generate the .ism (server manifest) and .ismc (client manifests) for streaming from an existing, pre-encoded MP4 file.
-
-1. Publish the output asset that contains the adaptive bitrate MP4 set. You publish by creating a [streaming locator](stream-streaming-locators-concept.md).
-1. Build URLs that target different formats (HLS, MPEG-DASH, and Smooth Streaming). The *streaming endpoint* would take care of serving the correct manifest and requests for all these different formats.
-
-The following diagram shows the on-demand streaming with dynamic packaging workflow.
-
-![Diagram of a workflow for on-demand streaming with dynamic packaging](./media/encode-dynamic-packaging-concept/media-services-dynamic-packaging.svg)
-
-The download path is present in the above image just to show you that you can download an MP4 file directly through the *streaming endpoint* (origin) (you specify the downloadable [streaming policy](stream-streaming-policy-concept.md) on the streaming locator).<br/>The dynamic packager is not altering the file. You can optionally use the Azure blob storage APIs to access an MP4 directly for progressive downloading if you wish to bypass the *streaming endpoint* (origin) features.
-
-### Encode to adaptive bitrate MP4s
-
-The following articles show examples of [how to encode a video with Media Services](encode-concept.md):
-
-* [Use content aware encoding](encode-content-aware-concept.md).
-* [Encode from an HTTPS URL by using built-in presets](job-input-from-http-how-to.md).
-* [Encode a local file by using built-in presets](job-input-from-local-file-how-to.md).
-* [Build a custom preset to target your specific scenario or device requirements](transform-custom-transform-how-to.md).
-* [Code samples for encoding with Standard Encoder using .NET](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding)
-
-See the list of supported Standard Encoder input [formats and codecs](encode-media-encoder-standard-formats-reference.md).
-
-## Live streaming workflow
-
-A live event can be set to either a *pass-through* (an on-premises live encoder sends a multiple bitrate stream) or *live encoding* (an on-premises live encoder sends a single bitrate stream).
-
-Here's a common workflow for live streaming with *dynamic packaging*:
-
-1. Create a [live event](live-event-outputs-concept.md).
-1. Get the ingest URL and configure your on-premises encoder to use the URL to send the contribution feed.
-1. Get the preview URL and use it to verify that the input from the encoder is being received.
-1. Create a new asset.
-1. Create a live output and use the asset name that you created.<br />The live output archives the stream into the asset.
-1. Create a streaming locator with the built-in streaming policy types.<br />If you intend to encrypt your content, review [Content protection overview](drm-content-protection-concept.md).
-1. List the paths on the streaming locator to get the URLs to use.
-1. Get the host name for the streaming endpoint you want to stream from.
-1. Build URLs that target different formats (HLS, MPEG-DASH, and Smooth Streaming). The *streaming endpoint* takes care of serving the correct manifest and requests for the different formats.
-
-This diagram shows the workflow for live streaming with *dynamic packaging*:
-
-![Diagram of a workflow for pass-through encoding with dynamic packaging](./media/live-streaming/pass-through.svg)
-
-For information about live streaming in Media Services v3, see [Live streaming overview](stream-live-streaming-concept.md).
-
-## Video codecs supported by Dynamic Packaging
-
-Dynamic packaging supports video files that are in the MP4 container file format and contain video that is encoded with [H.264](https://en.m.wikipedia.org/wiki/H.264/MPEG-4_AVC) (MPEG-4 AVC or AVC1) or [H.265](https://en.m.wikipedia.org/wiki/High_Efficiency_Video_Coding) (HEVC, hev1, or hvc1).
-
-> [!NOTE]
-> Resolutions of up to 4K and frame rates of up to 60 frames/second have been tested with *dynamic packaging*.
-
-## Audio codecs supported by dynamic packaging
-
-Dynamic packaging also supports audio files that are stored in the MP4 file container format containing encoded audio stream in one of the following codecs:
-
-* [AAC](https://en.wikipedia.org/wiki/Advanced_Audio_Coding) (AAC-LC, HE-AAC v1, or HE-AAC v2).
-* [Dolby Digital Plus](https://en.wikipedia.org/wiki/Dolby_Digital_Plus) (Enhanced AC-3 or E-AC3). The encoded audio must be stored in the MP4 container format to work with Dynamic Packaging.
-* Dolby Atmos
-
- Streaming Dolby Atmos content is supported for standards like the MPEG-DASH protocol with either Common Streaming Format (CSF) or Common Media Application Format (CMAF) fragmented MP4, and via HTTP Live Streaming (HLS) with CMAF.
-* [DTS](https://en.wikipedia.org/wiki/DTS_%28sound_system%29)<br />
- DTS codecs supported by DASH-CSF, DASH-CMAF, HLS-M2TS, and HLS-CMAF packaging formats are:
-
- * DTS Digital Surround (dtsc)
- * DTS-HD High Resolution and DTS-HD Master Audio (dtsh)
- * DTS Express (dtse)
- * DTS-HD Lossless (no core) (dtsl)
-
-Dynamic packaging supports multiple audio tracks with DASH or HLS (version 4 or later) for streaming assets that have multiple audio tracks with multiple codecs and languages.
-
-For all of the above audio codecs, the encoded audio must be stored in the MP4 container format to work with Dynamic Packaging. The service does not support raw elementary stream file formats on blob storage (for example the following would not be supported - .dts, .ac3.)
-
-Only files with the .mp4 of .mp4a extension are supported for audio packaging.
-
-### Limitations
-
-#### iOS limitation on AAC 5.1 audio
-
-Apple iOS devices do not support 5.1 AAC audio codec. Multi-channel audio must be encoded using Dolby Digital or Dolby Digital Plus codecs.
-
-For detailed information, see [HLS authoring specification for apple devices](https://developer.apple.com/documentation/http_live_streaming/hls_authoring_specification_for_apple_devices).
-
-> [!NOTE]
-> Media Services does not support encoding of Dolby Digital, Dolby Digital Plus or Dolby Digital Plus with Dolby Atmos multi-channel audio formats.
-
-#### Dolby Digital audio
-
-Media Services dynamic packaging does not currently support files that contain [Dolby Digital](https://en.wikipedia.org/wiki/Dolby_Digital) (AC3) audio (as this is considered a legacy codec by Dolby).
-
-## Manifests
-
-In Media Services *dynamic packaging*, the streaming client manifests for HLS, MPEG-DASH, and Smooth Streaming are dynamically generated based on the **format** query in the URL.
-
-A manifest file includes streaming metadata such as track type (audio, video, or text), track name, start and end time, bitrate (qualities), track languages, presentation window (sliding window of fixed duration), and video codec (FourCC). It also instructs the player to retrieve the next fragment by providing information about the next playable video fragments that are available and their location. Fragments (or segments) are the actual "chunks" of video content.
-
-### Examples
-
-#### HLS
-
-Here's an example of an HLS manifest file, also called an HLS master playlist:
-
-```
-#EXTM3U
-#EXT-X-VERSION:4
-#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio",NAME="aac_eng_2_128041_2_1",LANGUAGE="eng",DEFAULT=YES,AUTOSELECT=YES,URI="QualityLevels(128041)/Manifest(aac_eng_2_128041_2_1,format=m3u8-aapl)"
-#EXT-X-STREAM-INF:BANDWIDTH=536608,RESOLUTION=320x180,CODECS="avc1.64000d,mp4a.40.2",AUDIO="audio"
-QualityLevels(381048)/Manifest(video,format=m3u8-aapl)
-#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=536608,RESOLUTION=320x180,CODECS="avc1.64000d",URI="QualityLevels(381048)/Manifest(video,format=m3u8-aapl,type=keyframes)"
-#EXT-X-STREAM-INF:BANDWIDTH=884544,RESOLUTION=480x270,CODECS="avc1.640015,mp4a.40.2",AUDIO="audio"
-QualityLevels(721495)/Manifest(video,format=m3u8-aapl)
-#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=884544,RESOLUTION=480x270,CODECS="avc1.640015",URI="QualityLevels(721495)/Manifest(video,format=m3u8-aapl,type=keyframes)"
-#EXT-X-STREAM-INF:BANDWIDTH=1327398,RESOLUTION=640x360,CODECS="avc1.64001e,mp4a.40.2",AUDIO="audio"
-QualityLevels(1154816)/Manifest(video,format=m3u8-aapl)
-#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=1327398,RESOLUTION=640x360,CODECS="avc1.64001e",URI="QualityLevels(1154816)/Manifest(video,format=m3u8-aapl,type=keyframes)"
-#EXT-X-STREAM-INF:BANDWIDTH=2413312,RESOLUTION=960x540,CODECS="avc1.64001f,mp4a.40.2",AUDIO="audio"
-QualityLevels(2217354)/Manifest(video,format=m3u8-aapl)
-#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=2413312,RESOLUTION=960x540,CODECS="avc1.64001f",URI="QualityLevels(2217354)/Manifest(video,format=m3u8-aapl,type=keyframes)"
-#EXT-X-STREAM-INF:BANDWIDTH=3805760,RESOLUTION=1280x720,CODECS="avc1.640020,mp4a.40.2",AUDIO="audio"
-QualityLevels(3579827)/Manifest(video,format=m3u8-aapl)
-#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=3805760,RESOLUTION=1280x720,CODECS="avc1.640020",URI="QualityLevels(3579827)/Manifest(video,format=m3u8-aapl,type=keyframes)"
-#EXT-X-STREAM-INF:BANDWIDTH=139017,CODECS="mp4a.40.2",AUDIO="audio"
-QualityLevels(128041)/Manifest(aac_eng_2_128041_2_1,format=m3u8-aapl)
-```
-
-#### MPEG-DASH
-
-Here's an example of an MPEG-DASH manifest file, also called an MPEG-DASH Media Presentation Description (MPD):
-
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<MPD xmlns="urn:mpeg:dash:schema:mpd:2011" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" mediaPresentationDuration="PT1M10.315S" minBufferTime="PT7S">
- <Period>
- <AdaptationSet id="1" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.2" lang="en">
- <SegmentTemplate timescale="10000000" media="QualityLevels($Bandwidth$)/Fragments(aac_eng_2_128041_2_1=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(aac_eng_2_128041_2_1=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S d="60160000" r="10" />
- <S d="41386666" />
- </SegmentTimeline>
- </SegmentTemplate>
- <Representation id="5_A_aac_eng_2_128041_2_1_1" bandwidth="128041" audioSamplingRate="48000" />
- </AdaptationSet>
- <AdaptationSet id="2" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.640020" maxWidth="1280" maxHeight="720" startWithSAP="1">
- <SegmentTemplate timescale="10000000" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S d="60060000" r="10" />
- <S d="42375666" />
- </SegmentTimeline>
- </SegmentTemplate>
- <Representation id="1_V_video_1" bandwidth="3579827" width="1280" height="720" />
- <Representation id="1_V_video_2" bandwidth="2217354" codecs="avc1.64001F" width="960" height="540" />
- <Representation id="1_V_video_3" bandwidth="1154816" codecs="avc1.64001E" width="640" height="360" />
- <Representation id="1_V_video_4" bandwidth="721495" codecs="avc1.640015" width="480" height="270" />
- <Representation id="1_V_video_5" bandwidth="381048" codecs="avc1.64000D" width="320" height="180" />
- </AdaptationSet>
- </Period>
-</MPD>
-```
-#### Smooth Streaming
-
-Here's an example of a Smooth Streaming manifest file:
-
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<SmoothStreamingMedia MajorVersion="2" MinorVersion="2" Duration="703146666" TimeScale="10000000">
- <StreamIndex Chunks="12" Type="audio" Url="QualityLevels({bitrate})/Fragments(aac_eng_2_128041_2_1={start time})" QualityLevels="1" Language="eng" Name="aac_eng_2_128041_2_1">
- <QualityLevel AudioTag="255" Index="0" BitsPerSample="16" Bitrate="128041" FourCC="AACL" CodecPrivateData="1190" Channels="2" PacketSize="4" SamplingRate="48000" />
- <c t="0" d="60160000" r="11" />
- <c d="41386666" />
- </StreamIndex>
- <StreamIndex Chunks="12" Type="video" Url="QualityLevels({bitrate})/Fragments(video={start time})" QualityLevels="5">
- <QualityLevel Index="0" Bitrate="3579827" FourCC="H264" MaxWidth="1280" MaxHeight="720" CodecPrivateData="0000000167640020ACD9405005BB011000003E90000EA600F18319600000000168EBECB22C" />
- <QualityLevel Index="1" Bitrate="2217354" FourCC="H264" MaxWidth="960" MaxHeight="540" CodecPrivateData="000000016764001FACD940F0117EF01100000303E90000EA600F1831960000000168EBECB22C" />
- <QualityLevel Index="2" Bitrate="1154816" FourCC="H264" MaxWidth="640" MaxHeight="360" CodecPrivateData="000000016764001EACD940A02FF9701100000303E90000EA600F162D960000000168EBECB22C" />
- <QualityLevel Index="3" Bitrate="721495" FourCC="H264" MaxWidth="480" MaxHeight="270" CodecPrivateData="0000000167640015ACD941E08FEB011000003E90000EA600F162D9600000000168EBECB22C" />
- <QualityLevel Index="4" Bitrate="381048" FourCC="H264" MaxWidth="320" MaxHeight="180" CodecPrivateData="000000016764000DACD941419F9F011000003E90000EA600F14299600000000168EBECB22C" />
- <c t="0" d="60060000" r="11" />
- <c d="42375666" />
- </StreamIndex>
-</SmoothStreamingMedia>
-```
-
-### Naming of tracks in the manifest
-
-If an audio track name is specified in the .ism file, Media Services adds a `Label` element within an `AdaptationSet` to specify the textural information for the specific audio track. An example of the output DASH manifest:
-
-```xml
-<AdaptationSet codecs="mp4a.40.2" contentType="audio" lang="en" mimeType="audio/mp4" subsegmentAlignment="true" subsegmentStartsWithSAP="1">
- <Label>audio_track_name</Label>
- <Role schemeIdUri="urn:mpeg:dash:role:2011" value="main"/>
- <Representation audioSamplingRate="48000" bandwidth="131152" id="German_Forest_Short_Poem_english-en-68s-2-lc-128000bps_seg">
- <BaseURL>German_Forest_Short_Poem_english-en-68s-2-lc-128000bps_seg.mp4</BaseURL>
- </Representation>
-</AdaptationSet>
-```
-
-The player can use the `Label` element to display on its UI.
-
-### Signaling audio description tracks
-
-You can add a narration track to your video to help visually impaired clients follow the video recording by listening to the narration. You need to annotate an audio track as audio description in the manifest. To do that, add ΓÇ£accessibilityΓÇ¥ and ΓÇ£roleΓÇ¥ parameters to the .ism file. It's your responsibility to set these parameters correctly to signal an audio track as audio description. For example, add `<param name="accessibility" value="description" />` and `<param name="role" value="alternate"` to the .ism file for a specific audio track.
-
-For more information, see the [How to signal a descriptive audio track](signal-descriptive-audio-howto.md) example.
-
-#### Smooth Streaming manifest
-
-If you're playing a Smooth Streaming stream, the manifest would carry values in `Accessibility` and `Role` attributes for that audio track. For example, `Role="alternate" Accessibility="description"` would be added in the `StreamIndex` element to indicate it's an audio description.
-
-#### DASH manifest
-
-For DASH manifest, the following two elements would be added to signal the audio description:
-
-```xml
-<Accessibility schemeIdUri="urn:mpeg:dash:role:2011" value="description"/>
-<Role schemeIdUri="urn:mpeg:dash:role:2011" value="alternate"/>
-```
-
-#### HLS playlist
-
-For HLS v7 and above `(format=m3u8-cmaf)`, its playlist would carry `AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.describes-video"` when the audio description track is signaled.
---
-#### Example
-
-For more information, see [How to signal audio description tracks](signal-descriptive-audio-howto.md).
-
-## Dynamic Manifest filtering
-
-To control the number of tracks, formats, bitrates, and presentation time windows that are sent to players, you can use dynamic filtering with the Media Services dynamic packager. For more information, see [Pre-filtering manifests with the dynamic packager](filters-dynamic-manifest-concept.md).
-
-## Dynamic encryption for DRM
-
-You can use *dynamic encryption* to dynamically encrypt your live or on-demand content with AES-128 or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM licenses to authorized clients. For more information, see [dynamic encryption](drm-content-protection-concept.md).
-
-> [!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## More information
-
-Check out [Azure Media Services community](media-services-community.md) to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest).
-
-## Next steps
-
-[Upload, encode, and stream videos](stream-files-tutorial-with-api.md)
media-services Encode Media Encoder Standard Formats Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-media-encoder-standard-formats-reference.md
- Title: Standard Encoder formats and codecs - Azure
-description: This article contains a list of the most common import and export file formats that you can use with StandardEncoderPreset.
------- Previously updated : 11/18/2021----
-# Standard Encoder formats and codecs
--
-This article contains a list of the most common import and export file formats that you can use with [StandardEncoderPreset](/rest/api/medi).
-
-## Input container/file formats
-
-| File formats (file extensions) | Supported |
-| | |
-| FLV (with H.264 and AAC codecs) (.flv) |Yes |
-| MXF (.mxf) |Yes |
-| GXF (.gxf) |Yes |
-| MPEG2-PS, MPEG2-TS, 3GP (.ts, .ps, .3gp, .3gpp, .mpg) |Yes |
-| Windows Media Video (WMV)/ASF (.wmv, .asf) |Yes |
-| AVI (Uncompressed 8bit/10bit) (.avi) |Yes |
-| MP4 (.mp4, .m4a, .m4v)/ISMV (.isma, .ismv) |Yes |
-| [Microsoft Digital Video Recording(DVR-MS)](/previous-versions/windows/desktop/mstv/about-the-dvr-ms-file-format) (.dvr-ms) |Yes |
-| Matroska/WebM (.mkv) |Yes |
-| WAVE/WAV (.wav) |Yes |
-| QuickTime (.mov) |Yes |
-
-### Audio formats in input containers
-
-Standard Encoder supports carrying the following audio formats in input containers:
-
-* MXF, GXF, and QuickTime files, which have audio tracks with interleaved stereo or 5.1 samples
-
-or
-
-* MXF, GXF, and QuickTime files where the audio is carried as separate PCM tracks but the channel mapping (to stereo or 5.1) can be deduced from the file metadata
-
-## Input video codecs
-| Input video codecs | Supported |
-| | |
-| AVC 8-bit/10-bit, up to 4:2:2, including AVCIntra |8 bit 4:2:0 and 4:2:2 |
-| Sony XAVC / XAVC S (in MXF container)| Yes|
-| Avid DNxHD (in MXF container) |Yes |
-| DVCPro/DVCProHD (in MXF container) |Yes |
-| Digital video (DV) (in AVI files) |Yes |
-| JPEG 2000 |Yes |
-| MPEG-2 (up to 422 Profile and High Level; including variants such as Sony XDCAM, Sony XDCAM HD, Sony XDCAM IMX, CableLabs®, and D10) |Up to 422 Profile |
-| MPEG-1 |Yes |
-| VC-1/WMV9 |Yes |
-| Canopus HQ/HQX |No |
-| MPEG-4 Part 2 |Yes |
-| [Theora](https://en.wikipedia.org/wiki/Theora) |Yes |
-| YUV420 uncompressed, or mezzanine |Yes |
-| Apple ProRes 422 |Yes |
-| Apple ProRes 422 LT |Yes |
-| Apple ProRes 422 HQ |Yes |
-| Apple ProRes Proxy |Yes |
-| Apple ProRes 4444 |Yes |
-| Apple ProRes 4444 XQ |Yes |
-| HEVC/H.265| Main Profile|
-
-## Input audio codecs
-| Input Audio Codecs | Supported |
-| | |
-| AAC (AAC-LC, AAC-HE, and AAC-HEv2; up to 5.1) |Yes |
-| MPEG Layer 2 |Yes |
-| MP3 (MPEG-1 Audio Layer 3) |Yes |
-| Windows Media Audio |Yes |
-| WAV/PCM |Yes |
-| [FLAC](https://en.wikipedia.org/wiki/FLAC)</a> |Yes |
-| [Opus](https://go.microsoft.com/fwlink/?LinkId=822667) |Yes |
-| [Vorbis](https://en.wikipedia.org/wiki/Vorbis)</a> |Yes |
-| AMR (adaptive multi-rate) |Yes |
-| AES (SMPTE 331M and 302M, AES3-2003) |No |
-| Dolby® E |No |
-| Dolby® Digital (AC3) |No |
-| Dolby® Digital Plus (E-AC3) |No |
-
-## Output formats and codecs
-The following table lists the codecs and file formats that are supported for export.
-
-| File Format | Video Codec | Audio Codec |
-| | | |
-| MP4 <br/><br/>(including multi-bitrate MP4 containers) |H.264 (High, Main, and Baseline Profiles), HEVC (H.265) 8-bit |AAC-LC, HE-AAC v1, HE-AAC v2 |
-| MPEG2-TS |H.264 (High, Main, and Baseline Profiles) |AAC-LC, HE-AAC v1, HE-AAC v2 |
media-services Encode On Premises Encoder Partner https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-on-premises-encoder-partner.md
- Title: Become an on-premises encoder partner
-description: This article discusses how to verify your on-premises live streaming encoders.
---- Previously updated : 08/31/2020---
-
-# How to verify your on-premises live streaming encoder
--
-As an Azure Media Services on-premises encoder partner, Media Services promotes your product by recommending your encoder to enterprise customers. To become an on-premises encoder partner, you must verify compatibility of your on-premises encoder with Media Services. To do so, complete the following verifications.
--
-## Pass-through Live Event verification
-
-1. In your Media Services account, make sure that the **Streaming Endpoint** is running.
-2. Create and start the **pass-through** Live Event (basic or standard). <br/> For more information, see [Live Event states and billing](live-event-states-billing-concept.md).
-3. Get the ingest URLs and configure your on-premises encoder to use the URL to send a multi-bitrate live stream to Media Services.
-4. Get the preview URL and use it to verify that the input from the encoder is actually being received.
-5. Create a new **Asset** object.
-6. Create a **Live Output** and use the asset name that you created.
-7. Create a **Streaming Locator** with the built-in **Streaming Policy** types.
-8. List the paths on the **Streaming Locator** to get back the URLs to use.
-9. Get the host name for the **Streaming Endpoint** that you want to stream from.
-10. Combine the URL from step 8 with the host name in step 9 to get the full URL.
-11. Run your live encoder for approximately 10 minutes.
-12. Stop the Live Event.
-13. Use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the archived asset to ensure that playback has no visible glitches at all quality levels. Or, watch and validate via the preview URL during the live session.
-14. Record the asset ID, the published streaming URL for the live archive, and the settings and version used from your live encoder.
-15. Reset the Live Event state after creating each sample.
-16. Repeat steps 5 through 15 for all configurations supported by your encoder (with and without ad signaling, captions, or different encoding speeds).
-
-## Live encoding Live Event verification
-
-1. In your Media Services account, make sure that the **Streaming Endpoint** is running.
-2. Create and start the **live encoding** Live Event. <br/> For more information, see [Live Event states and billing](live-event-states-billing-concept.md).
-3. Get the ingest URLs and configure your encoder to push a single-bitrate live stream to Media Services.
-4. Get the preview URL and use it to verify that the input from the encoder is actually being received.
-5. Create a new **Asset** object.
-6. Create a **Live Output** and use the asset name that you created.
-7. Create a **Streaming Locator** with the built-in **Streaming Policy** types.
-8. List the paths on the **Streaming Locator** to get back the URLs to use.
-9. Get the host name for the **Streaming Endpoint** that you want to stream from.
-10. Combine the URL from step 8 with the host name in step 9 to get the full URL.
-11. Run your live encoder for approximately 10 minutes.
-12. Stop the Live Event.
-13. Use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the archived asset to ensure that playback has no visible glitches for all quality levels. Or, watch and validate via the preview URL during the live session.
-14. Record the asset ID, the published streaming URL for the live archive, and the settings and version used from your live encoder.
-15. Reset the Live Event state after creating each sample.
-16. Repeat steps 5 through 15 for all configurations supported by your encoder (with and without ad signaling, captions, or different encoding speeds).
-
-## Longevity verification
-
-Follow the same steps as in [Pass-through Live Event verification](#pass-through-live-event-verification) except for step 11. <br/>Instead of 10 minutes, run your live encoder for one week or longer. Use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the live streaming from time to time (or an archived asset) to ensure that playback has no visible glitches.
-
-## Email your recorded settings
-
-Finally, email your recorded settings and live archive parameters to Azure Media Services at amshelp@microsoft.com as a notification that all self-verification checks have passed. Also, include your contact information for any follow-ups. You can contact the Azure Media Services team with any questions about this process.
-
-## See also
-
-[Tested on-premises encoders](encode-recommended-on-premises-live-encoders.md)
-
-## Next steps
-
-[Live streaming with Media Services v3](stream-live-streaming-concept.md)
media-services Encode Recommended On Premises Live Encoders https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/encode-recommended-on-premises-live-encoders.md
- Title: Live streaming encoders recommended by Media Services
-description: Learn about live streaming on-premises encoders recommended by Media Services
-
-keywords: encoding;encoders;media
--- Previously updated : 03/04/2022---
-
-# Verified on-premises live streaming encoders
--
-In Azure Media Services, a [Live Event](/rest/api/media/liveevents) (channel) represents a pipeline for processing live-streaming content. The Live Event receives live input streams in one of two ways.
-
-* An on-premises live encoder sends a multi-bitrate RTMP or Smooth Streaming (fragmented MP4) stream to the Live Event that is not enabled to perform live encoding with Media Services. The ingested streams pass through Live Events without any further processing. This method is called **pass-through**. We recommend for the live encoder to send multi-bitrate streams instead of a single-bitrate stream to a pass-through live event to allow for adaptive bitrate streaming to the client.
-
- If you are using multi-bitrates streams for the pass-through live event, the video GOP size and the video fragments on different bitrates must be synchronized to avoid unexpected behavior on the playback side.
-
- > [!TIP]
- > Using a pass-through method is the most economical way to do live streaming.
- > The service supports a basic and standard pass-through mode depending on your ingest requirements.
- > The basic pass-through is the most economical, but is limited in capabilities.
-
-* An on-premises live encoder sends a single-bitrate stream to the Live Event that is enabled to perform live encoding with Media Services in one of the following formats: RTMP or Smooth Streaming (fragmented MP4). The Live Event then performs live encoding of the incoming single-bitrate stream to a multi-bitrate (adaptive) video stream.
-
-This article discusses verified on-premises live streaming encoders. The verification is done through vendor self-verification or customer verification. Microsoft Azure Media Services does not do full or rigorous testing of each encoder, and does not continually re-verify on updates. For instructions on how to verify your on-premises live encoder, see [verify your on-premises encoder](encode-on-premises-encoder-partner.md)
-
-For detailed information about live encoding with Media Services, see [Live streaming with Media Services v3](stream-live-streaming-concept.md).
-
-## Encoder requirements
-
-Encoders must support TLS 1.2 when using HTTPS or RTMPS protocols.
-
-## Live encoders that output RTMP
-
-Media Services recommends using one of following live encoders that have RTMP as output. The supported URL schemes are `rtmp://` or `rtmps://`.
-
-When streaming via RTMP, check firewall and/or proxy settings to confirm that outbound TCP ports 1935 and 1936 are open.<br/><br/>
-When streaming via RTMPS, check firewall and/or proxy settings to confirm that outbound TCP ports 2935 and 2936 are open.
-
-> [!NOTE]
-> Encoders must support TLS 1.2 when using the RTMPS protocols.
--- Adobe Flash Media Live Encoder 3.2-- [Antix Digital](http://www.antixdigital.com/) StreamZ Live (previously Imagine Communication SelenioFlex Live)-- [Blackmagic ATEM Mini and ATEM Mini PRO](https://www.blackmagicdesign.com/products/atemmini)-- [Cambria Live 4.3](https://www.capellasystems.net/products/cambria-live/)-- Elemental Live (version 2.14.15 and higher)-- [Ffmpeg](https://www.ffmpeg.org)-- [GoPro](https://gopro.com/help/articles/block/getting-started-with-live-streaming) Hero 7 and Hero 8-- Haivision KB-- Haivision Makito X HEVC-- [OBS Studio](https://obsproject.com/download)-- [Osprey Talon hardware encoders](https://www.ospreyvideo.com/talon-encoders), Talon 4K-SC, Talon UHD-SC-- [Restream.io](https://restream.io/)-- [Streamlabs](https://streamlabs.com/)-- [Switcher Studio (iOS)](https://www.switcherstudio.com/)-- Telestream Wirecast (version 13.0.2 or higher due to the TLS 1.2 requirement)-- Telestream Wirecast S (only RTMP is supported. No RTMPS support due to lack of TLS 1.2+)-- Teradek Slice 756-- VMIX-- xStream-
-> [!WARNING]
-> The above list of encoders is just a recommendation list. Encoders are not tested or validated by Microsoft on a continual basis and updates or breaking changes can be introduced by encoder vendors or open source projects that could break compatibility.
-
-## Live encoders that output fragmented MP4 (Smooth Streaming ingest)
-
-Media Services recommends using one of the following live encoders that have multi-bitrate Smooth Streaming (fragmented MP4) as output. The supported URL schemes are `http://` or `https://`.
-
-> [!NOTE]
-> Encoders must support TLS 1.2 when using HTTPS protocols.
--- Ateme TITAN Live-- [Antix Digital](http://www.antixdigital.com/) StreamZ Live (previously Imagine Communication SelenioFlex Live)-- Cisco Digital Media Encoder 2200-- Elemental Live (version 2.14.15 and higher due to the TLS 1.2 requirement)-- Envivio 4Caster C4 Gen III -- [Ffmpeg](https://www.ffmpeg.org)-- Media Excel Hero Live and Hero 4K (UHD/HEVC)-
-> [!TIP]
-> If you are streaming live events in multiple languages (for example, one English audio track and one Spanish audio track), you can accomplish this with the Media Excel live encoder configured to send the live feed to a pass-through Live Event.
-
-> [!WARNING]
-> The above list of encoders is just a recommendation list. Encoders are not tested or validated by Microsoft on a continual basis and support or bugs can be introduced by the encoder vendors or open source projects that break compatibility at any time.
-
-## Configuring on-premises live encoder settings
-
-For information about what settings are valid for your live event type, see [Live Event types comparison](live-event-types-comparison-reference.md).
-
-### Playback requirements
-
-To play back content, both an audio and video stream must be present. Playback of the video-only stream is not supported.
-
-### Configuration tips
--- Whenever possible, use a hardwired internet connection.-- When you're determining bandwidth requirements, double the streaming bitrates. Although not mandatory, this simple rule helps to mitigate the impact of network congestion.-- When using software-based encoders, close out any unnecessary programs.-- Changing your encoder configuration after it has started pushing has negative effects on the event. Configuration changes can cause the event to become unstable. If you change your encoder configuration, you need to reset [Live Events](/rest/api/media/live-events/reset) and restart the live event in order for the change to take place. If you stop and start the live event without resetting it, the live event will preserve the previous configuration.-- Always test and validate newer versions of encoder software for continued compatibility with Azure Media Services. Microsoft does not re-validate encoders on this list, and most validations are done by the software vendors directly as a "self-certification."-- Ensure that you give yourself ample time to set up your event. For high-scale events, we recommend starting the setup an hour before your event.-- Use the H.264 video and AAC-LC audio codec output.-- Stick to supported resolutions and frame rates for the type of Live Event you are broadcasting to (for example, 60fps is currently rejected.)-- Ensure that there is key frame or GOP temporal alignment across video qualities.-- Make sure there is a unique stream name for each video quality.-- Use strict CBR encoding recommended for optimum adaptive bitrate performance.-
-> [!IMPORTANT]
-> Watch the physical condition of the machine (CPU / Memory / etc) as uploading fragments to cloud involves CPU and IO operations.
-> If you change any encoder configurations, reset [Live Events](/rest/api/media/live-events/reset) the channels and the live event for the change to take place. If you stop and start the live event without resetting it, the live event will preserve the previous configuration.
-
-## See also
-
-[Live streaming with Media Services v3](stream-live-streaming-concept.md)
-
-## Next steps
-
-[How to verify your encoder](encode-on-premises-encoder-partner.md)
media-services Face Redaction Event Based Python Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/face-redaction-event-based-python-quickstart.md
- Title: Event-based Face Redaction
-description: This quickstart shows how to deploy an event-based solution on Azure, where incoming videos will be transformed using a Job in Azure Media Services.
------ Previously updated : 5/21/2021----
-# Event-based Face Redaction
--
-## Introduction
-
-In some scenarios or use-cases, Azure Media Services should process or analyze videos the moment videos land on a data store. An example use-case could be where a team wants to analyze videos of a site or plant to see if people on-site follow security instructions (e.g. wear helmets). For this use-case, an Edge device on-site could capture the videos when motion is detected, and then send to Azure. To comply with privacy standards, the faces of people captured on the videos should be redacted before they can be analyzed by the team. To be able to share the enriched videos as soon as possible with team, the face redaction step should be done the moment a video lands on Azure. This quickstart shows how to use Azure Media Services in such an event-based scenario on Azure. Videos uploaded to a storage account will be transformed using a Job in Azure Media Services. It uses the Media Service v3 API.
-
-The specific transformation that will be used is called [Face Redactor](./analyze-face-redaction-concept.md). This is an Azure Media Analytics preset, that allows you to modify your video by blurring faces of selected individuals.
-
-By the end of the quickstart you will be able to redact faces in a video:
--
-## Solution Overview
--
-This quickstart shows how to deploy the solution that can be found in the solution overview above. It starts with a storage account (Azure Data Lake Storage Gen2), with an Event Listener connected to it (Event Grid), that triggers an Azure Function when new .mp4 files are uploaded to the storage account. The Azure Function will submit a job to a pre-configured Transform in Azure Media Services. The resulting redacted video will be stored on a Blob Storage account.
-
-## Prerequisites
--- If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.-- Create a resource group to use with this quickstart.-
-## Get the sample and understand its deployment
-
-Create a fork of the [Python samples repository](https://github.com/Azure-Samples/media-services-v3-python). For this quickstart, we're working with the FaceRedactorEventBased sample.
-
-The deployment of this sample consists of three separate steps: deploying the Azure services to setup the overall solution, deploying the Function App that submits a job to Azure Media Services when a new file is uploaded, and configuring the Eventgrid trigger. We have created a GitHub Actions workflow that performs these steps. Therefore, this solution can be deployed by adding the necessary variables to your GitHub environment, which means that no local development tools are required.
-
-## Create a Service Principal
-
-Before the GitHub Actions workflow can be run, a Service principal has to be created that has *Contributor* and *Storage Blob Data Reader* roles on the Resource Group. This Service Principal will be the app that will provision and configure all Azure services on behalf of GitHub Actions. The Service Principal is also used after the solution is deployed to generate a SAS token for videos that need to be processed.
-
-To create the Service Principal and give it the roles that are needed on the Resource Group, fill in the variables in the following bash command and running it in the Cloud Shell:
-```bash
-# Replace <subscription-id>, <name-of-resource-group> and <name-of-app> with the corresponding values.
-# Make sure to use a unique name for the app name parameter.
-
-app_name="<name-of-app>"
-resource_group="<name-of-resource-group>"
-subscription_id="<subscription-id>"
-
-az ad sp create-for-rbac --name $app_name --role contributor \
- --scopes /subscriptions/$subscription_id/resourceGroups/$resource_group \
- --sdk-auth
-
-object_id=$(az ad sp list --display-name $app_name --query [0].objectId -o tsv)
-
-az role assignment create --assignee $object_id --role "Storage Blob Data Reader" \
- --scope /subscriptions/$subscription_id/resourceGroups/$resource_group
-```
-
-The command should output a JSON object similar to this:
-
-```json
-{
- "clientId": "<GUID>",
- "clientSecret": "<GUID>",
- "subscriptionId": "<GUID>",
- "tenantId": "<GUID>",
- (...)
-}
-```
-Make sure to copy the output and have it available for the next step.
-
-## Add Service Principal details to GitHub Secrets
-
-The Service Principal details should be stored as a [GitHub Secret](https://docs.github.com/en/actions/reference/encrypted-secrets) so that GitHub Actions can deploy and configure the necessary services within Azure. Go to the Repo Settings -> Secrets of your forked repo and click on 'Create New Secrets'. Create the following secrets:
-
-## Create the .env file
-
-Copy the contents from the sample.env file that is in your forked repo in the VideoAnalytics/FaceRedactorEventBased folder. Then, create your own .env file by clicking on Add file -> Create new file. Name the file *.env* and fill in the variables. When you're done, click on 'Commit new file'. We are now ready to deploy the solution, but we will first examine the code files that we will be using.
-
-## Examine the code for provisioning the Azure Resources
-
-The bash script below provisions the Azure services used in this solution. The bash script uses the Azure CLI and executes the following actions:
-- Load environment variables into local variables.-- Define names for ADLSgen2, a generic Azure Storage account, Azure Media Services, Azure Function App, and an Event Grid System Topic and Subscription.-- Provision the Azure services defined.-
-[!code-bash[Main](../../../media-services-v3-python/VideoAnalytics/FaceRedactorEventBased/AzureServicesProvisioning/deploy_resources.azcli)]
-
-## Examine Azure Function code
-
-After successfully provisioning the Azure Resources, we are ready to deploy the Python code to our Azure Function. The **/azure-function/EventGrid_AMSJob/__init__.py** file contains the logic to trigger an AMS job whenever a file is landing in the Azure Data Lake Gen2 file system. The script performs the following steps:
-- Import dependencies and libraries.-- Use Function binder to listen to Azure Event Grid.-- Grab and define variables from event schema.-- Create Input/Output asset for AMS Job.-- Connect to Azure Data Lake Gen2 using DataLakeService Client, and generate a SAS-token to use as authentication for the AMS job input.-- Configure and create the Job.-
-[!code-python[Main](../../../media-services-v3-python/VideoAnalytics/FaceRedactorEventBased/AzureFunction/EventGrid_AMSJob/__init__.py)]
-
-## Examine the code for configuring the Azure Resources
-
-The bash script below is used for configuring the Resources after they have been provisioned. Executing this script is the last step of the deployment of the solution, after deploying our Function code. The script executes the following steps:
-- Configure App Settings for the Function App.-- Create an Azure Event Grid System Topic.-- Create the Event subscription, so that when a Blob is created in the ADLSg2 Raw folder, the Azure Function is triggered.-- Create the Azure Media Services Transform using a REST API call. This transform will be called in the Azure Function.-
-> [!NOTE]
-> Currently, neither the Azure Media Services v3 Python SDK, nor Azure CLI did support the creation of a FaceRedaction Transform. We therefore the REST API method to create the transform job.
-
-[!code-bash[Main](../../../media-services-v3-python/VideoAnalytics/FaceRedactorEventBased/AzureServicesProvisioning/configure_resources.azcli)]
-
-## Enable GitHub Actions pipeline
- The Workflow file in this repository contains the steps to execute the deployment of this solution. To start the Workflow, it needs to be enabled for your own repo. In order to enable it, go to the Actions tab in your repo and select 'I understand my workflows, go ahead and enable them'.
-
-After enabling the GitHub Actions, you can find the workload file here: [.github/workflows/main.yml](https://github.com/Azure-Samples/media-services-v3-python/blob/main/.github/workflows/main.yml). Aside from the triggers, there is a build job with a couple of steps. The following steps are included:
-- **Env**: In here, multiple environment variables are defined, referring to the GitHub Secrets that we added earlier.-- **Read Environment file**: The environment file is read for the build job.-- **Resolve Project Dependencies using Pip**: The needed libraries in our Azure functions are loaded into the GitHub Actions environment-- **Azure Login**: This step uses the GitHub Secret for logging into the Azure CLI using the Service Principal details.-- **Deploy Azure Resources using Azure CLI script file**: runs the deployment script for provisioning the Azure Resources-- **Deploy Azure Function code**: This step packages and deploys the Azure function in the directory './azure-function'. When the Azure Function is deployed successfully, it should be visible in the Azure portal under the name 'EventGrid_AMSJob':---- **Configure Azure Resources using Azure CLI script file**: If all correct, the last step is to configure the deployed Azure services to activate the event-listener.-
-After enabling the workflows, select the 'Deploy Azure Media Service FaceRedaction solution' workflow and select 'Run workflow'. Now, the solution will be deployed using the variables added in the previous steps. Wait a couple of minutes and verify that it has run successfully.
--
-## Test your solution
-Go to the storage explorer of your ADLS Gen2 in the Azure portal. Upload a video to the Raw container. If you're looking for a test video, download one from [this website](https://www.pexels.com/search/videos/group/). See the image below for guidance on uploading a video to the ADLS Gen2 storage account:
--
-Verify in you Azure Media Services instance that a job is created by going to your Azure Media Services account and select Transforms + Jobs from the menu. Then select the face redactor transformation.
--
-This page should show the job that was fired by the Azure Function. The job can either be finished or still processing.
--
-By selecting the job, you'll see some details about the specific job. If you select the Output asset name and then use the link to the storage container that is linked to it, you can see your processed video when the job is finished.
--
-## Clean up Resources
-
-When you're finished with the quickstart, delete the Resources created in the resource group. Additionally, you can delete the forked repo.
-
-## Next steps
-
-If you would like to modify this example, chances are you would like to run the code locally. For local development, the variables in the sample.env file are sufficient because the Service Principal is not needed when a user account is logged in to the locally installed Azure CLI. For guidance on working locally with your Azure Function, we refer to [these docs](../../azure-functions/create-first-function-vs-code-python.md).
media-services Filter Order Page Entities How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filter-order-page-entities-how-to.md
- Title: Filtering, ordering, and paging of entities
-description: Learn about filtering, ordering, and paging of Azure Media Services v3 entities.
------ Previously updated : 08/31/2020----
-# Filtering, ordering, and paging entities
--
-This topic discusses the OData query options and pagination support available when you're listing Azure Media Services v3 entities.
-
-## Considerations
-
-* Properties of entities that are of the `Datetime` type are always in UTC format.
-* White space in the query string should be URL-encoded before you send a request.
-
-## Comparison operators
-
-You can use the following operators to compare a field to a constant value:
-
-Equality operators:
--- `eq`: Test whether a field is *equal to* a constant value.-- `ne`: Test whether a field is *not equal to* a constant value.-
-Range operators:
--- `gt`: Test whether a field is *greater than* a constant value.-- `lt`: Test whether a field is *less than* a constant value.-- `ge`: Test whether a field is *greater than or equal to* a constant value.-- `le`: Test whether a field is *less than or equal to* a constant value.-
-## Filter
-
-Use `$filter` to supply an OData filter parameter to find only the objects you're interested in.
-
-The following REST example filters on the `alternateId` value of an asset:
-
-```
-GET https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaServices/amstestaccount/assets?api-version=2018-07-01&$filter=properties/alternateId%20eq%20'unique identifier'
-```
-
-The following C# example filters on the asset's created date:
-
-```csharp
-var odataQuery = new ODataQuery<Asset>("properties/created lt 2018-05-11T17:39:08.387Z");
-var firstPage = await MediaServicesArmClient.Assets.ListAsync(CustomerResourceGroup, CustomerAccountName, odataQuery);
-```
-
-## Order by
-
-Use `$orderby` to sort the returned objects by the specified parameter. For example:
-
-```
-GET https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaServices/amstestaccount/assets?api-version=2018-07-01$orderby=properties/created%20gt%202018-05-11T17:39:08.387Z
-```
-
-To sort the results in ascending or descending order, append either `asc` or `desc` to the field name, separated by a space. For example: `$orderby properties/created desc`.
-
-## Skip token
-
-If a query response contains many items, the service returns a `$skiptoken` (`@odata.nextLink`) value that you use to get the next page of results. Use it to page through the entire result set.
-
-In Media Services v3, you can't configure the page size. The page size varies by the type of entity. Read the individual sections that follow for details.
-
-If entities are created or deleted while you're paging through the collection, the changes are reflected in the returned results (if those changes are in the part of the collection that hasn't been downloaded).
-
-> [!TIP]
-> Always use `nextLink` to enumerate the collection and don't depend on a particular page size.
->
-> The `nextLink` value will be present only if there's more than one page of entities.
-
-Consider the following example of where `$skiptoken` is used. Make sure you replace *amstestaccount* with your account name and set the *api-version* value to the latest version.
-
-If you request a list of assets like this:
-
-```
-GET https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaServices/amstestaccount/assets?api-version=2018-07-01 HTTP/1.1
-x-ms-client-request-id: dd57fe5d-f3be-4724-8553-4ceb1dbe5aab
-Content-Type: application/json; charset=utf-8
-```
-
-You'll get back a response similar to this one:
-
-```
-HTTP/1.1 200 OK
-
-{
-"value":[
-{
-"name":"Asset 0","id":"/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaservices/amstestaccount/assets/Asset 0","type":"Microsoft.Media/mediaservices/assets","properties":{
-"assetId":"00000000-0000-0000-0000-000000000000","created":"2018-12-11T22:12:44.98Z","lastModified":"2018-12-11T22:15:48.003Z","container":"asset-00000000-0000-0000-0000-0000000000000","storageAccountName":"amsacctname","storageEncryptionFormat":"None"
-}
-},
-// lots more assets
-{
-"name":"Asset 517","id":"/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaservices/amstestaccount/assets/Asset 517","type":"Microsoft.Media/mediaservices/assets","properties":{
-"assetId":"00000000-0000-0000-0000-000000000000","created":"2018-12-11T22:14:08.473Z","lastModified":"2018-12-11T22:19:29.657Z","container":"asset-00000000-0000-0000-0000-000000000000","storageAccountName":"amsacctname","storageEncryptionFormat":"None"
-}
-}
-],"@odata.nextLink":"https:// management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaServices/amstestaccount/assets?api-version=2018-07-01&$skiptoken=Asset+517"
-}
-```
-
-You would then request the next page by sending a get request for:
-
-```
-https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaServices/amstestaccount/assets?api-version=2018-07-01&$skiptoken=Asset+517
-```
-
-The following C# example shows how to enumerate through all streaming locators in the account.
-
-```csharp
-var firstPage = await MediaServicesArmClient.StreamingLocators.ListAsync(CustomerResourceGroup, CustomerAccountName);
-
-var currentPage = firstPage;
-while (currentPage.NextPageLink != null)
-{
- currentPage = await MediaServicesArmClient.StreamingLocators.ListNextAsync(currentPage.NextPageLink);
-}
-```
-
-## Using logical operators to combine query options
-
-Media Services v3 supports **OR** and **AND** logical operators.
-
-The following REST example checks the job's state:
-
-```
-https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/qbtest/providers/Microsoft.Media/mediaServices/qbtest/transforms/VideoAnalyzerTransform/jobs?$filter=properties/state%20eq%20Microsoft.Media.JobState'Scheduled'%20or%20properties/state%20eq%20Microsoft.Media.JobState'Processing'&api-version=2018-07-01
-```
-
-You construct the same query in C# like this:
-
-```csharp
-var odataQuery = new ODataQuery<Job>("properties/state eq Microsoft.Media.JobState'Scheduled' or properties/state eq Microsoft.Media.JobState'Processing'");
-client.Jobs.List(config.ResourceGroup, config.AccountName, VideoAnalyzerTransformName, odataQuery);
-```
-
-## Filtering and ordering options of entities
-
-The following table shows how you can apply the filtering and ordering options to different entities:
-
-|Entity name|Property name|Filter|Order|
-|||||
-|[Assets](/rest/api/media/assets/)|name|`eq`, `gt`, `lt`, `ge`, `le`|`asc` and `desc`|
-||properties.alternateId |`eq`||
-||properties.assetId |`eq`||
-||properties.created| `eq`, `gt`, `lt`| `asc` and `desc`|
-|[Content key policies](/rest/api/media/contentkeypolicies)|name|`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.created |`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.description |`eq`, `ne`, `ge`, `le`, `gt`, `lt`||
-||properties.lastModified|`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.policyId|`eq`, `ne`||
-|[Jobs](/rest/api/media/jobs)| name | `eq` | `asc` and `desc`|
-||properties.state | `eq`, `ne` | |
-||properties.created | `gt`, `ge`, `lt`, `le`| `asc` and `desc`|
-||properties.lastModified | `gt`, `ge`, `lt`, `le` | `asc` and `desc`|
-|[Streaming locators](/rest/api/media/streaminglocators)|name|`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.created |`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.endTime |`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-|[Streaming policies](/rest/api/media/streamingpolicies)|name|`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-||properties.created |`eq`, `ne`, `ge`, `le`, `gt`, `lt`|`asc` and `desc`|
-|[Transforms](/rest/api/media/transforms)| name | `eq` | `asc` and `desc`|
-|| properties.created | `gt`, `ge`, `lt`, `le`| `asc` and `desc`|
-|| properties.lastModified | `gt`, `ge`, `lt`, `le`| `asc` and `desc`|
-
-## Next steps
-
-* [List Assets](/rest/api/media/assets/list)
-* [List Content Key Policies](/rest/api/media/contentkeypolicies/list)
-* [List Jobs](/rest/api/media/jobs/list)
-* [List Streaming Policies](/rest/api/media/streamingpolicies/list)
-* [List Streaming Locators](/rest/api/media/streaminglocators/list)
-* [Stream a file](stream-files-dotnet-quickstart.md)
-* [Quotas and limits](limits-quotas-constraints-reference.md)
media-services Filters Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filters-concept.md
- Title: Defining filters in Azure Media Services
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services creates dynamic manifests to achieve this selective streaming.
------ Previously updated : 08/31/2020---
-# Filters
--
-When delivering your content to customers (Live Streaming events or Video on Demand) your client might need more flexibility than what's described in the default asset's manifest file. Azure Media Services offers [Dynamic Manifests](filters-dynamic-manifest-concept.md) based on pre-defined filters.
-
-Filters are server-side rules that allow your customers to do things like:
--- Play back only a section of a video (instead of playing the whole video). For example:
- - Reduce the manifest to show a sub-clip of a live event ("sub-clip filtering"), or
- - Trim the start of a video ("trimming a video").
-- Deliver only the specified renditions and/or specified language tracks that are supported by the device that is used to play back the content ("rendition filtering"). -- Adjust Presentation Window (DVR) in order to provide a limited length of the DVR window in the player ("adjusting presentation window").-
-Media Services enables you to create **Account filters** and **Asset filters** for your content. In addition, you can associate your pre-created filters with a **Streaming Locator**.
-
-## Defining filters
-
-There are two types of filters:
-
-* [Account Filters](/rest/api/media/accountfilters) (global) - can be applied to any asset in the Azure Media Services account, have a lifetime of the account.
-* [Asset Filters](/rest/api/media/assetfilters) (local) - can only be applied to an asset with which the filter was associated upon creation, have a lifetime of the asset.
-
-**Account Filters** and **Asset Filters** types have exactly the same properties for defining/describing the filter. Except when creating the **Asset Filter**, you need to specify the asset name with which you want to associate the filter.
-
-Depending on your scenario, you decide what type of a filter is more suitable (Asset Filter or Account Filter). Account Filters are suitable for device profiles (rendition filtering) where Asset Filters could be used to trim a specific asset.
-
-You use the following properties to describe the filters.
-
-|Name|Description|
-|||
-|firstQuality|The first quality bitrate of the filter.|
-|presentationTimeRange|The presentation time range. This property is used for filtering manifest start/end points, presentation window length, and the live start position. <br/>For more information, see [PresentationTimeRange](#presentationtimerange).|
-|tracks|The tracks selection conditions. For more information, see [tracks](#tracks)|
-
-### presentationTimeRange
-
-Use this property with **Asset Filters**. It is not recommended to set the property with **Account Filters**.
-
-|Name|Description|
-|||
-|**endTimestamp**|Applies to Video on Demand (VoD).<br/>For the Live Streaming presentation, it is silently ignored and applied when the presentation ends and the stream becomes VoD.<br/>This is a long value that represents an absolute end point of the presentation, rounded to the closest next GOP start. The unit is the timescale, so an endTimestamp of 1800000000 would be for 3 minutes.<br/>Use startTimestamp and endTimestamp to trim the fragments that will be in the playlist (manifest).<br/>For example, startTimestamp=40000000 and endTimestamp=100000000 using the default timescale will generate a playlist that contains fragments from between 4 seconds and 10 seconds of the VoD presentation. If a fragment straddles the boundary, the entire fragment will be included in the manifest.|
-|**forceEndTimestamp**|Applies to Live Streaming only.<br/>Indicates whether the endTimestamp property must be present. If true, endTimestamp must be specified or a bad request code is returned.<br/>Allowed values: false, true.|
-|**liveBackoffDuration**|Applies to Live Streaming only.<br/> This value defines the latest live position that a client can seek to.<br/>Using this property, you can delay live playback position and create a server-side buffer for players.<br/>The unit for this property is timescale (see below).<br/>The maximum live back off duration is 300 seconds (3000000000).<br/>For example, a value of 2000000000 means that the latest available content is 20 seconds delayed from the real live edge.|
-|**presentationWindowDuration**|Applies to Live Streaming only.<br/>Use presentationWindowDuration to apply a sliding window of fragments to include in a playlist.<br/>The unit for this property is timescale (see below).<br/>For example, set presentationWindowDuration=1200000000 to apply a two-minute sliding window. Media within 2 minutes of the live edge will be included in the playlist. If a fragment straddles the boundary, the entire fragment will be included in the playlist. The minimum presentation window duration is 60 seconds.|
-|**startTimestamp**|Applies to Video on Demand (VoD) or Live Streaming.<br/>This is a long value that represents an absolute start point of the stream. The value gets rounded to the closest next GOP start. The unit is the timescale, so a startTimestamp of 150000000 would be for 15 seconds.<br/>Use startTimestamp and endTimestampp to trim the fragments that will be in the playlist (manifest).<br/>For example, startTimestamp=40000000 and endTimestamp=100000000 using the default timescale will generate a playlist that contains fragments from between 4 seconds and 10 seconds of the VoD presentation. If a fragment straddles the boundary, the entire fragment will be included in the manifest.|
-|**timescale**|Applies to all timestamps and durations in a Presentation Time Range, specified as the number of increments in one second.<br/>Default is 10000000 - ten million increments in one second, where each increment would be 100 nanoseconds long.<br/>For example, if you want to set a startTimestamp at 30 seconds, you would use a value of 300000000 when using the default timescale.|
-
-### Tracks
-
-You specify a list of filter track property conditions (FilterTrackPropertyConditions) based on which the tracks of your stream (Live Streaming or Video on Demand) should be included into dynamically created manifest. The filters are combined using a logical **AND** and **OR** operation.
-
-Filter track property conditions describe track types, values (described in the following table), and operations (Equal, NotEqual).
-
-|Name|Description|
-|||
-|**Bitrate**|Use the bitrate of the track for filtering.<br/><br/>The recommended value is a range of bitrates, in bits per second. For example, "0-2427000".<br/><br/>Note: while you can use a specific bitrate value, like 250000 (bits per second), this approach is not recommended, as the exact bitrates can fluctuate from one Asset to another.|
-|**FourCC**|Use the FourCC value of the track for filtering.<br/><br/>The value is the first element of codecs format, as specified in [RFC 6381](https://tools.ietf.org/html/rfc6381). Currently, the following codecs are supported: <br/>For Video: "avc1", "hev1", "hvc1"<br/>For Audio: "mp4a", "ec-3"<br/><br/>To determine the FourCC values for tracks in an Asset, get and examine the manifest file.|
-|**Language**|Use the language of the track for filtering.<br/><br/>The value is the tag of a language you want to include, as specified in RFC 5646. For example, "en".|
-|**Name**|Use the name of the track for filtering.|
-|**Type**|Use the type of the track for filtering.<br/><br/>The following values are allowed: "video", "audio", or "text".|
-
-### Example
-
-The following example defines a Live Streaming filter:
-
-```json
-{
- "properties": {
- "presentationTimeRange": {
- "startTimestamp": 0,
- "endTimestamp": 170000000,
- "presentationWindowDuration": 9223372036854776000,
- "liveBackoffDuration": 0,
- "timescale": 10000000,
- "forceEndTimestamp": false
- },
- "firstQuality": {
- "bitrate": 128000
- },
- "tracks": [
- {
- "trackSelections": [
- {
- "property": "Type",
- "operation": "Equal",
- "value": "Audio"
- },
- {
- "property": "Language",
- "operation": "NotEqual",
- "value": "en"
- },
- {
- "property": "FourCC",
- "operation": "NotEqual",
- "value": "EC-3"
- }
- ]
- },
- {
- "trackSelections": [
- {
- "property": "Type",
- "operation": "Equal",
- "value": "Video"
- },
- {
- "property": "Bitrate",
- "operation": "Equal",
- "value": "3000000-5000000"
- }
- ]
- }
- ]
- }
-}
-```
-
-## Associating filters with Streaming Locator
-
-You can specify a list of [asset or account filters](filters-concept.md) on your [Streaming Locator](/rest/api/medi), which is based on filters in the URL + filters you specify on the Streaming Locator.
-
-See the following examples:
-
-* [Associate filters with Streaming Locator - .NET](filters-dynamic-manifest-dotnet-how-to.md#associate-filters-with-streaming-locator)
-* [Associate filters with Streaming Locator - CLI](filters-dynamic-manifest-cli-how-to.md#associate-filters-with-streaming-locator)
-
-## Updating filters
-
-**Streaming Locators** are not updatable while filters can be updated.
-
-It is not recommended to update the definition of filters associated with an actively published **Streaming Locator**, especially when CDN is enabled. Streaming servers and CDNs can have internal caches that may result in stale cached data to be returned.
-
-If the filter definition needs to be changed consider creating a new filter and adding it to the **Streaming Locator** URL or publishing a new **Streaming Locator** that references the filter directly.
-
-## Next steps
-
-The following articles show how to create filters programmatically.
--- [Create filters with REST APIs](filters-dynamic-manifest-rest-howto.md)-- [Create filters with .NET](filters-dynamic-manifest-dotnet-how-to.md)-- [Create filters with CLI](filters-dynamic-manifest-cli-how-to.md)
media-services Filters Dynamic Manifest Cli How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filters-dynamic-manifest-cli-how-to.md
- Title: Use CLI to create filters with Azure Media Services
-description: This article shows how to use CLI to create filters with Azure Media Services v3.
------- Previously updated : 08/31/2020----
-# Creating filters with CLI
--
-When delivering your content to customers (streaming Live events or Video on Demand), your client might need more flexibility than what's described in the default asset's manifest file. Azure Media Services enables you to define account filters and asset filters for your content.
-
-For detailed description of this feature and scenarios where it is used, see [Dynamic Manifests](filters-dynamic-manifest-concept.md) and [Filters](filters-concept.md).
-
-This topic shows how to configure a filter for a Video on-Demand asset and use CLI for Media Services v3 to create [Account Filters](/cli/azure/ams/account-filter) and [Asset Filters](/cli/azure/ams/asset-filter).
-
-> [!NOTE]
-> Make sure to review [presentationTimeRange](filters-concept.md#presentationtimerange).
-
-## Prerequisites
--- [Create a Media Services account](./account-create-how-to.md). Make sure to remember the resource group name and the Media Services account name.-
-## Define a filter
-
-The following example defines the track selection conditions that are added to the final manifest. This filter includes any audio tracks that are EC-3 and any video tracks that have bitrate in the 0-1000000 range.
-
-> [!TIP]
-> If you plan to define **Filters** in REST, notice that you need to include the "Properties" wrapper JSON object.
-
-```json
-[
- {
- "trackSelections": [
- {
- "property": "Type",
- "value": "Audio",
- "operation": "Equal"
- },
- {
- "property": "FourCC",
- "value": "EC-3",
- "operation": "NotEqual"
- }
- ]
- },
- {
- "trackSelections": [
- {
- "property": "Type",
- "value": "Video",
- "operation": "Equal"
- },
- {
- "property": "Bitrate",
- "value": "0-1000000",
- "operation": "Equal"
- }
- ]
- }
-]
-```
-
-## Create account filters
-
-The following [az ams account-filter](/cli/azure/ams/account-filter) command creates an account filter with filter track selections that were [defined earlier](#define-a-filter).
-
-The command allows you to pass an optional `--tracks` parameter that contains JSON representing the track selections. Use @{file} to load JSON from a file. If you are using the Azure CLI locally, specify the whole file path:
-
-```azurecli
-az ams account-filter create -a amsAccount -g resourceGroup -n filterName --tracks @tracks.json
-```
-
-Also, see [JSON examples for filters](/rest/api/media/accountfilters/createorupdate#create-an-account-filter).
-
-## Create asset filters
-
-The following [az ams asset-filter](/cli/azure/ams/asset-filter) command creates an asset filter with filter track selections that were [defined earlier](#define-a-filter).
-
-```azurecli
-az ams asset-filter create -a amsAccount -g resourceGroup -n filterName --asset-name assetName --tracks @tracks.json
-```
-
-Also, see [JSON examples for filters](/rest/api/media/assetfilters/createorupdate#create-an-asset-filter).
-
-## Associate filters with Streaming Locator
-
-You can specify a list of asset or account filters, which would apply to your Streaming Locator. The [Dynamic Packager (Streaming Endpoint)](encode-dynamic-packaging-concept.md) applies this list of filters together with those your client specifies in the URL. This combination generates a [Dynamic Manifest](filters-dynamic-manifest-concept.md), which is based on filters in the URL + filters you specify on Streaming Locator. We recommend that you use this feature if you want to apply filters but do not want to expose the filter names in the URL.
-
-The following CLI code shows how to create a Streaming Locator and specify `filters`. This is an optional property that takes a space-separated list of asset filter names and/or account filter names.
-
-```azurecli
-az ams streaming-locator create -a amsAccount -g resourceGroup -n streamingLocatorName \
- --asset-name assetName \
- --streaming-policy-name policyName \
- --filters filterName1 filterName2
-
-```
-
-## Stream using filters
-
-Once you define filters, your clients could use them in the streaming URL. Filters could be applied to adaptive bitrate streaming protocols: Apple HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming.
-
-The following table shows some examples of URLs with filters:
-
-|Protocol|Example|
-|||
-|HLS|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=m3u8-aapl,filter=myAccountFilter)`|
-|MPEG DASH|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=mpd-time-csf,filter=myAssetFilter)`|
-|Smooth Streaming|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(filter=myAssetFilter)`|
-
-## Next step
-
-[Stream videos](stream-files-tutorial-with-api.md)
-
-## See also
-
-[Azure CLI](/cli/azure/ams)
media-services Filters Dynamic Manifest Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filters-dynamic-manifest-concept.md
- Title: Filter your manifests using Dynamic Packager
-description: Learn how to create filters using Dynamic Packager to filter and selectively stream your manifests.
------ Previously updated : 08/31/2020-
-#Customer intent: As a developer or a content provider, when delivering adaptive bitrate streaming content to devices, you often need to target specific device capabilities or available network bandwidth. Pre-filtering manifests with Dynamic Packager allows your clients to manipulate the streaming of your content without you needing to create multiple copies of the same media file.
--
-# Filter your manifests using Dynamic Packager
--
-When you're delivering adaptive bitrate streaming content to devices, you sometimes need to publish multiple versions of a manifest to target specific device capabilities or available network bandwidth. The [Dynamic Packager](encode-dynamic-packaging-concept.md) lets you specify filters which can filter out specific codecs, resolutions, bitrates, and audio track combinations on-the-fly. This filtering removes the need to create multiple copies. You simply need to publish a new URL with a specific set of filters configured to your target devices (iOS, Android, SmartTV, or browsers) and the network capabilities (high-bandwidth, mobile, or low-bandwidth scenarios). In this case, clients can manipulate the streaming of your content through the query string (by specifying available [Asset filters or Account filters](filters-concept.md)) and use filters to stream specific sections of a stream.
-
-Some delivery scenarios require that you make sure a customer can't access specific tracks. For example, maybe you don't want to publish a manifest that contains HD tracks to a specific subscriber tier. Or, maybe you want to remove specific adaptive bitrate (ABR) tracks to reduce cost of delivery to a specific device that wouldn't benefit from the additional tracks. In this case, you could associate a list of pre-created filters with your [Streaming Locator](stream-streaming-locators-concept.md) on creation. Clients then can't manipulate how the content is streamed because it's defined by the **Streaming Locator**.
-
-You can combine filtering through specifying [filters on Streaming Locator](filters-concept.md#associating-filters-with-streaming-locator) + additional device-specific filters that your client specifies in the URL. This combination is useful to restrict additional tracks like metadata or event streams, audio languages, or descriptive audio tracks.
-
-This ability to specify different filters on your stream provides a powerful **Dynamic Manifest** manipulation solution to target multiple use-case scenarios for your target devices. This topic explains concepts related to **Dynamic Manifests** and gives examples of scenarios in which you can use this feature.
-
-> [!NOTE]
-> Dynamic Manifests don't change the asset and the default manifest for that asset.
-
-## Overview of manifests
-
-Azure Media Services supports HLS, MPEG DASH, and Smooth Streaming protocols. As part of [Dynamic Packaging](encode-dynamic-packaging-concept.md), the streaming client manifests (HLS Master Playlist, DASH Media Presentation Description [MPD], and Smooth Streaming) are dynamically generated based on the format selector in the URL. For more information, see the delivery protocols in [Common on-demand workflow](encode-dynamic-packaging-concept.md#to-prepare-your-source-files-for-delivery).
-
-### Get and examine manifest files
-
-You specify a list of filter track property conditions based on which tracks of your stream (live or video on-demand [VOD]) should be included in a dynamically created manifest. To get and examine the properties of the tracks, you have to load the Smooth Streaming manifest first.
-
-The [Upload, encode, and stream files with .NET](stream-files-tutorial-with-api.md#get-streaming-urls) tutorial shows you how to build the streaming URLs with .NET. If you run the app, one of the URLs points to the Smooth Streaming manifest: `https://amsaccount-usw22.streaming.media.azure.net/00000000-0000-0000-0000-0000000000000/ignite.ism/manifest`.<br/> Copy and paste the URL into the address bar of a browser. The file will be downloaded. You can open it in any text editor.
-
-For a REST example, see [Upload, encode, and stream files with REST](stream-files-tutorial-with-rest.md#list-paths-and-build-streaming-urls).
-
-### Monitor the bitrate of a video stream
-
-You can use the [Azure Media Player demo page](https://aka.ms/azuremediaplayer) to monitor the bitrate of a video stream. The demo page displays diagnostics info on the **Diagnostics** tab:
-
-![Azure Media Player diagnostics][amp_diagnostics]
-
-### Examples: URLs with filters in query string
-
-You can apply filters to ABR streaming protocols: HLS, MPEG-DASH, and Smooth Streaming. The following table shows some examples of URLs with filters:
-
-|Protocol|Example|
-|||
-|HLS|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=m3u8-aapl,filter=myAccountFilter)`|
-|MPEG DASH|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=mpd-time-csf,filter=myAssetFilter)`|
-|Smooth Streaming|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(filter=myAssetFilter)`|
-
-## Rendition filtering
-
-You can choose to encode your asset to multiple encoding profiles (H.264 Baseline, H.264 High, AACL, AACH, Dolby Digital Plus) and multiple quality bitrates. However, not all client devices will support all your asset's profiles and bitrates. For example, older Android devices support only H.264 Baseline+AACL. Sending higher bitrates to a device that can't get the benefits wastes bandwidth and device computation. Such a device must decode all the given information, only to scale it down for display.
-
-With Dynamic Manifest, you can create device profiles (such as mobile, console, or HD/SD) and include the tracks and qualities that you want to be a part of each profile. That's called rendition filtering. The following diagram shows an example of it.
-
-![Example of rendition filtering with Dynamic Manifest][renditions2]
-
-In the following example, an encoder was used to encode a mezzanine asset into seven ISO MP4s video renditions (from 180p to 1080p). The encoded asset can be [dynamically packaged](encode-dynamic-packaging-concept.md) into any of the following streaming protocols: HLS, MPEG DASH, and Smooth.
-
-The top of the following diagram shows the HLS manifest for the asset with no filters. (It contains all seven renditions.) In the lower left, the diagram shows an HLS manifest to which a filter named "ott" was applied. The "ott" filter specifies the removal of all bitrates below 1 Mbps, so the bottom two quality levels were stripped off in the response. In the lower right, the diagram shows the HLS manifest to which a filter named "mobile" was applied. The "mobile" filter specifies the removal of renditions where the resolution is larger than 720p, so the two 1080p renditions were stripped off.
-
-![Rendition filtering with Dynamic Manifest][renditions1]
-
-## Removing language tracks
-Your assets might include multiple audio languages such as English, Spanish, French, and so on. Usually, the Player SDK manages default audio track selection and available audio tracks per user selection.
-
-Developing such Player SDKs is challenging because it requires different implementations across device-specific player frameworks. Also, on some platforms, Player APIs are limited and don't include the audio selection feature where users can't select or change the default audio track. With asset filters, you can control the behavior by creating filters that only include desired audio languages.
-
-![Filtering of language tracks with Dynamic Manifest][language_filter]
-
-## Trimming the start of an asset
-
-In most live streaming events, operators run some tests before the actual event. For example, they might include a slate like this before the start of the event: "Program will begin momentarily."
-
-If the program is archiving, the test and slate data are also archived and included in the presentation. However, this information shouldn't be shown to the clients. With Dynamic Manifest, you can create a start time filter and remove the unwanted data from the manifest.
-
-![Trimming start of an asset with Dynamic Manifest][trim_filter]
-
-## Creating subclips (views) from a live archive
-
-Many live events are long running and live archive might include multiple events. After the live event ends, broadcasters may want to break up the live archive into logical program start and stop sequences.
-
-You can publish these virtual programs separately without post processing the live archive and not creating separate assets (which doesn't get the benefit of the existing cached fragments in the CDNs). Examples of such virtual programs are the quarters of a football or basketball game, innings in baseball, or individual events of any sports program.
-
-With Dynamic Manifest, you can create filters by using start/end times and create virtual views over the top of your live archive.
-
-![Subclip filter with Dynamic Manifest][subclip_filter]
-
-Here's the filtered asset:
-
-![Filtered asset with Dynamic Manifest][skiing]
-
-## Adjusting the presentation window (DVR)
-
-Currently, Azure Media Services offers circular archive where the duration can be configured between 1 minute - 25 hours. Manifest filtering can be used to create a rolling DVR window over the top of the archive, without deleting media. There are many scenarios where broadcasters want to provide a limited DVR window to move with the live edge and at the same time keep a bigger archiving window. A broadcaster may want to use the data that's out of the DVR window to highlight clips, or they may want to provide different DVR windows for different devices. For example, most of the mobile devices don't handle large DVR windows (you can have a 2-minute DVR window for mobile devices and one hour for desktop clients).
-
-![DVR window with Dynamic Manifest][dvr_filter]
-
-## Adjusting LiveBackoff (live position)
-
-Manifest filtering can be used to remove several seconds from the live edge of a live program. Filtering allows broadcasters to watch the presentation on the preview publication point and create advertisement insertion points before the viewers receive the stream (backed off by 30 seconds). Broadcasters can then push these advertisements to their client frameworks in time for them to receive and process the information before the advertisement opportunity.
-
-In addition to the advertisement support, the live back-off setting can be used to adjust the viewers' position so that when clients drift and hit the live edge, they can still get fragments from the server. That way, clients won't get an HTTP 404 or 412 error.
-
-![Filter for live back-off with Dynamic Manifest][livebackoff_filter]
-
-## Combining multiple rules in a single filter
-
-You can combine multiple filtering rules in a single filter. For example, you can define a "range rule" to remove slates from a live archive and also filter out available bitrates. When you're applying multiple filtering rules, the end result is the intersection of all rules.
-
-![Multiple filtering rules with Dynamic Manifest][multiple-rules]
-
-## Combining multiple filters (filter composition)
-
-You can also combine multiple filters in a single URL. The following scenario demonstrates why you might want to combine filters:
-
-1. You need to filter your video qualities for mobile devices, like Android or iPad (in order to limit video qualities). To remove the unwanted qualities, you'll create an account filter suitable for the device profiles. You can use account filters for all your assets under the same Media Services account without any further association.
-1. You also want to trim the start and end time of an asset. To do the trimming, you'll create an asset filter and set the start/end time.
-1. You want to combine both of these filters. Without combination, you would need to add quality filtering to the trimming filter, which would make filter usage more difficult.
-
-To combine filters, set the filter names to the manifest/playlist URL in semicolon-delimited format. LetΓÇÖs assume you have a filter named *MyMobileDevice* that filters qualities, and you have another named *MyStartTime* to set a specific start time. You can combine up to three filters.
-
-For more information, see [this blog post](https://azure.microsoft.com/blog/azure-media-services-release-dynamic-manifest-composition-remove-hls-audio-only-track-and-hls-i-frame-track-support/).
-
-## Considerations and limitations
--- The values for **forceEndTimestamp**, **presentationWindowDuration**, and **liveBackoffDuration** shouldn't be set for a VOD filter. They're used only for live filter scenarios.-- A dynamic manifest operates in GOP boundaries (key frames), so trimming has GOP accuracy.-- You can use the same filter name for account and asset filters. Asset filters have higher precedence and will override account filters.-- If you update a filter, it can take up to 2 minutes for the streaming endpoint to refresh the rules. If you used filters to serve the content (and you cached the content in proxies and CDN caches), updating these filters can result in player failures. We recommend that you clear the cache after updating the filter. If this option isn't possible, consider using a different filter.-- Customers need to manually download the manifest and parse the exact start time stamp and time scale.-
- - To determine properties of the tracks in an asset, [get and examine the manifest file](#get-and-examine-manifest-files).
- - The formula to set the asset filter time-stamp properties is: <br/>startTimestamp = &lt;start time in the manifest&gt; + &lt;expected filter start time in seconds&gt; * timescale
-
-## Next steps
-
-The following articles show how to create filters programmatically:
--- [Create filters with REST APIs](filters-dynamic-manifest-rest-howto.md)-- [Create filters with .NET](filters-dynamic-manifest-dotnet-how-to.md)-- [Create filters with CLI](filters-dynamic-manifest-cli-how-to.md)-
-[renditions1]: ./media/filters-dynamic-manifest-concept/media-services-rendition-filter.png
-[renditions2]: ./media/filters-dynamic-manifest-concept/media-services-rendition-filter2.png
-
-[rendered_subclip]: ./media/filters-dynamic-manifests/media-services-rendered-subclip.png
-[timeline_trim_event]: ./media/filters-dynamic-manifests/media-services-timeline-trim-event.png
-[timeline_trim_subclip]: ./media/filters-dynamic-manifests/media-services-timeline-trim-subclip.png
-
-[multiple-rules]:./media/filters-dynamic-manifest-concept/media-services-multiple-rules-filters.png
-
-[subclip_filter]: ./media/filters-dynamic-manifest-concept/media-services-subclips-filter.png
-[trim_event]: ./media/filters-dynamic-manifests/media-services-timeline-trim-event.png
-[trim_subclip]: ./media/filters-dynamic-manifests/media-services-timeline-trim-subclip.png
-[trim_filter]: ./media/filters-dynamic-manifest-concept/media-services-trim-filter.png
-[redered_subclip]: ./media/filters-dynamic-manifests/media-services-rendered-subclip.png
-[livebackoff_filter]: ./media/filters-dynamic-manifest-concept/media-services-livebackoff-filter.png
-[language_filter]: ./media/filters-dynamic-manifest-concept/media-services-language-filter.png
-[dvr_filter]: ./media/filters-dynamic-manifest-concept/media-services-dvr-filter.png
-[skiing]: ./media/filters-dynamic-manifest-concept/media-services-skiing.png
-[amp_diagnostics]: ./media/filters-dynamic-manifest-concept/amp_diagnostics.png
media-services Filters Dynamic Manifest Dotnet How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filters-dynamic-manifest-dotnet-how-to.md
- Title: Creating filters with Azure Media Services v3 .NET SDK
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services v3 .NET SDK creates dynamic manifests to achieve this selective streaming.
------ Previously updated : 08/31/2020----
-# Create filters with Media Services .NET SDK
--
-When delivering your content to customers (streaming Live events or Video on Demand) your client might need more flexibility than what's described in the default asset's manifest file. Azure Media Services enables you to define account filters and asset filters for your content.
-
-For detailed description of this feature and scenarios where it is used, see [Dynamic Manifests](filters-dynamic-manifest-concept.md) and [Filters](filters-concept.md).
-
-This topic shows how to use Media Services .NET SDK to define a filter for a Video on Demand asset and create [Account Filters](/dotnet/api/microsoft.azure.management.media.models.accountfilter) and [Asset Filters](/dotnet/api/microsoft.azure.management.media.models.assetfilter).
-
-> [!NOTE]
-> Make sure to review [presentationTimeRange](filters-concept.md#presentationtimerange).
-
-## Prerequisites
--- Review [Filters and dynamic manifests](filters-dynamic-manifest-concept.md).-- [Create a Media Services account](./account-create-how-to.md). Make sure to remember the resource group name and the Media Services account name. -- Get information needed to [access APIs](./access-api-howto.md)-- Review [Upload, encode, and stream using Azure Media Services](stream-files-tutorial-with-api.md) to see how to [start using .NET SDK](stream-files-tutorial-with-api.md#start-using-media-services-apis-with-the-net-sdk)-
-## Define a filter
-
-In .NET, you configure track selections with [FilterTrackSelection](/dotnet/api/microsoft.azure.management.media.models.filtertrackselection) and [FilterTrackPropertyCondition](/dotnet/api/microsoft.azure.management.media.models.filtertrackpropertycondition) classes.
-
-The following code defines a filter that includes any audio tracks that are EC-3 and any video tracks that have bitrate in the 0-1000000 range.
-
-```csharp
-var audioConditions = new List<FilterTrackPropertyCondition>()
-{
- new FilterTrackPropertyCondition(FilterTrackPropertyType.Type, "Audio", FilterTrackPropertyCompareOperation.Equal),
- new FilterTrackPropertyCondition(FilterTrackPropertyType.FourCC, "EC-3", FilterTrackPropertyCompareOperation.Equal)
-};
-
-var videoConditions = new List<FilterTrackPropertyCondition>()
-{
- new FilterTrackPropertyCondition(FilterTrackPropertyType.Type, "Video", FilterTrackPropertyCompareOperation.Equal),
- new FilterTrackPropertyCondition(FilterTrackPropertyType.Bitrate, "0-1000000", FilterTrackPropertyCompareOperation.Equal)
-};
-
-List<FilterTrackSelection> includedTracks = new List<FilterTrackSelection>()
-{
- new FilterTrackSelection(audioConditions),
- new FilterTrackSelection(videoConditions)
-};
-```
-
-## Create account filters
-
-The following code shows how to use .NET to create an account filter that includes all track selections [defined above](#define-a-filter).
-
-```csharp
-AccountFilter accountFilterParams = new AccountFilter(tracks: includedTracks);
-client.AccountFilters.CreateOrUpdate(config.ResourceGroup, config.AccountName, "accountFilterName1", accountFilter);
-```
-
-## Create asset filters
-
-The following code shows how to use .NET to create an asset filter that includes all track selections [defined above](#define-a-filter).
-
-```csharp
-AssetFilter assetFilterParams = new AssetFilter(tracks: includedTracks);
-client.AssetFilters.CreateOrUpdate(config.ResourceGroup, config.AccountName, encodedOutputAsset.Name, "assetFilterName1", assetFilterParams);
-```
-
-## Associate filters with Streaming Locator
-
-You can specify a list of asset or account filters, which would apply to your Streaming Locator. The [Dynamic Packager (Streaming Endpoint)](encode-dynamic-packaging-concept.md) applies this list of filters together with those your client specifies in the URL. This combination generates a [Dynamic Manifest](filters-dynamic-manifest-concept.md), which is based on filters in the URL + filters you specify on Streaming Locator. We recommend that you use this feature if you want to apply filters but do not want to expose the filter names in the URL.
-
-The following C# code shows how to create a Streaming Locator and specify `StreamingLocator.Filters`. This is an optional property that takes an `IList<string>` of filter names.
-
-```csharp
-IList<string> filters = new List<string>();
-filters.Add("filterName");
-
-StreamingLocator locator = await client.StreamingLocators.CreateAsync(
- resourceGroup,
- accountName,
- locatorName,
- new StreamingLocator
- {
- AssetName = assetName,
- StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly,
- Filters = filters
- });
-```
-
-## Stream using filters
-
-Once you define filters, your clients could use them in the streaming URL. Filters could be applied to adaptive bitrate streaming protocols: Apple HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming.
-
-The following table shows some examples of URLs with filters:
-
-|Protocol|Example|
-|||
-|HLS|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=m3u8-aapl,filter=myAccountFilter)`|
-|MPEG DASH|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=mpd-time-csf,filter=myAssetFilter)`|
-|Smooth Streaming|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(filter=myAssetFilter)`|
-
-## Next steps
-
-[Stream videos](stream-files-tutorial-with-api.md)
media-services Filters Dynamic Manifest Rest Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/filters-dynamic-manifest-rest-howto.md
- Title: Creating filters with Azure Media Services v3 REST API
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services v3 REST API creates dynamic manifests to achieve this selective streaming.
------ Previously updated : 08/31/2020---
-# Creating filters with Media Services REST API
--
-When delivering your content to customers (streaming Live events or Video on Demand) your client might need more flexibility than what's described in the default asset's manifest file. Azure Media Services enables you to define account filters and asset filters for your content.
--
-For detailed description of this feature and scenarios where it is used, see [Dynamic Manifests](filters-dynamic-manifest-concept.md) and [Filters](filters-concept.md).
-
-This topic shows how to define a filter for a Video on Demand asset and use REST APIs to create [Account Filters](/rest/api/media/accountfilters) and [Asset Filters](/rest/api/media/assetfilters).
-
-> [!NOTE]
-> Make sure to review [presentationTimeRange](filters-concept.md#presentationtimerange).
-
-## Prerequisites
-
-To complete the steps described in this topic, you have to:
--- Review [Filters and dynamic manifests](filters-dynamic-manifest-concept.md).-- [Configure Postman for Azure Media Services REST API calls](setup-postman-rest-how-to.md).-
- Make sure to follow the last step in the topic [Get Azure AD Token](setup-postman-rest-how-to.md#get-azure-ad-token).
-
-## Define a filter
-
-The following is the **Request body** example that defines the track selection conditions that are added to the manifest. This filter includes any audio tracks that are EC-3 and any video tracks that have bitrate in the 0-1000000 range.
-
-```json
-{
- "properties": {
- "tracks": [
- {
- "trackSelections": [
- {
- "property": "Type",
- "value": "Audio",
- "operation": "Equal"
- },
- {
- "property": "FourCC",
- "value": "EC-3",
- "operation": "Equal"
- }
- ]
- },
- {
- "trackSelections": [
- {
- "property": "Type",
- "value": "Video",
- "operation": "Equal"
- },
- {
- "property": "Bitrate",
- "value": "0-1000000",
- "operation": "Equal"
- }
- ]
- }
- ]
- }
-}
-```
-
-## Create account filters
-
-In the Postman's collection that you downloaded, select **Account Filters**->**Create or update an Account Filter**.
-
-The **PUT** HTTP request method is similar to:
-
-```
-PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Media/mediaServices/{accountName}/accountFilters/{filterName}?api-version=2018-07-01
-```
-
-Select the **Body** tab and paste the json code you [defined earlier](#define-a-filter).
-
-Select **Send**.
-
-The filter has been created.
-
-For more information, see [Create or update](/rest/api/media/accountfilters/createorupdate). Also, see [JSON examples for filters](/rest/api/media/accountfilters/createorupdate#create-an-account-filter).
-
-## Create asset filters
-
-In the "Media Services v3" Postman collection that you downloaded, select **Assets**->**Create or update Asset Filter**.
-
-The **PUT** HTTP request method is similar to:
-
-```
-PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Media/mediaServices/{accountName}/assets/{assetName}/assetFilters/{filterName}?api-version=2018-07-01
-```
-
-Select the **Body** tab and paste the json code you [defined earlier](#define-a-filter).
-
-Select **Send**.
-
-The asset filter has been created.
-
-For details on how to create or update asset filters, see [Create or update](/rest/api/media/assetfilters/createorupdate). Also, see [JSON examples for filters](/rest/api/media/assetfilters/createorupdate#create-an-asset-filter).
-
-## Associate filters with Streaming Locator
-
-You can specify a list of asset or account filters, which would apply to your Streaming Locator. The [Dynamic Packager (Streaming Endpoint)](encode-dynamic-packaging-concept.md) applies this list of filters together with those your client specifies in the URL. This combination generates a [Dynamic Manifest](filters-dynamic-manifest-concept.md), which is based on filters in the URL + filters you specify on Streaming Locator. We recommend that you use this feature if you want to apply filters but do not want to expose the filter names in the URL.
-
-To create and associate filters with a Streaming Locator using REST, use the [Streaming Locators - Create](/rest/api/media/streaminglocators/create) API and specify `properties.filters` in the [Request Body](/rest/api/media/streaminglocators/create#request-body).
-
-## Stream using filters
-
-Once you define filters, your clients could use them in the streaming URL. Filters could be applied to adaptive bitrate streaming protocols: Apple HTTP Live Streaming (HLS), MPEG-DASH, and Smooth Streaming.
-
-The following table shows some examples of URLs with filters:
-
-|Protocol|Example|
-|||
-|HLS|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=m3u8-aapl,filter=myAccountFilter)`|
-|MPEG DASH|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(format=mpd-time-csf,filter=myAssetFilter)`|
-|Smooth Streaming|`https://amsv3account-usw22.streaming.media.azure.net/fecebb23-46f6-490d-8b70-203e86b0df58/bigbuckbunny.ism/manifest(filter=myAssetFilter)`|
-
-## Next steps
-
-[Stream videos](stream-files-tutorial-with-rest.md)
media-services Input Metadata Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/input-metadata-schema.md
- Title: Azure Media Services v3 input metadata schema
-description: This article gives an overview of Azure Media Services v3 input metadata schema.
------- Previously updated : 08/31/2020---
-# Input metadata
--
-An encoding job is associated with an input asset (or assets) on which you want to perform some encoding tasks. Upon completion of a task, an output asset is produced. The output asset contains video, audio, thumbnails, manifest, and other files.
-
-The output asset also contains a file with metadata about the input asset. The name of the metadata JSON file has a random ID, do not use it to identify the input asset that output asset belongs to. To identify the input asset it belongs to, use the `Uri` field (for more information, see [Other child elements](#other-child-elements)).
-
-Media Services does not preemptively scan input assets to generate metadata. Input metadata is generated only as an artifact when an input asset is processed in a Job. Hence this artifact is written to the output asset. Different tools are used to generate metadata for input assets and output assets. Therefore, the input metadata has a slightly different schema than the output metadata.
-
-This article discusses the elements and types of the JSON schema on which the input metada (&lt;asset_id&gt;_metadata.json ) is based. For information about the file that contains metadata about the output asset, see [Output metadata](output-metadata-schema.md).
-
-You can find the JSON schema example at the end of this article.
-
-## AssetFile
-
-Contains a collection of AssetFile elements for the encoding job.
-
-> [!NOTE]
-> The following four child elements must appear in a sequence.
-
-| Name | Description |
-| | |
-| **VideoTracks**|Each physical asset file can contain zero or more videos tracks interleaved into an appropriate container format. For more information, see [VideoTracks](#videotracks). |
-| **AudioTracks**|Each physical asset file can contain zero or more audio tracks interleaved into an appropriate container format. For more information, see [AudioTracks](#audiotracks) |
-| **Metadata** |Asset fileΓÇÖs metadata represented as key\value strings. <br />For example: `<Metadata key="language" value="eng" />` |
-
-### Other child elements
-
-| Name | Description |
-| | |
-| **Name**<br />Required |Asset file name. <br /><br />Example: `"Name": "Ignite-short.mp4"` |
-| **Uri**<br />Required |The URL where the input asset is located. To identify the input asset the output asset belongs to, use the `Uri` field instead of ID.|
-| **Size**<br />Required |Size of the asset file in bytes. <br /><br />Example: `"Size": 75739259`|
-| **Duration**<br />Required |Content play back duration. <br /><br />Example: `"Duration": "PT1M10.304S"`. |
-| **NumberOfStreams**<br />Required |Number of streams in the asset file. <br /><br />Example: `"NumberOfStreams": 2`|
-| **FormatNames**<br />Required |Format names. <br /><br />Example: `"FormatNames": "mov,mp4,m4a,3gp,3g2,mj2"`|
-| **FormatVerboseName**<br /> Required |Format verbose names. <br /><br />Example: `"FormatVerboseName": "QuickTime / MOV"` |
-| **StartTime** |Content start time. <br /><br />Example: `"StartTime": "PT0S"` |
-| **OverallBitRate** |Average bitrate of the asset file in bits per second. <br /><br />Example: `"OverallBitRate": 8618539`|
-
-## VideoTracks
-
-| Name | Description |
-| | |
-| **FourCC**<br />Required |Video codec FourCC code that is reported by ffmpeg.<br /><br />Example: `"FourCC": "avc1" | "hev1" | "hvc1"` |
-| **Profile** |Video track's profile. <br /><br />Example: `"Profile": "Main"`|
-| **Level** |Video track's level. <br /><br />Example: `"Level": "3.2"`|
-| **PixelFormat** |Video track's pixel format. <br /><br />Example: `"PixelFormat": "yuv420p"`|
-| **Width**<br />Required |Encoded video width in pixels. <br /><br />Example: `"Width": "1280"`|
-| **Height**<br />Required |Encoded video height in pixels.<br /><br />Example: `"Height": "720"` |
-| **DisplayAspectRatioNumerator**<br />Required |Video display aspect ratio numerator.<br /><br />Example: `"DisplayAspectRatioNumerator": 16.0` |
-| **DisplayAspectRatioDenominator**<br />Required |Video display aspect ratio denominator. <br /><br />Example: `"DisplayAspectRatioDenominator": 9.0`|
-| **SampleAspectRatioNumerator** |Video sample aspect ratio numerator. <br /><br />Example: `"SampleAspectRatioNumerator": 1.0`|
-| **SampleAspectRatioDenominator**|Example: `"SampleAspectRatioDenominator": 1.0`|
-| **FrameRate**<br />Required |Measured video frame rate in .3f format. <br /><br />Example: `"FrameRate": 29.970`|
-| **Bitrate** |Average video bit rate in bits per second, as calculated from the asset file. Only the elementary stream payload is counted, and the packaging overhead is not included. <br /><br />Example: `"Bitrate": 8421583`|
-| **HasBFrames** |Video track number of B frames. <br /><br />Example: `"HasBFrames": 2`|
-| **Metadata** |Generic key/value strings that can be used to hold a variety of information. <br />See the full example at the end of the article. |
-| **Id**<br />Required |Zero-based index of this audio or video track.<br /><br /> This **Id** is not necessarily the TrackID as used in an MP4 file. <br /><br />Example: `"Id": 2`|
-| **Codec** |Video track codec string. <br /><br />Example: `"Codec": "h264 | hev1"`|
-| **CodecLongName** |Audio or video track codec long name. <br /><br />Example: `"CodecLongName": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10"`|
-| **Codec** |Video track codec string. <br /><br />Example: `"Codec": "h264 | hev1"`|
-| **TimeBase**<br />Required |Time base.<br /><br />Example: `"TimeBase": "1/30000"`|
-| **NumberOfFrames** |Number of frames (present for video tracks). <br /><br />Example: `"NumberOfFrames": 2107`|
-| **StartTime** |Track start time.<br /><br />Example: `"StartTime": "PT0.033S"` |
-| **Duration** |Track duration. <br /><br />Example: `"Duration": "PT1M10.304S"`|
-
-## AudioTracks
-
-| Name | Description |
-| | |
-| **SampleFormat** |Sample format. <br /><br />Example: `"SampleFormat": "fltp"`|
-| **ChannelLayout** |Channel layout. <br /><br />Example: `"ChannelLayout": "stereo"`|
-| **Channels**<br />Required |Number (0 or more) of audio channels. <br /><br />Example: `"Channels": 2`|
-| **SamplingRate**<br />Required |Audio sampling rate in samples/sec or Hz. <br /><br />Example: `"SamplingRate": 48000`|
-| **Bitrate** |Average audio bit rate in bits per second, as calculated from the asset file. Only the elementary stream payload is counted, and the packaging overhead is not included in this count. <br /><br />Example: `"Bitrate": 192080`|
-| **Metadata** |Generic key/value strings that can be used to hold a variety of information. <br />See the full example at the end of the article. |
-| **Id**<br />Required |Zero-based index of this audio or video track.<br /><br /> This is not necessarily that the TrackID as used in an MP4 file. <br /><br />Example: `"Id": 1`|
-| **Codec** |Video track codec string. <br /><br />Example: `"Codec": "aac"`|
-| **CodecLongName** |Audio or video track codec long name. <br /><br />Example: `"CodecLongName": "AAC (Advanced Audio Coding)"`|
-| **TimeBase**<br />Required |Time base.<br /><br />Example: `"TimeBase": "1/48000"` |
-| **NumberOfFrames** |Number of frames (present for video tracks). <br /><br />Example: `"NumberOfFrames": 3294`|
-| **StartTime** |Track start time. For more information, see [ISO8601](https://www.iso.org/iso-8601-date-and-time-format.html). <br /><br />Example: `"StartTime": "PT0S"` |
-| **Duration** |Track duration. <br /><br />Example: `"Duration": "PT1M10.272S"` |
-
-## Metadata
-
-| Name | Description |
-| | |
-| **key**<br />Required |The key in the key/value pair. |
-| **value**<br /> Required |The value in the key/value pair. |
-
-## Schema example
-
-```json
-{
- "AssetFile": [
- {
- "VideoTracks": [
- {
- "FourCC": "avc1",
- "Profile": "Main",
- "Level": "3.2",
- "PixelFormat": "yuv420p",
- "Width": "1280",
- "Height": "720",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "SampleAspectRatioNumerator": 1.0,
- "SampleAspectRatioNumeratorSpecified": true,
- "SampleAspectRatioDenominator": 1.0,
- "SampleAspectRatioDenominatorSpecified": true,
- "FrameRate": 29.970,
- "Bitrate": 8421583,
- "BitrateSpecified": true,
- "HasBFrames": 2,
- "HasBFramesSpecified": true,
- "Disposition": {
- "Default": 1
- },
- "Metadata": [
- {
- "key": "creation_time",
- "value": "2018-02-21T21:42:08.000000Z"
- },
- {
- "key": "language",
- "value": "eng"
- },
- {
- "key": "handler_name",
- "value": "Video Media Handler"
- },
- {
- "key": "encoder",
- "value": "AVC Coding"
- }
- ],
- "Id": 2,
- "Codec": "h264",
- "CodecLongName": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
- "TimeBase": "1/30000",
- "NumberOfFrames": 2107,
- "NumberOfFramesSpecified": true,
- "StartTime": "PT0.033S",
- "Duration": "PT1M10.304S"
- }
- ],
- "AudioTracks": [
- {
- "SampleFormat": "fltp",
- "ChannelLayout": "stereo",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 192080,
- "BitrateSpecified": true,
- "BitsPerSampleSpecified": true,
- "Disposition": {
- "Default": 1
- },
- "Metadata": [
- {
- "key": "creation_time",
- "value": "2018-02-21T21:42:08.000000Z"
- },
- {
- "key": "language",
- "value": "eng"
- },
- {
- "key": "handler_name",
- "value": "Sound Media Handler"
- }
- ],
- "Id": 1,
- "Codec": "aac",
- "CodecLongName": "AAC (Advanced Audio Coding)",
- "TimeBase": "1/48000",
- "NumberOfFrames": 3294,
- "NumberOfFramesSpecified": true,
- "StartTime": "PT0S",
- "Duration": "PT1M10.272S"
- }
- ],
- "Metadata": [
- {
- "key": "major_brand",
- "value": "mp42"
- },
- {
- "key": "minor_version",
- "value": "19529854"
- },
- {
- "key": "compatible_brands",
- "value": "mp42isom"
- },
- {
- "key": "creation_time",
- "value": "2018-02-21T21:42:08.000000Z"
- }
- ],
- "Name": "Ignite-short.mp4",
- "Uri": "https://amsstorageacct.blob.core.windows.net/asset-00000000-0000-0000-000000000000/ignite.mp4",
- "Size": 75739259,
- "Duration": "PT1M10.304S",
- "NumberOfStreams": 2,
- "FormatNames": "mov,mp4,m4a,3gp,3g2,mj2",
- "FormatVerboseName": "QuickTime / MOV",
- "StartTime": "PT0S",
- "OverallBitRate": 8618539,
- "OverallBitRateSpecified": true
- }
- ]
-}
-```
-
-## Next steps
-
-[Output metadata](output-metadata-schema.md)
media-services Integrate Azure Functions Dotnet How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/integrate-azure-functions-dotnet-how-to.md
- Title: Develop Azure Functions with Media Services v3
-description: This article shows how to start developing Azure Functions with Media Services v3 using Visual Studio Code.
----- Previously updated : 06/09/2021----
-# Develop Azure Functions with Media Services v3
--
-This article shows you how to get started with creating Azure Functions that use Media Services. The Azure Function defined in this article encodes a video file with Media Encoder Standard. As soon as the encoding job has been created, the function returns the job name and output asset name. To review Azure Functions, see [Overview](../../azure-functions/functions-overview.md) and other topics in the **Azure Functions** section.
-
-If you want to explore and deploy existing Azure Functions that use Azure Media Services, check out [Media Services Azure Functions](https://github.com/Azure-Samples/media-services-v3-dotnet-core-functions-integration). This repository contains examples that use Media Services to show workflows related to ingesting content directly from blob storage, encoding, and live streaming operations.
-
-## Prerequisites
--- Before you can create your first function, you need to have an active Azure account. If you don't already have an Azure account, [free accounts are available](https://azure.microsoft.com/free/).-- If you are going to create Azure Functions that perform actions on your Azure Media Services (AMS) account or listen to events sent by Media Services, you should create an AMS account, as described [here](account-create-how-to.md).-- Install [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).-
-This article explains how to create a C# .NET 5 function that communicates with Azure Media Services. To create a function with another language, look to this [article](../../azure-functions/functions-develop-vs-code.md).
-
-### Run local requirements
-
-These prerequisites are only required to run and debug your functions locally. They aren't required to create or publish projects to Azure Functions.
--- [.NET Core 3.1 and .NET 5 SDKs](https://dotnet.microsoft.com/download/dotnet).--- The [Azure Functions Core Tools](../../azure-functions/functions-run-local.md#install-the-azure-functions-core-tools) version 3.x or later. The Core Tools package is downloaded and installed automatically when you start the project locally. Core Tools includes the entire Azure Functions runtime, so download and installation might take some time.--- The [C# extension](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.csharp) for Visual Studio Code.-
-## Install the Azure Functions extension
-
-You can use the Azure Functions extension to create and test functions and deploy them to Azure.
-
-1. In Visual Studio Code, open **Extensions** and search for **Azure functions**, or select this link in Visual Studio Code: [`vscode:extension/ms-azuretools.vscode-azurefunctions`](vscode:extension/ms-azuretools.vscode-azurefunctions).
-
-1. Select **Install** to install the extension for Visual Studio Code:
-
- ![Install the extension for Azure Functions](./Media/integrate-azure-functions-dotnet-how-to/vscode-install-extension.png)
-
-1. After installation, select the Azure icon on the Activity bar. You should see an Azure Functions area in the Side Bar.
-
- ![Azure Functions area in the Side Bar](./Media/integrate-azure-functions-dotnet-how-to/azure-functions-window-vscode.png)
-
-## Create an Azure Functions project
-
-The Functions extension lets you create a function app project, along with your first function. The following steps show how to create an HTTP-triggered function in a new Functions project. HTTP trigger is the simplest function trigger template to demonstrate.
-
-1. From **Azure: Functions**, select the **Create Function** icon:
-
- ![Create a function](./Media/integrate-azure-functions-dotnet-how-to/create-function.png)
-
-1. Select the folder for your function app project, and then **Select C# for your function project** and **.NET 5 Isolated** for the runtime.
-
-1. Select the **HTTP trigger** function template.
-
- ![Choose the HTTP trigger template](./Media/integrate-azure-functions-dotnet-how-to/create-function-choose-template.png)
-
-1. Type **HttpTriggerEncode** for the function name and select Enter, accept **Company.Function** for the namespace then select **Function** for the access rights. This authorization level requires you to provide a [function key](../../azure-functions/functions-bindings-http-webhook-trigger.md#authorization-keys) when you call the function endpoint.
-
- ![Select Function authorization](./Media/integrate-azure-functions-dotnet-how-to/create-function-auth.png)
-
- A function is created in your chosen language and in the template for an HTTP-triggered function.
-
- ![HTTP-triggered function template in Visual Studio Code](./Media/integrate-azure-functions-dotnet-how-to/new-function-full.png)
-
-## Install Media Services and other extensions
-
-Run the dotnet add package command in the Terminal window to install the extension packages that you need in your project. The following command installs the Media Services package and other extensions needed by the sample.
-
-```bash
-dotnet add package Azure.Storage.Blobs
-dotnet add package Microsoft.Azure.Management.Media
-dotnet add package Azure.Identity
-```
-
-## Generated project files
-
-The project template creates a project in your chosen language and installs required dependencies. The new project has these files:
-
-* **host.json**: Lets you configure the Functions host. These settings apply when you're running functions locally and when you're running them in Azure. For more information, see [host.json reference](./../../azure-functions/functions-host-json.md).
-
-* **local.settings.json**: Maintains settings used when you're running functions locally. These settings are used only when you're running functions locally.
-
- >[!IMPORTANT]
- >Because the local.settings.json file can contain secrets, you need to exclude it from your project source control.
-
-* **HttpTriggerEncode.cs** class file that implements the function.
-
-### HttpTriggerEncode.cs
-
-This is the C# code for your function. Its role is to take a Media Services asset or a source URL and launches an encoding job with Media Services. It uses a Transform that is created if it does not exist. When it is created, it used the preset provided in the input body.
-
->[!IMPORTANT]
->Replace the full content of HttpTriggerEncode.cs file with [`HttpTriggerEncode.cs` from this repository](https://github.com/Azure-Samples/media-services-v3-dotnet-core-functions-integration/blob/main/Tutorial/HttpTriggerEncode.cs).
-
-Once you are done defining your function, select **Save and Run**.
-
-The source code for the **Run** method of the function is:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-core-functions-integration/Tutorial/HttpTriggerEncode.cs#Run)]
-
-### local.settings.json
-
-Update the file with the following content (and replace the values).
-
-```json
-{
- "IsEncrypted": false,
- "Values": {
- "AzureWebJobsStorage": "",
- "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
- "AadClientId": "00000000-0000-0000-0000-000000000000",
- "AadEndpoint": "https://login.microsoftonline.com",
- "AadSecret": "00000000-0000-0000-0000-000000000000",
- "AadTenantId": "00000000-0000-0000-0000-000000000000",
- "AccountName": "amsaccount",
- "ArmAadAudience": "https://management.core.windows.net/",
- "ArmEndpoint": "https://management.azure.com/",
- "ResourceGroup": "amsResourceGroup",
- "SubscriptionId": "00000000-0000-0000-0000-000000000000"
- }
-}
-```
-
-## Test your function
-
-When you run the function locally in VS Code, the function should be exposed as:
-
-```url
-http://localhost:7071/api/HttpTriggerEncode
-```
-
-To test it, you can use Postman to do a POST on this URL using a JSON input body.
-
-JSON input body example:
-
-```json
-{
- "inputUrl":"https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/Ignite-short.mp4",
- "transformName" : "TransformAS",
- "builtInPreset" :"AdaptiveStreaming"
- }
-```
-
-The function should return 200 OK with an output body containing the job and output asset names.
-
-![Test the function with Postman](./Media/integrate-azure-functions-dotnet-how-to/postman.png)
-
-## Next steps
-
-At this point, you are ready to start developing functions that call Media Services API.
-
-For more information and a complete sample of using Azure Functions with Azure Media Services v3, see the [Media Services v3 Azure Functions sample](https://github.com/Azure-Samples/media-services-v3-dotnet-core-functions-integration/tree/main/Functions).
media-services Integrate Event Grid Log Analytics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/integrate-event-grid-log-analytics-tutorial.md
- Title: Store Media Services events in Azure Log Analytics
-description: Learn how to store Azure Media Services events in Azure Log Analytics.
------ Previously updated : 08/24/2020---
-# Tutorial: Store Azure Media Services events in Azure Log Analytics
-
-## Azure Media Services Events
-
-Azure Media Services v3 emits events on [Azure Event Grid](monitoring/media-services-event-schemas.md). You can subscribe to events in many ways and store them in data stores. In this tutorial, you will subscribe to Media Services events using a [Log App Flow](https://azure.microsoft.com/services/logic-apps/). The Logic App will be triggered for each event and store the body of the event in Azure Log Analytics. Once the events are in Azure Log Analytics, you can use other Azure services to create a dashboard, monitor, and alert on these events, though we won't be covering that in this tutorial.
-
-> [!NOTE]
-> It would be helpful if you are already familiar with using FFmpeg as your on-premises encoder. If not, that's okay. The command line and instructions for streaming a video is included below.
-
-You will learn how to:
-
-> [!div class="checklist"]
-> * Create a no code Logic App Flow
-> * Subscribe to Azure Media Services event topics
-> * Parse events and store to Azure Log Analytics
-> * Query events from Azure Log Analytics
-
-If you donΓÇÖt have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-
-## Prerequisites
-
-> * An [Azure subscription](setup-azure-subscription-how-to.md)
-> * A [Media Services](account-create-how-to.md) account and resource group.
-> * An installation of [FFmpeg](https://ffmpeg.org/download.html) for your OS.
-> * A [Log Analytics](../../azure-monitor/logs/quick-create-workspace.md) workspace
-
-## Subscribe to a Media Services event with Logic App
-
-1. In the Azure portal, if you haven't done so already, create a [Log Analytics](../../azure-monitor/logs/quick-create-workspace.md) workspace. You'll need the Workspace ID and one of the keys, so keep that browser window open. Then, open the portal in another tab or window.
-
-1. Navigate to your Azure Media Services account and select **Events**. This will show all the methods for subscribing to Azure Media Services events.
- > [!div class="mx-imgBorder"]
- > ![Azure Media Services Portal](media/tutorial-events-log-analytics/select-events-01a.png)
-
-1. Select the **Logic Apps icon** to create a Logic App. This will open the Logic App Designer where you can create the flow to capture the events and push them to Log Analytics.
- > [!div class="mx-imgBorder"]
- > ![Create Logic App](media/tutorial-events-log-analytics/select-logic-app-02.png)
-
-1. Select the **+ icon**, select the tenant you want to use, then select Sign in. You will see a Microsoft sign-in prompt.
- > [!div class="mx-imgBorder"]
- > ![Connect to Azure Event Grid](media/tutorial-events-log-analytics/select-event-add-grid-03.png)
-![Select the tenant](media/tutorial-events-log-analytics/select-tenant-03a.png)
-
-1. Select **Continue** to subscribe to the Media Services Events.
- > [!div class="mx-imgBorder"]
- > ![Connected to Azure Event Grid](media/tutorial-events-log-analytics/select-continue-04.png)
-
-1. In the **Resource Type** list, locate "Microsoft.Media.MediaServices".
- > [!div class="mx-imgBorder"]
- >![Azure Media Services Resource Events](media/tutorial-events-log-analytics/locate-azure-media-services-events-05.png)
-
-1. Select the **Event Type item**. There will be a list of all the events Azure Media Services emits. You can select the events you would like to track. You can add multiple event types. (Later, you will make a small change to the Logic App flow to store each event type in a separate Log Analytics Log and propagate the Event Type name to the Log Analytics Log name dynamically.)
- > [!div class="mx-imgBorder"]
- > ![Azure Media Services Event Type](media/tutorial-events-log-analytics/select-azure-media-services-event-type-06.png)
-
-1. Select **Save As**.
-
-1. Give your Logic App a name. The resource group is selected by default. Leave the other settings the way they are, then select **Create**. You will be returned to the Azure home screen.
- > [!div class="mx-imgBorder"]
- > ![Logic app naming interface](media/tutorial-events-log-analytics/give-logic-app-name-06a.png)
-
-## Create an action
-
-Now that you are subscribed to the event(s), create an action.
-
-1. If the portal has taken you back to the home screen, navigate back to the Logic App you just created by searching All resources for the app name.
-
-1. Select your app, then select **Logic app designer**. The designer pane will open.
-
-1. Select **+ New Step**.
-
-1. Since you want to push the events to the Azure Log Analytics service, search for "Azure Log Analytics" and select the "Azure Log Analytics Data Collector".
- > [!div class="mx-imgBorder"]
- > ![Azure Log Analytics Data Collector](media/tutorial-events-log-analytics/select-azure-log-analytics-data-collector-07.png)
-
-1. To connect to the Log Analytics Workspace, you need the Workspace ID and an Agent Key. Open the Azure portal in a new tab or window, navigate to the Log Analytics Workspace you created before the start of this tutorial.
- > [!div class="mx-imgBorder"]
- > ![Azure Log Analytics Workspace ID](media/tutorial-events-log-analytics/log-analytics-workspace-id-08.png)
-
-1. On the left menu, locate **Agents Management** and select it. This will show you the agent keys that have been generated.
- > [!div class="mx-imgBorder"]
- > ![Azure Log Analytics Agents management](media/tutorial-events-log-analytics/select-agents-management-09.png)
-
-1. Copy the *Workspace ID*.
- > [!div class="mx-imgBorder"]
- > ![Copy Workspace ID](media/tutorial-events-log-analytics/copy-workspace-id.png)
-
-1. In the other browser tab or window, under the Azure Log Analytics Data Collector select **Send Data**, give your connection a name, then paste the *Workspace ID* in the **Workspace ID** field.
-
-1. Return to the Workspace browser tab or window and copy the *Workspace Key*.
- > [!div class="mx-imgBorder"]
- > ![Agents management primary key](media/tutorial-events-log-analytics/agents-management-primary-key-10.png)
-
-1. In the other browser tab or window, paste the *Workspace Key* in the **Workspace Key** field.
-
-1. Select **Create**. Now you will create the JSON request body and the Custom Log Name.
-
-1. Select the **JSON Request body** field. A link to **Add dynamic content** will appear.
-
-1. Select **Add Dynamic content** and then select **Topic**.
-
-1. Do the same for **Custom Log Name**.
- > [!div class="mx-imgBorder"]
- > ![Topic selected](media/tutorial-events-log-analytics/topic-selected.png)
-
-1. Select **Code View** of the Logic App. Look for the Inputs and Log-Type lines.
- > [!div class="mx-imgBorder"]
- > ![Code view of two lines](media/tutorial-events-log-analytics/code-view-two-lines.png)
-
-1. Change the `body` value from `"@triggerBody()?['topic']"` to `"@{triggerBody()}"`. This is for parsing the entire message to Log Analytics.
-
-1. Change the `Log-Type` from `"@triggerBody()?['topic']"` to `"@replace(triggerBody()?['eventType'],'.','')"`. (This will replace "." as these are not allowed in Log Analytics Log Names.)
- > [!div class="mx-imgBorder"]
- > ![Logic App json after change](media/tutorial-events-log-analytics/changed-lines.png)
-
-1. Select **Save**.
-
-1. To verify, select **Logic app designer**.
- > [!div class="mx-imgBorder"]
- > ![Verify Body and Function steps](media/tutorial-events-log-analytics/verify-changes-to-json.png)
-
-1. When you examine all the resources in the resource group, there will be a Logic App and two Logic App API connectors listed, one for the Events and one for Log Analytics. For more information about Event Grid system topics, read [Event Grid System Topics](../../event-grid/system-topics.md).
- > [!div class="mx-imgBorder"]
- > ![See all new resources in Resource Group](media/tutorial-events-log-analytics/contoso-rg-listing.png)
-
-## Test
-
-To test how it actually works, create a Live Event in Azure Media Services. Create an RTMP Live Event and use ffmpeg to push a "live" stream based on a .mp4 sample file. After the event is created, get the RTMP ingest URL.
-
-1. From your Media Services account, select **Live streaming**.
- > [!div class="mx-imgBorder"]
- > ![Create an Azure Media Services Live Event](media/tutorial-events-log-analytics/live-event.png)
-
-1. Select **Add live event**.
-
-1. Enter a name into the **Live event name** field. (The **Description** field is optional.)
-
-1. Select **Standard** cloud encoding.
-
-1. Select **Default 720p** for the encoding preset.
-
-1. Select **RTMP** input protocol.
-
-1. Select **Yes** for the persistent input URL.
-
-1. Select **Yes** to start the event when the event is created.
-
- > [!WARNING]
- > Billing will start if you select Yes. If you want to wait to start the stream until *just before* you start streaming with FFmpeg, select **No** and remember to start your live event then.
-
- > [!div class="mx-imgBorder"]
- > ![Live event settings](media/tutorial-events-log-analytics/live-event-settings.png)
-
-1. Select **I have all the rights to use the content/file...** checkbox.
-
-1. Select **Review + create**.
-
-1. Review your settings, then select **Create**. The live event listing will appear and the live event Ingest URL will be shown.
-
-1. Copy the **Ingest URL** to your clipboard.
-
-1. Select the **live event** in the listing to see the Producer view.
-
-### Stream with FFmpeg CLI
-
-1. Use the following command line.
-
- ```AzureCLI
- ffmpeg -i <localpathtovideo> -map 0 -c:v libx264 -c:a copy -f flv <ingestURL>/mystream
- ```
-
-1. Change `<ingestURL>` to the Ingest URL you copied to your clipboard.
-1. Change `<localpathtovideo>` to the local path of file you want to stream from FFmpeg.
-1. Add a unique name at the end, for example, `mystream`.
-1. Adjust the command line to reflect your test source file and any other system variables.
-1. Run the command. After a couple seconds, the "Producer view" player should start streaming. (Refresh the player if the video doesn't show up automatically.)
-
- > [!div class="mx-imgBorder"]
- > ![Verify proper video ingest in Producer Preview Player](media/tutorial-events-log-analytics/live-event-producer-view.png)
-
-## Verify the events
-
-With the live stream, Azure Media Services is emitting various events that are triggering the Logic App flow. To verify, navigate to the Logic App and determine if there are any triggers being fired by the events from Media Services.
-
-1. Navigate to the Logic App Overview page, you should see "Run History" listing jobs that have completed successfully.
- > [!div class="mx-imgBorder"]
- > ![Verify successful job execution in Logic App](media/tutorial-events-log-analytics/run-history.png)
-
-1. Select a successful job. The details of the job during runtime are shown.
-1. Select **Send Data** to expand it. In this case, the `MicrosoftMediaLiveEventEncoderConnected` event shows that it was captured as well as the parsed body. This is what is pushed to the Azure Log Analytics Workspace.
- > [!div class="mx-imgBorder"]
- > ![See send data](media/tutorial-events-log-analytics/verify-send-data.png)
-
-## Verify the logs
-
-1. Navigate to Log Analytics Workspace you created earlier.
-
-1. Select **Logs**.
-1. Close the Example queries popup.
-1. There will be a Custom Logs listing. Select the down arrow to expand it. There you will see the event name `MicrosoftMediaLiveEventEncoderConnected`.
-1. Select the event name to expand it.
-1. When you select the "eye" icon, it will show a preview of the query result.
-1. Select **See in query editor** and then select the item under **TimeGenerated UTC** listing to expand it and view the raw data.
-
-![See detailed Event output in Log Analytics](media/tutorial-events-log-analytics/raw-data.png)
-
-## Delete resources
-
-If you don't want to continue to use the resources you created during this tutorial, make sure you delete all of the resources in the resource group or you will continue to be charged.
-
-## Next steps
-
-You can create different queries and save them. These can be added to [Azure Dashboard](../../azure-monitor/visualize/tutorial-logs-dashboards.md).
media-services Job Cancel How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-cancel-how-to.md
- Title: Cancel a job
-description: This article shows how to cancel a job.
----- Previously updated : 03/10/2022---
-# Cancel a job
--
-## Methods
-
-Use the following methods to cancel a job.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Job Create How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-create-how-to.md
- Title: Create a job with Media Services
-description: The article shows how to create a Media Services job using different methods.
----- Previously updated : 03/11/2022---
-# Create a job
--
-The article shows how to create a Media Services job using different methods.
--
-## Prerequisites
-
-[Create a Media Services account](./account-create-how-to.md).
-
-## [Portal](#tab/portal/)
--
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Job Delete How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-delete-how-to.md
- Title: Delete a job
-description: This article shows how to delete a job.
----- Previously updated : 03/10/2022---
-# Delete a job
---
-## Methods
-
-Use the following methods to delete a job.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Job Download Results How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-download-results-how-to.md
- Title: Download the results of a job - Azure Media Services
-description: This article demonstrates how to download the results of a job.
---- Previously updated : 03/09/2022---
-# Download the results of a job
--
-In Azure Media Services, when processing your videos (for example, encoding or analyzing) you need to create an output [asset](assets-concept.md) to store the result of your [job](transform-jobs-concept.md). You can then download these results to a local folder using Media Service and Storage APIs.
-
-This article demonstrates how to download the results using Java and .NET SDKs.
-
-## Methods
-
-## [.NET](#tab/net/)
-
-```csharp
-/// <summary>
-/// Use Media Service and Storage APIs to download the output files to a local folder
-/// </summary>
-/// <param name="client">The Media Services client.</param>
-/// <param name="resourceGroupName">The name of the resource group within the Azure subscription.</param>
-/// <param name="accountName">The Media Services account name.</param>
-/// <param name="assetName">The asset name.</param>
-/// <param name="resultsFolder">The output folder name for downloaded files.</param>
-/// <returns>A task.</returns>
-private async static Task DownloadResults(IAzureMediaServicesClient client, string resourceGroupName, string accountName, string assetName, string resultsFolder)
-{
- AssetContainerSas assetContainerSas = client.Assets.ListContainerSas(
- resourceGroupName,
- accountName,
- assetName,
- permissions: AssetContainerPermission.Read,
- expiryTime: DateTime.UtcNow.AddHours(1).ToUniversalTime()
- );
-
- Uri containerSasUrl = new Uri(assetContainerSas.AssetContainerSasUrls.FirstOrDefault());
- CloudBlobContainer container = new CloudBlobContainer(containerSasUrl);
-
- string directory = Path.Combine(resultsFolder, assetName);
- Directory.CreateDirectory(directory);
-
- Console.WriteLine("Downloading results to {0}.", directory);
-
- var blobs = container.ListBlobsSegmentedAsync(null,true, BlobListingDetails.None,200,null,null,null).Result;
-
- foreach (var blobItem in blobs.Results)
- {
- if (blobItem is CloudBlockBlob)
- {
- CloudBlockBlob blob = blobItem as CloudBlockBlob;
- string filename = Path.Combine(directory, blob.Name);
-
- await blob.DownloadToFileAsync(filename, FileMode.Create);
- }
- }
-
- Console.WriteLine("Download complete.");
-}
-```
-
-## Code sample
-
-See the full code sample: [EncodingWithMESPredefinedPreset](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoEncoding/Encoding_PredefinedPreset/Program.cs)
--
media-services Job Error Codes Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-error-codes-reference.md
- Title: Job (encoding and analyzing) error codes
-description: This article links to job error codes reference topic and gives useful links to related topics.
------ Previously updated : 08/31/2020---
-# Media Services job error codes
--
-This topic links to a REST reference document for detailed description of [Job](transform-jobs-concept.md) error codes and messages.
-
-## Job error codes
-
-The following REST document gives detailed explanations about [Job error codes](/rest/api/media/jobs/get#joberrorcode).
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
--- [Streaming Endpoint error codes](stream-streaming-endpoint-error-codes-reference.md)-- [Azure Media Services concepts](concepts-overview.md)-- [Quotas and limits](limits-quotas-constraints-reference.md)-
-## Next steps
-
-[Example: access ErrorCode and Message from ApiException with .NET](configure-connect-dotnet-howto.md#connect-to-the-net-client)
media-services Job Input From Http How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-input-from-http-how-to.md
- Title: Create a job input from an HTTPS URL
-description: This topic demonstrates how to create an Azure Media Services Job input from an HTTPS URL.
------ Previously updated : 08/31/2020---
-# Create a job input from an HTTPS URL
--
-In Media Services v3, when you submit Jobs to process your videos, you have to tell Media Services where to find the input video. One of the options is to specify an HTTPS URL as a job input (as shown in this example). Note that currently, AMS v3 does not support chunked transfer encoding over HTTPS URLs. For a full example, see this [GitHub sample](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/blob/master/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs).
-
-> [!TIP]
-> Before you start developing, review [Developing with Media Services v3 APIs](media-services-apis-overview.md) (includes information on accessing APIs, naming conventions, etc.)
-
-## Methods
-
-## [.NET](#tab/net/)
-
-## .NET sample
-
-The following code shows how to create a job with an HTTPS URL input.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-quickstarts/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs#SubmitJob)]
--
media-services Job Input From Local File How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-input-from-local-file-how-to.md
- Title: Create a job input from a local file
-description: This article demonstrates how to create an Azure Media Services job input from a local file.
---- Previously updated : 03/09/2022---
-# Create a job input from a local file
--
-In Media Services v3, when you submit Jobs to process your videos, you have to tell Media Services where to find the input video. The input video can be stored as a Media Service Asset, in which case you create an input asset based on a file (stored locally or in Azure Blob storage). This topic shows how to create a job input from a local file. For a full example, see this [GitHub sample](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs).
-
-## Prerequisites
-
-* [Create a Media Services account](./account-create-how-to.md).
-
-## [.NET](#tab/net/)
-
-## .NET sample
-
-The following code shows how to create an input asset and use it as the input for the job. The CreateInputAsset function performs the following actions:
-
-* Creates the Asset
-* Gets a writable [SAS URL](../../storage/common/storage-sas-overview.md) to the AssetΓÇÖs [container in storage](../../storage/blobs/storage-quickstart-blobs-dotnet.md#upload-a-blob-to-a-container)
-* Uploads the file into the container in storage using the SAS URL
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateInputAsset)]
-
-The following code snippet creates an output asset if it doesn't already exist:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateOutputAsset)]
-
-The following code snippet submits an encoding job:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#SubmitJob)]
-
-## Job error codes
-
-See [Error codes](/rest/api/media/jobs/get#joberrorcode).
--
media-services Job List How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-list-how-to.md
- Title: List jobs
-description: This article shows how to list jobs.
----- Previously updated : 03/11/2022---
-# List jobs
---
-## Methods
-
-Use the following methods to list jobs.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Job Multiple Transform Outputs How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-multiple-transform-outputs-how-to.md
- Title: Create a job with multiple transform outputs
-description: This topic demonstrates how to create an Azure Media Services job with multiple transform outputs.
------ Previously updated : 08/31/2020----
-# Create a job with multiple transform outputs
--
-This topic shows how to create a Transform with two Transform Outputs. The first one calls for the input to be encoded for adaptive bitrate streaming with a built-in [AdaptiveStreaming](encode-concept.md#builtinstandardencoderpreset) preset. The second one calls for the audio signal in the input video to be processed with the [AudioAnalyzerPreset](analyze-video-audio-files-concept.md#built-in-presets). After the Transform is created, you can submit a job that will process your video accordingly. Since in this example we are specifying two Transform Outputs, we must specify two Job Outputs. You can choose to direct both Job Outputs to the same Asset (as shown below), or you can have the results be written to separate Assets.
-
-> [!TIP]
-> Before you start developing, review [Developing with Media Services v3 APIs](media-services-apis-overview.md) (includes information on accessing APIs, naming conventions, etc.)
-
-## [.NET](#tab/net/)
-
-## Create a transform
-
-The following code shows how to create a transform that produces two outputs.
-
-```csharp
-private static async Task<Transform> GetOrCreateTransformAsync(
- IAzureMediaServicesClient client,
- string resourceGroupName,
- string accountName,
- string transformName)
-{
- // Does a Transform already exist with the desired name? Assume that an existing Transform with the desired name
- // also uses the same recipe or Preset for processing content.
- Transform transform = await client.Transforms.GetAsync(resourceGroupName, accountName, transformName);
-
- if (transform == null)
- {
- // You need to specify what you want it to produce as an output
- TransformOutput[] output = new TransformOutput[]
- {
- new TransformOutput
- {
- Preset = new BuiltInStandardEncoderPreset()
- {
- // This sample uses the built-in encoding preset for Adaptive Bitrate Streaming.
- PresetName = EncoderNamedPreset.AdaptiveStreaming
- }
- },
- // Create an analyzer preset with video insights.
- new TransformOutput(new AudioAnalyzerPreset("en-US"))
- };
-
- // Create the Transform with the output defined above
- transform = await client.Transforms.CreateOrUpdateAsync(resourceGroupName, accountName, transformName, output);
- }
-
- return transform;
-}
-```
-
-## Submit a job
-
-Create a job with an HTTPS URL input and with two job outputs.
-
-```csharp
-private static async Task<Job> SubmitJobAsync(IAzureMediaServicesClient client,
- string resourceGroup,
- string accountName,
- string transformName)
-{
- // Output from the encoding Job must be written to an Asset, so let's create one
- string outputAssetName1 = $"output-" + Guid.NewGuid().ToString("N");
- Asset outputAsset = await client.Assets.CreateOrUpdateAsync(resourceGroup, accountName, outputAssetName1, new Asset());
-
- // This example shows how to encode from any HTTPs source URL - a new feature of the v3 API.
- // Change the URL to any accessible HTTPs URL or SAS URL from Azure.
- JobInputHttp jobInput =
- new JobInputHttp(files: new[] { "https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/Ignite-short.mp4" });
-
- JobOutput[] jobOutputs =
- {
- // Since we are specifying two Transform Outputs, two Job Outputs are needed.
- // In this example, the first Job Output is for the results from adaptive bitrate encoding,
- // and the second is for the results from audio analysis. In this example, both are written to the
- // same output Asset. Or, you can specify different Assets.
-
- new JobOutputAsset(outputAsset.Name),
- new JobOutputAsset(outputAsset.Name)
-
- };
-
- // In this example, we are using a unique job name.
- //
- // If you already have a job with the desired name, use the Jobs.Get method
- // to get the existing job. In Media Services v3, Get methods on entities returns null
- // if the entity doesn't exist (a case-insensitive check on the name).
- Job job;
- try
- {
- string jobName = $"job-" + Guid.NewGuid().ToString("N");
- job = await client.Jobs.CreateAsync(
- resourceGroup,
- accountName,
- transformName,
- jobName,
- new Job
- {
- Input = jobInput,
- Outputs = jobOutputs,
- });
- }
- catch (Exception exception)
- {
- if (exception.GetBaseException() is ApiErrorException apiException)
- {
- Console.Error.WriteLine(
- $"ERROR: API call failed with error code '{apiException.Body.Error.Code}' and message '{apiException.Body.Error.Message}'.");
- }
- throw exception;
- }
-
- return job;
-}
-```
--
media-services Job Show How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-show-how-to.md
- Title: Show or get a job
-description: This article shows how to show or get a job.
----- Previously updated : 03/11/2022---
-# Show the details of a job
---
-## Methods
-
-Use the following methods to show or get a job.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Job Update How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/job-update-how-to.md
- Title: Update a job
-description: This article shows how to update a job.
----- Previously updated : 03/11/2022---
-# Update a job
---
-## Methods
-
-Use the following methods to update a job.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
--
-## [Python](#tab/python/)
---
media-services Limits Quotas Constraints Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/limits-quotas-constraints-reference.md
- Title: Quotas and limits in Azure Media Services
-description: This topic describes quotas and limits in Microsoft Azure Media Services.
------- Previously updated : 08/25/2021---
-<!-- If you update limits in this topic, make sure to also update https://docs.microsoft.com/azure/azure-resource-manager/management/azure-subscription-service-limits#media-services-limits -->
-# Azure Media Services quotas and limits
--
-This article lists some of the most common Microsoft Azure Media Services limits, which are also sometimes called quotas.
-
-> [!NOTE]
-> For resources that aren't fixed, open a support ticket to ask for an increase in the quotas. Don't create additional Azure Media Services accounts in an attempt to obtain higher limits.
-
-## Account limits
-
-| Resource | Default Limit |
-| | |
-| [Media Services accounts](account-move-account-how-to.md) in a single subscription | 100 (fixed) |
-
-## Asset limits
-
-| Resource | Default Limit |
-| | |
-| [Assets](assets-concept.md) per Media Services account | 1,000,000|
-
-## Storage limits
-
-Azure Storage block blog limits apply to storage accounts used with Media Services. See [Azure Blob Storage limits](../../azure-resource-manager/management/azure-subscription-service-limits.md#azure-blob-storage-limits).
-
-These limit includes the total stored data storage size of the files that you upload for encoding and the file sizes of the encoded files. The limit for file size for encoding is a different limit. See [File size for encoding](#file-size-for-encoding-limit).
-
-### Storage account limit
-You can have up to 100 storage accounts. All storage accounts must be in the same Azure subscription.
-
-## File size for encoding limit
-An individual file that you upload to be encoded should be no larger than 260 GB.
-
-## Jobs (encoding & analyzing) limits
-
-| Resource | Default Limit |
-| | |
-| [Jobs](transform-jobs-concept.md) per Media Services account | 500,000 <sup>(3)</sup> (fixed)|
-| Job inputs per Job | 50 (fixed)|
-| Job outputs per Job | 20 (fixed) |
-| [Transforms](transform-jobs-concept.md) per Media Services account | 100 (fixed)|
-| Transform outputs in a Transform | 20 (fixed) |
-| Files per job input|10 (fixed)|
-
-<sup>3</sup> This number includes queued, finished, active, and canceled Jobs. It does not include deleted Jobs.
-
-Any Job record in your account older than 90 days will be automatically deleted, even if the total number of records is below the maximum quota.
-
-## Live streaming limits
-
-| Resource | Default Limit |
-| | |
-| [Live Events](live-event-outputs-concept.md) <sup>(4)</sup> per Media Services account |5|
-| Live Outputs per Live Event |3 <sup>(5)</sup> |
-| Max Live Output duration | [Size of the DVR window](live-event-cloud-dvr-time-how-to.md) |
-
-<sup>4</sup> For detailed information about Live Event limits, see [Live Event types comparison and limits](live-event-types-comparison-reference.md). Depending on your streaming use case and regional datacenter of choice, AMS is able to accommodate more than 5 Live Events per Media Services account. Please file a support request to increase your account quota.
-
-<sup>5</sup> Live Outputs start on creation and stop when deleted.
-
-## Packaging & delivery limits
-
-| Resource | Default Limit |
-| | |
-| [Streaming Endpoints](stream-streaming-endpoint-concept.md) (stopped or running) per Media Services account | 2 |
-| Premium streaming units | 10 |
-| [Dynamic Manifest Filters](filters-dynamic-manifest-concept.md)|100|
-| [Streaming Policies](stream-streaming-policy-concept.md) | 100 <sup>(6)</sup> |
-| Unique [Streaming Locators](stream-streaming-locators-concept.md) associated with an Asset at one time | 100<sup>(7)</sup> (fixed) |
-
-<sup>6</sup> When using a custom [Streaming Policy](/rest/api/media/streamingpolicies), you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. You should not be creating a new Streaming Policy for each Streaming Locator.
-
-<sup>7</sup> Streaming Locators are not designed for managing per-user access control. To give different access rights to individual users, use Digital Rights Management (DRM) solutions.
-
-## Protection limits
-
-| Resource | Default Limit |
-| | |
-| Options per [Content Key Policy](drm-content-key-policy-concept.md) |30 |
-| Licenses per month for each of the DRM types on Media Services key delivery service per account|1,000,000|
-
-## Support ticket
-
-For resources that are not fixed, you may ask for the quotas to be raised, by opening a [support ticket](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest). Include detailed information in the request on the desired quota changes, use-case scenarios, and regions required. <br/>Do **not** create additional Azure Media Services accounts in an attempt to obtain higher limits.
-
-## Next steps
-
-[Overview](media-services-overview.md)
media-services Live Event Cloud Dvr Time How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-cloud-dvr-time-how-to.md
- Title: Use time-shifting to create on-demand video playback
-description: This article describes how to use time-shifting and Live Outputs to record Live Streams and create on-demand playback.
------ Previously updated : 08/31/2020---
-# Use time-shifting and Live Outputs to create on-demand video playback
--
-In Azure Media Services, a [Live Output](/rest/api/media/liveoutputs) object is like a digital video recorder that will catch and record your live stream into an asset in your Media Services account. The recorded content is persisted into the container defined by the [Asset](/rest/api/media/assets) resource (the container is in the Azure Storage account attached to your account). The Live Output also allows you to control some properties of the outgoing live stream, like how much of the stream is kept in the archive recording (for example, the capacity of the cloud DVR) or when viewers can start watching the live stream. The archive on disk is a circular archive "window" that only holds the amount of content that's specified in the **archiveWindowLength** property of the Live Output. Content that falls outside of this window is automatically discarded from the storage container and isn't recoverable. The archiveWindowLength value represents an ISO-8601 timespan duration (for example, PTHH:MM:SS), which specifies the capacity of the DVR. The value can be set from a minimum of one minute to a maximum of 25 hours.
-
-The relationship between a Live Event and its Live Outputs is similar to traditional TV broadcast, in that a channel (Live Event) represents a constant stream of video and a recording (Live Output) is scoped to a specific time segment (for example, evening news from 6:30PM to 7:00PM). Once you have the stream flowing into the Live Event, you can begin the streaming event by creating an asset, Live Output, and streaming locator. Live Output will archive the stream and make it available to viewers through the [Streaming Endpoint](/rest/api/medi#general-steps) section.
-
-## Using a DVR during an event
-
-This section discusses how to use a DVR during an event to control what portions of the stream is available for ΓÇÿrewindΓÇÖ.
-
-The `archiveWindowLength` value determines how far back in time a viewer can go from the current live position. The `archiveWindowLength` value also determines how long the client manifests can grow.
-
-Suppose you're streaming a football game, and it has an `ArchiveWindowLength` of only 30 minutes. A viewer who starts watching your event 45 minutes after the game started can seek back to at most the 15-minute mark. Your Live Outputs for the game will continue until the Live Event is stopped. Content that falls outside of archiveWindowLength is continuously discarded from storage and is non-recoverable. In this example, the video between the start of the event and the 15-minute mark would have been purged from your DVR and from the container in blob storage for the asset. The archive isn't recoverable and is removed from the container in Azure blob storage.
-
-A Live Event supports up to three concurrently running Live Outputs (you can create at most 3 recordings/archives from one live stream at the same time). This support allows you to publish and archive different parts of an event as needed. Suppose you need to broadcast a 24x7 live linear feed, and create "recordings" of the different programs throughout the day to offer to customers as on-demand content for catch-up viewing. For this scenario, you first create a primary Live Output with a short archive window of 1 hour or lessΓÇôthis is the primary live stream that your viewers would tune into. You would create a Streaming Locator for this Live Output and publish it to your app or web site as the "Live" feed. While the Live Event is running, you can programmatically create a second concurrent Live Output at the beginning of a program (or 5 minutes early to provide some handles to trim later). This second Live Output can be deleted 5 minutes after the program ends. With this second asset, you can create a new Streaming Locator to publish this program as an on-demand asset in your app's catalog. You can repeat this process multiple times for other program boundaries or highlights that you wish to share as on-demand videos, all while the "Live" feed from the first Live Output continues to broadcast the linear feed.
-
-## Creating an archive for on-demand playback
-
-The asset that the Live Output is archiving to automatically becomes an on-demand asset when the Live Output is deleted. You must delete all Live Outputs before a Live Event can be stopped. You can use an optional flag [removeOutputsOnStop](/rest/api/media/liveevents/stop#request-body) to automatically remove Live Outputs on stop.
-
-Even after you stop and delete the event, users can stream your archived content as a video on-demand, for as long as you don't delete the asset. An asset shouldn't be deleted if it's used by an event; the event must be deleted first.
-
-If you've published the asset of your Live Output using a streaming locator, the Live Event (up to the DVR window length) will continue to be viewable until the streaming locatorΓÇÖs expiry or deletion, whichever comes first.
-
-For more information, see:
--- [Live streaming overview](stream-live-streaming-concept.md)-- [Live streaming tutorial](stream-live-tutorial-with-api.md)-
-> [!NOTE]
-> When you delete the Live Output, you're not deleting the underlying asset and content in the asset.
-
-## Next steps
-
-* [Subclip your videos](transform-subclip-video-how-to.md).
-* [Define filters for your assets](filters-dynamic-manifest-rest-howto.md).
media-services Live Event Error Codes Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-error-codes-reference.md
- Title: Azure Media Services live event error codes
-description: This article lists live event error codes.
------- Previously updated : 03/26/2021----
-# Media Services Live Event error codes
--
-The following tables list the [Live Event](live-event-outputs-concept.md) error codes.
-
-## LiveEventConnectionRejected
-
-When you subscribe to the [Event Grid](../../event-grid/index.yml) events for a
-live event, you may see one of the following errors from the
-[LiveEventConnectionRejected](monitoring/media-services-event-schemas.md\#liveeventconnectionrejected)
-event.
-> [!div class="mx-tdCol2BreakAll"]
->| Error | Information |
->|--|--|
->|**MPE_RTMP_APPID_AUTH_FAILURE** ||
->|Description | Incorrect ingest URL |
->|Suggested solution| APPID is a GUID token in RTMP ingest URL. Make sure it matches with Ingest URL from API. |
->|**MPE_INGEST_ENCODER_CONNECTION_DENIED** ||
->| Description |Encoder IP isn't present in the configured IP allow list |
->| Suggested solution| Make sure the encoder's IP is in the IP Allow List. Use an online tool such as *whoismyip* or *CIDR calculator* to set the proper value. Make sure the encoder can reach the server before the actual live event. |
->|**MPE_INGEST_RTMP_SETDATAFRAME_NOT_RECEIVED** ||
->| Description|The RTMP encoder did not send the `setDataFrame` command. |
->| Suggested solution|Most commercial encoders send stream metadata. For an encoder that pushes a single bitrate ingest, this may not be issue. The LiveEvent is able to calculate incoming bitrate when the stream metadata is missing. For multi-bitrate ingest for a PassThru channel or double push scenario, you can try to append the query string with 'videodatarate' and 'audiodatarate' in the ingest URL. The approximate value may work. The unit is in Kbit. For example, `rtmp://hostname:1935/live/GUID_APPID/streamname?videodatarate=5000&audiodatarate=192` |
->|**MPE_INGEST_CODEC_NOT_SUPPORTED** ||
->| Description|The codec specified isn't supported.|
->| Suggested solution| The LiveEvent received unsupported codec. For example, an RTMP ingest, LiveEvent received non-AVC video codec. Check encoder preset. |
->|**MPE_INGEST_DESCRIPTION_INFO_NOT_RECEIVED** ||
->| Description |The media description information was not received before the actual media data was delivered. |
->| Suggested solution|The LiveEvent does not receive the stream description (header or FLV tag) from the encoder. This is a protocol violation. Contact encoder vendor. |
->|**MPE_INGEST_MEDIA_QUALITIES_EXCEEDED** ||
->| Description| The count of quality levels for audio or video type exceeded the maximum allowed limit. Quality levels beyond the limit were ignored by the service.|
->| Suggested solution|When Live Event mode is Live Encoding, the encoder should push a single bitrate of video and audio. Note that a redundant push from the same bitrate is allowed. Check the encoder preset or output settings to make sure it outputs a single bitrate stream. |
->|**MPE_INGEST_BITRATE_AGGREGATED_EXCEEDED** ||
->| Description|The total incoming bitrate in a live event or channel service exceeded the maximum allowed limit. |
->| Suggested solution|The encoder exceeded the maximum incoming bitrate. This limit aggregates all incoming data from the contributing encoder. Check encoder preset or output settings to reduce bitrate. |
->|**MPE_RTMP_FLV_TAG_TIMESTAMP_INVALID** ||
->| Description|The timestamp for video or audio FLVTag is invalid from the RTMP encoder. |
->| Suggested solution|Deprecated. |
->|**MPE_INGEST_FRAMERATE_EXCEEDED** ||
->| Description|The incoming encoder ingested streams with frame rates exceeded the maximum allowed 30 fps for encoding live events/channels. |
->| Suggested solution|Check encoder preset to lower frame rate to under 36 fps. |
->|**MPE_INGEST_VIDEO_RESOLUTION_NOT_SUPPORTED** ||
->| Description|The incoming encoder ingested streams exceeded the following allowed resolutions: 1920x1088 for encoding live events/channels and 4096 x 2160 for basic and standard pass-through live events/channels. |
->| Suggested solution|Check encoder preset to lower video resolution so it doesn't exceed the limit. |
->|**MPE_INGEST_RTMP_TOO_LARGE_UNPROCESSED_FLV** |
->| Description|The live event has received a large amount of audio data at once, or a large amount of video data without any key frames. We have disconnected the encoder to give it a chance to retry with correct data. |
->| Suggested solution|Ensure that the encoder sends a key frame for every key frame interval(GOP). Enable settings like "Constant bitrate(CBR)" or "Align Key Frames". Sometimes, resetting the contributing encoder may help. If it doesn't help, contact encoder vendor. |
-
-## LiveEventEncoderDisconnected
-
-You may see one of the following errors from the
-[LiveEventEncoderDisconnected](monitoring/media-services-event-schemas.md\#liveeventencoderdisconnected)
-event.
-
-> [!div class="mx-tdCol2BreakAll"]
->| Error | Information |
->|--|--|
->|**MPE_RTMP_SESSION_IDLE_TIMEOUT** |
->| Description|RTMP session timed out after being idle for allowed time limit. |
->|Suggested solution|This typically happens when an encoder stops receiving the input feed so that the session becomes idle because there is no data to push out. Check if the encoder or input feed status is in a healthy state. |
->|**MPE_RTMP_FLV_TAG_TIMESTAMP_INVALID** |
->|Description| The timestamp for the video or audio FLVTag is invalid from RTMP encoder. |
->| Suggested solution| Deprecated. |
->|**MPE_CAPACITY_LIMIT_REACHED** |
->| Description|Encoder sending data too fast. |
->| Suggested solution|This happens when the encoder bursts out a large set of fragments in a brief period. This can theoretically happen when the encoder can't push data for while due to a network issue and the bursts out data when the network is available. Find the reason from encoder log or system log. |
->|**Unknown error codes** |
->| Description| These error codes can range from memory error to duplicate entries in hash map. This can happen when the encoder sends out a large set of fragments in a brief period. This can also happen when the encoder couldn't push data for while due to a network issue and then sends all the delayed fragments at once when the network becomes available. |
->|Suggested solution| Check the encoder logs.|
-
-## Other error codes
-
-> [!div class="mx-tdCol2BreakAll"]
->| Error | Information |Rejected/Disconnected Event|
->|--|--|--|
->|**ERROR_END_OF_MEDIA** ||Yes|
->| Description|This is general error. ||
->|Suggested solution| None.||
->|**MPI_SYSTEM_MAINTENANCE** ||Yes|
->| Description|The encoder disconnected due to service update or system maintenance. ||
->|Suggested solution|Make sure the encoder enables 'auto connect'. It allows the encoder to reconnect to the redundant live event endpoint that is not in maintenance. ||
->|**MPE_BAD_URL_SYNTAX** ||Yes|
->| Description|The ingest URL is incorrectly formatted. ||
->|Suggested solution|Make sure the ingest URL is correctly formatted. For RTMP, it should be `rtmp[s]://hostname:port/live/GUID_APPID/streamname` ||
->|**MPE_CLIENT_TERMINATED_SESSION** ||Yes|
->| Description|The encoder disconnected the session. ||
->|Suggested solution|This is not error. The encoder initiated disconnection, including graceful disconnection. If this is an unexpected disconnect, check the encoder logs. |
->|**MPE_INGEST_BITRATE_NOT_MATCH** ||No|
->| Description|The incoming data rate does not match with expected bitrate. ||
->|Suggested solution|This is a warning which happens when incoming data rate is too slow or fast. Check encoder log or system log.||
->|**MPE_INGEST_DISCONTINUITY** ||No|
->| Description| There is discontinuty in incoming data.||
->|Suggested solution| This is a warning that the encoder drops data due to a network issue or a system resource issue. Check the encoder log or system log. Monitor the system resource (CPU, memory or network) as well. If the system CPU is too high, try to lower the bitrate or use the H/W encoder option from the system graphics card.||
-
-## See also
-
-[Streaming Endpoint (Origin) error codes](stream-streaming-endpoint-error-codes-reference.md)
-
-## Next steps
-
-[Tutorial: Stream live with Media Services](stream-live-tutorial-with-api.md)
media-services Live Event Latency Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-latency-reference.md
- Title: LiveEvent low latency settings in Azure Media Services
-description: This topic gives an overview of LiveEvent low latency settings and shows how to set low latency.
------- Previously updated : 08/31/2020-----
-# Live Event low latency settings
--
-This article shows how to set low latency on a [Live Event](/rest/api/media/liveevents). It also discusses typical results that you see when using the low latency settings in various players. The results vary based on CDN and network latency.
-
-To use the new **LowLatency** feature, you set the **StreamOptionsFlag** to **LowLatency** on the **LiveEvent**. When creating [LiveOutput](/rest/api/media/liveoutputs) for HLS playback, set [LiveOutput.Hls.fragmentsPerTsSegment](/rest/api/media/liveoutputs/create#hls) to 1. Once the stream is up and running, you can use the [Azure Media Player](https://ampdemo.azureedge.net/) (AMP demo page), and set the playback options to use the "Low Latency Heuristics Profile".
-
-> [!NOTE]
-> Currently, the LowLatency HeuristicProfile in Azure Media Player is designed for playing back streams in MPEG-DASH protocol, with either CSF or CMAF format (for example, `format=mdp-time-csf` or `format=mdp-time-cmaf`).
-
-The following .NET example shows how to set **LowLatency** on the **LiveEvent**:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#NewLiveEvent)]
-
-See the full example: [Live Event with DVR](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/Live/LiveEventWithDVR/Program.cs).
-
-## Live Events latency
-
-The following tables show typical results for latency (when the LowLatency flag is enabled) in Media Services, measured from the time the contribution feed reaches the service to when a viewer sees the playback on the player. To use low latency optimally, you should tune your encoder settings down to 1 second "Group Of Pictures" (GOP) length. When using a higher GOP length, you minimize bandwidth consumption and reduce bitrate under same frame rate. It is especially beneficial in videos with less motion.
-
-### Pass-through
-
-||2s GOP low latency enabled|1s GOP low latency enabled|
-||||
-|**DASH in AMP**|10s|8s|
-|**HLS on native iOS player**|14s|10s|
-
-### Live encoding
-
-||2s GOP low latency enabled|1s GOP low latency enabled|
-||||
-|**DASH in AMP**|14s|10s|
-|**HLS on native iOS player**|18s|13s|
-
-> [!NOTE]
-> The end-to-end latency can vary depending on local network conditions or by introducing a CDN caching layer. You should test your exact configurations.
-
-## Next steps
--- [Live streaming overview](stream-live-streaming-concept.md)-- [Live streaming tutorial](stream-live-tutorial-with-api.md)
media-services Live Event Live Transcription How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-live-transcription-how-to.md
- Title: Live transcription
-: Azure Media Services
-description: Learn about Azure Media Services live transcription.
------ Previously updated : 08/31/2020----
-# Live transcription (preview)
--
-Azure Media Service delivers video, audio, and text in different protocols. When you publish your live stream using MPEG-DASH or HLS/CMAF, then along with video and audio, our service delivers the transcribed text in IMSC1.1 compatible TTML. The delivery is packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments. If using delivery via HLS/TS, then text is delivered as chunked VTT.
-
-Additional charges apply when live transcription is turned on. Please review the pricing information in the Live Video section of the [Media Services pricing page](https://azure.microsoft.com/pricing/details/media-services/).
-
-This article describes how to enable live transcription when streaming a Live Event with Azure Media Services. Before you continue, make sure you're familiar with the use of Media Services v3 REST APIs (see [this tutorial](stream-files-tutorial-with-rest.md) for details). You should also be familiar with the [live streaming](stream-live-streaming-concept.md) concept. It's recommended to complete the [Stream live with Media Services](stream-live-tutorial-with-api.md) tutorial.
-
-## Live transcription preview regions and languages
-
-Live transcription is available in the regions as documented [here](azure-clouds-regions.md).
-
-This is the list of available languages that can be transcribed, use the language code in the API.
-
-| Language | Language code |
-| -- | - |
-| Catalan | ca-ES |
-| Danish (Denmark) | da-DK |
-| German (Germany) | de-DE |
-| English (Australia) | en-AU |
-| English (Canada) | en-CA |
-| English (United Kingdom) | en-GB |
-| English (India) | en-IN |
-| English (New Zealand) | en-NZ |
-| English (United States) | en-US |
-| Spanish (Spain) | es-ES |
-| Spanish (Mexico) | es-MX |
-| Finnish (Finland) | fi-FI |
-| French (Canada) | fr-CA |
-| French (France) | fr-FR |
-| Italian (Italy) | it-IT |
-| Dutch (Netherlands) | nl-NL |
-| Portuguese (Brazil) | pt-BR |
-| Portuguese (Portugal) | pt-PT |
-| Swedish (Sweden) | sv-SE |
-
-## Create the live event with live transcription
-
-To create a live event with the transcription turned on, send the PUT operation with the 2019-05-01-preview API version, for example:
-
-```
-PUT https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/liveEvents/:liveEventName?api-version=2019-05-01-preview&autoStart=true
-```
-
-The operation has the following body (where a basic pass-through Live Event is created with RTMP as the ingest protocol). Note the addition of a transcriptions property.
-
-```
-{
- "properties": {
- "description": "Demonstrate how to enable live transcriptions",
- "input": {
- "streamingProtocol": "RTMP",
- "accessControl": {
- "ip": {
- "allow": [
- {
- "name": "Allow All",
- "address": "0.0.0.0",
- "subnetPrefixLength": 0
- }
- ]
- }
- }
- },
- "preview": {
- "accessControl": {
- "ip": {
- "allow": [
- {
- "name": "Allow All",
- "address": "0.0.0.0",
- "subnetPrefixLength": 0
- }
- ]
- }
- }
- },
- "encoding": {
- "encodingType": "PassthroughBasic"
- },
- "transcriptions": [
- {
- "language": "en-US"
- }
- ],
- "useStaticHostname": false,
- "streamOptions": [
- "Default"
- ]
- },
- "location": "West US 2"
-}
-```
-
-## Start or stop transcription after the live event has started
-
-You can start and stop live transcription while the live event is in running state. For more information about starting and stopping live events, read the Long-running operations section at [Develop with Media Services v3 APIs](media-services-apis-overview.md#long-running-operations).
-
-To turn on live transcriptions or to update the transcription language, patch the live event to include a ΓÇ£transcriptionsΓÇ¥ property. To turn off live transcriptions, remove the ΓÇ£transcriptionsΓÇ¥ property from the live event object.
-
-> [!NOTE]
-> Turning the transcription on or off **more than once** during the live event is not a supported scenario.
-
-This is the sample call to turn on live transcriptions.
-
-PATCH: ```https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/liveEvents/:liveEventName?api-version=2019-05-01-preview```
-
-```
-{
- "properties": {
- "description": "Demonstrate how to enable live transcriptions",
- "input": {
- "streamingProtocol": "RTMP",
- "accessControl": {
- "ip": {
- "allow": [
- {
- "name": "Allow All",
- "address": "0.0.0.0",
- "subnetPrefixLength": 0
- }
- ]
- }
- }
- },
- "preview": {
- "accessControl": {
- "ip": {
- "allow": [
- {
- "name": "Allow All",
- "address": "0.0.0.0",
- "subnetPrefixLength": 0
- }
- ]
- }
- }
- },
- "encoding": {
- "encodingType": "None"
- },
- "transcriptions": [
- {
- "language": "en-US"
- }
- ],
- "useStaticHostname": false,
- "streamOptions": [
- "Default"
- ]
- },
- "location": "West US 2"
-}
-```
-
-## Transcription delivery and playback
-
-Review the [Dynamic packaging overview](encode-dynamic-packaging-concept.md#to-prepare-your-source-files-for-delivery) article of how our service uses dynamic packaging to deliver video, audio, and text in different protocols. When you publish your live stream using MPEG-DASH or HLS/CMAF, then along with video and audio, our service delivers the transcribed text in IMSC1.1 compatible TTML. This delivery is packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments. If using delivery via HLS/TS, then the text is delivered as chunked VTT. You can use a web player such as the [Azure Media Player](player-use-azure-media-player-how-to.md) to play the stream.
-
-> [!NOTE]
-> If using Azure Media Player, use version 2.3.3 or later.
-
-## Known issues
-
-For preview, the following are known issues with live transcription:
--- Apps need to use the preview APIs, described in the [Media Services v3 OpenAPI Specification](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/preview/2019-05-01-preview/streamingservice.json).-- Digital rights management (DRM) protection does not apply to the text track, only AES envelope encryption is possible.-
-## Next steps
-
-* [Media Services overview](media-services-overview.md)
media-services Live Event Obs Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-obs-quickstart.md
- Title: Create a live stream with OBS Studio
-description: Learn how to create an Azure Media Services live stream by using the portal and OBS Studio
----- Previously updated : 03/20/2021---
-# Create an Azure Media Services live stream with OBS
--
-This quickstart will help you create a Media Services Live Event by using the Azure portal and broadcast using Open Broadcasting Studio (OBS). It assumes that you have an Azure subscription and have created a Media Services account.
-
-In this quickstart, we'll cover:
--- Setting up an on-premises encoder with OBS.-- Setting up a live stream.-- Setting up live stream outputs.-- Running a default streaming endpoint.-- Using Azure Media Player to view the live stream and on-demand output.-
-## Prerequisites
-
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
-
-## Sign in to the Azure portal
-
-Open your web browser, and go to the [Microsoft Azure portal](https://portal.azure.com/). Enter your credentials to sign in to the portal. The default view is your service dashboard.
-
-## Set up an on-premises encoder by using OBS
-
-1. Download and install OBS for your operating system on the [Open Broadcaster Software website](https://obsproject.com/).
-1. Start the application and keep it open.
-
-## Run the default streaming endpoint
-
-1. Select **Streaming endpoints** in the Media Services listing.
-
- ![Streaming endpoints menu item.](media/live-events-obs-quickstart/streaming-endpoints.png)
-1. If the default streaming endpoint status is stopped, select it. This step takes you to the page for that endpoint.
-1. Select **Start**.
-
- ![Start button for the streaming endpoint.](media/live-events-obs-quickstart/start.png)
-
-## Set up an Azure Media Services live stream
-
-1. Go to the Azure Media Services account within the portal, and then select **Live streaming** from the **Media Services** listing.
-
- ![Live streaming link.](media/live-events-obs-quickstart/select-live-streaming.png)
-1. Select **Add live event** to create a new live streaming event.
-
- ![Add live event icon.](media/live-events-obs-quickstart/add-live-event.png)
-1. Enter a name for your new event, such as *TestLiveEvent*, in the **Live event name** box.
-
- ![Live event name box.](media/live-events-obs-quickstart/live-event-name.png)
-1. Enter an optional description of the event in the **Description** box.
-1. Select the basic **Pass-through ΓÇô no cloud encoding** option.
-
- ![Cloud encoding option.](media/live-events-obs-quickstart/cloud-encoding.png)
-1. Select the **RTMP** option.
-1. Make sure that the **No** option is selected for **Start live event**, to avoid being billed for the live event before it's ready. (Billing will begin when the live event is started.)
-
- ![Start live event option.](media/live-events-obs-quickstart/start-live-event-no.png)
-1. Select the **Review + create** button to review the settings.
-1. Select the **Create** button to create the live event. You're then returned to the live event listing.
-1. Select the link to the live event that you created. Notice that your event is stopped.
-1. Keep this page open in your browser. We'll come back to it later.
-
-## Set up a live stream by using OBS Studio
-
-OBS starts with a default scene but with no inputs selected.
-
- ![OBS default screen](media/live-events-obs-quickstart/live-event-obs-default-screen.png)
-
-### Add a video source
-
-1. From the **Sources** panel, select the **add** icon to select a new source device. The **Sources** menu will open.
-
-1. Select **Video Capture Device** from the source device menu. The **Create/Select Source** menu will open.
-
- ![OBS sources menu with video device selected.](media/live-events-obs-quickstart/live-event-obs-video-device-menu.png)
-
-1. Select the **Add Existing** radio button, then select **OK**. The **Properties for Video Device** menu will open.
-
- ![OBS new video source menu with add existing selected.](media/live-events-obs-quickstart/live-event-obs-new-video-source.png)
-
-1. From the **Device** dropdown list, select the video input you want to use for your broadcast. Leave the rest of the settings alone for now, and select **OK**. The input source will be added to the **Sources** panel, and the video input view will show up in the **Preview** area.
-
- ![OBS camera settings](media/live-events-obs-quickstart/live-event-surface-camera.png)
-
-### Add an audio source
-
-1. From the **Sources** panel, select the **add** icon to select a new source device. The Source Device menu will open.
-
-1. Select **Audio Input Capture** from the source device menu. The **Create/Select Source** menu will open.
-
- ![OBS sources menu with audio device selected.](media/live-events-obs-quickstart/live-event-obs-audio-device-menu.png)
-
-1. Select the **Add Existing** radio button, then select **OK**. The **Properties for Audio Input Capture** menu will open.
-
- ![OBS audio source with add existing selected.](media/live-events-obs-quickstart/live-event-obs-new-audio-source.png)
-
-1. From the **Device** dropdown list, select the audio capture device you want to use for your broadcast. Leave the rest of the settings alone for now, and select OK. The audio capture device will be added to the audio mixer panel.
-
- ![OBS audio device selection dropdown list](media/live-events-obs-quickstart/live-event-select-audio-device.png)
-
-### Set up streaming and advanced encoding settings in OBS
-
-In the next procedure, you'll go back to Azure Media Services in your browser to copy the input URL to enter into the output settings:
-
-1. On the Azure Media Services page of the portal, select **Start** to start the live stream event. (Billing starts now.)
-
- ![Start icon.](media/live-events-obs-quickstart/start.png)
-1. Set the **RTMP** toggle to **RTMPS**.
-1. In the **Input URL** box, copy the URL to your clipboard.
-
- ![Input URL.](media/live-events-obs-quickstart/input-url.png)
-
-1. Switch to the OBS application.
-
-1. Select the **Settings** button in the **Controls** panel. The Settings options will open.
-
- ![OBS Controls panel with settings selected.](media/live-events-obs-quickstart/live-event-obs-settings.png)
-
-1. Select **Stream** from the **Settings** menu.
-
-1. From the **Service** dropdown list, select Show all, then select **Custom...**.
-
-1. In the **Server** field, paste the RTMPS URL you copied to your clipboard.
-
-1. Enter something into the **Stream key** field. It doesn't really matter what it is, but it needs to have a value.
-
- ![OBS stream settings.](media/live-events-obs-quickstart/live-event-obs-stream-settings.png)
-
-1. Select **Output** from the **Settings** menu.
-
-1. Select the **Output Mode** dropdown at the top of the page and choose **Advanced** to access all of the available encoder settings.
-
-1. Select the **Streaming** tab to set up the encoder.
-
-1. Select the right encoder for your system. If your hardware supports GPU acceleration, choose from NVIDIA **NVENC** H.264 or Intel **QuickSync** H.264. If your system doesn't have a supported GPU, select the **X264** software encoder option.
-
-#### X264 Encoder settings
-
-1. If you have selected the **X264** encoding option select the **Rescale Output** box. Select either 1920x1080 if you are using a Premium Live Event in Media Services or 1280x720 if you're using a Standard (720P) Live Event. If you're using a basic or standard pass-through live event, you can choose any available resolution.
-
-1. Set the **Bitrate** to anywhere between 1500 Kbps and 4000 Kbps. We recommend 2500 Kbps if you are using a Standard encoding Live Event at 720P. If you are using a 1080P Premium Live Event, 4000 Kbps is recommended. You may wish to adjust the bitrate based on available CPU capabilities and bandwidth on your network to achieve the desired quality setting.
-
-1. Enter *2* into the **Keyframe interval** field. The value sets the key frame interval to 2 seconds, which controls the final size of the fragments delivered over HLS or DASH from Media Services. Never set the key frame interval any higher than 4 seconds. If you are seeing high latency when broadcasting, you should always double check or inform your application users to always set this value to 2 seconds. When attempting to achieve lower latency live delivery you can choose to set this value to as low as 1 second.
-
-1. OPTIONAL: Set the CPU Usage Preset to **veryfast** and run some experiments to see if your local CPU can handle the combination of bitrate and preset with enough overhead. Try to avoid settings that would result in an average CPU higher than 80% to avoid any issues during live streaming. To improve quality, you can test with **faster** and **fast** preset settings until you reach your CPU limitations.
-
- ![OBS X264 encoder settings](media/live-events-obs-quickstart/live-event-obs-x264-settings.png)
-
-1. Leave the rest of the settings unchanged and select **OK**.
-
-#### Nvidia NVENC Encoder settings
-
-1. If you have selected the **NVENC** GPU encoding option, check the **Rescale Output** box and select either 1920x1080 if you are using a Premium Live Event in Media Services, or 1280x720 if you are using a Standard (720P) Live Event. If you are using a basic or standard pass-through live event, you can choose any available resolution.
-
-1. Set the **Rate Control** to CBR for Constant Bitrate rate control.
-
-1. Set the **Bitrate** anywhere between 1500 Kbps and 4000 Kbps. We recommend 2500 Kbps if you are using a Standard encoding Live Event at 720P. If you are using a 1080P Premium Live Event, 4000 Kbps is recommended. You may choose to adjust this based on available CPU capabilities and bandwidth on your network to achieve the desired quality setting.
-
-1. Set the **Keyframe Interval** to 2 seconds as noted above under the X264 options. Do not exceed 4 seconds, as this can significantly impact the latency of your live broadcast.
-
-1. Set the **Preset** to Low-Latency, Low-Latency Performance, or Low-Latency Quality depending on the CPU speed on your local machine. Experiment with these settings to achieve the best balance between quality and CPU utilization on your own hardware.
-
-1. Set the **Profile** to "main" or "high" if you are using a more powerful hardware configuration.
-
-1. Leave the **Look-ahead** unchecked. If you have a very powerful machine you can check this.
-
-1. Leave the **Psycho Visual Tuning** unchecked. If you have a very powerful machine you can check this.
-
-1. Set the **GPU** to 0 to automatically decide which GPUs to allocate. If desired, you can restrict GPU usage.
-
-1. Set the **Max B-frames** to 2
-
- ![OBS NVidia NVidia NVENC GPU encoder settings.](media/live-events-obs-quickstart/live-event-obs-nvidia-settings.png)
-
-#### Intel QuickSync Encoder settings
-
-1. If you have selected the Intel **QuickSync** GPU encoding option, check the **Rescale Output** box and select either 1920x1080 if you are using a Premium Live Event in Media Services, or 1280x720 if you are using a Standard (720P) Live Event. If you are using a basic or standard pass-through live event, you can choose any available resolution.
-
-1. Set the **Target Usage** to "balanced" or adjust as needed based on your CPU and GPU combined load. Adjust as necessary and experiment to achieve an 80% max CPU utilization on average with the quality that your hardware is capable of producing. If you are on more constrained hardware, test with "fast" or drop to "very fast" if you are having performance issues.
-
-1. Set the **Profile** to "main" or "high" if you are using a more powerful hardware configuration.
-
-1. Set the **Keyframe Interval** to 2 seconds as noted above under the X264 options. Do not exceed 4 seconds, as this can significantly impact the latency of your live broadcast.
-
-1. Set the **Rate Control** to CBR for Constant Bitrate rate control.
-
-1. Set the **Bitrate** anywhere between 1500 and 4000 Kbps. We recommend 2500 Kbps if you are using a Standard encoding Live Event at 720P. If you are using a 1080P Premium Live Event, 4000 Kbps is recommended. You may choose to adjust this based on available CPU capabilities and bandwidth on your network to achieve the desired quality setting.
-
-1. Set the **Latency** to "low".
-
-1. Set the **B frames** to 2.
-
-1. Leave the **Subjective Video Enhancements** unchecked.
-
- ![OBS Intel QuickSync GPU encoder settings.](media/live-events-obs-quickstart/live-event-obs-intel-settings.png)
-
-### Set Audio settings
-
-In the next procedure, you will adjust the audio encoding settings.
-
-1. Select the Output->Audio tab in Settings.
-
-1. Set the Track 1 **Audio Bitrate** to 128 Kbps.
-
- ![OBS Audio Bitrate settings.](media/live-events-obs-quickstart/live-event-obs-audio-output-panel.png)
-
-1. Select the Audio tab in Settings.
-
-1. Set the **Sample Rate** to 44.1 kHz.
-
- ![OBS Audio Sample Rate settings.](media/live-events-obs-quickstart/live-event-obs-audio-sample-rate-settings.png)
-
-### Start streaming
-
-1. In the **Controls** panel, click **Start Streaming**.
-
- ![OBS start streaming button.](media/live-events-obs-quickstart/live-event-obs-start-streaming.png)
-
-2. Switch to the Azure Media Services Live event screen in your browser and click the **Reload Player** link. You should now see your stream in the Preview player.
-
-## Set up outputs
-
-This part will set up your outputs and enable you to save a recording of your live stream.
-
-> [!NOTE]
-> For you to stream this output, the streaming endpoint must be running. See the later [Run the default streaming endpoint](#run-the-default-streaming-endpoint) section.
-
-1. Select the **Create outputs** link below the **Outputs** video viewer.
-1. If you like, edit the name of the output in the **Name** box to something more user-friendly so it's easy to find later.
-
- ![Output name box.](media/live-events-wirecast-quickstart/output-name.png)
-1. Leave all the rest of the boxes alone for now.
-1. Select **Next** to add a streaming locator.
-1. Change the name of the locator to something more user-friendly, if you want.
-
- ![Locator name box.](media/live-events-wirecast-quickstart/live-event-locator.png)
-1. Leave everything else on this screen alone for now.
-1. Select **Create**.
-
-## Play the output broadcast by using Azure Media Player
-
-1. Copy the streaming URL under the **Output** video player.
-1. In a web browser, open the [Azure Media Player demo](https://ampdemo.azureedge.net/azuremediaplayer.html).
-1. Paste the streaming URL into the **URL** box of Azure Media Player.
-1. Select the **Update Player** button.
-1. Select the **Play** icon on the video to see your live stream.
-
-## Stop the broadcast
-
-When you think you've streamed enough content, stop the broadcast.
-
-1. In the portal, select **Stop**.
-
-1. In OBS, select the **Stop Streaming** button in the **Controls** panel. This step stops the broadcast from OBS.
-
-## Play the on-demand output by using Azure Media Player
-
-The output that you created is now available for on-demand streaming as long as your streaming endpoint is running.
-
-1. Go to the Media Services listing and select **Assets**.
-1. Find the event output that you created earlier and select the link to the asset. The asset output page opens.
-1. Copy the streaming URL under the video player for the asset.
-1. Return to Azure Media Player in the browser and paste the streaming URL into the URL box.
-1. Select **Update Player**.
-1. Select the **Play** icon on the video to view the on-demand asset.
-
-## Clean up resources
-
-> [!IMPORTANT]
-> Stop the services! After you've completed the steps in this quickstart, be sure to stop the live event and the streaming endpoint, or you'll be billed for the time they remain running. To stop the live event, see the [Stop the broadcast](#stop-the-broadcast) procedure, steps 2 and 3.
-
-To stop the streaming endpoint:
-
-1. From the Media Services listing, select **Streaming endpoints**.
-2. Select the default streaming endpoint that you started earlier. This step opens the endpoint's page.
-3. Select **Stop**.
-
-> [!TIP]
-> If you don't want to keep the assets from this event, be sure to delete them so you're not billed for storage.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Live events and live outputs in Media Services](./live-event-outputs-concept.md)
media-services Live Event Outputs Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-outputs-concept.md
- Title: Live events and live outputs concepts
-description: This topic provides an overview of live events and live outputs in Azure Media Services v3.
------ Previously updated : 10/23/2020--
-# Live events and live outputs in Media Services
--
-Azure Media Services lets you deliver live events to your customers on the Azure cloud. To set up your live streaming events in Media Services v3, you need to understand the concepts discussed in this article.
-
-> [!TIP]
-> For customers migrating from Media Services v2 APIs, the **live event** entity replaces **Channel** in v2 and **live output** replaces **program**.
-
-## Live events
-
-[Live events](/rest/api/media/liveevents) are responsible for ingesting and processing the live video feeds. When you create a live event, a primary and secondary input endpoint is created that you can use to send a live signal from a remote encoder. The remote live encoder sends the contribution feed to that input endpoint using either the [RTMP](https://helpx.adobe.com/adobe-media-server/dev/stream-live-media-rtmp.html) or [Smooth Streaming](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251) (fragmented-MP4) input protocol. For the RTMP ingest protocol, the content can be sent in the clear (`rtmp://`) or securely encrypted on the wire(`rtmps://`). For the Smooth Streaming ingest protocol, the supported URL schemes are `http://` or `https://`.
-
-## Live event types
-
-A [live event](/rest/api/media/liveevents) can be set to either a basic or standard *pass-through* (an on-premises live encoder sends a multiple bitrate stream) or *live encoding* (an on-premises live encoder sends a single bitrate stream). The types are set during creation using [LiveEventEncodingType](/rest/api/media/liveevents/create#liveeventencodingtype):
-
-* **LiveEventEncodingType.PassthroughBasic**: An on-premises live encoder sends a multiple bitrate stream. The basic pass-through is limited to a peak ingress of 5 Mbps, 8-hour DVR window, and live transcription is not supported.
-* **LiveEventEncodingType.PassthroughStandard**: An on-premises live encoder sends a multiple bitrate stream. The standard pass-through has higher ingest limits, 25-hour DVR window, and support for live transcriptions.
-* **LiveEventEncodingType.Standard**: An on-premises live encoder sends a single bitrate stream to the live event and Media Services creates multiple bitrate streams. If the contribution feed is of 720p or higher resolution, the **Default720p** preset will encode a set of 6 resolution/bitrates pairs.
-* **LiveEventEncodingType.Premium1080p**: An on-premises live encoder sends a single bitrate stream to the live event and Media Services creates multiple bitrate streams. The Default1080p preset specifies the output set of resolution/bitrates pairs.
-
-### Pass-through
-
-![pass-through live event with Media Services example diagram](./media/live-streaming/pass-through.svg)
-
-When using the basic or standard pass-through **live event**, you rely on your on-premises live encoder to generate a multiple bitrate video stream and send that as the contribution feed to the live event (using RTMP or fragmented-MP4 protocol). The live event then carries through the incoming video streams without any further processing. Such a pass-through live event is optimized for long-running live events or 24x365 linear live streaming. When creating this type of live event, specify pass-through "basic" or "standard". (LiveEventEncodingType.PassThroughStandard).
-
-You can send the contribution feed at resolutions up to 4K and at a frame rate of 60 frames/second, with either H.264/AVC or H.265/HEVC (Smooth ingest only) video codecs, and AAC (AAC-LC, HE-AACv1, or HE-AACv2) audio codec. For more information, see [Live event types comparison](live-event-types-comparison-reference.md).
-
-> [!NOTE]
-> Using a pass-through method is the most economical way to do live streaming when you're doing multiple events over a long period of time, and you have already invested in on-premises encoders. See [Pricing](https://azure.microsoft.com/pricing/details/media-services/) details.
->
-
-See the .NET code example for creating a pass-through Live Event in [Live Event with DVR](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/4a436376e77bad57d6cbfdc02d7df6c615334574/Live/LiveEventWithDVR/Program.cs#L214).
-
-### Live encoding
-
-![live encoding with Media Services example diagram](./media/live-streaming/live-encoding.svg)
-
-When using live encoding with Media Services, you configure your on-premises live encoder to send a single bitrate video as the contribution feed to the live event (using RTMP or Fragmented-Mp4 protocol). You then set up a live event so that it encodes that incoming single bitrate stream to a [multiple bitrate video stream](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming), and makes the output available for delivery to play back devices via protocols like MPEG-DASH, HLS, and Smooth Streaming.
-
-When you use live encoding, you can send the contribution feed only at resolutions up to 1080p resolution at a frame rate of 30 frames/second, with H.264/AVC video codec and AAC (AAC-LC, HE-AACv1, or HE-AACv2) audio codec. Note that pass-through live events can support resolutions up to 4K at 60 frames/second. For more information, see [Live event types comparison](live-event-types-comparison-reference.md).
-
-The resolutions and bitrates contained in the output from the live encoder is determined by the preset. If using a **Standard** live encoder (LiveEventEncodingType.Standard), then the *Default720p* preset specifies a set of six resolution/bit rate pairs, going from 720p at 3.5 Mbps down to 192p at 200 kbps. Otherwise, if using a **Premium1080p** live encoder (LiveEventEncodingType.Premium1080p), then the *Default1080p* preset specifies a set of six resolution/bit rate pairs, going from 1080p at 3.5 Mbps down to 180p at 200 kbps. For information, see [System presets](live-event-types-comparison-reference.md#system-presets).
-
-> [!NOTE]
-> If you need to customize the live encoding preset, open a support ticket via Azure portal. Specify the desired table of resolution and bitrates. Verify that there's only one layer at 720p (if requesting a preset for a Standard live encoder) or at 1080p (if requesting a preset for a Premium1080p live encoder), and 6 layers at most.
-
-## Creating live events
-
-### Options
-
-When creating a live event, you can specify the following options:
-
-* You can give the live event a name and a description.
-* Cloud encoding includes Pass-through (no cloud encoding), Standard (up to 720p), or Premium (up to 1080p). For Standard and Premium encoding, you can choose the stretch mode of the encoded video.
- * None: Strictly respects the output resolution specified in the encoding preset without considering the pixel aspect ratio or display aspect ratio of the input video.
- * AutoSize: Overrides the output resolution and changes it to match the display aspect ratio of the input, without padding. For example, if the input is 1920x1080 and the encoding preset asks for 1280x1280, then the value in the preset is overridden, and the output will be at 1280x720, which maintains the input aspect ratio of 16:9.
- * AutoFit: Pads the output (with either letterbox or pillar box) to honor the output resolution, while ensuring that the active video region in the output has the same aspect ratio as the input. For example, if the input is 1920x1080 and the encoding preset asks for 1280x1280, then the output will be at 1280x1280, which contains an inner rectangle of 1280x720 at aspect ratio of 16:9, with pillar box regions 280 pixels wide at the left and right.
-* Streaming protocol (currently, the RTMP and Smooth Streaming protocols are supported). You can't change the protocol option while the live event or its associated live outputs are running. If you require different protocols, create a separate live event for each streaming protocol.
-* Input ID which is a globally unique identifier for the live event input stream.
-* Static hostname prefix which includes none (in which case a random 128 bit hex string will be used), Use live event name, or Use custom name. When you choose to use a customer name, this value is the Custom hostname prefix.
-* You can reduce end-to-end latency between the live broadcast and the playback by setting the input key frame interval, which is the duration (in seconds), of each media segment in the HLS output. The value should be a non-zero integer in the range of 0.5 to 20 seconds. The value defaults to 2 seconds if *neither* of the input or output key frame intervals are set. The key frame interval is only allowed on pass-through events.
-* When creating the event, you can set it to autostart. When autostart is set to true, the live event will be started after creation. The billing starts as soon as the live event starts running. You must explicitly call Stop on the live event resource to halt further billing. Alternatively, you can start the event when you're ready to start streaming.
-
-> [!NOTE]
-> The max framerate is 30 fps for both Standard and Premium encoding.
-
-## StandBy mode
-
-When you create a live event, you can set it to StandBy mode. While the event is in StandBy mode, you can edit the Description, the Static hostname prefix and restrict input and preview access settings. StandBy mode is still a billable mode, but is priced differently than when you start a live stream.
-
-For more information, see [Live event states and billing](live-event-states-billing-concept.md).
-
-* IP restrictions on the ingest and preview. You can define the IP addresses that are allowed to ingest a video to this live event. Allowed IP addresses can be specified as either a single IP address (for example '10.0.0.1'), an IP range using an IP address and a CIDR subnet mask (for example, '10.0.0.1/22'), or an IP range using an IP address and a dotted decimal subnet mask (for example, '10.0.0.1(255.255.252.0)').
-<br/><br/>
-If no IP addresses are specified and there's no rule definition, then no IP address will be allowed. To allow any IP address, create a rule and set 0.0.0.0/0.<br/>The IP addresses have to be in one of the following formats: IpV4 address with four numbers or CIDR address range.
-<br/><br/>
-If you want to enable certain IPs on your own firewalls or want to constrain inputs to your live events to Azure IP addresses, download a JSON file from [Azure Datacenter IP address ranges](https://www.microsoft.com/download/details.aspx?id=41653). For details about this file, select the **Details** section on the page.
-
-* When creating the event, you can choose to turn on live transcriptions. By default, live transcription is disabled. For more information about live transcription read [Live transcription](live-event-live-transcription-how-to.md).
-
-### Naming rules
-
-* Max live event name is 32 characters.
-* The name should follow this [regex](/dotnet/standard/base-types/regular-expression-language-quick-reference) pattern: `^[a-zA-Z0-9]+(-*[a-zA-Z0-9])*$`.
-
-Also see [Streaming Endpoints naming conventions](stream-streaming-endpoint-concept.md#naming-convention).
-
-> [!TIP]
-> To guarantee uniqueness of your live event name, you can generate a GUID then remove all the hyphens and curly brackets (if any). The string will be unique across all live events and its length is guaranteed to be 32.
-
-## Live event ingest URLs
-
-Once the live event is created, you can get ingest URLs that you'll provide to the live on-premises encoder. The live encoder uses these URLs to input a live stream. For more information, see [Recommended on-premises live encoders](encode-recommended-on-premises-live-encoders.md).
-
->[!NOTE]
-> As of the 2020-05-01 API release, "vanity" URLs are known as Static Host Names (useStaticHostname: true)
--
-> [!NOTE]
-> For an ingest URL to be static and predictable for use in a hardware encoder setup, set the **useStaticHostname** property to true and set the **accessToken** property to the same GUID on each creation.
-
-### Example LiveEvent and LiveEventInput configuration settings for a static (non random) ingest RTMP URL.
-
-```csharp
- LiveEvent liveEvent = new LiveEvent(
- location: mediaService.Location,
- description: "Sample LiveEvent from .NET SDK sample",
- // Set useStaticHostname to true to make the ingest and preview URL host name the same.
- // This can slow things down a bit.
- useStaticHostname: true,
-
- // 1) Set up the input settings for the Live event...
- input: new LiveEventInput(
- streamingProtocol: LiveEventInputProtocol.RTMP, // options are RTMP or Smooth Streaming ingest format.
- // This sets a static access token for use on the ingest path.
- // Combining this with useStaticHostname:true will give you the same ingest URL on every creation.
- // This is helpful when you only want to enter the URL into a single encoder one time for this Live Event name
- accessToken: "acf7b6ef-8a37-425f-b8fc-51c2d6a5a86a", // Use this value when you want to make sure the ingest URL is static and always the same. If omitted, the service will generate a random GUID value.
- accessControl: liveEventInputAccess, // controls the IP restriction for the source encoder.
- keyFrameIntervalDuration: "PT2S" // Set this to match the ingest encoder's settings
- ),
-```
-
-* Non static hostname
-
- A non static hostname is the default mode in Media Services v3 when creating a **LiveEvent**. You can get the live event allocated slightly more quickly, but the ingest URL that you would need for your live encoding hardware or software will be randomized . The URL will change if you do stop/start the live event. Non static hostnames are only useful in scenarios where an end user wants to stream using an app that needs to get a live event very quickly and having a dynamic ingest URL isn't a problem.
-
- If a client app doesn't need to pre-generate an ingest URL before the live event is created, let Media Services auto-generate the Access Token for the live event.
-
-* Static Hostnames
-
- Static hostname mode is preferred by most operators that wish to pre-configure their live encoding hardware or software with an RTMP ingest URL that never changes on creation or stop/start of a specific live event. These operators want a predictive RTMP ingest URL which doesn't change over time. This is also very useful when you need to push a static RTMP ingest URL into the configuration settings of a hardware encoding device like the BlackMagic Atem Mini Pro, or similar hardware encoding and production tools.
-
- > [!NOTE]
- > In the Azure portal, the static hostname URL is called "*Static hostname prefix*".
-
- To specify this mode in the API, set `useStaticHostName` to `true` at creation time (default is `false`). When `useStaticHostname` is set to true, the `hostnamePrefix` specifies the first part of the hostname assigned to the live event preview and ingest endpoints. The final hostname would be a combination of this prefix, the media service account name and a short code for the Azure Media Services data center.
-
- To avoid a random token in the URL, you also need to pass your own access token (`LiveEventInput.accessToken`) at creation time. The access token has to be a valid GUID string (with or without the hyphens). Once the mode is set, it can't be updated.
-
- The access token needs to be unique in your Azure region and Media Services account. If your app needs to use a static hostname ingest URL, it's recommended to always create fresh GUID instance for use with a specific combination of region, media services account, and live event.
-
- Use the following APIs to enable the static hostname URL and set the access token to a valid GUID (for example, `"accessToken": "1fce2e4b-fb15-4718-8adc-68c6eb4c26a7"`).
-
- |Language|Enable static hostname URL|Set access token|
- ||||
- |REST|[properties.useStaticHostname](/rest/api/media/liveevents/create#liveevent)|[LiveEventInput.useStaticHostname](/rest/api/media/liveevents/create#liveeventinput)|
- |CLI|[--use-static-hostname](/cli/azure/ams/live-event#az-ams-live-event-create)|[--access-token](/cli/azure/ams/live-event#optional-parameters)|
- |.NET|[LiveEvent.useStaticHostname](/dotnet/api/microsoft.azure.management.media.models.liveevent.usestatichostname?view=azure-dotnet&preserve-view=true#Microsoft_Azure_Management_Media_Models_LiveEvent_UseStaticHostname)|[LiveEventInput.AccessToken](/dotnet/api/microsoft.azure.management.media.models.liveeventinput.accesstoken#Microsoft_Azure_Management_Media_Models_LiveEventInput_AccessToken)|
-
-### Live ingest URL naming rules
-
-* The *random* string below is a 128-bit hex number (which is composed of 32 characters of 0-9 a-f).
-* *your access token*: The valid GUID string you set when using the static hostname setting. For example, `"1fce2e4b-fb15-4718-8adc-68c6eb4c26a7"`.
-* *stream name*: Indicates the stream name for a specific connection. The stream name value is usually added by the live encoder you use. You can configure the live encoder to use any name to describe the connection, for example: "video1_audio1", "video2_audio1", "stream".
-
-#### Non-static hostname ingest URL
-
-##### RTMP
-
-`rtmp://<random 128bit hex string>.channel.media.azure.net:1935/live/<auto-generated access token>/<stream name>`<br/>
-`rtmp://<random 128bit hex string>.channel.media.azure.net:1936/live/<auto-generated access token>/<stream name>`<br/>
-`rtmps://<random 128bit hex string>.channel.media.azure.net:2935/live/<auto-generated access token>/<stream name>`<br/>
-`rtmps://<random 128bit hex string>.channel.media.azure.net:2936/live/<auto-generated access token>/<stream name>`<br/>
-
-##### Smooth streaming
-
-`http://<random 128bit hex string>.channel.media.azure.net/<auto-generated access token>/ingest.isml/streams(<stream name>)`<br/>
-`https://<random 128bit hex string>.channel.media.azure.net/<auto-generated access token>/ingest.isml/streams(<stream name>)`<br/>
-
-#### Static hostname ingest URL
-
-In the following paths, `<live-event-name>` means either the name given to the event or the custom name used in the creation of the live event.
-
-##### RTMP
-
-`rtmp://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net:1935/live/<your access token>/<stream name>`<br/>
-`rtmp://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net:1936/live/<your access token>/<stream name>`<br/>
-`rtmps://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net:2935/live/<your access token>/<stream name>`<br/>
-`rtmps://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net:2936/live/<your access token>/<stream name>`<br/>
-
-##### Smooth streaming
-
-`http://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net/<your access token>/ingest.isml/streams(<stream name>)`<br/>
-`https://<live event name>-<ams account name>-<region abbrev name>.channel.media.azure.net/<your access token>/ingest.isml/streams(<stream name>)`<br/>
-
-## Live event preview URL
-
-Once the live event starts receiving the contribution feed, you can use its preview endpoint to preview and validate that you're receiving the live stream before further publishing. After you've checked that the preview stream is good, you can use the live event to make the live stream available for delivery through one or more (pre-created) Streaming Endpoints. To accomplish this, create a new [live output](/rest/api/media/liveoutputs) on the live event.
-
-> [!IMPORTANT]
-> Make sure that the video is flowing to the preview URL before continuing!
-
-## Live event long-running operations
-
-For details, see [long-running operations](media-services-apis-overview.md#long-running-operations).
-
-## Live outputs
-
-Once you have the stream flowing into the live event, you can begin the streaming event by creating an [Asset](/rest/api/media/assets), [live output](/rest/api/media/liveoutputs), and [Streaming Locator](/rest/api/media/streaminglocators). Live output will archive the stream and make it available to viewers through the [Streaming Endpoint](/rest/api/media/streamingendpoints).
-
-AMS's default allocation is 5 live events per Media Services account. If you would like to increase this limit, please file a support ticket in the Azure portal. AMS is able to increase your live event limit depending on your streaming situation and regional datacenter availabilities.
-
-For detailed information about live outputs, see [Using a cloud DVR](live-event-cloud-dvr-time-how-to.md).
-## Live event output questions
-
-See the [live event questions in the FAQ](frequently-asked-questions.yml). For information on live event quotas, see [quotas and limits](limits-quotas-constraints-reference.md)
media-services Live Event States Billing Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-states-billing-concept.md
- Title: Live event states and billing in Azure Media Services
-description: This topic gives an overview of Azure Media Services live event states and billing.
------ Previously updated : 10/26/2020----
-# Live event states and billing
--
-In Azure Media Services, a live event begins billing as soon as its state transitions to **Running** or **StandBy**. You will be billed even if there is no video flowing through the service. To stop the live event from billing, you have to stop the live event. Live Transcription is billed the same way as the live event.
-
-When **LiveEventEncodingType** on your [live event](/rest/api/media/liveevents) is set to Standard or Premium1080p, Media Services auto shuts off any live event that is still in the **Running** state 12 hours after the input feed is lost, and there are no **live output**s running. However, you will still be billed for the time the live event was in the **Running** state.
-
-> [!NOTE]
-> Pass-through (basic or standard) live events are NOT automatically shut off and must be explicitly stopped through the API to avoid excessive billing.
-
-## States
-
-The live event can be in one of the following states.
-
-|State|Description|
-|||
-|**Stopped**| This is the initial state of the live event after creation (unless autostart was set to true.) No billing occurs in this state. No input can be received by the live event. |
-|**Starting**| The live event is starting and resources getting allocated. No billing occurs in this state. If an error occurs, the live event returns to the Stopped state.|
-| **Allocating** | The allocate action was called on the live event and resources are being provisioned for this live event. Once this operation is done successfully, the live event will transition to StandBy state.
-|**StandBy**| live event resources have been provisioned and is ready to start. Billing occurs in this state. Most properties can still be updated, however ingest or streaming is not allowed during this state.
-|**Running**| The live event resources have been allocated, ingest and preview URLs have been generated, and it is capable of receiving live streams. At this point, billing is active. You must explicitly call Stop on the live event resource to halt further billing.|
-|**Stopping**| The live event is being stopped and resources are being de-provisioned. No billing occurs in this transient state. |
-|**Deleting**| The live event is being deleted. No billing occurs in this transient state. |
-
-You can choose to enable live transcriptions when you create the live event. If you do so, you will be billed for Live Transcriptions whenever the live event is in the **Running** state. Note that you will be billed even if there is no audio flowing through the live event.
-
-## Next steps
--- [Live streaming overview](stream-live-streaming-concept.md)-- [Live streaming tutorial](stream-live-tutorial-with-api.md)
media-services Live Event Streaming Best Practices Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-streaming-best-practices-guide.md
- Title: Media Services live streaming best practices guide
-description: This article describes best practices for achieving low-latency live streams with Azure Media Services.
------- Previously updated : 02/14/2022---
-# Media Services live streaming best practices guide
-
-Customers often ask how they can reduce the latency of their live stream. There are many factors that
-determine the end-to-end latency of a stream. Here are some that you should consider:
-
-1. Delays on the contribution encoder side. When customers use an
- encoding software such as OBS Studio, Wirecast, or others to send an
- RTMP live stream to Media Services. Settings on this software is critical in affecting the end-to-end latency of a live
- stream.
-
-2. Delays in the live streaming pipeline within Azure Media Services.
-
-3. CDN performance
-
-4. Buffering algorithms of the video player and network conditions on
- the client side
-
-5. Timing of provisioning
-
-## Contribution encoder
-
-As a customer, you are in control of the settings of the source encoder
-settings before the RTMP stream reaches Media Services. Here are some
-recommendations for the settings that would give you the lowest possible
-latency:
-
-1. **Pick the same region physically** **closest to your contribution
- encoder for your Media Services account.** This will ensure
- that you have a great network connection to the Media Services
- account.
-
-2. **Use a consistent fragment size.** We recommend a GOP size of 2
- seconds. The default on some encoders, such as OBS, is 8 seconds.
- Make sure that you change this setting.
-
-3. **Use the GPU encoder if your encoding software allows you to do
- that.** This would allow you to offload CPU work to the GPU.
-
-4. **Use an encoding profile that is optimized for low-latency.** For
- example, with OBS Studio, if you use the Nvidia H.264 encoder, you
- may see the ΓÇ£zero latencyΓÇ¥ preset.
-
-5. **Send content that is no higher in resolution than what you plan to
- stream.** For example, if you're using 720p standard encoding live
- events, you a stream that is already at 720p.
-
-6. **Keep your framerate at 30fps or lower unless using pass-through
- live events.** While we support 60 fps input for live events, our
- encoding live event output is still not above 30 fps.
-
-## Configuration of the Azure Media Services live event
-
-Here are some configurations that will help you reduce the latency in
-our pipeline:
-
-1. **Use the ΓÇÿLowLatencyΓÇÖ StreamOption on the live event.**
-
-2. **We recommend that you choose CMAF output for both HLS and DASH
- playback.** This allows you to share the same fragments for both
- formats. It increases your cache hit ratio when CDN is used. For example:
-
-
-| Type | Format | URL example |
-||||
-|HLS CMAF (recommended) | format=m3u8-cmaf | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=m3u8-cmaf)` |
-| MPEG-DASH CMAF (recommended) | format=mpd-time-cmaf | `https://amsv3account-usw22.streaming.media.azure.net/21b17732-0112-4d76-b526-763dcd843449/ignite.ism/manifest(format=mpd-time-cmaf)` |
-
-3. **If you must choose TS output, use an HLS packing ratio of 1.** This
-allows us to pack only one fragment into one HLS segment. You won't
-get the full benefits of LL-HLS in native Apple players.
-
-## Player optimizations
-
-**When choosing and configuring a video player, make sure you use settings that are optimized for lower latency.**
-
-Media Services supports different streaming protocols outputs ΓÇô DASH,
-HLS with TS output and HLS with CMAF fragments. Depending on the
-playerΓÇÖs implementation, buffering decisions impact the latency a
-viewer observes. Poor network conditions or default algorithms that
-favor quality and stability of playback could cause players to decide to
-buffer more content upfront to prevent interruptions during playback.
-These buffers before and during the playback sessions would add to the
-end-to-end latency.
-
-When Azure Media Player is used, the *Low Latency Heuristics* profile
-optimizes the player to have the lowest possible latency on the player
-side.
-
-## CDN choice
-
-Streaming endpoints are the origin servers that deliver the live and VOD
-streaming content to the CDN or to the customer directly. If a live
-event expects a large audience, or the audience is geographically
-located far away from the streaming endpoint (origin) serving the
-content, it's *important* for the customer to shield the origin using a
-Content Delivery Network (CDN).
-
-We recommend using Azure CDN which is provided by Verizon (Standard or
-Premium). We've optimized the integration experience so that a
-customer could configure this CDN with a single select in the Azure portal. Be sure to turn on Origin Shield and Streaming Optimizations for
-your CDN endpoint whenever you start your streaming endpoint.
-
-Our customers also have good experiences bringing their own CDN. Ensure that measures are taken on the CDN to shield the origin from
-excessive traffic.
-
-## Streaming endpoint scaling
-
-> [!NOTE]
-> A **standard streaming endpoint/origins** is a *shared* resource
-that allows customers with low traffic volumes to stream content at
-a lower cost. You would **not** use a standard streaming endpoint to
-scale streaming units if you expect large traffic volumes or you plan to
-use a CDN.
-
-A **premium streaming endpoint/origin** offers more flexibility and
-isolation for customers to scale by adding or removing *dedicated*
-streaming units. A *streaming unit* is a compute resource allocated to a
-streaming endpoint. Each streaming unit can stream approximately 200
-Mbps of traffic.
-
-While you can concurrently stream many live events at once using
-the same streaming endpoint, the maximum default streaming units needed
-for one streaming endpoint is 10. You can open a support ticket to
-request more than the default 10.
-
-## Determine the premium streaming units needed
-
-There are three steps to determine the number of streaming endpoints and
-streaming units needed:
-
-1. Determine the total egress needed.
-
-2. Divide the total egress by 200, which is the maximum Mbps each streaming unit can stream.
-
-### Determine the total egress needed
-
-Determine the total egress needed by using the following formula.
-
-*Total egress needed = average bandwidth x number of concurrent viewers
-x percent* *handled by the streaming endpoint.*
-
-LetΓÇÖs take a look at each of the multipliers in turn.
-
-**Average bandwidth.** What is the *average* bitrate you plan to stream?
-In other words, if you're going to have multiple bitrates available
-what bit rate is the average of all the bitrates you're planning for?
-You can estimate this using one of the following methods:
-
-For a live event that *includes encoding*:
-
- - If you donΓÇÖt know what your *average* bandwidth is going to be, you
- could use our top bitrates as an estimate. Our *top* bitrates are:
-
- - 5.5Mbps for the 1080p encoded live events, therefore, your
- average bitrate is going to be somewhere around 3.5Mbps.
-
- - Look at the encoding preset used for encoding the live event, for
- example, the AdaptiveStreaming(H.264) preset. See this [output
- example](encode-autogen-bitrate-ladder.md#output).
-
-For a live event that is simply using pass-through and not encoding:
-
- - Check the encoding bitrate ladder used by your local encoder.
-
-**Number of concurrent viewers.** How many concurrent viewers are
-expected? This could be hard to estimate, but do your best based on your
-customer data. Are you streaming a conference to a global audience? Are
-you planning to live stream to sell a set of products to your customers?
-
-**Percent of traffic** **handled by** **the streaming endpoint.** This
-can also be expressed as ΓÇ£the percent of traffic NOT handled by the CDNΓÇ¥
-since that is the number that actually goes into the formula. So, with
-that in mind, what is the CDN offload you expect? If the CDN is expected
-to handle 90% of the live traffic, then only 10% of the traffic would be
-expected on the streaming endpoint. The number used in the formula is
-.10 which is the percentage of traffic expected on the streaming
-endpoint.
-
-### Determine the number of premium streaming units needed
-
-Premium streaming units needed = Average bandwidth x \# of viewers x
-Percentage of traffic not handled by the CDN / 200 Mbps
-
-### Example
-
-You've recently released a new product and want to present it to your
-established customers. You want low latency because you donΓÇÖt want to
-frustrate your already busy audience, so you'll use premium streaming
-endpoints and a CDN.
-
-You have approximately 100,000 customers, but they probably arenΓÇÖt all
-going to watch your live event. You guess that in the best case, only 1%
-of them will attend, which brings your expected concurrent viewers to
-1,000.
-
-*Number of concurrent users =* *1,000*
-
-You've decided that you're going to use Media Services to encode your
-live stream and won't be using pass-through. You donΓÇÖt know what the
-average bandwidth is going to be, but you do know that you'll deliver
-in 1080p (*top* bitrate of 5.5 Mbps), so your *average* bandwidth is
-estimated to be 3.5 Mbps for your calculations.
-
-*Average bandwidth =* *3.5*
-
-Since your audience is dispersed worldwide, you expect that the CDN will
-handle most (90%) of the live traffic. Therefore, the premium streaming
-endpoints will only handle 10% of the traffic.
-
-*Percent handled by the streaming endpoint =* *10% = 0.1*
-
-Using the formula provided above:
-
-*Total egress needed = average bandwidth x number of concurrent viewers
-x percent handled by the streaming endpoint.*
-
-*total egress needed* = 3.5 x 1,000 x 0.1
-
-*total egress needed* = 350 Mbps
-
-Dividing the total egress by 200 you determine that you need 1.75
-premium streaming units.
-
-*premium streaming units needed* = *total egress needed*/200Mpbs
-
-*premium streaming units needed* = 1.75
-
-We'll round up this number to 2, giving us 2 units needed.
-
-### Use the portal to estimate your needs
-
-The Azure portal can help you simplify the calculations. On the
-streaming page, you can use the calculator provided to see the estimated
-audience reach when you change the average bandwidth, CDN hit ratio and
-number of streaming units.
-
-1. From the media services account page, select **Steaming endpoints** from
- the menu.
-
-2. Add a new streaming endpoint by selecting **Add streaming endpoint**.
-
-3. Give the streaming endpoint a name.
-
-4. Select **Premium streaming endpoint** for the streaming endpoint type.
-
-5. Since you're just getting an estimate at this point, donΓÇÖt start
- the streaming endpoint after creation. Select **No**.
-
-6. Select *Standard Verizon* or *Premium Verizon* for your CDN pricing
- tier. The profile name will change accordingly. Leave the name as it
- is for this exercise.
-
-7. For the CDN profile, select **Create New**.
-
-8. Select **Create**. Once the endpoint has been deployed, the streaming
- endpoints screen will appear.
-
-9. Select the streaming endpoint you just created. The streaming
- endpoint screen will appear with audience reach estimates.
-
-10. The default setting for the streaming endpoint with 1 streaming unit
- shows that it's estimated to stream to 571 concurrent viewers at
- 3.5 Mbps using 90% of the CDN and 10% of the streaming endpoint.
-
-11. Change the percentage of the **Egress source** from 90% from CDN cache
- to 0%. The calculator will estimate that you'll be able to stream
- to 57 concurrent viewers at 3.5 Mbps at 200 Mbps **without** a CDN.
-
-12. Now change the **Egress source** back to 90%.
-
-13. Then, change the **streaming units** to 2. The calculator will estimate
- that you'll be able to stream to 1143 concurrent viewers at
- 3.5 Mbps with 4000Mpbs with the CDN handling 90% of the traffic.
-
-14. Select **Save**.
-
-15. You can start the streaming endpoint and try sending traffic to it.
- The metrics at the bottom of the screen will track actual traffic.
-
-## Timing
-
-You may want to provision streaming units 1 hour ahead of the expected
-peak usage to ensure streaming units are ready.
media-services Live Event Types Comparison Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-types-comparison-reference.md
- Title: Azure Media Services LiveEvent types
-description: In Azure Media Services, a live event can be set to either a *pass-through* or *live encoding*. This article shows a detailed table that compares Live Event types.
------- Previously updated : 02/17/2022---
-# Live Event types comparison
--
-In Azure Media Services, a [Live Event](/rest/api/media/liveevents) can be set to either a *pass-through* (an on-premises live encoder sends a multiple bitrate stream) or *live encoding* (an on-premises live encoder sends a single bitrate stream).
-
-This articles compares features of the live event types.
-
-## Types comparison
-
-The following table compares features of the Live Event types. The types are set during creation using [LiveEventEncodingType](/rest/api/media/liveevents/create#liveeventencodingtype):
-
-* **LiveEventEncodingType.PassthroughBasic**: An on-premises live encoder sends a multiple bitrate stream. The basic pass-through is limited to a peak ingress of 5Mbps, up to 8-hour DVR window, and live transcription is not supported.
-* **LiveEventEncodingType.PassthroughStandard**: An on-premises live encoder sends a multiple bitrate stream. The standard pass-through has higher ingest limits, up to 25-hour DVR window, and support for live transcriptions.
-* **LiveEventEncodingType.Standard** - An on-premises live encoder sends a single bitrate stream to the Live Event and Media Services creates multiple bitrate streams. If the contribution feed is of 720p or higher resolution, the **Default720p** preset will encode a set of 6 resolution/bitrate pairs (details follow later in the article).
-* **LiveEventEncodingType.Premium1080p** - An on-premises live encoder sends a single bitrate stream to the Live Event and Media Services creates multiple bitrate streams. The Default1080p preset specifies the output set of resolution/bitrate pairs (details follow later in the article).
-
-| Feature | Basic pass-through | Standard pass-through | Standard 720P or Premium 1080P Encoding Event |
-| | -- | -- | |
-| Single bitrate input is transcoded into multiple bitrates in the cloud | No | No | Yes |
-| Maximum video resolution for contribution feed | 4K (4096x2160 at 60 frames/sec) | 4K (4096x2160 at 60 frames/sec) | 1080p (1920x1088 at 30 frames/sec) |
-| Recommended maximum layers in contribution feed (within ingest bandwidth limits) | Limited to maximum aggregate bandwidth of 5 Mbps | Limited to maximum aggregate bandwidth of 60 Mbps | 1 video track and 1 audio (any additional tracks are silently dropped) track |
-| Maximum layers in output | Same as input | Same as input | Up to 6 (see System Presets below) |
-| Maximum aggregate bandwidth of contribution feed | Supports combined input up to 5 Mbps, individual bitrates not to exceed 4 Mbps. No video frame rate restriction. | Supports combined input up to 60 Mbps, individual bitrates not to exceed 20Mbps. No video frame rate restriction. | Supports single bitrate input. Individual input bandwidth cannot exceed 20Mbps. Video frame rate cannot exceed 60 frames/second. |
-| Maximum DVR (time shift) window duration allowed | up to 8 hours | up to 25 hours | up to 25 hours |
-| Maximum number of live outputs allowed | only 1 live output | up to 3 live outputs | up to 3 live outputs |
-| Maximum bitrate for a single layer in the contribution | Up to 4 Mbps | 20 Mbps | 20 Mbps |
-| Support for multiple language audio tracks | Yes | Yes | No |
-| Supported input video codecs | H.264/AVC (RTMP and Smooth), or H.265/HEVC (Smooth Streaming ingest only) | H.264/AVC (RTMP and Smooth), or H.265/HEVC (Smooth Streaming ingest only) | H.264/AVC (RTMP and Smooth Streaming ingest) |
-| Supported output video codecs | Same as input | Same as input | H.264/AVC |
-| Supported video bit depth, input, and output | Up to 10-bit including HDR 10/HLG | Up to 10-bit including HDR 10/HLG | 8-bit |
-| Supported input audio codecs | AAC-LC, HE-AAC v1, HE-AAC v2 | AAC-LC, HE-AAC v1, HE-AAC v2 | AAC-LC, HE-AAC v1, HE-AAC v2 |
-| Supported output audio codecs | Same as input | Same as input | AAC-LC |
-| Maximum video resolution of output video | Same as input | Same as input | Standard - 720p, Premium1080p - 1080p |
-| Maximum frame rate of input video | 60 frames/second | 60 frames/second | Standard or Premium1080p - 60 frames/second - transcoded output will be reduced to 23.98, 24, 25, 29.97, or 30 fps only depending on the source frame rate. |
-| Input protocols | RTMP, fragmented-MP4 (Smooth Streaming) | RTMP, fragmented-MP4 (Smooth Streaming) | RTMP, fragmented-MP4 (Smooth Streaming) |
-| Price | See the [pricing page](https://azure.microsoft.com/pricing/details/media-services/) and click on "Live Video" tab | See the [pricing page](https://azure.microsoft.com/pricing/details/media-services/) and click on "Live Video" tab | See the [pricing page](https://azure.microsoft.com/pricing/details/media-services/) and click on "Live Video" tab |
-| Maximum run time | 24 hrs x 365 days, live linear | 24 hrs x 365 days, live linear | 24 hrs x 365 days, live linear (preview) |
-| Ability to pass through embedded CEA 608/708 captions data | Yes | Yes | Yes |
-| Live transcription support | No. Live transcriptions are not supported for basic pass-through. | Yes | Yes |
-| Support for ad signaling via SCTE-35 in-band messages | Yes | Yes | Yes |
-| Support for non-uniform input GOPs | Yes | Yes | Yes duration |
-| Auto-shutoff of Live Event when input feed is lost | No | No | After 12 hours, if there is no LiveOutput running |
-
-## System presets
-
-The resolutions and bitrates contained in the output from the live encoder are determined by the [presetName](/rest/api/media/liveevents/create#liveeventencoding). If using a **Standard** live encoder (LiveEventEncodingType.Standard), then the *Default720p* preset specifies a set of 6 resolution/bitrate pairs described below. Otherwise, if using a **Premium1080p** live encoder (LiveEventEncodingType.Premium1080p), then the *Default1080p* preset specifies the output set of resolution/bitrate pairs.
-
-> [!NOTE]
-> You cannot apply the Default1080p preset to a Live Event if it has been setup for Standard live encoding - you will get an error. You will also get an error if you try to apply the Default720p preset to a Premium1080p live encoder.
-
-### Output Video Streams for Default720p
-
-If the contribution feed is of 720p or higher resolution, the **Default720p** preset will encode the feed into the following 6 layers. In the table below, Bitrate is in kbps, MaxFPS represents that maximum allowed frame rate (in frames/second), Profile represents the H.264 Profile used.
-
-If the source frame rate on input is >30 fps, the frame rate will be reduced to match half of the input frame rate. For example 60 fps would be reduced to 30fps. 50 fps would be reduced to 25 fps, etc.
--
-| Bitrate | Width | Height | MaxFPS | Profile |
-| - | -- | | | - |
-| 3500 | 1280 | 720 | 30 | High |
-| 2200 | 960 | 540 | 30 | High |
-| 1350 | 704 | 396 | 30 | High |
-| 850 | 512 | 288 | 30 | High |
-| 550 | 384 | 216 | 30 | High |
-| 200 | 340 | 192 | 30 | High |
-
-> [!NOTE]
-> If you need to customize the live encoding preset, please open a support ticket via Azure Portal. You should specify the desired table of video resolution and bitrates. Customization of the audio encoding bitrate is not supported. Do verify that there is only one layer at 720p, and at most 6 layers. Also do specify that you are requesting a preset.
-
-### Output Video Streams for Default1080p
-
-If the contribution feed is of 1080p resolution, the **Default1080p** preset will encode the feed into the following 6 layers.
-
-If the source frame rate on input is >30 fps, the frame rate will be reduced to match half of the input frame rate. For example 60 fps would be reduced to 30fps. 50 fps would be reduced to 25 fps, etc.
-
-| Bitrate | Width | Height | MaxFPS | Profile |
-| - | -- | | | - |
-| 5500 | 1920 | 1080 | 30 | High |
-| 3000 | 1280 | 720 | 30 | High |
-| 1600 | 960 | 540 | 30 | High |
-| 800 | 640 | 360 | 30 | High |
-| 400 | 480 | 270 | 30 | High |
-| 200 | 320 | 180 | 30 | High |
-
-> [!NOTE]
-> If you need to customize the live encoding preset, please open a support ticket via Azure Portal. You should specify the desired table of resolution and bitrates. Verify that there is only one layer at 1080p, and at most 6 layers. Also, specify that you are requesting a preset for a Premium1080p live encoder. The specific values of the bitrates and resolutions may be adjusted over time.
-
-### Output Audio Stream for Default720p and Default1080p
-
-For both *Default720p* and *Default1080p* presets, audio is encoded to stereo AAC-LC at 128 kbps. The sampling rate follows that of the audio track in the contribution feed.
-
-> [!NOTE]
-> If the sampling rate is low, such as 8khz, the encoded output will be lower than 128kbps.
-
-## Implicit properties of the live encoder
-
-The previous section describes the properties of the live encoder that can be controlled explicitly, via the preset - such as the number of layers, resolutions, and bitrates. This section clarifies the implicit properties.
-
-### Group of pictures (GOP) duration
-
-The live encoder follows the [GOP](https://en.wikipedia.org/wiki/Group_of_pictures) structure of the contribution feed - which means the output layers will have the same GOP duration. Hence, it is recommended that you configure the on-premises encoder to produce a contribution feed that has fixed GOP duration (typically 2 seconds). This will ensure that the outgoing HLS and MPEG DASH streams from the service also has fixed GOP durations. Small variations in GOP durations are likely to be tolerated by most devices.
-
-### Frame rate limits
-
-The live encoder also follows the durations of the individual video frames in the contribution feed - which means the output layers will have frames with the same durations. Hence, it is recommended that you configure the on-premises encoder to produce a contribution feed that has fixed frame rate (at most 30 frames/second). This will ensure that the outgoing HLS and MPEG DASH streams from the service also has fixed frame rates durations. Small variations in frame rates may be tolerated by most devices, but there is no guarantee that the live encoder will produce an output that will play correctly. Your on-premises live encoder should not be dropping frames (eg. under low battery conditions) or varying the frame rate in any way.
-
-If the source frame rate on input is >30 fps, the frame rate will be reduced to match half of the input frame rate. For example 60 fps would be reduced to 30fps. 50 fps would be reduced to 25 fps, etc.
-
-### Resolution of contribution feed and output layers
-
-The live encoder is configured to avoid up-converting the contribution feed. As a result the maximum resolution of the output layers will not exceed that of the contribution feed.
-
-For example, if you send a contribution feed at 720p to a Live Event configured for Default1080p live encoding, the output will only have 5 layers, starting with 720p at 3Mbps, going down to 1080p at 200 kbps. Or if you send a contribution feed at 360p into a Live Event configured for Standard live encoding, the output will contain 3 layers (at resolutions of 288p, 216p, and 192p). In the degenerate case, if you send a contribution feed of, say, 160x90 pixels to a Standard live encoder, the output will contain one layer at 160x90 resolution at the same bitrate as that of the contribution feed.
-
-### Bitrate of contribution feed and output layers
-
-The live encoder is configured to honor the bitrate settings in the preset, irrespective of the bitrate of the contribution feed. As a result the bitrate of the output layers may exceed that of the contribution feed. For example, if you send in a contribution feed at a resolution of 720p at 1 Mbps, the output layers will remain the same as in the [table](live-event-types-comparison-reference.md#output-video-streams-for-default720p) above.
--
-## Next steps
-
-[Live streaming overview](stream-live-streaming-concept.md)
media-services Live Event Wirecast Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/live-event-wirecast-quickstart.md
- Title: Create an Azure Media Services live stream
-description: Learn how to create an Azure Media Services live stream by using the portal and Wirecast
----- Previously updated : 08/31/2020---
-# Create an Azure Media Services live stream
--
-This quickstart will help you create an Azure Media Services live stream by using the Azure portal and Telestream Wirecast. It assumes that you have an Azure subscription and have created a Media Services account.
-
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
-
-## Sign in to the Azure portal
-
-Open your web browser, and go to the [Microsoft Azure portal](https://portal.azure.com/). Enter your credentials to sign in to the portal. The default view is your service dashboard.
-
-In this quickstart, we'll cover:
--- Setting up an on-premises encoder with a free trial of Telestream Wirecast.-- Setting up a live stream.-- Setting up live stream outputs.-- Running a default streaming endpoint.-- Using Azure Media Player to view the live stream and on-demand output.-
-To keep things simple, we'll use an encoding preset for Azure Media Services in Wirecast, pass-through cloud encoding, and RTMP.
-
-## Set up an on-premises encoder by using Wirecast
-
-1. Download and install Wirecast for your operating system on the [Telestream website](https://www.telestream.net).
-1. Start the application and use your favorite email address to register the product. Keep the application open.
-1. In the email that you receive, verify your email address. Then the application will start the free trial.
-1. Recommended: Watch the video tutorial in the opening application screen.
-
-## Set up an Azure Media Services live stream
-
-1. Go to the Azure Media Services account within the portal, and then select **Live streaming** from the **Media Services** listing.
-
- ![Live streaming link](media/live-events-wirecast-quickstart/select-live-streaming.png)
-1. Select **Add live event** to create a new live streaming event.
-
- ![Add live event icon](media/live-events-wirecast-quickstart/add-live-event.png)
-1. Enter a name for your new event, such as *TestLiveEvent*, in the **Live event name** box.
-
- ![Live event name box](media/live-events-wirecast-quickstart/live-event-name.png)
-1. Enter an optional description of the event in the **Description** box.
-1. Select the **Pass-through ΓÇô no cloud encoding** option.
-
- ![Cloud encoding option](media/live-events-wirecast-quickstart/cloud-encoding.png)
-1. Select the **RTMP** option.
-1. Make sure that the **No** option is selected for **Start live event**, to avoid being billed for the live event before it's ready. (Billing will begin when the live event is started.)
-
- ![Start live event option](media/live-events-wirecast-quickstart/start-live-event-no.png)
-1. Select the **Review + create** button to review the settings.
-1. Select the **Create** button to create the live event. You're then returned to the live event listing.
-1. Select the link to the live event that you just created. Notice that your event is stopped.
-1. Keep this page open in your browser. We'll come back to it later.
-
-## Set up a live stream by using Wirecast Studio
-
-1. In the Wirecast application, select **Create Empty Document** from the main menu, and then select **Continue**.
-
- ![Wirecast start screen](media/live-events-wirecast-quickstart/open-empty-document.png)
-1. Hover over the first layer in the **Wirecast layers** area. Select the **Add** icon that appears, and select the video input that you want to stream.
-
- ![Wirecast add icon](media/live-events-wirecast-quickstart/add-icon.png)
-
- The **Master Layer 1** dialog box opens.
-1. Select **Video Capture** from the menu, and then select the camera that you want to use.
-
- ![Preview area for video capture](media/live-events-wirecast-quickstart/video-shot-selection.png)
-
- The view from the camera appears in the preview area.
-1. Hover over the second layer in the **Wirecast layers** area. Select the **Add** icon that appears, and select the audio input that you want to stream. The **Master Layer 2** dialog box opens.
-1. Select **Audio capture** from the menu, and then select the audio input that you want to use.
-
- ![Inputs for audio capture](media/live-events-wirecast-quickstart/audio-shot-select.png)
-1. From the main menu, select **Output settings**. The **Select an Output Destination** dialog box appears.
-1. Select **Azure Media Services** from the **Destination** drop-down list. The output setting for Azure Media Services automatically populates *most* of the output settings.
-
- ![Wirecast output settings screen](media/live-events-wirecast-quickstart/azure-media-services.png)
--
-In the next procedure, you'll go back to Azure Media Services in your browser to copy the input URL to enter into the output settings:
-
-1. On the Azure Media Services page of the portal, select **Start** to start the live stream event. (Billing starts now.)
-
- ![Start icon](media/live-events-wirecast-quickstart/start.png)
-2. Set the **Secure/Not secure** toggle to **Not secure**. This step sets the protocol to RTMP instead of RTMPS.
-3. In the **Input URL** box, copy the URL to your clipboard.
-
- ![Input URL](media/live-events-wirecast-quickstart/input-url.png)
-4. Switch to the Wirecast application and paste the **Input URL** into the **Address** box in the output settings.
-
- ![Wirecast input URL](media/live-events-wirecast-quickstart/input-url-wirecast.png)
-5. Select **OK**.
-
-## Set up outputs
-
-This part will set up your outputs and enable you to save a recording of your live stream.
-
-> [!NOTE]
-> For you to stream this output, the streaming endpoint must be running. See the later [Run the default streaming endpoint](#run-the-default-streaming-endpoint) section.
-
-1. Select the **Create outputs** link below the **Outputs** video viewer.
-1. If you like, edit the name of the output in the **Name** box to something more user friendly so it's easy to find later.
-
- ![Output name box](media/live-events-wirecast-quickstart/output-name.png)
-1. Leave all the rest of the boxes alone for now.
-1. Select **Next** to add a streaming locator.
-1. Change the name of the locator to something more user friendly, if you want.
-
- ![Locator name box](media/live-events-wirecast-quickstart/live-event-locator.png)
-1. Leave everything else on this screen alone for now.
-1. Select **Create**.
-
-## Start the broadcast
-
-1. In Wirecast, select **Output** > **Start / Stop Broadcasting** > **Start Azure Media
-
- ![Start broadcast menu items](media/live-events-wirecast-quickstart/start-broadcast.png)
-
- When the stream has been sent to the live event, the **Live** window in Wirecast appears in the video player on the live event page in Azure Media Services.
-
-1. Select the **Go** button under the preview window to start broadcasting the video and audio that you selected for the Wirecast layers.
-
- ![Wirecast Go button](media/live-events-wirecast-quickstart/go-button.png)
-
- > [!TIP]
- > If there's an error, try reloading the player by selecting the **Reload player** link above the player.
-
-## Run the default streaming endpoint
-
-1. Select **Streaming endpoints** in the Media Services listing.
-
- ![Streaming endpoints menu item](media/live-events-wirecast-quickstart/streaming-endpoints.png)
-1. If the default streaming endpoint status is stopped, select it. This step takes you to the page for that endpoint.
-1. Select **Start**.
-
- ![Start button for the streaming endpoint](media/live-events-wirecast-quickstart/start.png)
-
-## Play the output broadcast by using Azure Media Player
-
-1. Copy the streaming URL under the **Output** video player.
-1. In a web browser, open the [Azure Media Player demo](https://ampdemo.azureedge.net/azuremediaplayer.html).
-1. Paste the streaming URL into the **URL** box of Azure Media Player.
-1. Select the **Update Player** button.
-1. Select the **Play** icon on the video to see your live stream.
-
-## Stop the broadcast
-
-When you think you've streamed enough content, stop the broadcast.
-
-1. In Wirecast, select the **Broadcast** button. This step stops the broadcast from Wirecast.
-1. In the portal, select **Stop**. You then get a warning message that the live stream will stop but the output will now become an on-demand asset.
-1. Select **Stop** in the warning message. Azure Media Player now shows an error, because the live stream is no longer available.
-
-## Play the on-demand output by using Azure Media Player
-
-The output that you created is now available for on-demand streaming as long as your streaming endpoint is running.
-
-1. Go to the Media Services listing and select **Assets**.
-1. Find the event output that you created earlier and select the link to the asset. The asset output page opens.
-1. Copy the streaming URL under the video player for the asset.
-1. Return to Azure Media Player in the browser and paste the streaming URL into the URL box.
-1. Select **Update Player**.
-1. Select the **Play** icon on the video to view the on-demand asset.
-
-## Clean up resources
-
-> [!IMPORTANT]
-> Stop the services! After you've completed the steps in this quickstart, be sure to stop the live event and the streaming endpoint, or you'll be billed for the time they remain running. To stop the live event, see the [Stop the broadcast](#stop-the-broadcast) procedure, steps 2 and 3.
-
-To stop the streaming endpoint:
-
-1. From the Media Services listing, select **Streaming endpoints**.
-2. Select the default streaming endpoint that you started earlier. This step opens the endpoint's page.
-3. Select **Stop**.
-
-> [!TIP]
-> If you don't want to keep the assets from this event, be sure to delete them so you're not billed for storage.
-
-## Next steps
-> [!div class="nextstepaction"]
-> [Live events and live outputs in Media Services](./live-event-outputs-concept.md)
media-services Media Reserved Units How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-reserved-units-how-to.md
- Title: Scale Media Reserved Units (MRUs)
-description: This topic shows how to use to scale media processing with Azure Media Services.
----- Previously updated : 08/25/2021--
-# How to scale media reserved units (legacy)
--
-This article shows you how to scale Media Reserved Units (MRUs) for faster encoding.
-
-> [!WARNING]
-> This command will no longer work for Media Services accounts that are created with the 2020-05-01 (or later) version of the API or later. For these accounts media reserved units are no longer needed as the system will automatically scale up and down based on load. If you donΓÇÖt see the option to manage MRUs in the Azure portal, youΓÇÖre using an account that was created with the 2020-05-01 API or later.
-> The purpose of this article is to document the legacy process of using MRUs
-
-## Prerequisites
-
-[Create a Media Services account](./account-create-how-to.md).
-
-Understand [Media Reserved Units](concept-media-reserved-units.md).
-
-## [CLI](#tab/cli/)
-
-## Scale Media Reserved Units with CLI
-
-Run the `mru` command.
-
-The following [az ams account mru](/cli/azure/ams/account/mru) command sets Media Reserved Units on the "amsaccount" account using the **count** and **type** parameters.
-
-```azurecli
-az ams account mru set -n amsaccount -g amsResourceGroup --count 10 --type S3
-```
-
-## Billing
-
- While there were previously charges for Media Reserved Units, as of April 17, 2021 there are no longer any charges for accounts that have configuration for Media Reserved Units.
-
-## See also
-
-* [Migrate from Media Services v2 to v3](migrate-v-2-v-3-migration-introduction.md)
--
media-services Media Services Apis Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-services-apis-overview.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Develop with v3 APIs
-: Azure Media Services
-description: Learn about rules that apply to entities and APIs when developing with Media Services v3.
------- Previously updated : 10/23/2020-----
-# Develop with Media Services v3 APIs
--
-As a developer, you can use client libraries for (.NET, Python, Node.js, Java, Go, and Ruby) that allow you to interact with the REST API to easily create, manage, and maintain custom media workflows. The [Media Services v3](https://aka.ms/ams-v3-rest-sdk) API is based on the OpenAPI specification (formerly known as a Swagger).
-
-This article discusses rules that apply to entities and APIs when you develop with Media Services v3.
--
-## Accessing the Azure Media Services API
-
-To be authorized to access Media Services resources and the Media Services API, you must first be authenticated. Media Services supports [Azure Active Directory (Azure AD)-based](../../active-directory/fundamentals/active-directory-whatis.md) authentication. Two common authentication options are:
-
-* **Service principal authentication**: Used to authenticate a service (for example: web apps, function apps, logic apps, API, and microservices). Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs. For example, for web apps there should always be a mid-tier that connects to Media Services with a Service Principal.
-* **User authentication**: Used to authenticate a person who is using the app to interact with Media Services resources. The interactive app should first prompt the user for the user's credentials. An example is a management console app used by authorized users to monitor encoding jobs or live streaming.
-
-The Media Services API requires that the user or app making the REST API requests have access to the Media Services account resource and use a **Contributor** or **Owner** role. The API can be accessed with the **Reader** role but only **Get** or **List** operations will be available. For more information, see [Azure role-based access control (Azure RBAC) for Media Services accounts](security-rbac-concept.md).
-
-Instead of creating a service principal, consider using managed identities for Azure resources to access the Media Services API through Azure Resource Manager. To learn more about managed identities for Azure resources, see [What is managed identities for Azure resources](../../active-directory/managed-identities-azure-resources/overview.md).
-
-### Azure AD service principal
-
-The Azure AD app and service principal should be in the same tenant. After you create the app, give the app **Contributor** or **Owner** role access to the Media Services account.
-
-If you're not sure whether you have permissions to create an Azure AD app, see [Required permissions](../../active-directory/develop/howto-create-service-principal-portal.md#permissions-required-for-registering-an-app).
-
-In the following figure, the numbers represent the flow of the requests in chronological order:
-
-![Middle-tier app authentication with AAD from a web API](./media/use-aad-auth-to-access-ams-api/media-services-principal-service-aad-app1.png)
-
-1. A middle-tier app requests an Azure AD access token that has the following parameters:
-
- * Azure AD tenant endpoint.
- * Media Services resource URI.
- * Resource URI for REST Media Services.
- * Azure AD app values: the client ID and client secret.
-
- To get all the needed values,
-see [Access Azure Media Services API](./access-api-howto.md).
-
-2. The Azure AD access token is sent to the middle tier.
-4. The middle tier sends request to the Azure Media REST API with the Azure AD token.
-5. The middle tier gets back the data from Media Services.
-
-### Samples
-
-See the following samples that show how to connect with Azure AD service principal:
-* [Connect with .NET](configure-connect-dotnet-howto.md)
-* [Connect with Node.js](configure-connect-nodejs-howto.md)
-* [Connect with Python](configure-connect-python-howto.md)
-* [Connect with Java](configure-connect-java-howto.md)
-* [Connect with REST](setup-postman-rest-how-to.md)
-
-## Naming conventions
-
-Azure Media Services v3 resource names (for example, Assets, Jobs, Transforms) are subject to Azure Resource Manager naming constraints. In accordance with Azure Resource Manager, the resource names are always unique. Thus, you can use any unique identifier strings (for example, GUIDs) for your resource names.
-
-Media Services resource names can't include: '<', '>', '%', '&', ':', '&#92;', '?', '/', '*', '+', '.', the single quote character, or any control characters. All other characters are allowed. The max length of a resource name is 260 characters.
-
-For more information about Azure Resource Manager naming, see [Naming requirements](https://github.com/Azure/azure-resource-manager-rpc/blob/master/v1.0/resource-api-reference.md#arguments-for-crud-on-resource) and [Naming conventions](/azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging).
-
-### Names of files/blobs within an asset
-
-The names of files/blobs within an asset must follow both the [blob name requirements](/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata) and the [NTFS name requirements](/windows/win32/fileio/naming-a-file). The reason for these requirements is the files can get copied from blob storage to a local NTFS disk for processing.
-
-## Long-running operations
-
-The operations marked with `x-ms-long-running-operation` in the Azure Media Services [swagger files](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01/streamingservice.json) are long running operations.
-
-For details about how to track asynchronous Azure operations, see [Async operations](../../azure-resource-manager/management/async-operations.md).
-
-Media Services has the following long-running operations:
-
-* [Create Live Events](/rest/api/media/liveevents/create)
-* [Update Live Events](/rest/api/media/liveevents/update)
-* [Delete Live Event](/rest/api/media/liveevents/delete)
-* [Start Live Event](/rest/api/media/liveevents/start)
-* [Stop LiveEvent](/rest/api/media/liveevents/stop)
-
- Use the `removeOutputsOnStop` parameter to delete all associated Live Outputs when stopping the event.
-* [Reset LiveEvent](/rest/api/media/liveevents/reset)
-* [Create LiveOutput](/rest/api/media/liveevents/create)
-* [Delete LiveOutput](/rest/api/media/liveevents/delete)
-* [Create StreamingEndpoint](/rest/api/media/streamingendpoints/create)
-* [Update StreamingEndpoint](/rest/api/media/streamingendpoints/update)
-* [Delete StreamingEndpoint](/rest/api/media/streamingendpoints/delete)
-* [Start StreamingEndpoint](/rest/api/media/streamingendpoints/start)
-* [Stop StreamingEndpoint](/rest/api/media/streamingendpoints/stop)
-* [Scale StreamingEndpoint](/rest/api/media/streamingendpoints/scale)
-
-On successful submission of a long operation, you receive a '201 Created' and must poll for operation completion using the returned operation ID.
-
-The [track asynchronous Azure operations](../../azure-resource-manager/management/async-operations.md) article explains in depth how to track the status of asynchronous Azure operations through values returned in the response.
-
-Only one long-running operation is supported for a given Live Event or any of its associated Live Outputs. Once started, a long running operation must complete before starting a subsequent long-running operation on the same LiveEvent or any associated Live Outputs. For Live Events with multiple Live Outputs, you must await the completion of a long running operation on one Live Output before triggering a long running operation on another Live Output.
-
-## SDKs
-
-> [!NOTE]
-> The Azure Media Services v3 SDKs aren't guaranteed to be thread-safe. When developing a multi-threaded app, you should add your own thread synchronization logic to protect the client or use a new AzureMediaServicesClient object per thread. You should also be careful of multi-threading issues introduced by optional objects provided by your code to the client (like an HttpClient instance in .NET).
-
-|SDK|Reference|
-|||
-|[.NET SDK](https://aka.ms/ams-v3-dotnet-sdk)|[.NET ref](/dotnet/api/overview/azure/mediaservices/management)|
-|[Java SDK](https://aka.ms/ams-v3-java-sdk)|[Java ref](/java/api/overview/azure/mediaservices/management)|
-|[Python SDK](https://aka.ms/ams-v3-python-sdk)|[Python ref](/python/api/overview/azure/mediaservices/management)|
-|[Node.js SDK](https://aka.ms/ams-v3-nodejs-sdk) |[Node.js ref](/javascript/api/overview/azure/arm-mediaservices-readme)|
-|[Go SDK](https://aka.ms/ams-v3-go-sdk) |[Go ref](https://aka.ms/ams-v3-go-ref)|
-|[Ruby SDK](https://aka.ms/ams-v3-ruby-sdk)||
-
-### See also
--- [EventGrid .NET SDK that includes Media Service events](https://www.nuget.org/packages/Microsoft.Azure.EventGrid/)-- [Definitions of Media Services events](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/eventgrid/data-plane/Microsoft.Media/stable/2018-01-01/MediaServices.json)-
-## Azure Media Services Explorer
-
-[Azure Media Services Explorer](https://github.com/Azure/Azure-Media-Services-Explorer) (AMSE) is a tool available to Windows customers who want to learn about Media Services. AMSE is a Winforms/C# application that does upload, download, encode, stream VOD and live content with Media Services. The AMSE tool is for clients who want to test Media Services without writing any code. The AMSE code is provided as a resource for customers who want to develop with Media Services.
-
-AMSE is an Open Source project, support is provided by the community (issues can be reported to https://github.com/Azure/Azure-Media-Services-Explorer/issues). This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information, see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact opencode@microsoft.com with any other questions or comments.
-
-## Filtering, ordering, paging of Media Services entities
-
-See [Filtering, ordering, paging of Azure Media Services entities](filter-order-page-entities-how-to.md).
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
-
-To get all the needed values, see [Access Azure Media Services API](./access-api-howto.md).
-
-## Next steps
-
-* [Connect to Media Services with Java](configure-connect-java-howto.md)
-* [Connect to Media Services with .NET](configure-connect-dotnet-howto.md)
-* [Connect to Media Services with Node.js](configure-connect-nodejs-howto.md)
-* [Connect to Media Services with Python](configure-connect-python-howto.md)
media-services Media Services Arm Template Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-services-arm-template-quickstart.md
- Title: Media Services account ARM template
-description: This article shows you how to use an ARM template to create a media services account.
----- Previously updated : 03/23/2021----
-# Quickstart: Media Services account ARM template
--
-This article shows you how to use an Azure Resource Manager template (ARM template) to create a media services account.
-
-## Introduction
--
-Readers who are experienced with ARM templates can continue to the [deployment section](#deploy-the-template).
-
-If your environment meets the prerequisites and you're familiar with using ARM templates, select the **Deploy to Azure** button. The template will open in the Azure portal.
-
-[![Deploy to Azure](../../media/template-deployments/deploy-to-azure.svg)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.media%2Fmedia-services-create%2Fazuredeploy.json)
-
-## Prerequisites
-
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-
-If you have never deployed an ARM template before, it is helpful to read about [Azure ARM templates](../../azure-resource-manager/templates/index.yml) and go through the [tutorial](../../azure-resource-manager/templates/template-tutorial-create-first-template.md?tabs=azure-powershell).
-
-## Review the template
-
-The template used in this quickstart is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/media-services-create/).
-
-The syntax for the JSON code fence is:
--
-Three Azure resource types are defined in the template:
--- [Microsoft.Storage/storageAccounts](/azure/templates/microsoft.storage/storageaccounts): create a storage account-- [Microsoft.Media/mediaservices](/azure/templates/microsoft.media/mediaservices): create a Media Services account-
-## Set the account
-
-```azurecli-interactive
-
-az account set --subscription {your subscription name or id}
-
-```
-
-## Create a resource group
-
-```azurecli-interactive
-
-az group create --name {the name you want to give your resource group} --location "{pick a location}"
-
-```
-
-## Assign a variable to your deployment file
-
-For convenience, create a variable that stores the path to the template file. This variable makes it easier for you to run the deployment commands because you don't have to retype the path every time you deploy.
-
-```azurecli-interactive
-
-templateFile="{provide the path to the template file}"
-
-```
-
-## Deploy the template
-
-You will be prompted to enter the media services account name.
-
-```azurecli-interactive
-
-az deployment group create --name {the name you want to give to your deployment} --resource-group {the name of resource group you created} --template-file $templateFile
-
-```
-
-## Review deployed resources
-
-You should see a JSON response similar to the below:
-
-```json
-{
- "id": "/subscriptions/{subscriptionid}/resourceGroups/amsarmquickstartrg/providers/Microsoft.Resources/deployments/amsarmquickstartdeploy",
- "location": null,
- "name": "amsarmquickstartdeploy",
- "properties": {
- "correlationId": "{correlationid}",
- "debugSetting": null,
- "dependencies": [
- {
- "dependsOn": [
- {
- "id": "/subscriptions/{subscriptionid}/resourceGroups/amsarmquickstartrg/providers/Microsoft.Storage/storageAccounts/storagey44cfdmliwatk",
- "resourceGroup": "amsarmquickstartrg",
- "resourceName": "storagey44cfdmliwatk",
- "resourceType": "Microsoft.Storage/storageAccounts"
- }
- ],
- "id": "/subscriptions/35c2594a-23da-4fce-b59c-f6fb9513eeeb/resourceGroups/amsarmquickstartrg/providers/Microsoft.Media/mediaServices/{accountname}",
- "resourceGroup": "amsarmquickstartrg",
- "resourceName": "{accountname}",
- "resourceType": "Microsoft.Media/mediaServices"
- }
- ],
- "duration": "PT1M10.8615001S",
- "error": null,
- "mode": "Incremental",
- "onErrorDeployment": null,
- "outputResources": [
- {
- "id": "/subscriptions/{subscriptionid}/resourceGroups/amsarmquickstartrg/providers/Microsoft.Media/mediaServices/{accountname}",
- "resourceGroup": "amsarmquickstartrg"
- },
- {
- "id": "/subscriptions/{subscriptionid}/resourceGroups/amsarmquickstartrg/providers/Microsoft.Storage/storageAccounts/storagey44cfdmliwatk",
- "resourceGroup": "amsarmquickstartrg"
- }
- ],
- "outputs": null,
- "parameters": {
- "mediaServiceName": {
- "type": "String",
- "value": "{accountname}"
- }
- },
- "parametersLink": null,
- "providers": [
- {
- "id": null,
- "namespace": "Microsoft.Media",
- "registrationPolicy": null,
- "registrationState": null,
- "resourceTypes": [
- {
- "aliases": null,
- "apiVersions": null,
- "capabilities": null,
- "locations": [
- "eastus"
- ],
- "properties": null,
- "resourceType": "mediaServices"
- }
- ]
- },
- {
- "id": null,
- "namespace": "Microsoft.Storage",
- "registrationPolicy": null,
- "registrationState": null,
- "resourceTypes": [
- {
- "aliases": null,
- "apiVersions": null,
- "capabilities": null,
- "locations": [
- "eastus"
- ],
- "properties": null,
- "resourceType": "storageAccounts"
- }
- ]
- }
- ],
- "provisioningState": "Succeeded",
- "templateHash": "{templatehash}",
- "templateLink": null,
- "timestamp": "2020-11-24T23:25:52.598184+00:00",
- "validatedResources": null
- },
- "resourceGroup": "amsarmquickstartrg",
- "tags": null,
- "type": "Microsoft.Resources/deployments"
-}
-
-```
-
-In the Azure portal, confirm that your resources have been created.
-
-![quickstart resources created](./media/media-services-arm-template-quickstart/quickstart-arm-template-resources.png)
-
-## Clean up resources
-
-If you aren't planning to use the resources you just created, you can delete the resource group.
-
-```azurecli-interactive
-
-az group delete --name {name of the resource group}
-
-```
-
-## Next steps
-
-To learn more about using an ARM template by following the process of creating a template with parameters, variables and more, try
-
-> [!div class="nextstepaction"]
-> [Tutorial: Create and deploy your first ARM template](../../azure-resource-manager/templates/template-tutorial-create-first-template.md)
media-services Media Services Community https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-services-community.md
- Title: Azure Media Services v3 community overview
-description: This Azure Media Services community page discusses different ways you can ask questions, give feedback, and get updates about Media Services.
------- Previously updated : 08/31/2020---
-# Azure Media Services v3 community
--
-This Azure Media Services community page discusses different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Provide feedback and make suggestions
--
-## Discussion
-
-### Twitter
-
-You can use the [@MSFTAzureMedia](https://twitter.com/MSFTAzureMedia) twitter handle to contact us or follow updates on Twitter. Use the [@AzureSupport](https://twitter.com/azuresupport) twitter handle to request support on Twitter.
-
-### Online forums
-
-The following forums can be used for asking questions about current products and features.
-
-Currently, MSDN is Media Services team's primary community forum.
-
-[![Screenshot showing the logo for MSDN, the Media Services team's primary community forum.](./media/media-services-community/msdn.png)](/answers/topics/azure-media-services.html)
-
-The team also monitors questions tagged on Stack Overflow with 'azure-media-services'.
-
-[![StackOverflow](./media/media-services-community/stack-overflow.png)](https://stackoverflow.com/questions/tagged/azure-media-services)
-
-## Next steps
-
-[Azure Media Services overview](media-services-overview.md)
media-services Media Services Compliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-services-compliance.md
- Title: Azure Media Service Compliance, privacy and security
-: Azure Media Services
-description: As an important reminder, you must comply with all applicable laws in your use of Azure Media Services, and you may not use Media Services or any Azure service in a manner that violates the rights of others, or that may be harmful to others.
----- Previously updated : 2/17/2022-
-#Customer intent: As a developer or a content provider, I want to encode, stream (on demand or live), analyze my media content so that my customers can: view the content on a wide variety of browsers and devices, gain valuable insights from recorded content.
--
-# Azure Media Services compliance, privacy and security
--
-## Compliance, privacy and security
-
-As an important reminder, you must comply with all applicable laws in your use of Azure Media Services, and you may not use Media Services or any Azure service in a manner that violates the rights of others, or that may be harmful to others.
-
-Before uploading any video/image to Media Services, You must have all the proper rights to use the video/image, including, where required by law, all the necessary consents from individuals (if any) in the video/image, for the use, processing, and storage of their data in Media Services and Azure. Some jurisdictions may impose special legal requirements for the collection, online processing and storage of certain categories of data, such as biometric data. Before using Media Services and Azure for the processing and storage of any data subject to special legal requirements, You must ensure compliance with any such legal requirements that may apply to You.
-
-## Learn more about compliance
-
-To learn about compliance, privacy and security in Media Services please visit the Microsoft [Trust Center](https://www.microsoft.com/trust-center/?rtc=1). For Microsoft's privacy obligations, data handling and retention practices, including how to delete your data, please review Microsoft's [Privacy Statement](https://privacy.microsoft.com/PrivacyStatement), the [Online Services Terms](https://www.microsoft.com/licensing/product-licensing/products?rtc=1) ("OST") and [Data Processing Addendum](https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67) ("DPA"). By using Media Services, you agree to be bound by the OST, DPA and the Privacy Statement.
media-services Media Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/media-services-overview.md
- Title: Azure Media Services v3 overview
-: Azure Media Services
-description: A high-level overview of Azure Media Services v3 with links to quickstarts, tutorials, and code samples.
---- Previously updated : 03/09/2022--
-#Customer intent: As a developer or a content provider, I want to encode, stream (on demand or live), analyze my media content so that my customers can: view the content on a wide variety of browsers and devices, gain valuable insights from recorded content.
--
-# Azure Media Services v3 overview
--
-Azure Media Services is a cloud-based platform that enables you to build solutions that achieve broadcast-quality video streaming, enhance accessibility and distribution, analyze content, and much more. Whether you're an app developer, a call center, a government agency, or an entertainment company, Media Services helps you create apps that deliver media experiences of outstanding quality to large audiences on today's most popular mobile devices and browsers.
-
-The Media Services v3 SDKs are based on [Media Services v3 OpenAPI Specification (Swagger)](https://aka.ms/ams-v3-rest-sdk).
-
-
-## What can I do with Media Services?
-
-Media Services lets you build a variety of media workflows in the cloud. Some examples of what you can do with Media Services include:
-
-* Deliver videos in various formats so they can be played on a wide variety of browsers and devices. For both on-demand and live streaming delivery to various clients (mobile devices, TV, PC, and so on), the video and audio content needs to be encoded and packaged appropriately. To see how to deliver and stream such content, see [Quickstart: Encode and stream files](stream-files-dotnet-quickstart.md).
-* Stream live sporting events to a large online audience, like soccer, baseball, college and high school sports, and more.
-* Broadcast public meetings and events, like town halls, city council meetings, and legislative bodies.
-* Analyze recorded videos or audio content. For example, to achieve higher customer satisfaction, organizations can extract speech-to-text and build search indexes and dashboards. Then, they can extract intelligence around common complaints, sources of complaints, and other relevant data.
-* Create a subscription video service and stream DRM protected content when a customer (for example, a movie studio) needs to restrict the access and use of proprietary copyrighted work.
-* Deliver offline content for playback on airplanes, trains, and automobiles. A customer might need to download content onto their phone or tablet for playback when they anticipate to be disconnected from the network.
-* Implement an educational e-learning video platform with Azure Media Services and [Azure Cognitive Services APIs](../../index.yml?pivot=products&panel=ai) for speech-to-text captioning, translating to multi-languages, and so on.
-* Use Azure Media Services together with [Azure Cognitive Services APIs](../../index.yml?pivot=products&panel=ai) to add subtitles and captions to videos to cater to a broader audience (for example, people with hearing disabilities or people who want to read along in a different language).
-* Enable Azure CDN to achieve large scaling to better handle instantaneous high loads (for example, the start of a product launch event).
-
-## How can I get started with v3?
-
-Learn how to encode and package content, stream videos on-demand, broadcast live, and analyze your videos with Media Services v3. Tutorials, API references, and other documentation show you how to securely deliver on-demand and live video or audio streams that scale to millions of users.
-
-> [!TIP]
-> Before you start developing, review: [Fundamental concepts](concepts-overview.md) which includes important concepts, like packaging, encoding, and protecting, and [Developing with Media Services v3 APIs](media-services-apis-overview.md) which includes information on accessing APIs, naming conventions, and so on.
-
-### SDKs
-
-Start developing with [Azure Media Services v3 client SDKs](media-services-apis-overview.md#sdks).
-
-### Quickstarts
-
-The quickstarts show fundamental day-1 instructions for new customers to quickly try out Media Services.
-
-* [Stream video files - .NET](stream-files-dotnet-quickstart.md)
-* [Stream video files - CLI](stream-files-cli-quickstart.md)
-* [Stream video files - Node.js](stream-files-nodejs-quickstart.md)
-
-### Tutorials
-
-The tutorials show scenario-based procedures for some of the top Media Services tasks.
-
-* [Encode remote file and stream video ΓÇô REST](stream-files-tutorial-with-rest.md)
-* [Encode uploaded file and stream video - .NET](stream-files-tutorial-with-api.md)
-* [Stream live - .NET](stream-live-tutorial-with-api.md)
-* [Analyze your video - .NET](analyze-videos-tutorial.md)
-* [AES-128 dynamic encryption - .NET](drm-playready-license-template-concept.md)
-
-### Samples
-
-Use [this samples browser](/samples/browse/?products=azure-media-services) to browse Azure Media Services code samples.
-
-### How-to guides
-
-How-to guides contain code samples that demonstrate how to complete a task. In this section, you'll find many examples. Here are a few of them:
-
-* [Create an account - CLI](./account-create-how-to.md)
-* [Access APIs - CLI](./access-api-howto.md)
-* [Encode with HTTPS as job input - .NET](job-input-from-http-how-to.md)
-* [Monitor events - Portal](monitoring/monitor-events-portal-how-to.md)
-* [Encrypt dynamically with multi-DRM - .NET](drm-protect-with-drm-tutorial.md)
-* [How to encode with a custom transform - CLI](transform-custom-transform-how-to.md)
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Compliance, privacy and security
-
-[!IMPORTANT] Read the [Compliance, privacy and security document](media-services-compliance.md) before using Azure Media Services to deliver your media content.
media-services Migrate V 2 V 3 Differences Api Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-differences-api-access.md
- Title: Media Services V2 vs v3 API access
-description: This article describes the API access differences between Azure Media Services V2 to V3.
------ Previously updated : 03/25/2021---
-# API access differences between Azure Media Services V2 to v3 API
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-2.svg)
-
-This article describes the API access differences between Azure Media Services V2 to V3.
-
-## API Access
-
-All Media Services accounts will have access to the V3 API. However, we strongly
-recommend migration development on a fresh account before applying updated code
-to an existing V2 account. This is because V3 entities aren't backwards
-compatible with V2. Some V2 entities like Assets are forward compatible with V3.
-You can continue to use existing accounts if you donΓÇÖt mix the V2 and V3 APIs
-and then try to go back to V2, but this is discouraged.
-
-Access to the V2 API will be available until it is retired in 2024.
-
-## Create a V3 account
-
-While you are migrating, you can create a V3 account that still has access to V2. Creating the account can be done with:
--- The REST API and older version-- Selecting the checkbox in the portal.-
-> [!div class="mx-imgBorder"]
-> [ ![account creation in the portal](./media/migration-guide/v-3-v-2-access-account-creation-small.png) ](./media/migration-guide/v-3-v-2-access-account-creation.png#lightbox)
-
-All the .NET, CLI, and other SDKs will be targeting the latest 2020-05-01 API, so find or configure the older API versions.
-
-> [!NOTE]
-> New accounts created with the 2020-05-01 (or later) API cannot use V2 APIs.
media-services Migrate V 2 V 3 Differences Feature Gaps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-differences-feature-gaps.md
- Title: Feature gaps between Azure Media Services V2 and V3
-description: This article describes the feature gaps between Azure Media Services V2 to v3.
------- Previously updated : 01/31/2022---
-# Feature gaps between Azure Media Services V2 and V3
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-2.svg)
-
-This part of the migration guidance gives you detailed information about the differences between the V2 and V3 APIs.
-
-## Feature gaps between V2 and V3 APIs
-
-The V3 API has the following feature gaps with the V2 API. A couple
-of the advanced features of the Media Encoder Standard in V2 APIs are currently not available in V3:
--- Inserting a silent audio track when input has no audio or inserting a monochrome video track when input has no video, as this is no longer required with the Azure Media Player.--- Inserting a video track when input has no video.--- The `InsertBlackIfNoVideoBottomLayerOnly` and `InsertBlackIfNoVideo` flags are no longer supported in v3.--- Live Events with transcoding currently don't support Slate insertion mid-stream and ad marker insertion via API call.--- Azure Media Premium Encoder will no longer be supported in V2. If you're using it for 8-bit HEVC encoding, use the new HEVC support in the Standard Encoder.
- - We added support for audio channel mapping to the Standard encoder. See [Audio in the Media Services Encoding Swagger documentation](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/Encoding.json).
- - If you were using advanced features or output formats of the third-party licensed product such as MXF or ProRes, use the Azure Partner solution from Telestream, which will be transactional by the time of the V2 retirement. Alternatively you can use Imagine Communications, or [Bitmovin](http://bitmovin.com).
--- The ΓÇ£availability setΓÇ¥ property on the Streaming Endpoint in V2 is no longer supported. See the sample project and guidance for [High Availability VOD](./architecture-high-availability-encoding-concept.md) delivery in the V3 API.--- In Media Services V3, FairPlay IV cannot be specified. While it doesn't impact customers using Media Services for both packaging and license delivery, it can be an issue when using a third-party DRM system to deliver the FairPlay licenses (hybrid mode).--- Client-side storage encryption for protection of assets at rest has been removed in the V3 API and replaced by storage service encryption for data at rest. The V3 APIs continue to work with existing storage-encrypted assets but won't allow creation of new ones.-
-## Terminology and entity changes
-
-See [Terminology and entity](migrate-v-2-v-3-differences-terminology.md) changes for additional changes to the API.
media-services Migrate V 2 V 3 Differences Terminology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-differences-terminology.md
- Title: Media Services v3 terminology and entity changes
-description: This article describes the terminology differences between Azure Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Terminology and entity changes between Media Services V2 and V3
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-2.svg)
-
-This article describes the terminology differences between Azure Media Services v2 to v3.
-
-## Naming conventions
-
-Review the naming conventions that are applied to Media Services V3 resources. Also review [naming blobs](assets-concept.md#naming)
-
-## Terminology changes
--- A *Locator* is now called a *Streaming Locator*.-- A *Channel* is now called a *Live Event*.-- A *Program* is now called a *Live Output*.-- A *Task* is now called a *JobOutput*, which is part of a Job.-
-## Entity changes
-| **V2 Entity**<!-- row --> | **V3 Entity** | **Guidance** | **Accessible to V3** | **Updated by V3** |
-|--|--|--|--|--|
-| `AccessPolicy`<!-- row --> | <!-- empty --> | The entity `AccessPolicies` doesn't exist in V3. | No | No |
-| `Asset`<!-- row --> | `Asset` | <!-- empty --> | Yes | Yes |
-| `AssetDeliveryPolicy`<!-- row --> | `StreamingPolicy` | <!-- empty --> | Yes | No |
-| `AssetFile`<!-- row --> | <!-- empty --> |The entity `AssetFiles` doesn't exist in V3. Although files (storage blobs) that you upload are still considered files.<br/><br/> Use the Azure Storage APIs to enumerate the blobs in a container instead. There are two ways to apply a transform to the files with a job:<br/><br/>Files already uploaded to storage: The URI would include the asset ID for jobs to be done on assets within a storage account.<br/><br/>Files to be uploaded during the transform and job process: The asset is created in storage, a SAS URL is returned, files are uploaded to storage, and then the transform is applied to the files. | No | No |
-| `Channel`<!-- row --> | `LiveEvent` | Live Events replace Channels from the v2 API. They carry over most features, and have more new features like live transcriptions, stand-by mode, and support for RTMPS ingest. <br/><br/>See [live event in scenario based live streaming](migrate-v-2-v-3-migration-scenario-based-live-streaming.md) | No | No |
-| `ContentKey`<!-- row --> | <!-- empty --> | `ContentKeys` is no longer an entity, it's now a property of a streaming locator.<br/><br/> In v3, the content key data is either associated with the `StreamingLocator` (for output encryption) or the Asset itself (for client side storage encryption). | Yes | No |
-| `ContentKeyAuthorizationPolicy`<!-- row --> | `ContentKeyPolicy` | <!-- empty --> | Yes | No |
-| `ContentKeyAuthorizationPolicyOption` <!-- row --> | <!-- empty --> | `ContentKeyPolicyOptions` are included in the `ContentKeyPolicy`. | Yes | No |
-| `IngestManifest`<!-- row --> | <!-- empty --> | The entity `IngestManifests` doesn't exist in V3. Uploading files in V3 involves the Azure storage API. Assets are first created and then files are uploaded to the associated storage container. There are many ways to get data into an Azure Storage container that can be used instead. `JobInputHttp` also provides a way to download a job input from a given url if desired. | No | No |
-| `IngestManifestAsset`<!-- row --> | <!-- empty --> | There are many ways to get data into an Azure Storage container that can be used instead. `JobInputHttp` also provides a way to download a job input from a given url if desired. | No | No |
-| `IngestManifestFile`<!-- row --> | <!-- empty --> | There are many ways to get data into an Azure Storage container that can be used instead. `JobInputHttp` also provides a way to download a job input from a given url if desired. | No | No |
-| `Job`<!-- row --> | `Job` | Create a `Transform` before creating a `Job`. | No | No |
-| `JobTemplate`<!-- row --> | `Transform` | Use a `Transform` instead. A transform is a separate entity from a job and is reuseable. | No | No |
-| `Locator`<!-- row --> | `StreamingLocator` | <!--empty --> | Yes | No |
-| `MediaProcessor`<!-- row --> | <!-- empty --> | Instead of looking up the `MediaProcessor` to use by name, use the desired preset when defining a transform. The preset used will determine the media processor used by the job system. See encoding topics in [scenario based encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md). <!--Probably needs a link to its own article so customers know Indexerv1 maps to AudioAnalyzerPreset in basic mode, etc.--> | No | NA (readonly in V2) |
-| `NotificationEndPoint`<!-- row --> | <!--empty --> | Notifications in v3 are handled via Azure Event Grid. The `NotificationEndpoint` is replaced by the Event Grid subscription registration which also encapsulates the configuration for the types of notifications to receive (which in v2 was handled by the `JobNotificationSubscription` of the Job, the `TaskNotificationSubscription` of the Task, and the Telemetry `ComponentMonitoringSetting`). The v2 Telemetry was split between Azure Event Grid and Azure Monitor to fit into the enhancements of the larger Azure ecosystem. | No | No |
-| `Program`<!-- row --> | `LiveOutput` | Live Outputs now replace Programs in the v3 API. | No | No |
-| `StreamingEndpoint`<!-- row --> | `StreamingEndpoint` | Streaming Endpoints remain primarily the same. They're used for dynamic packaging, encryption, and delivery of HLS and DASH content for both live and on-demand streaming either direct from origin, or through the CDN. New features include support for better Azure Monitor integration and charting. | Yes | Yes |
-| `Task`<!-- row --> | `JobOutput` | Replaced by `JobOutput` (which is no longer a separate entity in the API). See encoding topics in [scenario based encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md). | No | No |
-| `TaskTemplate`<!-- row --> | `TransformOutput` | Replaced by `TransformOutput` (which is no longer a separate entity in the API). See encoding topics in [scenario based encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md). | No | No |
-| `Inputs`<!-- row --> | `Inputs` | Inputs and outputs are now at the Job level. See encoding topics in [scenario based encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md) | No | No |
-| `Outputs`<!-- row --> | `Outputs` | Inputs and outputs are now at the Job level. In V3, the metadata format changed from XML to JSON. Live Outputs start on creation and stop when deleted. See encoding topics in [scenario based encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md) | No | No |
--
-| **Other changes** | **V2** | **V3** |
-||||
-| **Storage** <!--new row --> |||
-| Storage <!--new row --> | | The V3 SDKs are now decoupled from the Storage SDK, which gives you more control over the version of Storage SDK you want to use and avoids versioning issues. |
-| **Encoding** <!--new row --> |||
-| Encoding bit rates <!--new row --> | bit rates measured in kbps ex: 128 (kbps)| bits per second ex: 128000 (bits/second)|
-| Encoding DRM FairPlay <!--new row --> | In Media Services V2, initialization vector (IV) can be specified. | In Media Services V3, the FairPlay IV cannot be specified.|
-| Premium encoder <!--new row --> | Premium encoder and Legacy Indexer| The [Premium Encoder](../previous/media-services-encode-asset.md) and the legacy [media analytics processors](../previous/legacy-components.md) (Azure Media Services Indexer 2 Preview, Face Redactor, etc.) are not accessible via V3. We added support for audio channel mapping to the Standard encoder. See [Audio in the Media Services Encoding Swagger documentation](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Medi) |
-| **Transforms and jobs** <!--new row -->|||
-| Job based processing HTTPS <!--new row --> |<!-- empty -->| For file-based Job processing, you can use a HTTPS URL as the input. You don't need to have content already stored in Azure, nor do you need to create Assets. |
-| ARM templates for jobs <!--new row --> | ARM templates didn't exist in V2. | A transform can be used to build reusable configurations, to create Azure Resource Manager templates, and isolate processing settings between multiple customers or tenants. |
-| **Live events** <!--new row --> |||
-| Streaming endpoint <!--new row --> | A streaming endpoint represents a streaming service that can deliver content directly to a client player application, or to a Content Delivery Network (CDN) for further distribution. | Streaming Endpoints remain primarily the same. They're used for dynamic packaging, encryption, and delivery of HLS and DASH content for both live and on-demand streaming either direct from origin, or through the CDN. New features include support for better Azure Monitor integration and charting. |
-| Live event channels <!--new row --> | Channels are responsible for processing live streaming content. A Channel provides an input endpoint (ingest URL) that you then provide to a live transcoder. The channel receives live input streams from the live transcoder and makes it available for streaming through one or more streaming endpoints. Channels also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.| Live Events replace Channels from the v2 API. They carry over most features, and have more new features like live transcriptions, stand-by mode, and support for RTMPS ingest. |
-| Live event programs <!--new row --> | A Program enables you to control the publishing and storage of segments in a live stream. Channels manage Programs. The Channel and Program relationship is similar to traditional media where a channel has a constant stream of content and a program is scoped to some timed event on that channel. You can specify the number of hours you want to keep the recorded content for the program by setting the `ArchiveWindowLength` property. This value can be set from a minimum of 5 minutes to a maximum of 25 hours.| Live Outputs now replace Programs in the v3 API. |
-| Live event length <!--new row --> |<!-- empty -->| You can stream Live Events 24/7 when using Media Services for transcoding a single bitrate contribution feed into an output stream that has multiple bitrates.|
-| Live event latency <!--new row --> |<!-- empty -->| New low latency live streaming support on live events. |
-| Live Event Preview <!--new row --> |<!-- empty -->| Live Event Preview supports Dynamic Packaging and Dynamic Encryption. This enables content protection on Preview as well as DASH and HLS packaging. |
-| Live event RTMPS <!--new row --> |<!-- empty-->| Improved RTMPS support with increased stability and more source encoder support. |
-| Live event RTMPS secure ingest <!--new row --> | | When you create a live event, you get 4 ingest URLs. The 4 ingest URLs are almost identical, have the same streaming token `AppId`, only the port number part is different. Two of the URLs are primary and backup for RTMPS.|
-| Live event transcription <!--new row --> |<!-- empty--> | Azure Media Service delivers video, audio, and text in different protocols. When you publish your live stream using MPEG-DASH or HLS/CMAF, then along with video and audio, our service delivers the transcribed text in IMSC1.1 compatible TTML.|
-| Live event standby mode <!--new row --> | There was no standby mode for V2. | Stand-by mode is a new v3 feature that helps manage hot pools of Live Events. Customers can now start a Live Event in stand-by mode at lower cost before transitioning it to the running state. This improves channel start times and reduces costs of operating hot pools for faster start ups. |
-| Live event billing <!--new row --> | <!-- empty-->| Live events billing is based on Live Channel meters. |
-| Live outputs <!--new row --> | Programs had to be started after creation. | Live Outputs start on creation and stop when deleted. |
media-services Migrate V 2 V 3 Migration Benefits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-benefits.md
- Title: The benefits of migrating to Media Services API V3
-description: This article lists the benefits of migrating from Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Step 1 - Understand the benefits of migrating to Media Services API V3
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-1.svg)
-
-## Use the latest API
-
-We encourage you to start using version 2020-05-01 (or later) of the Azure Media Services V3 API now to gain the benefits because new features, functionality, and performance optimizations are only available in the current V3 API.
-
-You can change the API version in the portal, latest SDKs, latest CLI, and REST API with the correct version string.
-
-There have been significant improvements to Media Services with V3.
-
-## Benefits of Media Services v3
-
-| **V3 feature** | **Benefit** |
-| | |
-| **Azure portal** | |
-| Azure portal updates | The Azure portal has been updated to include the management of V3 API entities. It allows customers to use the portal to start live streaming, submit V3 transform jobs, manage content protection policies, streaming endpoints, get API access, manage linked storage accounts, and perform monitoring tasks. |
-| **Accounts and Storage** | |
-| Azure role-based access control (RBAC) | Customers can now define their own roles and control access to each entity in the Media Services ARM API. This helps control access to resources by AAD accounts. |
-| Managed Identities | Managed identities eliminate the need for developers to manage credentials by providing an identity for the Azure resource in Azure AD. See details on managed identities [here](../../active-directory/managed-identities-azure-resources/overview.md). |
-| Private link support | Customers will access Media Services endpoints for Key Delivery, LiveEvents, and StreamingEndpoints via a PrivateEndpoint on their VNet. |
-| [Customer-manged keys](concept-use-customer-managed-keys-byok.md) or bring your own key (BYOK) support | Customers can encrypt the data in their Media Services account using a key in their Azure Key Vault. |
-| **Assets** | |
-| An Asset can have multiple [streaming locators](stream-streaming-locators-concept.md) each with different [dynamic packaging](encode-dynamic-packaging-concept.md) and dynamic encryption settings. | There's a limit of 100 streaming locators allowed on each asset. Customers can store a single copy of the media content in the asset, but share different streaming URLs with different streaming policies or content protection policies that are based on a targeted audience.
-| **Job processing** ||
-| V3 introduces the concept of [Transforms](transform-jobs-concept.md) for file-based Job processing. | A Transform can be used to build reusable configurations, to create Azure Resource Manager Templates, and isolate processing settings between multiple customers or tenants. |
-| For file-based job processing, you can use a HTTP(S) URL as the input. | You don't need to have content already stored in Azure, nor do you need to create input Assets. |
-| **Live events** ||
-| Premium 1080p Live Events | New Live Event SKU allows customers to get cloud encoding with output up to 1080p in resolution. |
-| New [low latency](live-event-latency-reference.md) live streaming support on Live Events. | This allows users to watch live events closer to real time than if they didn't have this setting enabled. |
-| Live Event Preview supports [dynamic packaging](encode-dynamic-packaging-concept.md) and dynamic encryption. | This enables content protection on preview and DASH and HLS packaging. |
-| Live Outputs replace Programs | Live output is simpler to use than the program entity in the v2 APIs. |
-| RTMP ingest for Live Events is improved, with support for more encoders | Increases stability and provides source encoder flexibility. |
-| Live Events can stream 24x7 | You can host a Live Event and keep your audience engaged for longer periods. |
-| Live transcription on Live Events | Live transcription allows customers to automatically transcribe spoken language into text in real time during the live event broadcast. This significantly improves accessibility of live events. |
-| [Stand-by mode](live-event-outputs-concept.md#standby-mode) on Live Events | Live events that are in standby state are less costly than running live events. This allows customers to maintain a set of live events that are ready to start within seconds at a lower cost than maintaining a set of running live events. Reduced pricing for standby live events will become effective in February 2021 for most regions, with the rest to follow in April 2021.
-|**Content protection** ||
-| [Content protection](drm-content-key-policy-concept.md) supports multi-key features. | Customers can now use multiple content encryption keys on their Streaming locators. |
-| **Monitoring** | |
-| [Azure EventGrid](monitoring/reacting-to-media-services-events.md) notification support | EventGrid notifications are more feature rich. There are more types of notifications, broader SDK support for receiving the notifications in your own application, and more existing Azure services that can act as event handlers. |
-| [Azure Monitor support and integration in the Azure portal](monitoring/monitor-events-portal-how-to.md) | This allows customers to visualize Media Services account quota usage, real-time statistics of streaming endpoints, and ingest and archive statistics for live events. Customers are now able to set alerts and perform necessary actions based on real-time metric data. |
media-services Migrate V 2 V 3 Migration Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-introduction.md
- Title: Migrate from Media Services v2 to v3 introduction
-description: This article is an introduction to migrating from Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Migrate from Media Services v2 to v3 introduction
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-The Media Services migration guide helps you migrate from Media Services V2 APIs to V3 APIs based on a migration that takes advantage of the new features and functions that are now available. You should use your best judgment and determine what best fits your scenario.
--
-## How to use this guide
-
-### Navigating
-
-Throughout the guide, you will see the following graphic.
-
-![migration steps](./media/migration-guide/steps.svg)<br/>
-
-The step you are on will be indicated by a color change in the number of the step, like so:
-
-![migration steps 2](./media/migration-guide/steps-2.svg)<br/>
-
-At the end of each page, you will see links to the rest of the migration documents you can read underneath the **Next steps** heading.
-
-### Guidance
-
-The guidance provided here is *general*. It includes content to improve your awareness of what is now available in V3 as well as what has changed in the Media Services workflows.
-
-For more detailed guidance, including screenshots and conceptual graphics, there are links to concepts, tutorials, quickstarts, samples and API references in each scenario-based topic. We have also listed samples that help you compare the V2 API to the V3 API.
-
-## Migration guidance overview
-
-There are four general steps to follow during your migration:
-
-## Step 1 Benefits
-
-<hr color="#5ea0ef" size="10">
-
-[Understand the benefits](migrate-v-2-v-3-migration-benefits.md) of migrating to Media Services API V3.
-
-## Step 2 Differences
-
-<hr color="#5ea0ef" size="10">
-
-Understand the differences between Media Services V2 API and the V3 API.
--- [API access](migrate-v-2-v-3-differences-api-access.md)-- [Feature gaps](migrate-v-2-v-3-differences-feature-gaps.md)-- [Terminology and entity changes](migrate-v-2-v-3-differences-terminology.md)-
-## Step 3 SDK setup
-
-<hr color="#5ea0ef" size="10">
-
-Understand the SDK differences and [set up to migrate to the V3 REST API or client SDK](migrate-v-2-v-3-migration-setup.md).
-
-## Step 4 Scenario-based guidance
-
-<hr color="#5ea0ef" size="10">
-
-Your application of Media Services V2 may be unique. Therefore, we have provided scenario-based guidance based on how you *may have* used media services in the past and the steps for each feature of the service such as:
--- [Encoding](migrate-v-2-v-3-migration-scenario-based-encoding.md)-- [Live streaming](migrate-v-2-v-3-migration-scenario-based-live-streaming.md)-- [Packaging and delivery](migrate-v-2-v-3-migration-scenario-based-publishing.md)-- [Content protection](migrate-v-2-v-3-migration-scenario-based-content-protection.md)-- [Media Reserved Units (MRU)](migrate-v-2-v-3-migration-scenario-based-media-reserved-units.md)
media-services Migrate V 2 V 3 Migration Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-samples.md
- Title: Media Services v2 to v3 migration samples comparison
-description: A set of samples to help you compare the code differences between Azure Media Services v2 to v3.
------- Previously updated : 05/25/2021---
-# Media Services migration code sample comparison
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-## Compare the SDKs
-
-You can use some of our code samples to compare the way things are done between SDKs.
-
-## Samples for comparison
-
-The following table is a listing of samples for comparison between v2 and v3 for common scenarios.
-
-|Scenario|v2 API|v3 API|
-||||
-|Create an asset and upload a file |[v2 .NET example](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-aes/blob/master/DynamicEncryptionWithAES/DynamicEncryptionWithAES/Program.cs#L113)|[v3 .NET example](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#L169)|
-|Submit a job|[v2 .NET example](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-aes/blob/master/DynamicEncryptionWithAES/DynamicEncryptionWithAES/Program.cs#L146)|[v3 .NET example](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#L298)<br/><br/>Shows how to first create a Transform and then submit a Job.|
-|Publish an asset with AES encryption |1. Create `ContentKeyAuthorizationPolicyOption`<br/>2. Create `ContentKeyAuthorizationPolicy`<br/>3. Create `AssetDeliveryPolicy`<br/>4. Create `Asset` and upload content OR submit `Job` and use `OutputAsset`<br/>5. Associate `AssetDeliveryPolicy` with `Asset`<br/>6. Create `ContentKey`<br/>7. Attach `ContentKey` to `Asset`<br/>8. Create `AccessPolicy`<br/>9. Create `Locator`<br/><br/>[v2 .NET example](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-aes/blob/master/DynamicEncryptionWithAES/DynamicEncryptionWithAES/Program.cs#L64)|1. Create `ContentKeyPolicy`<br/>2. Create `Asset`<br/>3. Upload content or use `Asset` as `JobOutput`<br/>4. Create `StreamingLocator`<br/><br/>[v3 .NET example](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/EncryptWithAES/Program.cs#L105)|
-|Get job details and manage jobs |[Manage jobs with v2](../previous/media-services-dotnet-manage-entities.md#get-a-job-reference) |[Manage jobs with v3](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#L546)|
media-services Migrate V 2 V 3 Migration Scenario Based Content Protection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-scenario-based-content-protection.md
- Title: Content protection migration guidance
-description: This article gives your content protection scenario-based guidance that will assist you in your migrating from Azure Media Services v2 to v3.
------ Previously updated : 04/05/2021---
-# Content protection scenario-based migration guidance
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-4.svg)
-
-This article provides you with details and guidance on the migration of content protection use cases from the v2 API to the new Azure Media Services v3 API.
-
-## Protect content in v3 API
-
-Use the support for [Multi-key](architecture-design-multi-drm-system.md) features in the new v3 API.
-
-See content protection concepts, tutorials and how to guides at the end of this article for specific steps.
-
-> [!NOTE]
-> The rest of this article discusses how you can migrate your v2 content protection to v3 with .NET. If you need instructions or sample code for a different language or method, please create a GitHub issue for this page.
-
-## v3 visibility of v2 Assets, StreamingLocators, and properties
-
-In the v2 API, `Assets`, `StreamingLocators`, and `ContentKeys` were used to protect your streaming content. When migrating to the v3 API, your v2 API `Assets`, `StreamingLocators`, and `ContentKeys` are all exposed automatically in the v3 API and all of the data on them is available for you to access.
-
-However, you cannot *update* any properties on v2 entities through the v3 API that were created in v2.
-
-If you need to update, change or alter content stored on v2 entities, update them with the v2 API or create new v3 API entities to migrate them.
-
-## Asset identifier differences
-
-To migrate, you'll need to access properties or content keys from your v2 Assets. It's important to understand that the v2 API uses the `AssetId` as the primary identification key but the new v3 API uses the *Azure Resource Management name* of the entity as the primary identifier. (The v2 `Asset.Name` property is not used as a unique identifier.) With the v3 API, your v2 Asset name now appears as the `Asset.Description`.
-
-For example, if you previously had a v2 Asset with the ID of `nb:cid:UUID:8cb39104-122c-496e-9ac5-7f9e2c2547b8`, the identifier is now at the end of the GUID `8cb39104-122c-496e-9ac5-7f9e2c2547b8`. You'll see this when listing your v2 assets through the v3 API.
-
-Any Assets that were created and published using the v2 API will have both a `ContentKeyPolicy` and a `ContentKey` in the v3 API instead of a default content key policy on the `StreamingPolicy`.
-
-For more information, see the [Content key policy](./drm-content-key-policy-concept.md) documentation and the [Streaming Policy](./stream-streaming-policy-concept.md) documentation.
-
-## Use Azure Media Services Explorer (AMSE) v2 and AMSE v3 tools side by side
-
-Use the [v2 Azure Media Services Explorer tool](https://github.com/Azure/Azure-Media-Services-Explorer/releases/tag/v4.3.15.0) along with the [v3 Azure Media Services Explorer tool](https://github.com/Azure/Azure-Media-Services-Explorer) to compare the data side by side for an Asset created and published via v2 APIs. The properties should all be visible, but in different locations.
-
-## Use the .NET content protection migration sample
-
-You can find a code sample to compare the differences in Asset identifiers using the [v2tov3MigrationSample](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/v2tov3Migration) under ContentProtection in the Media Services code samples.
-
-## List the Streaming Locators
-
-You can query the `StreamingLocators` associated with the Assets created in the v2 API using the new v3 method [ListStreamingLocators](/rest/api/media/assets/liststreaminglocators) on the Asset entity. Also reference the .NET client SDK version of [ListStreamingLocatorsAsync](/dotnet/api/microsoft.azure.management.media.assetsoperationsextensions.liststreaminglocatorsasync?preserve-view=true&view=azure-dotnet)
-
-The results of the `ListStreamingLocators` method will provide you the `Name` and `StreamingLocatorId` of the locator along with the `StreamingPolicyName`.
-
-## Find the content keys
-
-To find the `ContentKeys` used with your `StreamingLocators`, you can call the [StreamingLocator.ListContentKeysAsync](/dotnet/api/microsoft.azure.management.media.streaminglocatorsoperationsextensions.listcontentkeysasync?preserve-view=true&view=azure-dotnet) method.
-
-For more information on content protection in the v3 API, see the article [Protect your content with Media Services dynamic encryption.](./drm-content-protection-concept.md)
-
-## Change the v2 ContentKeyPolicy keeping the same ContentKey
-
-You should first unpublish (remove all Streaming Locators) on the Asset via the v2 SDK. Here's how:
-
-1. Delete the locator.
-1. Unlink the `ContentKeyAuthorizationPolicy`.
-1. Unlink the `AssetDeliveryPolicy`.
-1. Unlink the `ContentKey`.
-1. Delete the `ContentKey`.
-1. Create a new `StreamingLocator` in v3 using a v3 `StreamingPolicy` and `ContentKeyPolicy`, specifying the specific content key identifier and key value needed.
-
-> [!NOTE]
-> It is possible to delete the v2 locator using the v3 API, but this won't remove the content key or the content key policy if they were created in the v2 API.
-
-## Content protection concepts, tutorials and how to guides
-
-### Concepts
--- [Protect your content with Media Services dynamic encryption](drm-content-protection-concept.md)-- [Design of a multi-DRM content protection system with access control](architecture-design-multi-drm-system.md)-- [Media Services v3 with PlayReady license template](drm-playready-license-template-concept.md)-- [Media Services v3 with Widevine license template overview](drm-widevine-license-template-concept.md)-- [Apple FairPlay license requirements and configuration](drm-fairplay-license-overview.md)-- [Streaming Policies](stream-streaming-policy-concept.md)-- [Content Key Policies](drm-content-key-policy-concept.md)-
-### Tutorials
-
-[Quickstart: Use portal to encrypt content](drm-encrypt-content-how-to.md)
-
-### How to guides
--- [Get a signing key from the existing policy](drm-get-content-key-policy-how-to.md)-- [Offline FairPlay Streaming for iOS with Media Services v3](drm-offline-fairplay-for-ios-concept.md)-- [Offline Widevine streaming for Android with Media Services v3](drm-offline-widevine-for-android.md)-- [Offline PlayReady Streaming for Windows 10 with Media Services v3](drm-offline-playready-streaming-for-windows-10.md)-
-## Samples
--- [v2tov3MigrationSample](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/v2tov3Migration)-- You can also [compare the V2 and V3 code in the code samples](migrate-v-2-v-3-migration-samples.md).-
-## Tools
--- [v3 Azure Media Services Explorer tool](https://github.com/Azure/Azure-Media-Services-Explorer)-- [v2 Azure Media Services Explorer tool](https://github.com/Azure/Azure-Media-Services-Explorer/releases/tag/v4.3.15.0)
media-services Migrate V 2 V 3 Migration Scenario Based Encoding https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-scenario-based-encoding.md
- Title: Encoding migration guidance
-description: This article gives you encoding scenario based guidance that will assist you in migrating from Azure Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Encoding scenario-based migration guidance
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-4.svg)
-
-This article gives you encoding scenario based guidance that will assist you in migrating from Azure Media Services v2 to v3.
-
-## Prerequisites
-
-Before you get into changing your encoding workflow, it's recommended that you understand the differences in the way storage is managed. In AMS V3, the Azure Storage API is used to manage the storage account(s) associated with your Media Services account.
-
-> [!NOTE]
-> Jobs and tasks created in v2 do not show up in v3 as they are not associated with a transform. The recommendation is to switch to v3 transforms and jobs.
-
-## Encoding workflow comparison
-
-Take a few minutes to look at the flowcharts below for a visual comparison of the encoding workflows for V2 and V3.
-
-### V2 encoding workflow
-
-Click on the image below to see a larger version.
-
-[![Encoding workflow for V2](./media/migration-guide/V2-pretty.svg) ](./media/migration-guide/V2-pretty.svg#lightbox)
-
-1. Setup
- 1. Create an asset or use and existing asset. If using a new asset, upload content to that asset. If using an existing asset, you should be encoding files that already exist in the asset.
- 2. Get the values of the following items:
- - Media processor ID or object
- - Encoder string (name) of the encoder you want to use
- - Asset ID of new asset OR the asset ID of the existing asset
- 3. For monitoring, create either a job or task level notification subscription or an SDK event handler
-2. Create the job that contains the task or tasks. Each task should include the above items and:
- - A directive that an output asset needs to be created. The output asset is created by the system.
- - Optional name for the output asset
-3. Submit the job.
-4. Monitor the job.
-
-### V3 encoding workflow
-
-[![Encoding workflow for V3](./media/migration-guide/V3-pretty.svg)](./media/migration-guide/V3-pretty.svg#lightbox)
-
-1. Set up
- 1. Create an asset or use an existing asset. If using a new asset, upload content to that asset. If using an existing asset, you should be encoding files that already exist in the asset. You *shouldn't upload more content to that asset.*
- 1. Create an output asset. The output asset is where the encoded files and input and output metadata will be stored.
- 1. Get values for the transform:
- - Standard Encoder preset
- - AMS resource group
- - AMS account name
- 1. Create the transform or use an existing transform. Transforms are reusable. It isn't necessary to create a new transform each time you want to submit a job.
-1. Create a job
- 1. For the job, get the values for the following items:
- - Transform name
- - Base-URI for the SAS URL for your asset, the HTTPs source path of your file share, or the local path of the files. The `JobInputAsset` can also use an asset name as an input.
- - File name(s)
- - Output asset(s)
- - A resource group
- - AMS account name
-1. Use [Event Grid](monitoring/monitor-events-portal-how-to.md) for monitoring your job.
-1. Submit the job.
-
-## Custom presets from V2 to V3 encoding
-
-If your V2 code called the Standard Encoder with a custom preset, you first need to create a new transform with the custom Standard Encoder preset before submitting a job.
-
-Custom presets are now JSON and no longer XML based. Recreate your preset in JSON following the custom preset schema as defined in the [Transform Open API (Swagger)](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/examples/transforms-create.json) documentation.
-
-## Input and output metadata files from an encoding job
-
-In v2, XML input and output metadata files get generated as the result of an encoding job. In v3, the metadata format changed from XML to JSON. For more information about metadata, see [Input metadata](input-metadata-schema.md) and [Output metadata](output-metadata-schema.md).
-
-## Premium Encoder to v3 Standard Encoder or partner-based solutions
-
-The v2 API no longer supports the Premium Encoder. If you previously used the workflow-based Premium Encoder for HEVC encoding should migrate to the new v3 [Standard Encoder](encode-media-encoder-standard-formats-reference.md) with HEVC encoding support.
-
-If you require the advanced workflow features of the Premium Encoder, you're encouraged to start using an Azure advanced encoding partner solution from [Imagine Communications](https://imaginecommunications.com), [Telestream](https://www.telestream.net)or [Bitmovin](https://bitmovin.com).
-
-## Jobs with inputs that are on HTTPS hosted URLs
-
-You can now submit jobs in V3 from files stored either in Azure storage, stored locally, or external web servers using the [HTTP(S) job input support](job-input-from-http-how-to.md).
-
-If you previously used workflows to copy files from Azure blob files into empty assets before submitting jobs, you may be able to simplify your workflow by passing a SAS URL for the file in Azure blob storage directly into the job.
-
-## Indexer v1 audio transcription to the new AudioAnalyzer ΓÇ£basic modeΓÇ¥
-
-For customers using the Indexer v1 processor in the v2 API, you need to create a transform that invokes the new `AudioAnalyzer` in [basic mode](transform-create-basic-audio-how-to.md) prior to submitting a Job.
-
-## Encoding, transforms and jobs concepts, tutorials and how to guides
-
-### Concepts
--- [Encoding video and audio with Media Services](encode-concept.md)-- [Standard Encoder formats and codecs](encode-media-encoder-standard-formats-reference.md)-- [Encode with an autogenerated bitrate ladder](encode-autogen-bitrate-ladder.md)-- [Use the content-aware encoding preset to find the optimal bitrate value for a given resolution](encode-content-aware-concept.md)-- [Media Reserved Units](concept-media-reserved-units.md)-- [Input metadata](input-metadata-schema.md)-- [Output metadata](output-metadata-schema.md)-- [Dynamic packaging in Media Services v3: audio codecs](encode-dynamic-packaging-concept.md#audio-codecs-supported-by-dynamic-packaging)-
-### Tutorials
--- [Tutorial: Encode a remote file based on URL and stream the video - .NET](stream-files-dotnet-quickstart.md)-- [Tutorial: Upload, encode, and stream videos with Media Services v3](stream-files-tutorial-with-api.md)-
-### How to guides
--- [Create a job input from an HTTPS URL](job-input-from-http-how-to.md)-- [Create a job input from a local file](job-input-from-local-file-how-to.md)-- [Create a basic audio transform](transform-create-basic-audio-how-to.md)-- [How to encode with a custom transform](transform-custom-transform-how-to.md)-- [How to create an overlay with Media Encoder Standard](transform-create-overlay-how-to.md)-- [How to generate thumbnails using Encoder Standard](transform-generate-thumbnails-dotnet-how-to.md)-- [Subclip a video when encoding with Media Services - REST](transform-subclip-video-how-to.md)-
-## Samples
-
-You can also [compare the V2 and V3 code in the code samples](migrate-v-2-v-3-migration-samples.md).
media-services Migrate V 2 V 3 Migration Scenario Based Live Streaming https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-scenario-based-live-streaming.md
- Title: Media Services live streaming migration guidance
-description: This article is gives you live streaming scenario based guidance that will assist you min migrating from Azure Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Live streaming scenario-based migration guidance
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-4.svg)
-
-The Azure portal now supports live event set up and management. You are encouraged to try it out while testing your V2 to V3 migration.
-
-## Test the V3 live event workflow
-
-> [!NOTE]
-> Channels and Programs created with v2 (which are mapped to live events and live outputs in v3) be managed with the V3 api. The guidance is to switch over to V3 live events and live outputs at a convenient time when you can stop your existing V2 channel. There is currently no support for seamlessly migrating a continuously running 24x7 live channel to the new V3 live events. So, maintenance downtime needs to be coordinated at the best convenience for your audience and business.
-
-Test the new way of delivering Live events with Media Services before moving your content from V2 to V3. Here are the V3 features to work with and consider for migration.
--- Create a new v3 [Live Event](live-event-outputs-concept.md#live-events) for encoding. You can enable [1080P and 720P encoding presets](live-event-types-comparison-reference.md#system-presets).-- Use the [Live Output](live-event-outputs-concept.md#live-outputs) entity instead of Programs-- Create [streaming locators](stream-streaming-locators-concept.md).-- Consider your need for [HLS and DASH](encode-dynamic-packaging-concept.md) live streaming.-- If you require fast-start of live events explore the new [Standby mode](live-event-outputs-concept.md#standby-mode) features.-- If you want to transcribe your live event while it is happening, explore the new [live transcription](live-event-live-transcription-how-to.md) feature.-- Create 24x7x365 live events in v3 if you need a longer streaming duration.-- Use [Event Grid](monitoring/monitor-events-portal-how-to.md) to monitor your live events.-
-See Live events concepts, tutorials and how to guides below for specific steps.
-
-## Live events concepts, tutorials and how to guides
-
-### Concepts
--- [Live streaming with Azure Media Services v3](stream-live-streaming-concept.md)-- [Live events and live outputs in Media Services](live-event-outputs-concept.md)-- [Verified on-premises live streaming encoders](encode-recommended-on-premises-live-encoders.md)-- [Use time-shifting and Live Outputs to create on-demand video playback](live-event-cloud-dvr-time-how-to.md)-- [live-event-live-transcription-how-to (preview)](live-event-live-transcription-how-to.md)-- [Live Event types comparison](live-event-types-comparison-reference.md)-- [Live event states and billing](live-event-states-billing-concept.md)-- [Live Event low latency settings](live-event-latency-reference.md)-- [Media Services Live Event error codes](live-event-error-codes-reference.md)-
-### Tutorials and quickstarts
--- [Tutorial: Stream live with Media Services](stream-live-tutorial-with-api.md)-- [Create an Azure Media Services live stream with OBS](live-event-obs-quickstart.md)-- [Quickstart: Upload, encode, and stream content with portal](asset-create-asset-upload-portal-quickstart.md)-- [Quickstart: Create an Azure Media Services live stream with Wirecast](live-event-wirecast-quickstart.md)-
-## Samples
-
-You can also [compare the V2 and V3 code in the code samples](migrate-v-2-v-3-migration-samples.md).
media-services Migrate V 2 V 3 Migration Scenario Based Media Reserved Units https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-scenario-based-media-reserved-units.md
- Title: Media Reserved Units (MRUs) migration guidance
-description: This article gives you MRU scenario based guidance that will assist you in migrating from Azure Media Services V2 to V3.
------ Previously updated : 08/25/2021---
-# Media reserved units migration guidance
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-4.svg)
-
-This article gives you MRU scenario based guidance that will assist you in migrating from Azure Media Services V2 to V3.
-
-> [!Important]
-> Media reserved units are no longer needed for any Media services account as the system will automatically scale up and down based on load.
-
-## Scenario guidance
-
-Please migrate your MRUs based on the following scenarios:
-
-* For all Media Services accounts, you no longer are required to set Media Reserved Units (MRUs). The system will now automatically scale up and down based on load.
-* If you have an account that was created before the 2020-05-01 (or later) version of the API, you can still Have access to APIΓÇÖs for managing MRUΓÇÖs, however none of the MRU configuration that you set will be used to control encoding concurrency or performance. For more information, see [Scaling Media Processing](../previous/media-services-scale-media-processing-overview.md). You can manage the MRUs using CLI 2.0 for Media Services V3, or using the Azure portal.
-* If you don't see the option to manage MRUs in the Azure portal, you're running an account that was created with the 2020-05-01 API or later.
-* If you are familiar with setting your MRU type to S3, your performance will improve or remain the same with the removal with MRUs.
-* If you are an existing V2 customer, you need to create a new V3 account to support your existing application prior to the completion of migration.
-* Indexer V1 or other media processors that are not fully deprecated yet may need to be enabled again.
-
-For more information about MRUs, see [Media Reserved Units](concept-media-reserved-units.md) and [How to scale media reserved units](media-reserved-units-how-to.md).
-
-## MRU concepts, tutorials and how to guides
-
-### Concepts
-
-[Media Reserved Units](concept-media-reserved-units.md)
-
-### How to guides
-
-[How to scale media reserved units](media-reserved-units-how-to.md)
-
-## Samples
-
-You can also [compare the V2 and V3 code in the code samples](migrate-v-2-v-3-migration-samples.md).
media-services Migrate V 2 V 3 Migration Scenario Based Publishing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-scenario-based-publishing.md
- Title: Packaging and delivery migration guidance
-description: This article is gives you scenario based guidance for packaging and delivery that will assist you in migrating from Azure Media Services v2 to v3.
------ Previously updated : 03/25/2021---
-# Packaging and delivery scenario-based migration guidance
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-4.svg)
-
-This article gives you scenario-based guidance for packaging and delivery that will assist you in migrating from Azure Media Services v2 to v3.
-
-Major changes to the way content is published in v3 API. The new publishing model is simplified and uses fewer entities to create a Streaming Locator. The API reduced down to just two entities vs. the four entities previously required. Content Key Policies and Streaming Locators now replace the need for `ContentKeyAuthorizationPolicy`, `AssetDeliveryPolicy`, `ContentKey`, and `AccessPolicy`.
-
-## Packaging and delivery in v3
-
-1. Create [Content Key Policies](drm-content-key-policy-concept.md).
-1. Create [Streaming Locators](stream-streaming-locators-concept.md).
-1. Get the [Streaming paths](create-streaming-locator-build-url.md)
- 1. Configure it for a [DASH](encode-dynamic-packaging-concept.md#deliver-dash) or [HLS](encode-dynamic-packaging-concept.md#deliver-hls) player.
-
-See publishing concepts, tutorials and how to guides below for specific steps.
-
-## Will v2 streaming locators continue to work after February 2024?
-
-Streaming locators created with v2 API will continue to work after our v2 API is turned off. Once the Streaming Locator data is created in the Media Services backend database, there is no dependency on the v2 REST API for streaming. We will not remove v2 specific records from the database when v2 is turned off in February 2024.
-
-There are some properties of assets and locators created with v2 that cannot be accessed or updated using the new v3 API. For example, v2 exposes an **Asset Files** API that does not have an equivalent feature in the v3 API. Often this is not a problem for most of our customers, since it is not a widely used feature and you can still stream old locators and delete them when they are no longer needed.
-
-After migration, you should avoid making any calls to the v2 API to modify streaming locators or assets.
-
-## Publishing concepts, tutorials and how to guides
-
-### Concepts
--- [Dynamic packaging in Media Services v3](encode-dynamic-packaging-concept.md)-- [Filters](filters-concept.md)-- [Filter your manifests using Dynamic Packager](filters-dynamic-manifest-concept.md)-- [Streaming Endpoints (Origin) in Azure Media Services](stream-streaming-endpoint-concept.md)-- [Stream content with CDN integration](stream-scale-streaming-cdn-concept.md)-- [Streaming Locators](stream-streaming-locators-concept.md)-
-### How to guides
--- [Manage streaming endpoints with Media Services v3](stream-manage-streaming-endpoints-how-to.md)-- [Create a streaming locator and build URLs](create-streaming-locator-build-url.md)-- [Download the results of a job](job-download-results-how-to.md)-- [Signal descriptive audio tracks](signal-descriptive-audio-howto.md)-- [Azure Media Player full setup](../azure-media-player/azure-media-player-full-setup.md)-- [How to use the Video.js player with Azure Media Services](player-how-to-video-js-player.md)-- [How to use the Shaka player with Azure Media Services](player-shaka-player-how-to.md)-
-## Samples
-
-You can also [compare the V2 and V3 code in the code samples](migrate-v-2-v-3-migration-samples.md).
media-services Migrate V 2 V 3 Migration Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/migrate-v-2-v-3-migration-setup.md
- Title: Media Services v2 to v3 migration setup
-description: This article will assist you with setting up your environment for migrating from Azure Media Services v2 to v3.
---
-tags: ''
-keywords: azure media services, migration, stream, broadcast, live, SDK
--- Previously updated : 03/25/2021---
-# Step 3 - Set up to migrate to the V3 REST API or client SDK
-
-![migration guide logo](./media/migration-guide/azure-media-services-logo-migration-guide.svg)
-
-<hr color="#5ea0ef" size="10">
-
-![migration steps 2](./media/migration-guide/steps-3.svg)
-
-The following describes the steps to take to set up your environment to use the Media Services V3 API.
-
-## SDK model
-
-In the V2 API, there were two different client SDKs, one for the management API, which allowed programmatic creation of accounts, and one for resource management.
-
-Previously, developers would work with an Azure service principal client ID and client secret, along with a specific V2 REST API endpoint for their AMS account.
-
-The V3 API is Azure Resource Management (ARM) based. It uses Azure Active Directory (Azure AD) service principal IDs and keys to connect to the API. Developers will need to create service principals or managed identities to connect to the API. In the V3 API, the API uses standard ARM endpoints, uses a similar and consistent model to all other Azure
-services.
-
-Customers previously using the 2015-10-01 version of the ARM management API to manage their V2 accounts should use the 2020-05-01 (or later) version of the ARM management API supported for V3 API access.
-
-## Create a new media services account for testing
-
-Follow the quickstart steps for [setting up your environment](setup-azure-subscription-how-to.md?tabs=portal) using the Azure portal. Select API access and service principal authentication to generate a new Azure AD application ID and secrets for use with this test account.
-
-[Create a media services account](account-create-how-to.md?tabs=portal).
-[Get credentials to access Media Services API](access-api-howto.md?tabs=portal).
-
-## Download client SDK of your choice and set up your environment
--- SDKs available for [.NET](/dotnet/api/overview/azure/mediaservices/management), .NET Core, [Node.js](/javascript/api/overview/azure/arm-mediaservices-readme), [Python](/python/api/overview/azure/mediaservices/management), [Java](/jav).-- [Azure CLI](/cli/azure/ams) integration for simple scripting support.-
-> [!NOTE]
-> A community PHP SDK is no longer available for Azure Media Services on V3. If you're using PHP on V2, you should migrate to the REST API directly in your code.
-
-## Open API specifications
--- V3 is based on a unified API surface, which exposes both management and operations functionality built on Azure Resource Manager. Azure Resource Manager templates can be used to create and deploy transforms, streaming endpoints, live events, and more.--- The [OpenAPI Specification](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01) (formerly called Swagger) document explains the schema for all service components.--- All client SDKs are derived and generated from the Open API specification published on GitHub. At the time of publication of this article, the latest Open API specifications are maintained publicly in GitHub. The 2020-05-01 version is the [latest stable release](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01).-
-## [REST](#tab/rest)
-
-Use [Postman](./setup-postman-rest-how-to.md) for Media Services v3 REST API calls.
-Read the [REST API reference pages](/rest/api/media/).
-
-You should use the 2020-05-01 (or later) version string in the Postman collection.
-
-## [.NET](#tab/net)
-
-Read the article, [Connect to Media Services v3 API with .NET](configure-connect-dotnet-howto.md) to set up your environment.
-
-If you simply want to install the latest SDK using PackageManager, use the following command:
-
-`Install-Package Microsoft.Azure.Management.Media`
-
-Or to install the latest SDK using the .NET CLI use the following command:
-
-`dotnet add package Microsoft.Azure.Management.Media`
-
-Additionally, full .NET samples are available in [Azure-Samples/media-services-v3-dotnet](https://github.com/Azure-Samples/media-services-v3-dotnet) for various scenarios. The projects in this repository show how to implement different Azure Media Services scenarios using the v3 version.
-
-### Get started adjusting your code
-
-Search your code base for instances of `CloudMediaContext` usage to begin the upgrade process to the V3 API.
-
-The following code shows how the v2 API was previously accessed using the v2 .NET SDK. Developers would begin with creating a `CloudMediaContext` and create an instance with an `AzureAdTokenCredentials` object.
-
-```dotnet
-
-class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- try
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
-```
-
-## [Java](#tab/java)
-
-Read the article, [Connect to Media Services v3 API with Java](configure-connect-java-howto.md) to set up your environment.
-
-## [Python](#tab/python)
-
-Read the article, [Connect to Azure Media Services v3 API - Python](configure-connect-python-howto.md) to set up your environment.
-
-## [Node.js](#tab/nodejs)
-
-Read the article [Connect to Azure Media Services v3 API - Node.js](configure-connect-nodejs-howto.md) to set up your environment.
-
-## [Ruby](#tab/ruby)
--- Get the [Ruby](https://github.com/Azure/azure-sdk-for-ruby/blob/master/README.md) SDK on GitHub.-
-## [Go](#tab/go)
-
-Download the [Go](https://godoc.org/github.com/Azure/azure-sdk-for-go/services/mediaservices/mgmt/2018-07-01/media) SDK.
--
media-services Job State Events Cli How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/job-state-events-cli-how-to.md
- Title: Monitor Azure Media Services events with Event Grid
-description: This article shows how to subscribe to Event Grid in order to monitor Azure Media Services events by using Azure CLI.
------ Previously updated : 03/17/2021----
-# Create and monitor Media Services events with Event Grid using the Azure CLI
--
-Azure Event Grid is an eventing service for the cloud. This service uses [event subscriptions](../../../event-grid/concepts.md#event-subscriptions) to route event messages to subscribers. Media Services events contain all the information you need to respond to changes in your data. You can identify a Media Services event because the eventType property starts with "Microsoft.Media.". For more information, see [Media Services event schemas](media-services-event-schemas.md).
-
-In this article, you use the Azure CLI to subscribe to events for your Azure Media Services account. Then, you trigger events to view the result. Typically, you send events to an endpoint that processes the event data and takes actions. In this article, you send the events to a web app that collects and displays the messages.
-
-## Prerequisites
--- An active Azure subscription. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.-- Install and use the CLI locally, this article requires the Azure CLI version 2.0 or later. Run `az --version` to find the version you have. If you need to install or upgrade, see [Install the Azure CLI](/cli/azure/install-azure-cli).-
- Currently, not all [Media Services v3 CLI](/cli/azure/ams) commands work in the Azure Cloud Shell. It is recommended to use the CLI locally.
--- [Create a Media Services account](../account-create-how-to.md).-
- Make sure to remember the values that you used for the resource group name and Media Services account name.
-
-## Create a message endpoint
-
-Before subscribing to the events for the Media Services account, let's create the endpoint for the event message. Typically, the endpoint takes actions based on the event data. In this article, you deploy a [pre-built web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the event messages. The deployed solution includes an App Service plan, an App Service web app, and source code from GitHub.
-
-1. Select **Deploy to Azure** to deploy the solution to your subscription. In the Azure portal, provide values for the parameters.
-
- [![Image showing a button labeled "Deploy to Azure".](https://azuredeploy.net/deploybutton.png)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fazure-event-grid-viewer%2Fmaster%2Fazuredeploy.json)
-
-1. The deployment may take a few minutes to complete. After the deployment has succeeded, view your web app to make sure it's running. In a web browser, navigate to:
-`https://<your-site-name>.azurewebsites.net`
-
-If you switch to the "Azure Event Grid Viewer" site, you see it has no events yet.
-
-
-## Set the Azure subscription
-
-In the following command, provide the Azure subscription ID that you want to use for the Media Services account. You can see a list of subscriptions that you have access to by navigating to [Subscriptions](https://portal.azure.com/#blade/Microsoft_Azure_Billing/SubscriptionsBlade).
-
-```azurecli
-az account set --subscription mySubscriptionId
-```
-
-## Subscribe to Media Services events
-
-You subscribe to an article to tell Event Grid which events you want to track. The following example subscribes to the Media Services account you created, and passes the URL from the website you created as the endpoint for event notification.
-
-Replace `<event_subscription_name>` with a unique name for your event subscription. For `<resource_group_name>` and `<ams_account_name>`, use the values you used when creating the Media Services account. For the `<endpoint_URL>`, provide the URL of your web app and add `api/updates` to the home page URL. By specifying the endpoint when subscribing, Event Grid handles the routing of events to that endpoint.
-
-1. Get the resource id
-
- ```azurecli
- amsResourceId=$(az ams account show --name <ams_account_name> --resource-group <resource_group_name> --query id --output tsv)
- ```
-
- For example:
-
- ```
- amsResourceId=$(az ams account show --name amsaccount --resource-group amsResourceGroup --query id --output tsv)
- ```
-
-2. Subscribe to the events
-
- ```azurecli
- az eventgrid event-subscription create \
- --source-resource-id $amsResourceId \
- --name <event_subscription_name> \
- --endpoint <endpoint_URL>
- ```
-
- For example:
-
- ```
- az eventgrid event-subscription create --source-resource-id $amsResourceId --name amsTestEventSubscription --endpoint https://amstesteventgrid.azurewebsites.net/api/updates/
- ```
-
- > [!TIP]
- > You might get validation handshake warning. Give it a few minutes and the handshake should validate.
-
-Now, let's trigger events to see how Event Grid distributes the message to your endpoint.
-
-## Send an event to your endpoint
-
-You can trigger events for the Media Services account by running an encoding job. You can follow [this quickstart](../stream-files-dotnet-quickstart.md) to encode a file and start sending events.
-
-View your web app again, and notice that a subscription validation event has been sent to it. Event Grid sends the validation event so the endpoint can verify that it wants to receive event data. The endpoint has to set `validationResponse` to `validationCode`. For more information, see [Event Grid security and authentication](../../../event-grid/security-authentication.md). You can view the web app code to see how it validates the subscription.
-
-> [!TIP]
-> Select the eye icon to expand the event data. Do not refresh the page, if you want to view all the events.
-
-![View subscription event](../media/monitor-events-portal/view-subscription-event.png)
-
-## Next steps
-
-[Upload, encode, and stream](../stream-files-tutorial-with-api.md)
media-services Media Services Diagnostic Logs Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/media-services-diagnostic-logs-howto.md
- Title: Monitor diagnostic logs via Azure Monitor
-description: This article demonstrates how to route and view diagnostic logs via Azure Monitor.
------ Previously updated : 03/17/2021----
-# Monitor Media Services diagnostic logs
--
-[Azure Monitor](../../../azure-monitor/overview.md) enables you to monitor metrics and diagnostic logs that help you understand how your applications are performing. For detailed description of this feature and to see why you would want to use Azure Media Services metrics and diagnostics logs, see [Monitor Media Services metrics and diagnostic logs](monitor-media-services.md).
-
-This article shows you how to route data to the storage account and then view the data.
-
-## Prerequisites
--- [Create a Media Services account](../account-create-how-to.md).-- Review [Monitor Media Services](monitor-media-services.md).-
-## Route data to the storage account using the portal
-
-1. Log in to the Azure portal at https://portal.azure.com.
-1. Navigate to your Media Services account in and click **Diagnostic Settings** under **Monitor**. Here you see a list of all resources in your subscription that produce monitoring data through Azure Monitor.
-
- ![Screenshot that highlights Diagnostic settings under the Monitoring section.](../media/media-services-diagnostic-logs/logs01.png)
-
-1. Click **Add diagnostic setting**.
-
- A resource diagnostic setting is a definition of *what* monitoring data should be routed from a particular resource and *where* that monitoring data should go.
-
-1. In the section that appears, give your setting a **name** and check the box for **Archive to a storage account**.
-
- Select the storage account to which you want to send logs and press **OK**.
-1. Check all the boxes under **Log** and **Metric**. Depending on the resource type, you may only have one of these options. These checkboxes control what categories of log and metric data available for that resource type are sent to the destination you've selected, in this case, a storage account.
-
- ![Diagnostic settings section](../media/media-services-diagnostic-logs/logs02.png)
-1. Set the **Retention (days)** slider to 30. This slider sets a number of days to retain the monitoring data in the storage account. Azure Monitor automatically deletes data older than the number of days specified. A retention of zero days stores the data indefinitely.
-1. Click **Save**.
-
-Monitoring data from your resource is now flowing into the storage account.
-
-## Route data to the storage account using the Azure CLI
-
-To enable storage of diagnostic logs in a Storage Account, you would run the following `az monitor diagnostic-settings` Azure CLI command:
-
-```azurecli-interactive
-az monitor diagnostic-settings create --name <diagnostic name> \
- --storage-account <name or ID of storage account> \
- --resource <target resource object ID> \
- --resource-group <storage account resource group> \
- --logs '[
- {
- "category": <category name>,
- "enabled": true,
- "retentionPolicy": {
- "days": <# days to retain>,
- "enabled": true
- }
- }]'
-```
-
-For example:
-
-```azurecli-interactive
-az monitor diagnostic-settings create --name amsv3diagnostic \
- --storage-account storageaccountforams \
- --resource "/subscriptions/00000000-0000-0000-0000-0000000000/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount" \
- --resource-group "amsResourceGroup" \
- --logs '[{"category": "KeyDeliveryRequests", "enabled": true, "retentionPolicy": {"days": 3, "enabled": true }}]'
-```
-
-## View data in the storage account using the portal
-
-If you have followed the preceding steps, data has begun flowing to your storage account.
-
-You may need to wait up to five minutes before the event appears in the storage account.
-
-1. In the portal, navigate to the **Storage Accounts** section by finding it on the left-hand navigation bar.
-1. Identify the storage account you created in the preceding section and click on it.
-1. Click on **Blobs**, then on the container labeled **insights-logs-keydeliveryrequests**. This is the container that has your logs in it. Monitoring data is broken out into containers by resource ID, then by date and time.
-1. Navigate to the PT1H.json file by clicking into the containers for resource ID, date, and time. Click on the PT1H.json file and click **Download**.
-
- You can now view the JSON event that was stored in the storage account.
-
-### Examples of PT1H.json
-
-#### Clear key delivery log
-
-```json
-{
- "time": "2019-05-21T00:07:33.2820450Z",
- "resourceId": "/SUBSCRIPTIONS/00000000-0000-0000-0000-0000000000000/RESOURCEGROUPS/amsResourceGroup/PROVIDERS/MICROSOFT.MEDIA/MEDIASERVICES/AMSACCOUNT",
- "operationName": "MICROSOFT.MEDIA/MEDIASERVICES/CONTENTKEYS/READ",
- "operationVersion": "1.0",
- "category": "KeyDeliveryRequests",
- "resultType": "Succeeded",
- "resultSignature": "OK",
- "durationMs": 253,
- "identity": {
- "authorization": {
- "issuer": "myIssuer",
- "audience": "myAudience"
- },
- "claims": {
- "urn:microsoft:azure:media
- "nbf": "1558396914",
- "exp": "1558400814",
- "iss": "myIssuer",
- "aud": "myAudience"
- }
- },
- "level": "Informational",
- "location": "westus2",
- "properties": {
- "requestId": "fb5c2b3a-bffa-4434-9c6f-73d689649add",
- "keyType": "Clear",
- "keyId": "e4276e1d-c012-40b1-80d0-ac15808b9277",
- "policyName": "SharedContentKeyPolicyUsedByAllAssets",
- "tokenType": "JWT",
- "statusMessage": "OK"
- }
-}
-```
-
-#### Widevine encrypted key delivery log
-
-```json
-{
- "time": "2019-05-20T23:15:22.7088747Z",
- "resourceId": "/SUBSCRIPTIONS/00000000-0000-0000-0000-0000000000000/RESOURCEGROUPS/amsResourceGroup/PROVIDERS/MICROSOFT.MEDIA/MEDIASERVICES/AMSACCOUNT",
- "operationName": "MICROSOFT.MEDIA/MEDIASERVICES/CONTENTKEYS/READ",
- "operationVersion": "1.0",
- "category": "KeyDeliveryRequests",
- "resultType": "Succeeded",
- "resultSignature": "OK",
- "durationMs": 69,
- "identity": {
- "authorization": {
- "issuer": "myIssuer",
- "audience": "myAudience"
- },
- "claims": {
- "urn:microsoft:azure:media
- "nbf": "1558392430",
- "exp": "1558396330",
- "iss": "myIssuer",
- "aud": "myAudience"
- }
- },
- "level": "Informational",
- "location": "westus2",
- "properties": {
- "requestId": "49613dd2-16aa-4595-a6e0-4e68beae6d37",
- "keyType": "Widevine",
- "keyId": "0092d23a-0a42-4c5f-838e-6d1bbc6346f8",
- "policyName": "DRMContentKeyPolicy",
- "tokenType": "JWT",
- "statusMessage": "OK"
- }
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## See also
-
-* [Azure Monitor Metrics](../../../azure-monitor/data-platform.md)
-* [Azure Monitor Diagnostic logs](../../../azure-monitor/essentials/platform-logs-overview.md)
-* [How to collect and consume log data from your Azure resources](../../../azure-monitor/essentials/platform-logs-overview.md)
-
-## Next steps
-
-[Monitor metrics](media-services-metrics-howto.md)
media-services Media Services Event Schemas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/media-services-event-schemas.md
- Title: Azure Event Grid schemas for Media Services events
-description: Learn about the properties that are provided for Media Services events with Azure Event Grid.
------- Previously updated : 07/08/2021---
-# Azure Event Grid schemas for Media Services events
--
-This article provides the schemas and properties for Media Services events.
-
-For a list of sample scripts and tutorials, see [Media Services event source](../../../event-grid/event-schema-subscriptions.md).
-
-## Job-related event types
-
-Media Services emits the **Job** related event types described below. There are two categories for the **Job** related events: "Monitoring Job State Changes" and "Monitoring Job Output State Changes".
-
-You can register for all of the events by subscribing to the JobStateChange event. Or, you can subscribe for specific events only (for example, final states like JobErrored, JobFinished, and JobCanceled).
-
-### Monitoring Job state changes
-
-| Event type | Description |
-| - | -- |
-| Microsoft.Media.JobStateChange| Get an event for all Job State changes. |
-| Microsoft.Media.JobScheduled| Get an event when Job transitions to scheduled state. |
-| Microsoft.Media.JobProcessing| Get an event when Job transitions to processing state. |
-| Microsoft.Media.JobCanceling| Get an event when Job transitions to canceling state. |
-| Microsoft.Media.JobFinished| Get an event when Job transitions to finished state. This is a final state that includes Job outputs.|
-| Microsoft.Media.JobCanceled| Get an event when Job transitions to canceled state. This is a final state that includes Job outputs.|
-| Microsoft.Media.JobErrored| Get an event when Job transitions to error state. This is a final state that includes Job outputs.|
-
-See [Schema examples](#event-schema-examples) that follow.
-
-### Monitoring job output state changes
-
-A job may contain multiple job outputs (if you configured the transform to have multiple job outputs.) If you want to track the details of the individual job output, listen for a job output change event.
-
-Each **Job** is going to be at a higher level than **JobOutput**, thus job output events get fired inside of a corresponding job.
-
-The error messages in `JobFinished`, `JobCanceled`, `JobError` output the aggregated results for each job output ΓÇô when all of them are finished. Whereas the job output events fire as each task finishes. For example, if you have an encoding output, followed by a Video Analytics output, you would get two events firing as job output events before the final JobFinished event fires with the aggregated data.
-
-| Event type | Description |
-| - | -- |
-| Microsoft.Media.JobOutputStateChange| Get an event for all Job output State changes. |
-| Microsoft.Media.JobOutputScheduled| Get an event when Job output transitions to scheduled state. |
-| Microsoft.Media.JobOutputProcessing| Get an event when Job output transitions to processing state. |
-| Microsoft.Media.JobOutputCanceling| Get an event when Job output transitions to canceling state.|
-| Microsoft.Media.JobOutputFinished| Get an event when Job output transitions to finished state.|
-| Microsoft.Media.JobOutputCanceled| Get an event when Job output transitions to canceled state.|
-| Microsoft.Media.JobOutputErrored| Get an event when Job output transitions to error state.|
-
-See [Schema examples](#event-schema-examples) that follow.
-
-### Monitoring job output progress
-
-| Event type | Description |
-| - | -- |
-| Microsoft.Media.JobOutputProgress| This event reflects the job processing progress, from 0% to 100%. The service attempts to send an event if there has been 5% or greater increase in the progress value or it has been more than 30 seconds since the last event (heartbeat). The progress value is not guaranteed to start at 0%, or to reach 100%, nor is it guaranteed to increase at a constant rate over time. Don't use this event to determine that the processing has been completed ΓÇô you should instead use the state change events.|
-
-See [Schema examples](#event-schema-examples) that follow.
-
-## Live event types
-
-Media Services also emits the **Live** event types described below. There are two categories for the **Live** events: stream-level events and track-level events.
-
-### Stream-level events
-
-Stream-level events are raised per stream or connection. Each event has a `StreamId` parameter that identifies the connection or stream. Each stream or connection has one or more tracks of different types. For example, one connection from an encoder may have one audio track and four video tracks. The stream event types are:
-
-| Event type | Description |
-| - | -- |
-| Microsoft.Media.LiveEventConnectionRejected | Encoder's connection attempt is rejected. |
-| Microsoft.Media.LiveEventEncoderConnected | Encoder establishes connection with live event. |
-| Microsoft.Media.LiveEventEncoderDisconnected | Encoder disconnects. |
-
-See [Schema examples](#event-schema-examples) that follow.
-
-### Track-level events
-
-Track-level events are raised per track.
-
-> [!NOTE]
-> All track-level events are raised after a live encoder is connected.
-
-The track-level event types are:
-
-| Event type | Description |
-| - | -- |
-| Microsoft.Media.LiveEventIncomingDataChunkDropped | Media server drops data chunk because it's too late or has an overlapping timestamp (timestamp of new data chunk is less than the end time of the previous data chunk). |
-| Microsoft.Media.LiveEventIncomingStreamReceived | Media server receives first data chunk for each track in the stream or connection. |
-| Microsoft.Media.LiveEventIncomingStreamsOutOfSync | Media server detects audio and video streams are out of sync. Use as a warning because user experience may not be impacted. |
-| Microsoft.Media.LiveEventIncomingVideoStreamsOutOfSync | Media server detects any of the two video streams coming from external encoder are out of sync. Use as a warning because user experience may not be impacted. |
-| Microsoft.Media.LiveEventIngestHeartbeat | Published every 20 seconds for each track when live event is running. Provides ingest health summary.<br/><br/>After the encoder was initially connected, the heartbeat event continues to emit every 20 sec whether the encoder is still connected or not. |
-| Microsoft.Media.LiveEventTrackDiscontinuityDetected | Media server detects discontinuity in the incoming track. |
-
-See [Schema examples](#event-schema-examples) that follow.
-
-## Event schema examples
-
-### JobStateChange
-
-The following example shows the schema of the **JobStateChange** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/<job-id>",
- "eventType": "Microsoft.Media.JobStateChange",
- "eventTime": "2018-04-20T21:26:13.8978772",
- "id": "b9d38923-9210-4c2b-958f-0054467d4dd7",
- "data": {
- "previousState": "Processing",
- "state": "Finished"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `previousState` | string | The state of the job before the event. |
-| `state` | string | The new state of the job being notified in this event. For example, "Scheduled: The job is ready to start" or "Finished: The job is finished".|
-
-Where the Job state can be one of the values: *Queued*, *Scheduled*, *Processing*, *Finished*, *Error*, *Canceled*, *Canceling*
-
-> [!NOTE]
-> *Queued* is only going to be present in the **previousState** property but not in the **state** property.
-
-### JobScheduled, JobProcessing, JobCanceling
-
-For each non-final Job state change (such as JobScheduled, JobProcessing, JobCanceling), the example schema looks similar to the following:
-
-```json
-[{
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/<job-id>",
- "eventType": "Microsoft.Media.JobProcessing",
- "eventTime": "2018-10-12T16:12:18.0839935",
- "id": "a0a6efc8-f647-4fc2-be73-861fa25ba2db",
- "data": {
- "previousState": "Scheduled",
- "state": "Processing",
- "correlationData": {
- "testKey1": "testValue1",
- "testKey2": "testValue2"
- }
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
-}]
-```
-
-### JobFinished, JobCanceled, JobErrored
-
-For each final Job state change (such as JobFinished, JobCanceled, JobErrored), the example schema looks similar to the following:
-
-```json
-[{
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/<job-id>",
- "eventType": "Microsoft.Media.JobFinished",
- "eventTime": "2018-10-12T16:25:56.4115495",
- "id": "9e07e83a-dd6e-466b-a62f-27521b216f2a",
- "data": {
- "outputs": [
- {
- "@odata.type": "#Microsoft.Media.JobOutputAsset",
- "assetName": "output-7640689F",
- "error": null,
- "label": "VideoAnalyzerPreset_0",
- "progress": 100,
- "state": "Finished"
- }
- ],
- "previousState": "Processing",
- "state": "Finished",
- "correlationData": {
- "testKey1": "testValue1",
- "testKey2": "testValue2"
- }
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
-}]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `outputs` | Array | Gets the Job outputs.|
-
-### JobOutputStateChange
-
-The following example shows the schema of the **JobOutputStateChange** event:
-
-```json
-[{
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/<job-id>",
- "eventType": "Microsoft.Media.JobOutputStateChange",
- "eventTime": "2018-10-12T16:25:56.0242854",
- "id": "dde85f46-b459-4775-b5c7-befe8e32cf90",
- "data": {
- "previousState": "Processing",
- "output": {
- "@odata.type": "#Microsoft.Media.JobOutputAsset",
- "assetName": "output-7640689F",
- "error": null,
- "label": "VideoAnalyzerPreset_0",
- "progress": 100,
- "state": "Finished"
- },
- "jobCorrelationData": {
- "testKey1": "testValue1",
- "testKey2": "testValue2"
- }
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
-}]
-```
-
-### JobOutputScheduled, JobOutputProcessing, JobOutputFinished, JobOutputCanceling, JobOutputCanceled, JobOutputErrored
-
-For each JobOutput state change, the example schema looks similar to the following:
-
-```json
-[{
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/<job-id>",
- "eventType": "Microsoft.Media.JobOutputProcessing",
- "eventTime": "2018-10-12T16:12:18.0061141",
- "id": "f1fd5338-1b6c-4e31-83c9-cd7c88d2aedb",
- "data": {
- "previousState": "Scheduled",
- "output": {
- "@odata.type": "#Microsoft.Media.JobOutputAsset",
- "assetName": "output-7640689F",
- "error": null,
- "label": "VideoAnalyzerPreset_0",
- "progress": 0,
- "state": "Processing"
- },
- "jobCorrelationData": {
- "testKey1": "testValue1",
- "testKey2": "testValue2"
- }
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
-}]
-```
-### JobOutputProgress
-
-The example schema looks similar to the following:
-
- ```json
-[{
- "topic": "/subscriptions/<subscription-id>/resourceGroups/belohGroup/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "transforms/VideoAnalyzerTransform/jobs/job-5AB6DE32",
- "eventType": "Microsoft.Media.JobOutputProgress",
- "eventTime": "2018-12-10T18:20:12.1514867",
- "id": "00000000-0000-0000-0000-000000000000",
- "data": {
- "jobCorrelationData": {
- "TestKey1": "TestValue1",
- "testKey2": "testValue2"
- },
- "label": "VideoAnalyzerPreset_0",
- "progress": 86
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
-}]
-```
-
-### LiveEventConnectionRejected
-
-The following example shows the schema of the **LiveEventConnectionRejected** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaServices/<account-name>",
- "subject": "/LiveEvents/MyLiveEvent1",
- "eventType": "Microsoft.Media.LiveEventConnectionRejected",
- "eventTime": "2018-01-16T01:57:26.005121Z",
- "id": "b303db59-d5c1-47eb-927a-3650875fded1",
- "data": {
- "streamId":"Mystream1",
- "ingestUrl": "http://abc.ingest.isml",
- "encoderIp": "118.238.251.xxx",
- "encoderPort": 52859,
- "resultCode": "MPE_INGEST_CODEC_NOT_SUPPORTED"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
-| `encoderIp` | string | IP of the encoder. |
-| `encoderPort` | string | Port of the encoder from where this stream is coming. |
-| `resultCode` | string | The reason the connection was rejected. The result codes are listed in the following table. |
-
-You can find the error result codes in [live Event error codes](../live-event-error-codes-reference.md).
-
-### LiveEventEncoderConnected
-
-The following example shows the schema of the **LiveEventEncoderConnected** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventEncoderConnected",
- "eventTime": "2018-08-07T23:08:09.1710643",
- "id": "<id>",
- "data": {
- "ingestUrl": "http://mle1-amsts03mediaacctgndos-ts031.channel.media.azure-test.net:80/ingest.isml",
- "streamId": "15864-stream0",
- "encoderIp": "131.107.147.xxx",
- "encoderPort": "27485"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible for providing this ID in the ingest URL. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
-| `encoderIp` | string | IP of the encoder. |
-| `encoderPort` | string | Port of the encoder from where this stream is coming. |
-
-### LiveEventEncoderDisconnected
-
-The following example shows the schema of the **LiveEventEncoderDisconnected** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventEncoderDisconnected",
- "eventTime": "2018-08-07T23:08:09.1710872",
- "id": "<id>",
- "data": {
- "ingestUrl": "http://mle1-amsts03mediaacctgndos-ts031.channel.media.azure-test.net:80/ingest.isml",
- "streamId": "15864-stream0",
- "encoderIp": "131.107.147.xxx",
- "encoderPort": "27485",
- "resultCode": "S_OK"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `streamId` | string | Identifier of the stream or connection. Encoder or customer is responsible to add this ID in the ingest URL. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
-| `encoderIp` | string | IP of the encoder. |
-| `encoderPort` | string | Port of the encoder from where this stream is coming. |
-| `resultCode` | string | The reason for the encoder disconnecting. It could be graceful disconnect or from an error. The result codes are listed in the following table. |
-
-You can find the error result codes in [live Event error codes](../live-event-error-codes-reference.md).
-
-The graceful disconnect result codes are:
-
-| Result code | Description |
-| -- | -- |
-| S_OK | Encoder disconnected successfully. |
-| MPE_CLIENT_TERMINATED_SESSION | Encoder disconnected (RTMP). |
-| MPE_CLIENT_DISCONNECTED | Encoder disconnected (FMP4). |
-| MPI_REST_API_CHANNEL_RESET | Channel reset command is received. |
-| MPI_REST_API_CHANNEL_STOP | Channel stop command received. |
-| MPI_REST_API_CHANNEL_STOP | Channel undergoing maintenance. |
-| MPI_STREAM_HIT_EOF | EOF stream is sent by the encoder. |
-
-### LiveEventIncomingDataChunkDropped
-
-The following example shows the schema of the **LiveEventIncomingDataChunkDropped** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaServices/<account-name>",
- "subject": "/LiveEvents/MyLiveEvent1",
- "eventType": "Microsoft.Media.LiveEventIncomingDataChunkDropped",
- "eventTime": "2018-01-16T01:57:26.005121Z",
- "id": "03da9c10-fde7-48e1-80d8-49936f2c3e7d",
- "data": {
- "trackType": "Video",
- "trackName": "Video",
- "bitrate": 300000,
- "timestamp": "36656620000",
- "timescale": "10000000",
- "resultCode": "FragmentDrop_OverlapTimestamp"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `trackType` | string | Type of the track (Audio / Video). |
-| `trackName` | string | Name of the track. |
-| `bitrate` | integer | Bitrate of the track. |
-| `timestamp` | string | Timestamp of the data chunk dropped. |
-| `timescale` | string | Timescale of the timestamp. |
-| `resultCode` | string | Reason of the data chunk drop. **FragmentDrop_OverlapTimestamp** or **FragmentDrop_NonIncreasingTimestamp**. |
-
-### LiveEventIncomingStreamReceived
-
-The following example shows the schema of the **LiveEventIncomingStreamReceived** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventIncomingStreamReceived",
- "eventTime": "2018-08-07T23:08:10.5069288Z",
- "id": "7f939a08-320c-47e7-8250-43dcfc04ab4d",
- "data": {
- "ingestUrl": "http://mle1-amsts03mediaacctgndos-ts031.channel.media.azure-test.net:80/ingest.isml/Streams(15864-stream0)15864-stream0",
- "trackType": "video",
- "trackName": "video",
- "bitrate": 2962000,
- "encoderIp": "131.107.147.xxx",
- "encoderPort": "27485",
- "timestamp": "15336831655032322",
- "duration": "20000000",
- "timescale": "10000000"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `trackType` | string | Type of the track (Audio / Video). |
-| `trackName` | string | Name of the track (either provided by the encoder or, in case of RTMP, server generates in *TrackType_Bitrate* format). |
-| `bitrate` | integer | Bitrate of the track. |
-| `ingestUrl` | string | Ingest URL provided by the live event. |
-| `encoderIp` | string | IP of the encoder. |
-| `encoderPort` | string | Port of the encoder from where this stream is coming. |
-| `timestamp` | string | First timestamp of the data chunk received. |
-| `timescale` | string | Timescale in which timestamp is represented. |
-
-### LiveEventIncomingStreamsOutOfSync
-
-The following example shows the schema of the **LiveEventIncomingStreamsOutOfSync** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventIncomingStreamsOutOfSync",
- "eventTime": "2018-08-10T02:26:20.6269183Z",
- "id": "b9d38923-9210-4c2b-958f-0054467d4dd7",
- "data": {
- "minLastTimestamp": "319996",
- "typeOfStreamWithMinLastTimestamp": "Audio",
- "maxLastTimestamp": "366000",
- "typeOfStreamWithMaxLastTimestamp": "Video",
- "timescaleOfMinLastTimestamp": "10000000",
- "timescaleOfMaxLastTimestamp": "10000000"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `minLastTimestamp` | string | Minimum of last timestamps among all the tracks (audio or video). |
-| `typeOfTrackWithMinLastTimestamp` | string | Type of the track (audio or video) with minimum last timestamp. |
-| `maxLastTimestamp` | string | Maximum of all the timestamps among all the tracks (audio or video). |
-| `typeOfTrackWithMaxLastTimestamp` | string | Type of the track (audio or video) with maximum last timestamp. |
-| `timescaleOfMinLastTimestamp`| string | Gets the timescale in which "MinLastTimestamp" is represented.|
-| `timescaleOfMaxLastTimestamp`| string | Gets the timescale in which "MaxLastTimestamp" is represented.|
-
-### LiveEventIncomingVideoStreamsOutOfSync
-
-The following example shows the schema of the **LiveEventIncomingVideoStreamsOutOfSync** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaServices/<account-name>",
- "subject": "/LiveEvents/LiveEvent1",
- "eventType": "Microsoft.Media.LiveEventIncomingVideoStreamsOutOfSync",
- "eventTime": "2018-01-16T01:57:26.005121Z",
- "id": "6dd4d862-d442-40a0-b9f3-fc14bcf6d750",
- "data": {
- "firstTimestamp": "2162058216",
- "firstDuration": "2000",
- "secondTimestamp": "2162057216",
- "secondDuration": "2000",
- "timescale": "10000000"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `firstTimestamp` | string | Timestamp received for one of the tracks/quality levels of type video. |
-| `firstDuration` | string | Duration of the data chunk with first timestamp. |
-| `secondTimestamp` | string | Timestamp received for some other track/quality level of the type video. |
-| `secondDuration` | string | Duration of the data chunk with second timestamp. |
-| `timescale` | string | Timescale of timestamps and duration.|
-
-### LiveEventIngestHeartbeat
-
-The following example shows the schema of the **LiveEventIngestHeartbeat** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventIngestHeartbeat",
- "eventTime": "2021-05-14T23:50:00.324",
- "id": "7f450938-491f-41e1-b06f-c6cd3965d786",
- "data": {
- "trackType":"video",
- "trackName":"video",
- "bitrate":2500000,
- "incomingBitrate":2462597,
- "lastTimestamp":"106999",
- "timescale":"1000",
- "overlapCount":0,
- "discontinuityCount":0,
- "nonincreasingCount":0,
- "unexpectedBitrate":false,
- "state":"Running",
- "healthy":true,
- "lastFragmentArrivalTime":"2021-05-14T23:50:00.324",
- "ingestDriftValue":"0",
- "transcriptionState":"",
- "transcriptionLanguage":""
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `trackType` | string | Type of the track (Audio / Video). |
-| `trackName` | string | Name of the track (either provided by the encoder or, in case of RTMP, server generates in *TrackType_Bitrate* format). |
-| `bitrate` | integer | Bitrate of the track. |
-| `incomingBitrate` | integer | Calculated bitrate based on data chunks coming from encoder. |
-| `lastTimestamp` | string | Latest timestamp received for a track in last 20 seconds. |
-| `timescale` | string | Timescale in which timestamps are expressed. |
-| `overlapCount` | integer | Number of data chunks had overlapped timestamps in last 20 seconds. |
-| `discontinuityCount` | integer | Number of discontinuities observed in last 20 seconds. |
-| `nonIncreasingCount` | integer | Number of data chunks with timestamps in the past were received in last 20 seconds. |
-| `unexpectedBitrate` | bool | If expected and actual bitrates differ by more than allowed limit in last 20 seconds. It's true if and only if, incomingBitrate >= 2* bitrate OR incomingBitrate <= bitrate/2 OR IncomingBitrate = 0. |
-| `state` | string | State of the live event. |
-| `healthy` | bool | Indicates whether ingest is healthy based on the counts and flags. Healthy is true if overlapCount = 0 && discontinuityCount = 0 && nonIncreasingCount = 0 && unexpectedBitrate = false. |
-| `lastFragmentArrivalTime` | string |The last time stamp in UTC that a fragment arrived at the ingest endpoint. Example date format is "2020-11-11 12:12:12:888999" |
-| `ingestDriftValue` | string | Indicates the speed of delay, in seconds-per-minute, of the incoming audio or video data during the last minute. The value is greater than zero if data is arriving to the live event slower than expected in the last minute; zero if data arrived with no delay; and "n/a" if no audio or video data was received. For example, if you have a contribution encoder sending in live content, and it is slowing down due to processing issues, or network latency, it may be only able to deliver a total of 58 seconds of audio or video in a one-minute period. This would be reported as two seconds-per-minute of drift. If the encoder is able to catch up and send all 60 seconds or more of data every minute, you will see this value reported as 0. If there was a disconnection or discontinuity from the encoder, this value may still display as 0, as it does not account for breaks in the data - only data that is delayed in timestamps.|
-| `transcriptionState` | string | This value is "On" for audio track heartbeats if live transcription is turned on, otherwise you will see an empty string. This state is only applicable to tracktype of "audio" for Live transcription. All other tracks will have an empty value.|
-| `transcriptionLanguage` | string | The language code (in BCP-47 format) of the transcription language. For example, ΓÇ£de-deΓÇ¥ indicates German (Germany). The value is empty for the video track heartbeats, or when live transcription is turned off. |
--
-### LiveEventChannelArchiveHeartbeatEvent
-
-The following example shows the schema of the **LiveEventChannelArchiveHeartbeatEvent** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventChannelArchiveHeartbeatEvent",
- "eventTime": "2021-05-14T23:50:00.324",
- "id": "7f450938-491f-41e1-b06f-c6cd3965d786",
- "data": {
- "channelLatencyMs": "10",
- "latencyResultCode": "S_OK"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `channelLatencyMs` | string | The time in milliseconds (ms) the ingested video spends in the live event pipeline before it is published to the HLS/DASH manifest for players to download.|
-| `latencyResultCode` | string | The result code for the channelLatencyMs calculation. `S_OK` indicates that the live event ingest was received without any problems. Other result codes indicate situations that would cause the channelLatencyMs to have an empty value. `MPE_KEY_FRAME_INTERVAL_TOO_LARGE` error code indicates that the ingested video source has a large GOP (key frame distance) that would negatively impact the channel latency. `MPE_INGEST_DISCONTINUITY` error code indicates that discontinuities were detected on the source stream, which can add long-latencies to the channel. |
-
-### LiveEventTrackDiscontinuityDetected
-
-The following example shows the schema of the **LiveEventTrackDiscontinuityDetected** event:
-
-```json
-[
- {
- "topic": "/subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.Media/mediaservices/<account-name>",
- "subject": "liveEvent/mle1",
- "eventType": "Microsoft.Media.LiveEventTrackDiscontinuityDetected",
- "eventTime": "2018-08-07T23:18:06.1270405Z",
- "id": "5f4c510d-5be7-4bef-baf0-64b828be9c9b",
- "data": {
- "trackName": "video",
- "previousTimestamp": "15336837615032322",
- "trackType": "video",
- "bitrate": 2962000,
- "newTimestamp": "15336837619774273",
- "discontinuityGap": "575284",
- "timescale": "10000000"
- },
- "dataVersion": "1.0",
- "metadataVersion": "1"
- }
-]
-```
-
-The data object has the following properties:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `trackType` | string | Type of the track (Audio / Video). |
-| `trackName` | string | Name of the track (either provided by the encoder or, in case of RTMP, server generates in *TrackType_Bitrate* format). |
-| `bitrate` | integer | Bitrate of the track. |
-| `previousTimestamp` | string | Timestamp of the previous fragment. |
-| `newTimestamp` | string | Timestamp of the current fragment. |
-| `discontinuityGap` | string | Gap between above two timestamps. |
-| `timescale` | string | Timescale in which both timestamp and discontinuity gap are represented. |
-
-### Common event properties
-
-An event has the following top-level data:
-
-| Property | Type | Description |
-| -- | - | -- |
-| `topic` | string | The event grid topic. This property has the resource ID for the Media Services account. |
-| `subject` | string | The resource path for the Media Services channel under the Media Services account. Concatenating the topic and subject give you the resource ID for the job. |
-| `eventType` | string | One of the registered event types for this event source. For example, "Microsoft.Media.JobStateChange". |
-| `eventTime` | string | The time the event is generated based on the provider's UTC time. |
-| `id` | string | Unique identifier for the event. |
-| `data` | object | Media Services event data. |
-| `dataVersion` | string | The schema version of the data object. The publisher defines the schema version. |
-| `metadataVersion` | string | The schema version of the event metadata. Event Grid defines the schema of the top-level properties. Event Grid provides this value. |
-
-## Next steps
-
-[Register for job state change events](../job-state-events-cli-how-to.md)
-
-## See also
--- [event grid .NET SDK that includes Media Service events](https://www.nuget.org/packages/Microsoft.Azure.eventgrid/)-- [Definitions of Media Services events](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/eventgrid/data-plane/Microsoft.Media/stable/2018-01-01/MediaServices.json)-- [Live Event error codes](../live-event-error-codes-reference.md)
media-services Media Services Metrics Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/media-services-metrics-howto.md
- Title: View metrics with Azure Monitor
-description: This article shows how to monitor metrics with the Azure portal charts and Azure CLI.
------ Previously updated : 03/17/2021----
-# Monitor Media Services metrics
--
-[Azure Monitor](../../../azure-monitor/overview.md) enables you to monitor metrics and diagnostic logs that help you understand how your applications are performing. For a detailed description of this feature and to understand why you should use Azure Media Services metrics and diagnostics logs, see [Monitor Media Services metrics and diagnostic logs](monitor-media-services-data-reference.md).
-
-Azure Monitor provides several ways to interact with metrics, including charting them in the portal, accessing them through the REST API, or querying them using Azure CLI. This article shows how to monitor metrics with the Azure portal charts and Azure CLI.
-
-## Prerequisites
--- [Create a Media Services account](../account-create-how-to.md)-- Review [Monitor Media Services metrics and diagnostic logs](monitor-media-services-data-reference.md)-
-## View metrics in Azure portal
-
-1. Sign in to the Azure portal at https://portal.azure.com.
-1. Navigate to your Azure Media Services account and select **Metrics**.
-1. Click the **Scope** box and select the resource you want to monitor.
-
- The **Select a scope** window appears on the right with the list of resources available to you. In this case you see:
-
- * &lt;Media Services account name&gt;
- * &lt;Media Services account name&gt;/&lt;streaming endpoint name&gt;
- * &lt;storage account name&gt;
-
- Filter then select the resource and press **Apply**. For details about supported resources and metrics, see [Monitor Media Services metrics](monitor-media-services-data-reference.md).
-
- > [!NOTE]
- > To switch between resources you want to monitor, click on the **Source** box again and repeat this step.
-
-1. Optional: give your chart a name (edit the name by pressing the pencil at the top).
-1. Add the metrics that you want to view.
-1. You can pin your chart to your dashboard.
-
-## View metrics with Azure CLI
-
-To get "Egress" metrics with Azure CLI, you would run the following `az monitor metrics` command:
-
-```azurecli-interactive
-az monitor metrics list --resource \
- "/subscriptions/<subscription id>/resourcegroups/<resource group name>/providers/Microsoft.Media/mediaservices/<Media Services account name>/streamingendpoints/<streaming endpoint name>" \
- --metric "Egress"
-```
-
-To get other metrics, substitute "Egress" for the metric name you are interested in.
-
-## See also
--- [Azure Monitor Metrics](../../../azure-monitor/data-platform.md)-- [Create, view, and manage metric alerts using Azure Monitor](../../../azure-monitor/alerts/alerts-metric.md).-
-## Next steps
-
-[Diagnostic logs](../media-services-diagnostic-logs-howto.md)
media-services Monitor Events Portal How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/monitor-events-portal-how-to.md
- Title: Monitor Media Services events with Event Grid portal
-description: This article shows how to subscribe to Event Grid in order to monitor Azure Media Services events.
---
-tags: ''
-keywords: azure media services, stream, broadcast, live, offline
--- Previously updated : 03/17/2021---
-# Create and monitor Media Services events with Event Grid using the Azure portal
--
-Azure Event Grid is an eventing service for the cloud. This service uses [event subscriptions](../../../event-grid/concepts.md#event-subscriptions) to route event messages to subscribers. Media Services events contain all the information you need to respond to changes in your data. You can identify a Media Services event because the eventType property starts with "Microsoft.Media.". For more information, see [Media Services event schemas](media-services-event-schemas.md).
-
-In this article, you use the Azure portal to subscribe to events for your Azure Media Services account. Then, you trigger events to view the result. Typically, you send events to an endpoint that processes the event data and takes actions. In the article, we send events to a web app that collects and displays the messages.
-
-When you're finished, you see that the event data has been sent to the web app.
-
-## Prerequisites
-
-* Have an active Azure subscription.
-* Create a new Azure Media Services account, as described in [this quickstart](../account-create-how-to.md).
-
-## Create a message endpoint
-
-Before subscribing to the events for the Media Services account, let's create the endpoint for the event message. Typically, the endpoint takes actions based on the event data. In this article, you deploy a [pre-built web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the event messages. The deployed solution includes an App Service plan, an App Service web app, and source code from GitHub.
-
-1. Select **Deploy to Azure** to deploy the solution to your subscription. In the Azure portal, provide values for the parameters.
-
- [![Image showing a button labeled "Deploy to Azure".](https://azuredeploy.net/deploybutton.png)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fazure-event-grid-viewer%2Fmaster%2Fazuredeploy.json)
-
-1. The deployment may take a few minutes to complete. After the deployment has succeeded, view your web app to make sure it's running. In a web browser, navigate to:
-`https://<your-site-name>.azurewebsites.net`
-
-If you switch to the "Azure Event Grid Viewer" site, you see it has no events yet.
-
-
-## Subscribe to Media Services events
-
-You subscribe to a topic to tell Event Grid which events you want to track, and where to send the events.
-
-1. In the portal, select your Media Services account and select **Events**.
-1. To send events to your viewer app, use a web hook for the endpoint.
-
- ![Select web hook](../media/monitor-events-portal/select-web-hook.png)
-
-1. The event subscription is prefilled with values for your Media Services account.
-1. Select 'Web Hook' for the **Endpoint Type**.
-1. In this topic, we leave the **Subscribe to all event types** checked. However, you can uncheck it and filter for specific event types.
-1. Click on the **Select an endpoint** link.
-
- For the web hook endpoint, provide the URL of your web app and add `api/updates` to the home page URL.
-
-1. Press **Confirm Selection**.
-1. Press **Create**.
-1. Give your subscription a name.
-
- ![Select logs](../media/monitor-events-portal/create-subscription.png)
-
-1. View your web app again, and notice that a subscription validation event has been sent to it.
-
- Event Grid sends the validation event so the endpoint can verify that it wants to receive event data. The endpoint has to set `validationResponse` to `validationCode`. For more information, see [Event Grid security and authentication](../../../event-grid/security-authentication.md). You can view the web app code to see how it validates the subscription.
-
-Now, let's trigger events to see how Event Grid distributes the message to your endpoint.
-
-## Send an event to your endpoint
-
-You can trigger events for the Media Services account by running an encoding job. You can follow [this quickstart](../stream-files-dotnet-quickstart.md) to encode a file and start sending events. If you subscribed to all events, you will see a screen similar to the following:
-
-> [!TIP]
-> Select the eye icon to expand the event data. Do not refresh the page, if you want to view all the events.
-
-![View subscription event](../media/monitor-events-portal/view-subscription-event.png)
-
-## Next steps
-
-[Upload, encode, and stream](../stream-files-tutorial-with-api.md)
media-services Monitor Media Services Data Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/monitor-media-services-data-reference.md
- Title: Monitoring Media Services data reference
-description: Important reference material needed when you monitor Media Services
------ Previously updated : 04/21/2021--
-# Monitoring Media Services data reference
-
-This article covers the data that is useful for monitoring Media Services. For more information about all platform metrics supported in Azure Monitor, review [Supported metrics with Azure Monitor](../../../azure-monitor/essentials/metrics-supported.md).
-
-## Metrics
-
-Metrics are collected at regular intervals whether or not the value changes. They're useful for alerting because they can be sampled frequently, and an alert can be fired quickly with relatively simple logic.
--
-Media Services supports monitoring metrics for the following resources:
-
-|Metric Type | Resource Provider / Type Namespace<br/> and link to individual metrics |
-|-|--|
-| Media Services general | [General](../../../azure-monitor/essentials/metrics-supported.md#microsoftmediamediaservices) |
-| Live Events | [Microsoft.Medi#microsoftmediamediaservicesliveevents)
-| Streaming Endpoints | [Microsoft.Medi#microsoftmediamediaservicesstreamingendpoints), which are relevant to the [Streaming Endpoints REST API](/rest/api/media/streamingendpoints).
--
-You should also review [account quotas and limits](../limits-quotas-constraints-reference.md).
--
-## Metric Dimensions
-
-For more information on what metric dimensions are, see [Multi-dimensional metrics](../../../azure-monitor/essentials/data-platform-metrics.md#multi-dimensional-metrics).
-
-Media services has the following metric dimensions. They are self-explantory based on the metrics they support. See the [metrics links](#metrics) above for more information.
-- OutputFormat-- HttpStatusCode -- ErrorCode -- TrackName -
-## Resource logs
-
-Resource logs provide rich and frequent data about the operation of an Azure resource. For more information, see [How to collect and consume log data from your Azure resources](../../../azure-monitor/essentials/platform-logs-overview.md).
-
-Media Services supports the following resource logs:
-[Microsoft.Medi#microsoftmediamediaservices)
-
-## Schemas
-
-For detailed description of the top-level diagnostic logs schema, see [Supported services, schemas, and categories for Azure Diagnostic Logs](../../../azure-monitor/essentials/resource-logs-schema.md).
-
-### Key delivery
-
-These properties are specific to the key delivery log schema.
-
-|Name|Description|
-|||
-|keyId|The ID of the requested key.|
-|keyType|Could be one of the following values: "Clear" (no encryption), "FairPlay", "PlayReady", or "Widevine".|
-|policyName|The Azure Resource Manager name of the policy.|
-|tokenType|The token type.|
-|statusMessage|The status message.|
-
-### Example
-
-Properties of the key delivery requests schema.
-
-```json
-{
- "time": "2019-01-11T17:59:10.4908614Z",
- "resourceId": "/SUBSCRIPTIONS/00000000-0000-0000-0000-0000000000/RESOURCEGROUPS/SBKEY/PROVIDERS/MICROSOFT.MEDIA/MEDIASERVICES/SBDNSTEST",
- "operationName": "MICROSOFT.MEDIA/MEDIASERVICES/CONTENTKEYS/READ",
- "operationVersion": "1.0",
- "category": "KeyDeliveryRequests",
- "resultType": "Succeeded",
- "resultSignature": "OK",
- "durationMs": 315,
- "identity": {
- "authorization": {
- "issuer": "http://testacs",
- "audience": "urn:test"
- },
- "claims": {
- "urn:microsoft:azure:media
- "iss": "http://testacs",
- "aud": "urn:test",
- "exp": "1547233138"
- }
- },
- "level": "Informational",
- "location": "uswestcentral",
- "properties": {
- "requestId": "b0243468-d8e5-4edf-a48b-d408e1661050",
- "keyType": "Clear",
- "keyId": "3321e646-78d0-4896-84ec-c7b98eddfca5",
- "policyName": "56a70229-82d0-4174-82bc-e9d3b14e5dbf",
- "tokenType": "JWT",
- "statusMessage": "OK"
- }
-}
-```
-
-```json
- {
- "time": "2019-01-11T17:59:33.4676382Z",
- "resourceId": "/SUBSCRIPTIONS/00000000-0000-0000-0000-0000000000/RESOURCEGROUPS/SBKEY/PROVIDERS/MICROSOFT.MEDIA/MEDIASERVICES/SBDNSTEST",
- "operationName": "MICROSOFT.MEDIA/MEDIASERVICES/CONTENTKEYS/READ",
- "operationVersion": "1.0",
- "category": "KeyDeliveryRequests",
- "resultType": "Failed",
- "resultSignature": "Unauthorized",
- "durationMs": 2,
- "level": "Error",
- "location": "uswestcentral",
- "properties": {
- "requestId": "875af030-b77c-416b-b7e1-58f23ebec182",
- "keyType": "Clear",
- "keyId": "3321e646-78d0-4896-84ec-c7b98eddfca5",
- "policyName": "56a70229-82d0-4174-82bc-e9d3b14e5dbf",
- "tokenType": "None",
- "statusMessage": "No token present in authorization header or URL."
- }
-}
-```
-
->[!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
media-services Monitor Media Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/monitor-media-services.md
- Title: Monitoring Media Services
-description: Start here to learn how to monitor Media Services
------ Previously updated : 03/17/2021--
-# Monitor Media Services
-
-When you have critical applications and business processes relying on Azure resources, you want to monitor those resources for their availability, performance, and operation. This article describes the monitoring data generated by Media Services and how you can use the features of Azure Monitor to analyze and alert on this data.
-
-## Metrics are useful
-
-Here are examples of how monitoring Media Services metrics can help you understand how your apps are performing. Some questions that can be addressed with Media Services metrics are:
--- How do I monitor my Standard Streaming Endpoint to know when I have exceeded the limits?-- How do I know if I have enough Premium Streaming Endpoint scale units?-- How can I set an alert to know when to scale up my Streaming Endpoints?-- How do I set an alert to know when the max egress configured on the account was reached?-- How can I see the breakdown of requests failing and what is causing the failure?-- How can I see how many HLS or DASH requests are being pulled from the packager?-- How do I set an alert to know when the I have hit the threshold value of failed requests?-
-<!--THIS DOESN'T BELONG HERE Concurrency becomes a concern for the number of Streaming Endpoints used in a single account over time. You need to keep in mind the relationship between the number of concurrent streams with complex publishing parameters like dynamic packaging to multiple protocols, multiple DRM encryptions etc. Each additional published live stream adds to the CPU and output bandwidth on the Streaming Endpoint. With that in mind, you should use Azure Monitor to closely watch the Streaming Endpoint's utilization (CPU and Egress capacity) to make certain that you are scaling it appropriately (or split traffic out between multiple Streaming Endpoints if you are getting into very high concurrency).-->
-
-<!-- Optional diagram showing monitoring for your service. If you need help creating one, contact
-robb@microsoft.com -->
-
-## What is Azure Monitor?
-
-Media Services creates monitoring data using [Azure Monitor](../../../azure-monitor/overview.md), which is a full stack monitoring service in Azure that provides a complete set of features to monitor your Azure resources in addition to resources in other clouds and on-premises.
-
-Start with reading the article [Monitoring Azure resources with Azure Monitor](../../../azure-monitor/essentials/monitor-azure-resource.md), which describes the following concepts:
--- What is Azure Monitor?-- Costs associated with monitoring-- Monitoring data collected in Azure-- Configuring data collection-- Standard tools in Azure for analyzing and alerting on monitoring data-
-## Monitoring data
-
-Media Services collects the same kinds of monitoring data as other Azure resources that are described in [Monitoring data from Azure resources](../../../azure-monitor/essentials/monitor-azure-resource.md#monitoring-data).
-
-All data collected by Azure Monitor fits into one of two fundamental types: metrics and logs. With these two types you can:
--- Visualize and analyze the metrics data using Metrics explorer.-- Monitor Media Services diagnostic logs and create alerts and notifications for them.-- With logs you can:
- - Send them to Azure Storage
- - Stream them to Azure Event Hubs
- - Export them to Log Analytics
- - Use third-party services
-
-See the article [Monitoring Media Services data reference](monitor-media-services-data-reference.md) for detailed information on the metrics and logs metrics created by Media Services.
-
-## Collection and routing
-
-*Platform metrics* and the *Activity log* are collected and stored automatically, but can be routed to other locations by using a diagnostic setting.
-
-*Resource Logs* are **not** collected and stored until you create a diagnostic setting and route them to one or more locations.
-
-See the article [Create diagnostic setting to collect platform logs and metrics in Azure](../../../azure-monitor/essentials/diagnostic-settings.md) for the detailed process of creating a diagnostic setting using the Azure portal, CLI, or PowerShell.
-
-When you create a diagnostic setting, you specify which categories of logs to collect. The categories for Media Services are listed in [Media Services monitoring data reference](monitor-media-services-data-reference.md).
-
-## Analyzing metrics
-
-You can analyze metrics for Media Services with metrics from other Azure services using metrics explorer by opening **Metrics** from the **Azure Monitor** menu. See [Getting started with Azure Metrics Explorer](../../../azure-monitor/essentials/metrics-getting-started.md) for details on using this tool.
-
-For a list of the metrics collected for Media Services, see [Monitoring Media Services Data Reference](monitor-media-services-data-reference.md).
-
-## Analyzing logs
-
-Data in Azure Monitor Logs is stored in tables where each table has its own set of unique properties.
-
-All resource logs in Azure Monitor have the same fields followed by service-specific fields. The common schema is outlined in [Azure Monitor resource log schema](../../../azure-monitor/essentials/resource-logs-schema.md#top-level-common-schema).
-
-The schema for Media Services resource logs is found in [Monitoring Media Services Data Reference](monitor-media-services-data-reference.md).
-
-The [Activity log](../../../azure-monitor/essentials/activity-log.md) is a platform log in Azure that provides insight into subscription-level events. You can view it independently or route it to Azure Monitor Logs, where you can do much more complex queries using Log Analytics.
-
-For a list of the types of resource logs collected for Media Services, see [Monitoring Media Services data reference](monitor-media-services-data-reference.md).
-
-### Why would I want to use diagnostic logs?
-
-Some things that you can examine with diagnostic logs are:
--- The number of licenses delivered by DRM type.-- The number of licenses delivered by policy.-- Errors by DRM or policy type.-- The number of unauthorized license requests from clients.-
-## Alerts
-
-Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues in your system before your customers notice them. You can set alerts on [metrics](../../../azure-monitor/alerts/alerts-metric-overview.md), [logs](../../../azure-monitor/alerts/alerts-unified-log.md), and the [activity log](../../../azure-monitor/alerts/activity-log-alerts.md).
-
-Media Services metrics are collected at regular intervals whether or not the value changes. They're useful for alerting because they can be sampled frequently. An alert can be fired quickly with relatively simple logic.
-
-<!--
-The following table lists common and recommended alert rules for Media Services.
-
-<!-- Fill in the table with metric and log alerts that would be valuable for your service. Change the format as necessary to make it more readable
-**PLACEHOLDER** table
-
-| Alert type | Condition | Description |
-|:|:|:|
-| | | |
-| | | |
>-
-## Next steps
-
media-services Reacting To Media Services Events https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/monitoring/reacting-to-media-services-events.md
- Title: Reacting to Azure Media Services events
-description: This article describes how to use Azure Event Grid to subscribe to Media Services events.
------- Previously updated : 07/08/2021--
-
-# Handling Event Grid events
--
-Media Services events allow applications to react to different events (for example, the job state change event) using modern serverless architectures. It does so without the need for complicated code or expensive and inefficient polling services. Instead, events are pushed through [Azure Event Grid](https://azure.microsoft.com/services/event-grid/) to event handlers such as [Azure Functions](https://azure.microsoft.com/services/functions/), [Azure Logic Apps](https://azure.microsoft.com/services/logic-apps/), or even to your own Webhook, and you only pay for what you use. For information about pricing, see [Event Grid pricing](https://azure.microsoft.com/pricing/details/event-grid/).
-
-Availability for Media Services events is tied to Event Grid [availability](../../../event-grid/overview.md) and will become available in other regions as Event Grid does.
-
-## Media Services events and schemas
-
-Event grid uses [event subscriptions](../../../event-grid/concepts.md#event-subscriptions) to route event messages to subscribers. Media Services events contain all the information you need to respond to changes in your data. You can identify a Media Services event because the eventType property starts with "Microsoft.Media.".
-
-For more information, see [Media Services event schemas](../media-services-event-schemas.md).
-
-## Samples and How-to
-
-The Media Services [samples repository for .NET](https://github.com/Azure-Samples/media-services-v3-dotnet) demonstrates how to use the latest Event Grid and Event Hubs client libraries to receive events in your own custom applications.
-
-In addition, the following how-to articles demonstrate the use of Event Grid through the CLI and Azure portal.
-
-* [Monitor events - portal](../monitor-events-portal-how-to.md)
-* [Monitor events - CLI](../job-state-events-cli-how-to.md)
-
-## Practices for consuming events
-
-Applications that handle Media Services events should follow a few recommended practices:
-
-* As multiple subscriptions can be configured to route events to the same event handler, it is important not to assume events are from a particular source, but to check the topic of the message to ensure that it comes from the storage account you are expecting.
-* Similarly, check that the eventType is one you are prepared to process, and do not assume that all events you receive will be the types you expect.
-* Ignore fields you donΓÇÖt understand. This practice will help keep you resilient to new features that might be added in the future.
-* Use the "subject" prefix and suffix matches to limit events to a particular event.
-
-> [!NOTE]
-> Events are subject to the Event Grid [Service Level Agreement (SLA)](https://azure.microsoft.com/support/legal/sla/event-grid/v1_0/). If you want to get event notifications using APIs, see examples on how to consume events, with [.NET SDK](https://github.com/Azure-Samples/media-services-v3-dotnet) or [Java SDK](https://github.com/Azure-Samples/media-services-v3-java).
-
-## Next steps
-
-* [Monitor events - portal](../monitor-events-portal-how-to.md)
-* [Monitor events - CLI](../job-state-events-cli-how-to.md)
media-services Output Metadata Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/output-metadata-schema.md
- Title: Azure Media Services output metadata schema
-description: This article gives an overview of Azure Media Services v3 output metadata schema.
------- Previously updated : 08/31/2020---
-# Output metadata
--
-An encoding job is associated with an input asset (or assets) on which you want to perform some encoding tasks. For example, encode an MP4 file to H.264 MP4 adaptive bitrate sets; create a thumbnail; create overlays. Upon completion of a task, an output asset is produced. The output asset contains video, audio, thumbnails, and other files. The output asset also contains a file with metadata about the output asset. The name of the metadata JSON file has the following format: `<source_file_name>_manifest.json` (for example, `BigBuckBunny_manifest.json`). You should scan for any *_metadata.json and query the filepath string within to find the source filename (without truncation).
-
-Media Services does not preemptively scan input assets to generate metadata. Input metadata is generated only as an artifact when an input asset is processed in a job. Hence this artifact is written to the output asset. Different tools are used to generate metadata for input assets and output assets. Therefore, the input metadata has a slightly different schema than the output metadata.
-
-This article discusses the elements and types of the JSON schema on which the output metadata (&lt;source_file_name&gt;_manifest.json) is based. <!--For information about the file that contains metadata about the input asset, see [Input metadata](input-metadata-schema.md). -->
-
-You can find the complete schema code and JSON example at the end of this article.
-
-## AssetFile
-
-Collection of AssetFile entries for the encoding job.
-
-| Name | Description |
-| | |
-| **Sources** |Collection of input/source media files, that was processed in order to produce this AssetFile.<br />Example: `"Sources": [{"Name": "Ignite-short_1280x720_AACAudio_3551.mp4"}]`|
-| **VideoTracks**|Each physical AssetFile can contain in it zero or more videos tracks interleaved into an appropriate container format. <br />See [VideoTracks](#videotracks). |
-| **AudioTracks**|Each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. This is the collection of all those audio tracks.<br /> For more information, see [AudioTracks](#audiotracks). |
-| **Name**<br />Required |The media asset file name. <br /><br />Example: `"Name": "Ignite-short_1280x720_AACAudio_3551.mp4"`|
-| **Size**<br />Required |Size of the asset file in bytes. <br /><br />Example: `"Size": 32414631`|
-| **Duration**<br />Required |Content play back duration. For more information, see the [ISO8601](https://www.iso.org/iso-8601-date-and-time-format.html) format. <br /><br />Example: `"Duration": "PT1M10.315S"`|
-
-## VideoTracks
-
-Each physical AssetFile can contain in it zero or more videos tracks interleaved into an appropriate container format. The **VideoTracks** element represents a collection of all the video tracks.
-
-| Name | Description |
-| | |
-| **Id**<br /> Required |Zero-based index of this video track. **Note:** This **Id** is not necessarily the TrackID as used in an MP4 file. <br /><br />Example: `"Id": 1`|
-| **FourCC**<br />Required | Video codec FourCC code that is reported by ffmpeg. <br /><br />Example: `"FourCC": "avc1" | "hev1" | "hvc1"`|
-| **Profile** |H264 profile (only applicable to H264 codec) <br /><br />Example: `"Profile": "High"` |
-| **Level** |H264 level (only applicable to H264 codec). <br /><br />Example: `"Level": "3.2"`|
-| **Width**<br />Required |Encoded video width in pixels. <br /><br />Example: `"Width": "1280"`|
-| **Height**<br />Required |Encoded video height in pixels. <br /><br />Example: `"Height": "720"`|
-| **DisplayAspectRatioNumerator**<br />Required|Video display aspect ratio numerator. <br /><br />Example: `"DisplayAspectRatioNumerator": 16.0`|
-| **DisplayAspectRatioDenominator**<br />Required |Video display aspect ratio denominator. <br /><br />Example: `"DisplayAspectRatioDenominator": 9.0`|
-| **Framerate**<br />Required |Measured video frame rate in .3f format. <br /><br />Example: `"Framerate": 29.970`|
-| **Bitrate**<br />Required |Average video bit rate in bits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead. <br /><br />Example: `"Bitrate": 3551567`|
-| **TargetBitrate**<br />Required |Target average bitrate for this video track, as requested via the encoding preset, in bits per second. <br /><br />Example: `"TargetBitrate": 3520000` |
-
-## AudioTracks
-
-Each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. The **AudioTracks** element represents a collection of all those audio tracks.
-
-| Name | Description |
-| | |
-| **Id**<br />Required |Zero-based index of this audio track. **Note:** This is not necessarily the TrackID as used in an MP4 file. <br /><br />Example: `"Id": 2`|
-| **Codec** |Audio track codec string. <br /><br />Example: `"Codec": "aac"`|
-| **Language**|Example: `"Language": "eng"`|
-| **Channels**<br />Required|Number of audio channels. <br /><br />Example: `"Channels": 2`|
-| **SamplingRate**<br />Required |Audio sampling rate in samples/sec or Hz. <br /><br />Example: `"SamplingRate": 48000`|
-| **Bitrate**<br />Required |Average audio bit rate in bits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead. <br /><br />Example: `"Bitrate": 128041`|
-
-## JSON schema example
-
-```json
-{
- "AssetFile": [
- {
- "Sources": [
- {
- "Name": "Ignite-short_1280x720_AACAudio_3551.mp4"
- }
- ],
- "VideoTracks": [
- {
- "Id": 1,
- "FourCC": "avc1",
- "Profile": "High",
- "Level": "3.2",
- "Width": "1280",
- "Height": "720",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "Framerate": 29.970,
- "Bitrate": 3551567,
- "TargetBitrate": 3520000
- }
- ],
- "AudioTracks": [
- {
- "Id": 2,
- "Codec": "aac",
- "Language": "eng",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128041
- }
- ],
- "Name": "Ignite-short_1280x720_AACAudio_3551.mp4",
- "Size": 32414631,
- "Duration": "PT1M10.315S"
- },
- {
- "Sources": [
- {
- "Name": "Ignite-short_960x540_AACAudio_2216.mp4"
- }
- ],
- "VideoTracks": [
- {
- "Id": 1,
- "FourCC": "avc1",
- "Profile": "High",
- "Level": "3.1",
- "Width": "960",
- "Height": "540",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "Framerate": 29.970,
- "Bitrate": 2216326,
- "TargetBitrate": 2210000
- }
- ],
- "AudioTracks": [
- {
- "Id": 2,
- "Codec": "aac",
- "Language": "eng",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128041
- }
- ],
- "Name": "Ignite-short_960x540_AACAudio_2216.mp4",
- "Size": 20680897,
- "Duration": "PT1M10.315S"
- },
- {
- "Sources": [
- {
- "Name": "Ignite-short_640x360_AACAudio_1150.mp4"
- }
- ],
- "VideoTracks": [
- {
- "Id": 1,
- "FourCC": "avc1",
- "Profile": "High",
- "Level": "3.0",
- "Width": "640",
- "Height": "360",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "Framerate": 29.970,
- "Bitrate": 1150440,
- "TargetBitrate": 1150000
- }
- ],
- "AudioTracks": [
- {
- "Id": 2,
- "Codec": "aac",
- "Language": "eng",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128041
- }
- ],
- "Name": "Ignite-short_640x360_AACAudio_1150.mp4",
- "Size": 11313920,
- "Duration": "PT1M10.315S"
- },
- {
- "Sources": [
- {
- "Name": "Ignite-short_480x270_AACAudio_722.mp4"
- }
- ],
- "VideoTracks": [
- {
- "Id": 1,
- "FourCC": "avc1",
- "Profile": "High",
- "Level": "2.1",
- "Width": "480",
- "Height": "270",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "Framerate": 29.970,
- "Bitrate": 722682,
- "TargetBitrate": 720000
- }
- ],
- "AudioTracks": [
- {
- "Id": 2,
- "Codec": "aac",
- "Language": "eng",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128041
- }
- ],
- "Name": "Ignite-short_480x270_AACAudio_722.mp4",
- "Size": 7554708,
- "Duration": "PT1M10.315S"
- },
- {
- "Sources": [
- {
- "Name": "Ignite-short_320x180_AACAudio_380.mp4"
- }
- ],
- "VideoTracks": [
- {
- "Id": 1,
- "FourCC": "avc1",
- "Profile": "High",
- "Level": "1.3",
- "Width": "320",
- "Height": "180",
- "DisplayAspectRatioNumerator": 16.0,
- "DisplayAspectRatioDenominator": 9.0,
- "Framerate": 29.970,
- "Bitrate": 380655,
- "TargetBitrate": 380000
- }
- ],
- "AudioTracks": [
- {
- "Id": 2,
- "Codec": "aac",
- "Language": "eng",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128041
- }`
- ],
- "Name": "Ignite-short_320x180_AACAudio_380.mp4",
- "Size": 4548932,
- "Duration": "PT1M10.315S"
- }
- ]
-}
-```
-
-## Next steps
-
-[Create a job input from an HTTPS URL](job-input-from-http-how-to.md)
media-services Player How To Video Js Player https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/player-how-to-video-js-player.md
- Title: Use the Video.js player with Azure Media Services
-description: This article explains how to use the HTML video object and JavaScript with Azure Media Services
------ Previously updated : 08/31/2020----
-# How to use the Video.js player with Azure Media Services
--
-## Overview
-
-Video.js is a web video player built for an HTML5 world. It plays adaptive media formats (such as DASH and HLS) in a browser, without using plugins or Flash. Instead, Video.js uses the open web standards MediaSource Extensions and Encrypted Media Extensions. Moreover, it supports video playback on desktops and mobile devices.
-
-Its official documentation can be found at [https://docs.videojs.com/](https://docs.videojs.com/).
-
-## Sample code
-Sample code for this article is available at [Azure-Samples/media-services-3rdparty-player-samples](https://github.com/Azure-Samples/media-services-3rdparty-player-samples).
-
-## Implement the player
-
-1. Create an `https://docsupdatetracker.net/index.html` file where you'll host the player. Add the following lines of code (you can replace the versions for newer if applicable):
-
- ```html
- <html>
- <head>
- <link href="https://vjs.zencdn.net/7.8.2/video-js.css" rel="stylesheet" />
- </head>
- <body>
- <video id="video" class="video-js vjs-default-skin vjs-16-9" controls data-setup="{}">
- </video>
-
- <script src="https://vjs.zencdn.net/7.8.2/video.js"></script>
- <script src="https://cdn.jsdelivr.net/npm/videojs-contrib-eme@3.7.0/dist/videojs-contrib-eme.min.js"></script>
- <script type="module" src="index.js"></script>
- </body>
- <html>
- ```
-
-2. Add an `index.js` file with the following code:
-
- ```javascript
- var videoJS = videojs("video");
- videoJS.src({
- src: "manifestUrl",
- type: "protocolType",
- });
- ```
-
-3. Replace `manifestUrl` with the HLS or DASH URL from the streaming locator of your asset which can be found on the streaming locator page in the Azure portal.
- ![streaming locator URLs](media/player-shaka-player-how-to/streaming-urls.png)
-
-4. Replace `protocolType` with the following options:
-
- - "application/x-mpegURL" for HLS protocols
- - "application/dash+xml" for DASH protocols
-
-### Set up captions
-
-Run the `addRemoteTextTrack` method, and replace:
--- `subtitleKind` with either `"captions"`, `"subtitles"`,`"descriptions"`, or `"metadata"` -- `caption` with the .vtt file path (vtt file needs to be in the same host to avoid CORS error)-- `subtitleLang` with the BCP 47 code for language, for example, `"eng"` for English or `"es"` Spanish-- `subtitleLabel` with your desired display name of caption-
-```javascript
-videojs.players.video.addRemoteTextTrack({
- kind: subtitleKind,
- src: caption,
- srclang: subtitleLang,
- label: subtitleLabel
-});
-```
-
-### Set up token authentication
-
-The token must be set in the authorization field of the request's header. In order to avoid problems with CORS, this token must be set only in those requests with `'keydeliver'` in its URL. The following code lines should do the work:
-
-```javascript
-setupTokenForDecrypt (options) {
- if (options.uri.includes('keydeliver')) {
- options.headers = options.headers || {}
- options.headers.Authorization = 'Bearer=' + this.getInputToken()
- }
-
- return options
-}
-```
-
-Then, the above function must be attached to the `videojs.Hls.xhr.beforeRequest` event.
-
-```javascript
-videojs.Hls.xhr.beforeRequest = setupTokenForDecrypt;
-```
-
-### Set up AES-128 encryption
-
-Video.js supports AES-128 encryption without any additional configuration.
-
-> [!NOTE]
-> There's currently an [issue](https://github.com/videojs/video.js/issues/6717) with encryption and HLS/DASH CMAF content, which are not playable.
-
-### Set up DRM protection
-
-In order to support DRM protection, you must add the [videojs-contrib-eme](https://github.com/videojs/videojs-contrib-eme) official extension. A CDN version works as well.
-
-1. In the `index.js` file detailed above, you must initialize the EME extension by adding `videoJS.eme();` *before* adding the source of the video:
-
- ```javascript
- videoJS.eme();
- ```
-
-2. Now you can define the URLs of the DRM services, and the URLs of the corresponding licenses as follows:
-
- ```javascript
- videoJS.src({
- keySystems: {
- "com.microsoft.playready": "YOUR PLAYREADY LICENSE URL",
- "com.widevine.alpha": "YOUR WIDEVINE LICENSE URL",
- "com.apple.fps.1_0": {
- certificateUri: "YOUR FAIRPLAY CERTIFICATE URL",
- licenseUri: "YOUR FAIRPLAY LICENSE URL"
- }
- }
- })
-
- ```
-
-#### Acquiring the license URL
-
-In order to acquire the license URL, you can:
--- Consult your DRM provider configuration-- or, if you are using the sample, consult the `output.json` document generated when you ran the *setup-vod.ps1* PowerShell script for VODs, or the *start-live.ps1* script for live streams. You'll also find the KIDs inside this file.-
-#### Using tokenized DRM
-
-In order to support tokenized DRM protection, you have to add the following line to the `src` property of the player:
-
-```javascript
-videoJS.src({
-src: ...,
-emeHeaders: {'Authorization': "Bearer=" + "YOUR TOKEN"},
-keySystems: {...
-```
--
-## Next steps
--- [Use the Azure Media Player](../azure-media-player/azure-media-player-overview.md) -- [Quickstart: Encrypt content](drm-encrypt-content-how-to.md)
media-services Player Media Players Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/player-media-players-concept.md
- Title: Media players for Media Services overview
-description: Which media players can I use with Azure Media Services? Azure Media Player, Shaka, and Video.js so far.
---- Previously updated : 3/08/2021--
-# Media players for Media Services
-
-You can use several media players with Media Services.
-
-## Azure Media Player
-
-Azure Media Player is a video player for a wide variety of browsers and devices. Azure Media Player uses industry standards, such as HTML5, Media Source Extensions (MSE), and Encrypted Media Extensions (EME) to provide an enriched adaptive streaming experience. When these standards aren't available on a device or in a browser, Azure Media Player uses Flash and Silverlight as fallback technology. Whatever of the playback technology used, developers have a unified JavaScript interface to access APIs. Content served by Azure Media Services can be played across a wide range of devices and browsers without any extra effort.
-
-See the [Azure Media Player documentation](../azure-media-player/azure-media-player-overview.md).
-
-## Shaka
-
-Shaka Player is an open-source JavaScript library for adaptive media. It plays adaptive media formats (such as DASH and HLS) in a browser, without using plugins or Flash. Instead, the Shaka Player uses the open web standards Media Source Extensions and Encrypted Media Extensions.
-
-See [How to use the Shaka player with Azure Media Services](player-shaka-player-how-to.md).
-
-## Video.js
-
-Video.js is a video player that plays adaptive media formats (such as DASH and HLS) in a browser. Video.js uses the open web standards MediaSource Extensions and Encrypted Media Extensions. It supports video playback on desktops and mobile devices. Its official documentation can be found at https://docs.videojs.com/.
-
-See [How to use the Video.js player with Azure Media Services](player-how-to-video-js-player.md).
--
-## Next steps ##
--- [Azure Media Player Quickstart](../azure-media-player/azure-media-player-quickstart.md)
media-services Player Shaka Player How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/player-shaka-player-how-to.md
- Title: How to use the Shaka player with Azure Media Services
-description: This article explains how to use the Shaka player with Azure Media Services
------ Previously updated : 08/31/2020----
-# How to use the Shaka player with Azure Media Services
--
-## Overview
-
-Shaka Player is an open-source JavaScript library for adaptive media. It plays adaptive media formats (such as DASH and HLS) in a browser, without using plugins or Flash. Instead, the Shaka Player uses the open web standards Media Source Extensions and Encrypted Media Extensions.
-
-We recommend using [Mux.js](https://github.com/videojs/mux.js/) as, without it, the Shaka player would support HLS CMAF format, but not HLS TS.
-
-Its official documentation can be found at [Shaka player documentation](https://shaka-player-demo.appspot.com/docs/api/tutorial-welcome.html).
-
-## Sample code
-
-Sample code for this article is available at [Azure-Samples/media-services-3rdparty-player-samples](https://github.com/Azure-Samples/media-services-3rdparty-player-samples).
-
-## Implementing the player
-
-Follow these instructions if you need to implement your own instance of the player:
-
-1. Create an `https://docsupdatetracker.net/index.html` file where you'll host the player. Add the following lines of code (you can replace the versions for newer if applicable):
-
- ```html
- <html>
- <head>
- <script src="//cdn.jsdelivr.net/npm/shaka-player@3.0.1/dist/shaka-player.compiled.js"></script>
- <script src="//cdn.jsdelivr.net/npm/mux.js@5.6.3/dist/mux.js"></script>
- <script type="module" src="index.js"></script>
- </head>
- <body>
- <video id="video" controls></video>
- </body>
- </html>
- ```
-
-1. Add a JavaScript file with the following code:
-
- ```javascript
- // myScript.js
- shaka.polyfill.installAll();
-
- var video = document.getElementById('video');
- var player = new shaka.Player(video);
- window.player = player;
-
- var manifestUrl = 'https://amsplayeraccount-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/sample-vod.ism/manifest(format=m3u8-aapl)';
- player.load(manifestUrl);
- ```
-
-1. Replace `manifestUrl` with the HLS or DASH URL from the streaming locator of your asset, which can be found on the streaming locator page in the Azure portal.
- ![streaming locator URLs](media/player-shaka-player-how-to/streaming-urls.png)
-
-1. Run a server (for example with `npm http-server`) and your player should be working...
-
-## Set up captions
-
-### Set up VOD captions
-
-Run the following lines of code, and replace `captionUrl` with your .vtt directory (vtt file needs to be in the same host to avoid CORS error), `lang` with the two letter code for language, and `type` with either `caption` or `subtitle`:
-
-```javascript
-player.configure('streaming.alwaysStreamText', true)
-player.load(manifestUrl).then(function(){
- player.addTextTrack(captionUrl, lang, type, 'text/vtt');
- var tracks = player.getTextTracks();
- player.selectTextTrack(tracks[0]);
-});
-```
-
-### Set up live stream captions
-
-Enable captions in live stream is configured adding the following line of code:
-
-```javascript
-player.setTextTrackVisibility(true)
-```
-
-## Set up token authentication
-
-Run the following lines of code, and replace `token` with your token string:
-
-```javascript
-player.getNetworkingEngine().registerRequestFilter(function (type, request) {
- if (type === shaka.net.NetworkingEngine.RequestType.LICENSE) {
- request.headers['Authorization'] = 'Bearer ' + token;
- }
-});
-```
-
-## Set up AES-128 encryption
-
-Shaka Player doesn't currently support AES-128 encryption.
-
-A link to a GitHub [issue](https://github.com/google/shaka-player/issues/850) to follow the status of this feature.
-
-## Set up DRM protection
-
-Shaka Player uses Encrypted Media Extensions (EME), which requires a secure URL to use. So, for testing any DRM protected content it's necessary to use https. If the site is using https, then the manifest and every segment will also need to use https. This is because of mixed content requirements.
-
-The order of preference for Shaka management of the URL(s) of its license server(s):
-
-1. ClearKey config, used for debugging, should override everything else. (The application can still specify a ClearKey license server.)
-2. Application-configured servers, if any are present, should override anything from the manifest.
-3. Manifest-provided license servers are only used if nothing else is specified.
-
-To specify the license server URL for Widevine or PlayReady, we can use the following code:
-
-```javascript
-player.configure({
- drm: {
- servers: {
- "com.widevine.alpha": "YOUR WIDEVINE LICENSE URL",
- "com.microsoft.playready": "YOUR PLAYREADY LICENSE URL"
- }
- }
-});
-
-```
-
-All FairPlay content requires setting a server certificate. It is set in the Player configuration:
-
-```javascript
-const req = await fetch("YOUR FAIRPLAY CERTIFICATE URL");
-const cert = await req.arrayBuffer();
-player.configure('drm.advanced.com\\.apple\\.fps\\.1_0.serverCertificate', new Uint8Array(cert));
-```
-
-For more information, see [Shaka player DRM protection documentation](https://shaka-player-demo.appspot.com/docs/api/tutorial-drm-config.html).
---
-## Next steps
-
-* [Use the Azure Media Player](../azure-media-player/azure-media-player-overview.md)
-* [Quickstart: Encrypt content](drm-encrypt-content-how-to.md)
media-services Player Use Azure Media Player How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/player-use-azure-media-player-how-to.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Playback with Azure Media Player - Azure
-description: Azure Media Player is a web video player built to play back media content from Microsoft Azure Media Services on a wide variety of browsers and devices.
------- Previously updated : 07/17/2019----
-# Playback with Azure Media Player
-
-Azure Media Player is a web video player built to play back media content from Microsoft Azure Media Services on a wide variety of browsers and devices. Azure Media Player utilizes industry standards, such as HTML5, Media Source Extensions (MSE), and Encrypted Media Extensions (EME) to provide an enriched adaptive streaming experience. When these standards are not available on a device or in a browser, Azure Media Player uses Flash and Silverlight as fallback technology. Regardless of the playback technology used, developers will have a unified JavaScript interface to access APIs. This allows for content served by Azure Media Services to be played across a wide-range of devices and browsers without any extra effort.
-
-Microsoft Azure Media Services allows for content to be served up with HLS, DASH, Smooth Streaming streaming formats to play back content. Azure Media Player takes into account these various formats and automatically plays the best link based on the platform/browser capabilities. Media Services also allows for dynamic encryption of assets with PlayReady encryption or AES-128 bit envelope encryption. Azure Media Player allows for decryption of PlayReady and AES-128 bit encrypted content when appropriately configured.
-
-> [!NOTE]
-> HTTPS playback is required for Widevine encrypted content.
-
-## Use Azure Media Player demo page
-
-### Start using
-
-You can use the [Azure Media Player demo page](https://aka.ms/azuremediaplayer) to play Azure Media Services samples or your own stream.
-
-To play a new video, paste a different URL and press **Update**.
-
-To configure various playback options (for example, tech, language, or encryption), press **Advanced Options**.
-
-![Azure Media Player](./media/azure-media-player/home-page.png)
-
-### Monitor diagnostics of a video stream
-
-You can use the [Azure Media Player demo page](https://aka.ms/azuremediaplayer) to monitor diagnostics of a video stream.
-
-![Azure Media Player diagnostics](./media/azure-media-player/diagnostics.png)
-
-## Set up Azure Media Player in your HTML
-
-Azure Media Player is easy to set up. It only takes a few moments to get basic playback of media content from your Media Services account. See [Azure Media Player documentation](../azure-media-player/azure-media-player-overview.md) for details on how to set up and configure Azure Media Player.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-* [Azure Media Player documentation](../azure-media-player/azure-media-player-overview.md)
-* [Azure Media Player samples](https://github.com/Azure-Samples/azure-media-player-samples)
media-services Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/release-notes.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Azure Media Services v3 release notes
-description: To stay up to date with the most recent developments, this article provides you with the latest updates on Azure Media Services v3.
------- Previously updated : 03/17/2021---
-# Azure Media Services v3 release notes
--
-To stay up to date with the most recent developments, this article provides you with information about:
-
-* The latest releases
-* Known issues
-* Bug fixes
-* Deprecated functionality
-
-## December 2021
-
-### Updated JavaScript SDK version 10.0.0
-
-The JavaScript SDK is now updated to support the latest REST API release of 2021-06-01. This new isomorphic JavaScript SDK includes better support for Promises and the ability to authenticate using the @azure/identity library for use with Azure AD applications, managed identity, and more.
-
-To download the latest package, see the [@azure/arm-media-services NPM package](https://www.npmjs.com/package/@azure/arm-mediaservices).
-
-An updated and expanded set of Node.js and TypeScript based samples for the new JavaScript package is available in Git Hub.
-[https://github.com/Azure-Samples/media-services-v3-node-tutorials](https://github.com/Azure-Samples/media-services-v3-node-tutorials)
-
-### Hebrew, Persian, and Portugal Portuguese languages available in the Audio/Video Analyzer preset for transcriptions
-
-Hebrew, Persian, and Portugal Portuguese (the current model that exists today is Brazilian Portuguese) are now available for use in the [Audio and Video Analyzer preset](./analyze-video-audio-files-concept.md)
-
-The new supported BCP-47 language codes are: he-IL, fa-IR, and pt-PT.
-
-### Sweden Central region is now GA
-
- Media Services is now generally available in the Sweden Central region. There are currently some feature limitations in the region while we await a few dependency services to also arrive in the region. Check the [regional feature availability chart](./azure-clouds-regions.md) to determine when features will arrive.
-
-### New live event channel archive heartbeat event
-
-A new event that tracks the status and health of the live event archive has been added. See the LiveEventChannelArchiveHeartbeatEvent in the [Event Grid schemas for Media Services](./media-services-event-schemas.md) for more details on this new event.
--
-## September 2021
-
-### New basic pass-through live event SKU
-
-The new basic pass-through live event SKU allows customers to create live events at a [lower price point](https://azure.microsoft.com/pricing/details/media-services/). It is similar to standard pass-through live events, but with lower input bandwidth limits, fewer live outputs allowed, different DVR window length limits, and no access to live transcription. See [live event types comparison](./live-event-types-comparison-reference.md#types-comparison) for more details.
-
-### Improved scale management and monitoring for a Streaming Endpoint in the portal
-
-The streaming endpoint portal page now provides an easy way for you to manage your egress capacity and estimate your audience reach with and without a CDN configured. Adjust the delivery bitrate and expected CDN cache hit ratio to get quick estimations of your audience size and help you determine if you need to scale up to more Premium streaming endpoints.
-
- [ ![Scale and monitor streaming endpoints in the portal](./media/release-notes/streaming-endpoint-monitor-inline.png) ](./media/release-notes/streaming-endpoint-monitor.png#lightbox)
-
-### Streaming Endpoint portal page now shows CPU, egress, and latency metrics
-
-You can now visualize the CPU load, egress bandwidth and end-to-end latency metrics on their streaming endpoints in the Azure portal. You can now create monitoring alerts based on the CPU, egress, or latency metrics directly in the portal using the power of Azure Monitor.
-
-### User-Assigned Managed Identities support for Media Services accounts
-
-Using User-Assigned Managed Identities, customers will now be able to enable better security of their storage accounts and associated key vaults. Access to the customer storage account and key vaults will be limited to the user assigned managed identity. You have full control over the lifetime of user-managed identities and can easily revoke the media service accountΓÇÖs access to any specific storage account as needed.
-
-### Media services storage accounts page in the portal now support both UAMI and SAMI
-
-You can now assign and manage user-assigned managed identities (UAMI) or system-assigned managed identities(SAMI) for your storage accounts directly in the Azure portal for Media Services.
-
-### Bring your own key page now also supports both UAMI and SAMI.
-The key management portal page for Media Services now supports configuration and management of user-assigned managed identities (UAMI) or system-assigned managed identities (SAMI).
-
- [ ![Bring your own keys for account encryption](./media/release-notes/byok-managed-identity.png)](./media/release-notes/byok-managed-identity.png)
--
-### Private Link support for Media services
-You can now restrict public access to your live events, streaming endpoints, and key delivery services endpoint for content protection and DRM by creating a private endpoint for each of the services. This will limit public access to each of these services. Only traffic originating from your configured virtual network (VNET), configured in Private Endpoint, will be able reach these endpoints.
-
-### IP allowlist for Key Service
-You can now choose to allow certain public IP addresses to have access to the key delivery service for DRM and content protection. Live event and streaming endpoints already support configuration of IP allowlist in their respective pages.
-
-You also now have an account level feature flag to allow/block public internet access to your media services account.
-
-## July 2021
-
-### .NET SDK (Microsoft.Azure.Management.Media) 5.0.0 release available in NuGet
-
-The [Microsoft.Azure.Management.Media](https://www.nuget.org/packages/Microsoft.Azure.Management.Media/5.0.0) .NET SDK version 5.0.0 is now released on NuGet. This version is generated to work with the [2021-06-01 stable](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2021-06-01) version of the Open API (Swagger) ARM REST API.
-
-For details on changes from the 4.0.0 release see the [change log](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/mediaservices/Microsoft.Azure.Management.Medi).
-
-#### Changes in the 5.0.0 .NET SDK release
-
-* The Media Services account now supports system and user assigned managed identities.
-* Added **PublicNetworkAccess** option to Media Services accounts. This option can be used with the Private Link feature to only allow access from private networks, blocking all public network access
-* Basic passthrough - A new live event type is added. "Basic Pass-through" live events have similar capabilities as standard pass-through live events with some input and output restrictions, and are offered at a reduced price.
-* **PresetConfigurations** - allow you to customize the output settings, and min and max bitrates used for the [Content Aware Encoding presets](./encode-content-aware-concept.md). This helps you to better estimate and plan for more accurate billing when using Content Aware Encoding through constrained output track numbers and resolutions.
-
-#### Breaking changes in tht 5.0.0 .NET SDK release
-
-* **ApiErrorException** has been replaced with **ErrorResponseException** to be consistent with all other Azure SDKs. Exception body has not changed.
-* All calls returning 404 Not found now raise an **ErrorResponseException** instead of returning null. This change was made to be consistent with other Azure SDKs
-* Media service constructor has new optional PublicNetworkAccess parameter after KeyDelivery parameter.
-* Type property in **MediaServiceIdentity** has been changed from ManagedIdentityType enum to string to accommodate multiple comma-separated values. Valid strings are **SystemAssigned** or **UserAssigned**.
-
-## June 2021
-
-### More Live Event ingest heartbeat properties for improved diagnostics
-
-More live event ingest heartbeat properties have been added to the Event Grid message. This includes the following new fields to assist with diagnosing issues during live ingest. The **ingestDriftValue** is helpful in scenarios where you need to monitor network latency from the source ingest encoder pushing into the live event. If this value drifts out too far, it can be an indication that the network latency is too high for a successful live streaming event.
-
-See the [LiveEventIngestHeartbeat schema](./monitoring/media-services-event-schemas.md#liveeventingestheartbeat) for more details.
-
-### Private links support is now GA
-
-Support for using Media Services with [private links](../../private-link/index.yml) is now GA and available in all Azure regions including Azure Government clouds.
-Azure Private Link enables you to access Azure PaaS Services and Azure hosted customer-owned/partner services over a Private Endpoint in your virtual network.
-Traffic between your virtual network and the service traverses over the Microsoft backbone network, eliminating exposure from the public Internet.
-
-For details on how to use Media Services with private links, see [Create a Media Services and Storage account with a Private Link](./security-private-link-how-to.md)
-
-### New US West 3 region is GA
-
-The US West 3 region is now GA and available for customers to use when creating new Media Services accounts.
-
-### Key delivery supports IP allowlist restrictions
-
-Media Services accounts can now be configured with IP allowlist restrictions on key delivery. The new allowlist setting is available on the Media Services account resource through the SDK and in the portal and CLI.
-This allows operators to restrict delivery of DRM licenses and AES-128 content keys to specific IPv4 ranges.
-
-This feature can also be used to shut off all public internet delivery of DRM licenses or AES-128 keys and restrict delivery to a private network endpoint.
-
-See the article [Restrict access to DRM license and AES key delivery using IP allowlists](./drm-content-protection-key-delivery-ip-allow.md) for details.
-
-### New Samples for Python and Node.js (with TypeScript)
-Updated samples for **Node.js** that use the latest TypeScript support in the Azure SDK.
-
-|Sample|Description|
-|||
-|[Live streaming](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/Live/)| Basic live streaming example. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Upload and stream HLS and DASH](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/StreamFilesSample/index.ts)| Basic example for uploading a local file or encoding from a source URL. Sample shows how to use storage SDK to download content, and shows how to stream to a player |
-|[Upload and stream HLS and DASH with PlayReady and Widevine DRM](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/StreamFilesWithDRMSample/index.ts)| Demonstrates how to encode and stream using Widevine and PlayReady DRM |
-|[Upload and use AI to index videos and audio](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/VideoAnalytics/index.ts)| Example of using the Video and Audio Analyzer presets to generate metadata and insights from a video or audio file |
--
-New **Python** sample demonstrating how to use Azure Functions, and Event Grid to trigger Face redaction preset.
-
-|Sample|Description|
-|||
-|[Face Redaction using events and functions](https://github.com/Azure-Samples/media-services-v3-python/tree/main/VideoAnalytics/FaceRedactorEventBased) | This is an example of an event-based approach that triggers an Azure Media Services Face Redactor job on a video as soon as it lands on an Azure Storage Account. It leverages Azure Media Services, Azure Function, Event Grid and Azure Storage for the solution. For the full description of the solution, see the [README.md](https://github.com/Azure-Samples/media-services-v3-python/blob/main/VideoAnalytics/FaceRedactorEventBased/README.md) |
--
-## May 2021
-
-### Availability Zones default support in Media Services
-
-Media Services now supports [Availability Zones](concept-availability-zones.md), providing fault-isolated locations within the same Azure region. Media Services accounts are zone redundant by default now and there is no extra configuration or settings required. This only applies to regions that have [Availability Zones support](../../availability-zones/az-region.md#azure-regions-with-availability-zones)
--
-## March 2021
-
-### New language support added to the AudioAnalyzer preset
-
-More languages for video transcription and subtitling are available now in the AudioAnalyzer preset (both Basic and Standard modes).
-
-* English (Australia), 'en-AU'
-* French (Canada), 'fr-CA'
-* Arabic (Bahrain) modern standard, 'ar-BH'
-* Arabic (Egypt), 'ar-EG'
-* Arabic (Iraq), 'ar-IQ'
-* Arabic (Israel), 'ar-IL'
-* Arabic (Jordan), 'ar-JO'
-* Arabic (Kuwait), 'ar-KW'
-* Arabic (Lebanon), 'ar-LB'
-* Arabic (Oman), 'ar-OM'
-* Arabic (Qatar), 'ar-QA'
-* Arabic (Saudi Arabia), 'ar-SA'
-* Danish, ΓÇÿda-DKΓÇÖ
-* Norwegian, 'nb-NO'
-* Swedish, ΓÇÿsv-SEΓÇÖ
-* Finnish, ΓÇÿfi-FIΓÇÖ
-* Thai, ΓÇÿth-THΓÇÖ
-* Turkish, ΓÇÿtr-TRΓÇÖ
-
-See the latest available languages in the [Analyzing Video And Audio Files concept article.](analyze-video-audio-files-concept.md)
-
-## February 2021
-
-### HEVC Encoding support in Standard Encoder
-
-The Standard Encoder now supports 8-bit HEVC (H.265) encoding support. HEVC content can be delivered and packaged through the Dynamic Packager using the 'hev1' format.
-
-A new .NET custom encoding with HEVC sample is available in the [media-services-v3-dotnet Git Hub repository](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_HEVC).
-In addition to custom encoding, the following new built-in HEVC encoding presets are now available:
--- H265ContentAwareEncoding-- H265AdaptiveStreaming-- H265SingleBitrate720P-- H265SingleBitrate1080p-- H265SingleBitrate4K-
-Customers previously using HEVC in the Premium Encoder in the v2 API should migrate to use the new HEVC encoding support in the Standard Encoder.
-
-### Azure Media Services v2 API and SDKs deprecation announcement
-
-#### Update your Azure Media Services REST API and SDKs to v3 by 29 February 2024
-
-Because version 3 of Azure Media Services REST API and client SDKs for .NET and Java offers more capabilities than version 2, weΓÇÖre retiring version 2 of the Azure Media Services REST API and client SDKs for .NET and Java.
-
-We encourage you to make the switch sooner to gain the richer benefits of version 3 of Azure Media Services REST API and client SDKs for .NET and Java.
-Version 3 provides:
-
-- 24x7 live event support-- ARM REST APIs, client SDKs for .NET core, Node.js, Python, Java, Go and Ruby.-- Customer-managed keys, trusted storage integration, private link support, and [more](./migrate-v-2-v-3-migration-benefits.md)-
-As part of the update to v3 API and SDKs, Media Reserve Units (MRUs) are no longer needed for any Media Services account as the system will automatically scale up and down based on load. Refer to the [MRUs migration guidance](./migrate-v-2-v-3-migration-scenario-based-media-reserved-units.md) for more information.
-
-#### Action Required
-
-To minimize disruption to your workloads, review the [migration guide](./migrate-v-2-v-3-migration-introduction.md) to transition your code from the version 2 API and SDKs to version 3 API and SDK before 29 February 2024.
-**After 29 February 2024**, Azure Media Services will no longer accept traffic on the version 2 REST API, the ARM account management API version 2015-10-01, or from the version 2 .NET client SDKs. This includes any third party open-source client SDKS that may call the version 2 API.
-
-See the official [Azure Updates announcement](https://azure.microsoft.com/updates/update-your-azure-media-services-rest-api-and-sdks-to-v3-by-29-february-2024/).
-
-### Standard Encoder support for v2 API features
-
-In addition to the new added support for HEVC (H.265) encoding, the following features are now available in the 2020-05-01 (or later) version of the encoding API.
--- Multiple Input File stitching is now supported using the new **JobInputClip** support.
- - An example is available for .NET showing how to [stitch two assets together](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_StitchTwoAssets).
-- Audio track selection allows customers to select and map the incoming audio tracks and route them to the output for encoding
- - See the [REST API OpenAPI for details](https://github.com/Azure/azure-rest-api-specs/blob/8d15dc681b081cca983e4d67fbf6441841d94ce4/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/Encoding.json#L385) on **AudioTrackDescriptor** and track selection
-- Track selection for encoding ΓÇô allows customers to choose tracks from an ABR source file or live archive that has multiple bitrate tracks. Extremely helpful for generating MP4s from the live event archive files.
- - See [VideoTrackDescriptor](https://github.com/Azure/azure-rest-api-specs/blob/8d15dc681b081cca983e4d67fbf6441841d94ce4/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/Encoding.json#L1562)
-- Redaction (blurring) capabilities added to FaceDetector
- - See the [Redact](https://github.com/Azure/azure-rest-api-specs/blob/8d15dc681b081cca983e4d67fbf6441841d94ce4/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/Encoding.json#L634) and [Combined](https://github.com/Azure/azure-rest-api-specs/blob/8d15dc681b081cca983e4d67fbf6441841d94ce4/specification/mediaservices/resource-manager/Microsoft.Media/stable/2020-05-01/Encoding.json#L649) modes of the FaceDetector Preset
-
-### New client SDK releases for 2020-05-01 version of the Azure Media Services API
-
-New client SDK versions for all available languages are now available with the above features.
-Please update to the latest client SDKs in your code bases using your package manager.
--- [.NET SDK package 3.0.4](https://www.nuget.org/packages/Microsoft.Azure.Management.Media/)-- [Node.js TypeScript version 8.1.0](https://www.npmjs.com/package/@azure/arm-mediaservices)-- [Python azure-mgmt-media 3.1.0](https://pypi.org/project/azure-mgmt-media/)-- [Java SDK 1.0.0-beta.2](https://search.maven.org/artifact/com.azure.resourcemanager/azure-resourcemanager-mediaservices/1.0.0-beta.2/jar)-
-### New Security features available in the 2020-05-01 version of the Azure Media Services API
--- **[Customer Managed Keys](concept-use-customer-managed-keys-byok.md)**: Content Keys and other data stored in accounts created with the "2020-05-01" version API are encrypted with an account key. Customers can provide a key to encrypt the account key.--- **[Trusted Storage](concept-trusted-storage.md)**: Media Services can be configured to access Azure Storage using a Managed Identity associated with the Media Services account. When storage accounts are accessed using a Managed Identity, customers can configure more restrictive network ACLs on the storage account without blocking Media Services scenarios.--- **[Managed Identities](concept-managed-identities.md)**: Customers may enable a System Assigned Managed Identity for a Media Services account to provide access to Key Vaults (for Customer-Managed Keys) and storage accounts (for Trusted Storage).-
-### Updated TypeScript Node.js Samples using isomorphic SDK for JavaScript
-
-The Node.js samples have been updated to use the latest isomorphic SDK. The samples now show use of TypeScript. In addition, a new live streaming sample was added for Node.js/TypeScript.
-
-See the latest samples in the **[media-services-v3-node-tutorials](https://github.com/Azure-Samples/media-services-v3-node-tutorials)** Git Hub repo.
-
-### New Live Stand-by mode to support faster startup from warm state
-
-Live Events now support a lower-cost billing mode for "stand-by". This allows customers to pre-allocate Live Events at a lower cost for the creation of "hot pools". Customers can then use the stand-by live events to transition to the Running state faster than starting from cold on creation. This reduces the time to start the channel significantly and allows for fast hot-pool allocation of machines running in a lower price mode.
-See the latest pricing details [here](https://azure.microsoft.com/pricing/details/media-services).
-For more information on the StandBy state and the other states of Live Events see the article - [Live event states and billing.](./live-event-states-billing-concept.md)
-
-## December 2020
-
-### Regional availability
-
-Azure Media Services is now available in the Norway East region in the Azure portal. There is no restV2 in this region.
-
-## October 2020
-
-### Basic Audio Analysis
-
-The Audio Analysis preset now includes a Basic mode pricing tier. The new Basic Audio Analyzer mode provides a low-cost option to extract speech transcription, and format output captions and subtitles. This mode performs speech-to-text transcription and generation of a VTT subtitle/caption file. The output of this mode includes an Insights JSON file including only the keywords, transcription, and timing information. Automatic language detection and speaker diarization are not included in this mode. See the list of [supported languages.](analyze-video-audio-files-concept.md#built-in-presets)
-
-Customers using Indexer v1 and Indexer v2 should migrate to the Basic Audio Analysis preset.
-
-For more information about the Basic Audio Analyzer mode, see [Analyzing Video and Audio files](analyze-video-audio-files-concept.md). To learn to use the Basic Audio Analyzer mode with the REST API, see [How to Create a Basic Audio Transform](transform-create-basic-audio-how-to.md).
-
-### Live Events
-
-Updates to most properties are now allowed when live events are stopped. In addition, users are allowed to specify a prefix for the static hostname for the live event's input and preview URLs. VanityUrl is now called `useStaticHostName` to better reflect the intent of the property.
-
-Live events now have a StandBy state. See [Live Events and Live Outputs in Media Services](./live-event-outputs-concept.md).
-
-A live event supports receiving various input aspect ratios. Stretch mode allows customers to specify the stretching behavior for the output.
-
-Live encoding now adds the capability of outputting fixed key frame interval fragments between 0.5 to 20 seconds.
-
-### Accounts
-
-> [!WARNING]
-> If you create a Media Services account with the 2020-05-01 API version it wonΓÇÖt work with RESTv2
-
-## August 2020
-
-### Dynamic Encryption
-
-Support for the legacy PlayReady Protected Interoperable File Format (PIFF 1.1) encryption is now available in the Dynamic Packager. This provides support for legacy Smart TV sets from Samsung and LG that implemented the early drafts of the Common Encryption standard (CENC) published by Microsoft. The PIFF 1.1 format is also known as the encryption format that was previously supported by the Silverlight client library. Today, the only use case scenario for this encryption format is to target the legacy Smart TV market where there remains a non-trivial number of Smart TVs in some regions that only support Smooth Streaming with PIFF 1.1 encryption.
-
-To use the new PIFF 1.1 encryption support, change the encryption value to 'piff' in the URL path of the Streaming Locator. For more information, see the [Content Protection overview.](drm-content-protection-concept.md)
-For Example: `https://amsv3account-usw22.streaming.media.azure.net/00000000-0000-0000-0000-000000000000/ignite.ism/manifest(encryption=piff)`|
-
-> [!NOTE]
-> PIFF 1.1 support is provided as a backwards compatible solution for Smart TV (Samsung, LG) that implemented the early "Silverlight" version of Common Encryption. It is recommended to only use the PIFF format where needed for support of legacy Samsung or LG Smart TVs shipped between 2009-2015 that supported the PIFF 1.1 version of PlayReady encryption.
-
-## July 2020
-
-### Live transcriptions
-
-Live Transcriptions now supports 19 languages and 8 regions.
-
-### Protecting your content with Media Services and Azure AD
-
-We published a tutorial called [End-to-End content protection using Azure AD](./architecture-azure-ad-content-protection.md).
-
-### High availability
-
-We published a High Availability with Media Services and Video on Demand (VOD) [overview](./architecture-high-availability-encoding-concept.md) and [sample](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/master/HighAvailabilityEncodingStreaming).
-
-## June 2020
-
-### Live Video Analytics on IoT Edge preview release
-
-The preview of Live Video Analytics on IoT Edge went public.
-
-Live Video Analytics on IoT Edge is an expansion to the Media Service family. It enables you to analyze live video with AI models of your choice on your own edge devices, and optionally capture and record that video. You can now build apps with real-time video analytics at the edge without worrying about the complexity of building and operating a live video pipeline.
-
-## May 2020
-
-Azure Media Services is now generally available in the following regions: "Germany North", "Germany West Central", "Switzerland North", and "Switzerland West". Customers can deploy Media Services to these regions using the Azure portal.
-
-## April 2020
-
-### Improvements in documentation
-
-Azure Media Player docs were migrated to the [Azure documentation](../azure-media-player/azure-media-player-overview.md).
-
-## January 2020
-
-### Improvements in media processors
--- Improved support for interlaced sources in Video Analysis ΓÇô such content is now de-interlaced correctly before being sent to inference engines.-- When generating thumbnails with the ΓÇ£BestΓÇ¥ mode, the encoder now searches beyond 30 seconds to select a frame that is not monochromatic.-
-### Azure Government cloud updates
-
-Media Services GAΓÇÖed in the following Azure Government regions: *USGov Arizona* and *USGov Texas*.
-
-## December 2019
-
-Added CDN support for *Origin-Assist Prefetch* headers for both live and video on-demand streaming; available for customers who have direct contract with Akamai CDN. Origin-Assist CDN-Prefetch feature involves the following HTTP header exchanges between Akamai CDN and Azure Media Services origin:
-
-|HTTP header|Values|Sender|Receiver|Purpose|
-| - | - | - | - | -- |
-|CDN-Origin-Assist-Prefetch-Enabled | 1 (default) or 0 |CDN|Origin|To indicate CDN is prefetch enabled|
-|CDN-Origin-Assist-Prefetch-Path| Example: <br/>Fragments(video=1400000000,format=mpd-time-cmaf)|Origin|CDN|To provide prefetch path to CDN|
-|CDN-Origin-Assist-Prefetch-Request|1 (prefetch request) or 0 (regular request)|CDN|Origin|To indicate the request from CDN is a prefetch|
-
-To see part of the header exchange in action, you can try the following steps:
-
-1. Use Postman or curl to issue a request to Media Services origin for an audio or video segment or fragment. Make sure to add the header CDN-Origin-Assist-Prefetch-Enabled: 1 in the request.
-2. In the response, you should see the header CDN-Origin-Assist-Prefetch-Path with a relative path as its value.
-
-## November 2019
-
-### Live transcription Preview
-
-Live transcription is now in public preview and available for use in the West US 2 region.
-
-Live transcription is designed to work with live events as an add-on capability. It is supported on both pass-through and Standard or Premium encoding live events. When this feature is enabled, the service uses the [Speech-To-Text](../../cognitive-services/speech-service/speech-to-text.md) feature of Cognitive Services to transcribe the spoken words in the incoming audio into text. This text is then made available for delivery along with video and audio in MPEG-DASH and HLS protocols. Billing is based on a new add-on meter that is extra cost to the live event when it is in the "Running" state. For details on Live transcription and billing, see [Live transcription](live-event-live-transcription-how-to.md)
-
-> [!NOTE]
-> Currently, live transcription is only available as a preview feature in the West US 2 region. It supports transcription of spoken words in English (en-us) only at this time.
-
-### Content protection
-
-The *Token Replay Prevention* feature released in limited regions back in September is now available in all regions.
- Media Services customers can now set a limit on the number of times the same token can be used to request a key or a license. For more information, see [Token Replay Prevention](drm-content-protection-concept.md#token-replay-prevention).
-
-### New recommended live encoder partners
-
-Added support for the following new recommended partner encoders for RTMP live streaming:
--- [Cambria Live 4.3](https://www.capellasystems.net/products/cambria-live/)-- [GoPro Hero7/8 and Max action cameras](https://gopro.com/help/articles/block/getting-started-with-live-streaming)-- [Restream.io](https://restream.io/)-
-### File Encoding enhancements
--- A new Content Aware Encoding preset is now available. It produces a set of GOP-aligned MP4s by using content-aware encoding. Given any input content, the service performs an initial lightweight analysis of the input content. It uses those results to determine the optimal number of layers, appropriate bit rate, and resolution settings for delivery by adaptive streaming. This preset is effective for low-complexity and medium-complexity videos, where the output files are at lower bit rates but at a quality that still delivers a good experience to viewers. The output will contain MP4 files with video and audio interleaved. For more information, see the [open API specs](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01/Encoding.json).-- Improved performance and multi-threading for the resizer in Standard Encoder. Under specific conditions, customer should see a performance boost between 5-40% VOD encoding. Low complexity content encoded into multiple bit-rates will see the highest performance increases. -- Standard encoding now maintains a regular GOP cadence for variable frame rate (VFR) contents during VOD encoding when using the time-based GOP setting. This means that customer submitting mixed frame rate content that varies between 15-30 fps, for example, should now see regular GOP distances calculated on output to adaptive bitrate streaming MP4 files. This will improve the ability to switch seamlessly between tracks when delivering over HLS or DASH. -- Improved AV sync for variable frame rate (VFR) source content-
-### Azure Video Analyzer for Media, Video analytics
--- Keyframes extracted using the VideoAnalyzer preset are now in the original resolution of the video instead of being resized. High-resolution keyframe extraction gives you original quality images and allows you to make use of the image-based artificial intelligence models provided by the Microsoft Computer Vision and Custom Vision services to gain even more insights from your video.-
-## September 2019
-
-### Media Services v3
-
-#### Live linear encoding of live events
-
-Media Services v3 is announcing the preview of 24 hrs x 365 days of live linear encoding of live events.
-
-### Media Services v2
-
-#### Deprecation of media processors
-
-We are announcing deprecation of *Azure Media Indexer* and *Azure Media Indexer 2 Preview*. For the retirement dates, see the [legacy components](../previous/legacy-components.md) article. Azure Video Analyzer for Media replaces these legacy media processors.
-
-For more information, see [Migrate from Azure Media Indexer and Azure Media Indexer 2 to **Azure Media Services Video Indexer**](../previous/migrate-indexer-v1-v2.md).
-
-## August 2019
-
-### Media Services v3
-
-#### South Africa regional pair is open for Media Services
-
-Media Services is now available in South Africa North and South Africa West regions.
-
-For more information, see [Clouds and regions in which Media Services v3 exists](azure-clouds-regions.md).
-
-### Media Services v2
-
-#### Deprecation of media processors
-
-We are announcing deprecation of the *Windows Azure Media Encoder* (WAME) and *Azure Media Encoder* (AME) media processors, which are being retired. For the retirement dates, see this [legacy components](../previous/legacy-components.md) article.
-
-For details, see [Migrate WAME to Media Encoder Standard](../previous/migrate-windows-azure-media-encoder.md) and [Migrate AME to Media Encoder Standard](../previous/migrate-azure-media-encoder.md).
-
-## July 2019
-
-### Content protection
-
-When streaming content protected with token restriction, end users need to obtain a token that is sent as part of the key delivery request. The *Token Replay Prevention* feature allows Media Services customers to set a limit on how many times the same token can be used to request a key or a license. For more information, see [Token Replay Prevention](drm-content-protection-concept.md#token-replay-prevention).
-
-As of July, the preview feature was only available in US Central and US West Central.
-
-## June 2019
-
-### Video subclipping
-
-You can now trim or subclip a video when encoding it using a [Job](/rest/api/media/jobs).
-
-This functionality works with any [Transform](/rest/api/media/transforms) that is built using either the [BuiltInStandardEncoderPreset](/rest/api/media/transforms/createorupdate#builtinstandardencoderpreset) presets, or the [StandardEncoderPreset](/rest/api/media/transforms/createorupdate#standardencoderpreset) presets.
-
-See examples:
-
-* [Subclip a video with REST](transform-subclip-video-how-to.md)
-
-## May 2019
-
-### Azure Monitor support for Media Services diagnostic logs and metrics
-
-You can now use Azure Monitor to view telemetry data emitted by Media Services.
-
-* Use the Azure Monitor diagnostic logs to monitor requests sent by the Media Services Key Delivery endpoint.
-* Monitor metrics emitted by Media Services [Streaming Endpoints](stream-streaming-endpoint-concept.md).
-
-For details, see [Monitor Media Services metrics and diagnostic logs](monitoring/monitor-media-services-data-reference.md).
-
-### Multi audio tracks support in Dynamic Packaging
-
-When streaming Assets that have multiple audio tracks with multiple codecs and languages, [Dynamic Packaging](encode-dynamic-packaging-concept.md) now supports multi audio tracks for the HLS output (version 4 or above).
-
-### Korea regional pair is open for Media Services
-
-Media Services is now available in Korea Central and Korea South regions.
-
-For more information, see [Clouds and regions in which Media Services v3 exists](azure-clouds-regions.md).
-
-### Performance improvements
-
-Added updates that include Media Services performance improvements.
-
-* The maximum file size supported for processing was updated. See, [Quotas, and limits](limits-quotas-constraints-reference.md).
-* [Encoding speeds improvements](concept-media-reserved-units.md).
-
-## April 2019
-
-### New presets
-
-* [FaceDetectorPreset](/rest/api/media/transforms/createorupdate#facedetectorpreset) was added to the built-in analyzer presets.
-* [ContentAwareEncodingExperimental](/rest/api/medi).
-
-## March 2019
-
-Dynamic Packaging now supports Dolby Atmos. For more information, see [Audio codecs supported by dynamic packaging](encode-dynamic-packaging-concept.md#audio-codecs-supported-by-dynamic-packaging).
-
-You can now specify a list of asset or account filters, which would apply to your Streaming Locator. For more information, see [Associate filters with Streaming Locator](filters-concept.md#associating-filters-with-streaming-locator).
-
-## February 2019
-
-Media Services v3 is now supported in Azure national clouds. Not all features are available in all clouds yet. For details, see [Clouds and regions in which Azure Media Services v3 exists](azure-clouds-regions.md).
-
-[Microsoft.Media.JobOutputProgress](monitoring/media-services-event-schemas.md#monitoring-job-output-progress) event was added to the Azure Event Grid schemas for Media Services.
-
-## January 2019
-
-### Media Encoder Standard and MPI files
-
-When encoding with Media Encoder Standard to produce MP4 file(s), a new .mpi file is generated and added to the output Asset. This MPI file is intended to improve performance for [dynamic packaging](encode-dynamic-packaging-concept.md) and streaming scenarios.
-
-You should not modify or remove the MPI file, or take any dependency in your service on the existence (or not) of such a file.
-
-## December 2018
-
-Updates from the GA release of the V3 API include:
-
-* The **PresentationTimeRange** properties are no longer 'required' for **Asset Filters** and **Account Filters**.
-* The $top and $skip query options for **Jobs** and **Transforms** have been removed and $orderby was added. As part of adding the new ordering functionality, it was discovered that the $top and $skip options had accidentally been exposed previously even though they are not implemented.
-* Enumeration extensibility was re-enabled. This feature was enabled in the preview versions of the SDK and got accidentally disabled in the GA version.
-* Two predefined streaming policies have been renamed. **SecureStreaming** is now **MultiDrmCencStreaming**. **SecureStreamingWithFairPlay** is now **Predefined_MultiDrmStreaming**.
-
-## November 2018
-
-The CLI 2.0 module is now available for [Azure Media Services v3 GA](/cli/azure/ams) ΓÇô v 2.0.50.
-
-### New commands
--- [az ams account](/cli/azure/ams/account)-- [az ams account-filter](/cli/azure/ams/account-filter)-- [az ams asset](/cli/azure/ams/asset)-- [az ams asset-filter](/cli/azure/ams/asset-filter)-- [az ams content-key-policy](/cli/azure/ams/content-key-policy)-- [az ams job](/cli/azure/ams/job)-- [az ams live-event](/cli/azure/ams/live-event)-- [az ams live-output](/cli/azure/ams/live-output)-- [az ams streaming-endpoint](/cli/azure/ams/streaming-endpoint)-- [az ams streaming-locator](/cli/azure/ams/streaming-locator)-- [az ams account mru](/cli/azure/ams/account/mru) - enables you to manage Media Reserved Units. For more information, see [Scale Media Reserved Units](media-reserved-units-how-to.md).-
-### New features and breaking changes
-
-#### Asset commands
--- ```--storage-account``` and ```--container``` arguments added.-- Default values for expiry time (Now+23h) and permissions (Read) in ```az ams asset get-sas-url``` command added.-
-#### Job commands
--- ```--correlation-data``` and ```--label``` arguments added-- ```--output-asset-names``` renamed to ```--output-assets```. Now it accepts a space-separated list of assets in 'assetName=label' format. An asset without label can be sent like this: 'assetName='.-
-#### Streaming Locator commands
--- ```az ams streaming locator``` base command replaced with ```az ams streaming-locator```.-- ```--streaming-locator-id``` and ```--alternative-media-id support``` arguments added.-- ```--content-keys argument``` argument updated.-- ```--content-policy-name``` renamed to ```--content-key-policy-name```.-
-#### Streaming Policy commands
--- ```az ams streaming policy``` base command replaced with ```az ams streaming-policy```.-- Encryption parameters support in ```az ams streaming-policy create``` added.-
-#### Transform commands
--- ```--preset-names``` argument replaced with ```--preset```. Now you can only set 1 output/preset at a time (to add more you have to run ```az ams transform output add```). Also, you can set custom StandardEncoderPreset by passing the path to your custom JSON.-- ```az ams transform output remove``` can be performed by passing the output index to remove.-- ```--relative-priority, --on-error, --audio-language and --insights-to-extract``` arguments added in ```az ams transform create``` and ```az ams transform output add``` commands.-
-## October 2018 - GA
-
-This section describes Azure Media Services (AMS) October updates.
-
-### REST v3 GA release
-
-The [REST v3 GA release](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01) includes more APIs for Live, Account/Asset level manifest filters, and DRM support.
-
-#### Azure Resource Management
-
-Support for Azure Resource Management enables unified management and operations API (now everything in one place).
-
-Starting with this release, you can use Resource Manager templates to create Live Events.
-
-#### Improvement of Asset operations
-
-The following improvements were introduced:
--- Ingest from HTTP(s) URLs or Azure Blob Storage SAS URLs.-- Specify your own container names for Assets.-- Easier output support to create custom workflows with Azure Functions.-
-#### New Transform object
-
-The new **Transform** object simplifies the Encoding model. The new object makes it easy to create and share encoding Resource Manager templates and presets.
-
-#### Azure Active Directory authentication and Azure RBAC
-
-Azure AD Authentication and Azure role-based access control (Azure RBAC) enable secure Transforms, LiveEvents, Content Key Policies, or Assets by Role or Users in Azure AD.
-
-#### Client SDKs
-
-Languages supported in Media Services v3: .NET Core, Java, Node.js, Ruby, TypeScript, Python, Go.
-
-#### Live encoding updates
-
-The following live encoding updates are introduced:
--- New low latency mode for live (10 seconds end-to-end).-- Improved RTMP support (increased stability and more source encoder support).-- RTMPS secure ingest.-
- When you create a Live Event, you now get 4 ingest URLs. The 4 ingest URLs are almost identical, have the same streaming token (AppId), only the port number part is different. Two of the URLs are primary and backup for RTMPS.
-- 24-hour transcoding support. -- Improved ad-signaling support in RTMP via SCTE35.-
-#### Improved Event Grid support
-
-You can see the following Event Grid support improvements:
--- Azure Event Grid integration for easier development with Logic Apps and Azure Functions. -- Subscribe for events on Encoding, Live Channels, and more.-
-### CMAF support
-
-CMAF and 'cbcs' encryption support for Apple HLS (iOS 11+) and MPEG-DASH players that support CMAF.
-
-### Video Indexer
-
-Video Indexer GA release was announced in August. For new information about currently supported features, see [What is Video Indexer](../../azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md?bc=%2fazure%2fmedia-services%2fvideo-indexer%2fbreadcrumb%2ftoc.json&toc=%2fazure%2fmedia-services%2fvideo-indexer%2ftoc.json).
-
-### Plans for changes
-
-#### Azure CLI 2.0
-
-The Azure CLI 2.0 module that includes operations on all features (including Live, Content Key Policies, Account/Asset Filters, Streaming Policies) is coming soon.
-
-### Known issues
-
-Only customers that used the preview API for Asset or AccountFilters are impacted by the following issue.
-
-If you created Assets or Account Filters between 09/28 and 10/12 with Media Services v3 CLI or APIs, you need to remove all Asset and AccountFilters and re-create them due to a version conflict.
-
-## May 2018 - Preview
-
-### .NET SDK
-
-The following features are present in the .NET SDK:
-
-* **Transforms** and **Jobs** to encode or analyze media content. For examples, see [Stream files](stream-files-tutorial-with-api.md) and [Analyze](analyze-videos-tutorial.md).
-* **Streaming Locators** for publishing and streaming content to end-user devices
-* **Streaming Policies** and **Content Key Policies** to configure key delivery and content protection (DRM) when delivering content.
-* **Live Events** and **Live Outputs** to configure the ingest and archiving of live streaming content.
-* **Assets** to store and publish media content in Azure Storage.
-* **Streaming Endpoints** to configure and scale dynamic packaging, encryption, and streaming for both live and on-demand media content.
-
-### Known issues
-
-* When submitting a job, you can specify to ingest your source video using HTTPS URLs, SAS URLs, or paths to files located in Azure Blob storage. Currently, Media Services v3 does not support chunked transfer encoding over HTTPS URLs.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
-
-[Migration guidance for moving from Media Services v2 to v3](migrate-v-2-v-3-migration-introduction.md).
-
-## Next steps
--- [Overview](media-services-overview.md)-- [Media Services v2 release notes](../previous/media-services-release-notes.md)
media-services Samples Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/samples-overview.md
- Title: Media Services v3 samples
-description: This article contains a list of all the samples available for Media Services v3 organized by method and SDK. Samples include .NET, Node.JS, Python, and Java, also REST with Postman.
----- Previously updated : 01/14/2022---
-# Media Services v3 samples
--
-This article contains a list of all the samples available for Media Services organized by method and SDK. Samples include .NET, Node.js (TypeScript), Python, Java, and also examples using REST with Postman.
-
-## Samples by SDK
-
-You'll find description and links to the samples you may be looking for in each of the tabs.
-
-## [Node.JS (TypeScript)](#tab/node/)
-
-|Sample|Description|
-|||
-|[Create an account from code](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Account/create-account.ts)|The sample shows how to create a Media Services account and set the primary storage account, in addition to advanced configuration settings including Key Delivery IP allowlist, Managed Identity, storage auth, and bring your own encryption key.|
-|[Create an account with user assigned managed identity code](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Account/create-account_with_managed_identity.ts)|The sample shows how to create a Media Services account and set the primary storage account, in addition to advanced configuration settings including Key Delivery IP allowlist, user or system assigned Managed Identity, storage auth, and bring your own encryption key.|
-|[Hello World - list assets](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/HelloWorld-ListAssets/list-assets.ts)|Basic example of how to connect and list assets |
-|[Live streaming with Standard Passthrough](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/Standard_Passthrough_Live_Event/index.ts)| Standard passthrough live streaming example. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Live streaming with Standard Passthrough with Event Hubs](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/Standard_Passthrough_Live_Event_with_EventHub/index.ts)| Demonstrates how to use Event Hubs to subscribe to events on the live streaming channel. Events include encoder connections, disconnections, heartbeat, latency, discontinuity, and drift issues. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Live streaming with Basic Passthrough](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/Basic_Passthrough_Live_Event/index.ts)| Shows how to set up the basic passthrough live event if you only need to broadcast a low-cost UGC channel. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Live streaming with 720P Standard encoding](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/720P_Encoding_Live_Event/index.ts)| Use live encoding in the cloud with the 720P HD adaptive bitrate encoding preset. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Live streaming with 1080P encoding](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/720P_Encoding_Live_Event/index.ts)| Use live encoding in the cloud with the 1080P HD adaptive bitrate encoding preset. **WARNING**, make sure to check that all resources are cleaned up and no longer billing in portal when using live|
-|[Upload and stream HLS and DASH](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/StreamFilesSample/index.ts)| Basic example for uploading a local file or encoding from a source URL. Sample shows how to use storage SDK to download content, and shows how to stream to a player |
-|[Upload and stream HLS and DASH with PlayReady and Widevine DRM](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/StreamFilesWithDRMSample/index.ts)| Demonstrates how to encode and stream using Widevine and PlayReady DRM |
-|[Upload and use AI to index videos and audio](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoAnalytics/index.ts)| Example of using the Video and Audio Analyzer presets to generate metadata and insights from a video or audio file |
-|[Create Transform, use Job preset overrides (v2-to-v3 API migration)](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/CreateTransform_Job_PresetOverride/index.ts)| If you need a workflow where you desire to submit custom preset jobs to a single queue, you can use this base sample that shows how to create a (mostly) empty Transform, and then you can use the preset override property on the Job to submit custom presets to the same transform. This allows you to treat the v3 AMS API a lot more like the legacy v2 API Job queue if you desire.|
-|[Basic Encoding with H264](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264/index.ts)| Shows how to use the standard encoder to encode a source file into H264 format with AAC audio and PNG thumbnails |
-|[Basic Encoding with H264 with Event Hubs/Event Grid](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264%20_with_EventHub/index.ts)| Shows how to use the standard encoder and receive and process Event Grid events from Media Services through an Event Hubs. First set up an Event Grid subscription that pushes events into an Event Hubs using the Azure portal or CLI to use this sample. |
-|[Sprite Thumbnail (VTT) in JPG format](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_Sprite_Thumbnail/index.ts)| Shows how to generate a VTT Sprite Thumbnail in JPG format and how to set the columns and number of images. This also shows a speed encoding mode in H264 for a 720P layer. |
-|[Content Aware encoding with H264](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264_ContentAware/index.ts)| Example of using the standard encoder with Content Aware encoding to automatically generate the best quality adaptive bitrate streaming set based on an analysis of the source files contents|
-|[Content Aware encoding Constrained with H264](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264_ContentAware_Constrained/index.ts)| Demonstrates how to control the output settings of the Content Aware encoding preset to make the outputs more deterministic to your encoding needs and costs. This will still auto generate the best quality adaptive bitrate streaming set based on an analysis of the source files contents, but constrain the output to your desired ranges.|
-|[Overlay Image](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264_OverlayImage/index.ts)| Shows how to upload an image file and overlay on top of video with output to MP4 container|
-|[Rotate Video](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264_Rotate90degrees/index.ts)| Shows how to use the rotation filter to rotate a video by 90 degrees. |
-|[Output to Transport Stream format](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_H264_To_TransportStream/index.ts)| Shows how to use the standard encoder to encode a source file and output to MPEG Transport Stream format using H264 format with AAC audio and PNG thumbnail|
-|[Basic Encoding with HEVC](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_HEVC/index.ts)| Shows how to use the standard encoder to encode a source file into HEVC format with AAC audio and PNG thumbnails |
-|[Content Aware encoding with HEVC](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_HEVC_ContentAware/index.ts)| Example of using the standard encoder with Content Aware encoding to automatically generate the best quality HEVC (H.265) adaptive bitrate streaming set based on an analysis of the source files contents|
-|[Content Aware encoding Constrained with HEVC](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_HEVC_ContentAware_Constrained/index.ts)| Demonstrates how to control the output settings of the Content Aware encoding preset to make the outputs more deterministic to your encoding needs and costs. This will still auto generate the best quality adaptive bitrate streaming set based on an analysis of the source files contents, but constrain the output to your desired ranges.|
-|[Bulk encoding from a remote Azure storage account using SAS URLs](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoEncoding/Encoding_Bulk_Remote_Storage_Account_SAS/index.ts)| This samples shows how you can point to a remote Azure Storage account using a SAS URL and submit batches of encoding jobs to your account, monitor progress, and continue. You can modify the file extension types to scan for (e.g - .mp4, .mov) and control the batch size submitted. You can also modify the Transform used in the batch operation. This sample demonstrates the use of SAS URL's as ingest sources to a Job input. Make sure to configure the REMOTESTORAGEACCOUNTSAS environment variable in the .env file for this sample to work.|
-| [Video Analytics](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/VideoAnalytics/index.ts)|This sample illustrates how to create a video and audio analyzer transform, upload a video file to an input asset, submit a job with the transform and download the results for verification.|
-| [Audio Analytics basic with per-job language override](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/AudioAnalytics/index.ts)|This sample illustrates how to create a audio analyzer transform using the basic mode. It also shows how you can override the preset language on a per-job basis to avoid creating a transform for every language. It also shows how to upload a media file to an input asset, submit a job with the transform and download the results for verification.|
-
-## [.NET](#tab/net/)
-
-| Sample | Description |
-|-|-|
-| [Account/CreateAccount](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Account/CreateAccount)|The sample shows how to create a Media Services account and set the primary storage account, in addition to advanced configuration settings including Key Delivery IP allowlist, Managed Identity, storage auth, and bring your own encryption key.|
-| [VideoEncoding/Encoding_PredefinedPreset](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_PredefinedPreset)|The sample shows how to submit a job using a built-in preset and an HTTP URL input, publish output asset for streaming, and download results for verification.|
-| [VideoEncoding/Encoding_H264_ContentAware](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264_ContentAware) | Demonstrates the most basic use of H.264 content-aware encoding without any constraints |
-| [VideoEncoding/Encoding_H264_ContentAware_Constrained](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264_ContentAware_Constrained) | Demonstrates how to use the PresetConfigurations class to constrain the output behavior of the preset|
-| [VideoEncoding/Encoding_H264](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264)|The sample shows how to submit a job using a custom H.264 encoding preset and an HTTP URL input, publish output asset for streaming, and download results for verification.|
-| [VideoEncoding/Encoding_HEVC_ContentAware](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_HEVC_ContentAware) | Shows basic usage of the HEVC codec with content-aware encoding and no constraints. The PresetConfigurations class is also supported for HEVC and can be added to this sample|
-| [VideoEncoding/Encoding_HEVC](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_HEVC)|The sample shows how to submit a job using a custom HEVC encoding preset and an HTTP URL input, publish output asset for streaming, and download results for verification.|
-| [VideoEncoding/Encoding_StitchTwoAssets](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_StitchTwoAssets)|The sample shows how to submit a job using the JobInputSequence to stitch together two or more assets that may be clipped by start or end time. The resulting encoded file is a single video with all assets stitched together. The sample will also publish output asset for streaming and download results for verification.|
-| [VideoEncoding/Encoding_SpriteThumbnail](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_SpriteThumbnail)|The sample shows how to submit a job using a custom preset with a thumbnail sprite and an HTTP URL input, publish output asset for streaming, and download results for verification.|
-| [Live/LiveEventWithDVR](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Live/LiveEventWithDVR)|This sample first shows how to create a LiveEvent with a full archive up to 25 hours and a filter on the asset with 5-minutes DVR window, then it shows how to use the filter to create a locator for streaming.|
-| [VideoAnalytics/VideoAnalyzer](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoAnalytics/VideoAnalyzer)|This sample illustrates how to create a video analyzer transform, upload a video file to an input asset, submit a job with the transform, and download the results for verification.|
-| [AudioAnalytics/AudioAnalyzer](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/AudioAnalytics/AudioAnalyzer)|This sample illustrates how to create an audio analyzer transform, upload a media file to an input asset, submit a job with the transform, and download the results for verification.|
-| [ContentProtection/BasicAESClearKey](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/BasicAESClearKey)|This sample demonstrates how to create a transform with built-in AdaptiveStreaming preset, submit a job, create a ContentKeyPolicy using a secret key, associate the ContentKeyPolicy with StreamingLocator, get a token, and print a url for playback in Azure Media Player. When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content with AES-128 and Azure Media Player uses the token to decrypt.|
-| [ContentProtection/BasicWidevine](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/BasicWidevine)|This sample demonstrates how to create a transform with built-in AdaptiveStreaming preset, submit a job, create a ContentKeyPolicy with Widevine configuration using a secret key, associate the ContentKeyPolicy with StreamingLocator, get a token, and print a url for playback in a Widevine Player. When a user requests Widevine-protected content, the player application requests a license from the Media Services license service. If the player application is authorized, the Media Services license service issues a license to the player. A Widevine license contains the decryption key that can be used by the client player to decrypt and stream the content.|
-| [ContentProtection/BasicPlayReady](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/BasicPlayReady)|This sample demonstrates how to create a transform with built-in AdaptiveStreaming preset, submit a job, create a ContentKeyPolicy with PlayReady configuration using a secret key, associate the ContentKeyPolicy with StreamingLocator, get a token, and print a url for playback in an Azure Media Player. When a user requests PlayReady-protected content, the player application requests a license from the Media Services license service. If the player application is authorized, the Media Services license service issues a license to the player. A PlayReady license contains the decryption key that can be used by the client player to decrypt and stream the content.|
-| [ContentProtection/OfflinePlayReadyAndWidevine](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/ContentProtection/OfflinePlayReadyAndWidevine)|This sample demonstrates how to dynamically encrypt your content with PlayReady and Widevine DRM and play the content without requesting a license from license service. It shows how to create a transform with built-in AdaptiveStreaming preset, submit a job, create a ContentKeyPolicy with open restriction and PlayReady/Widevine persistent configuration, associate the ContentKeyPolicy with a StreamingLocator and print a url for playback.|
-| [Streaming/AssetFilters](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/AssetFilters)|This sample demonstrates how to create a transform with built-in AdaptiveStreaming preset, submit a job, create an asset-filter and an account-filter, associate the filters to streaming locators and print urls for playback.|
-| [Streaming/StreamHLSAndDASH](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamHLSAndDASH)|This sample demonstrates how to create a transform with built-in AdaptiveStreaming preset, submit a job, publish output asset for HLS and DASH streaming.|
-| [HighAvailabilityEncodingStreaming](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/HighAvailabilityEncodingStreaming/) | This sample provides guidance and best practices for a production system using on-demand encoding or analytics. Readers should start with the companion article [High Availability with Media Services and VOD](architecture-high-availability-encoding-concept.md). There is a separate solution file provided for the [HighAvailabilityEncodingStreaming](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/HighAvailabilityEncodingStreaming/README.md) sample. |
-| [Azure Functions for Media Services](https://github.com/xpouyat/media-services-v3-dotnet-core-functions-integration/tree/main/Functions)|This project contains examples of Azure Functions that connect to Azure Media Services v3 for video processing. You can use Visual Studio 2019 or Visual Studio Code to develop and run the functions. An Azure Resource Manager (ARM) template and a GitHub Actions workflow are provided for the deployment of the Function resources and to enable continuous deployment.|
-
-## [Python](#tab/python)
-
-|Sample|Description|
-|||
-|[Basic Encoding with Python](https://github.com/Azure-Samples/media-services-v3-python)| Basic example for uploading a local file or encoding from a source URL. Sample shows how to use storage SDK to download content, and shows how to stream to a player |
-|[Face Redaction using events and functions](https://github.com/Azure-Samples/media-services-v3-python/tree/main/VideoAnalytics/FaceRedactorEventBased)| This is an example of an event-based approach that triggers an Azure Media Services Face Redactor job on a video as soon as it lands on an Azure Storage Account. It uses Azure Media Services, Azure Function, Event Grid, and Azure Storage for the solution. For the full description of the solution, see the [README.md](https://github.com/Azure-Samples/media-services-v3-python/blob/main/VideoAnalytics/FaceRedactorEventBased/README.md)|
--
-## [Java](#tab/java)
-
-|Sample|Description|
-|||
-|[AudioAnalytics/AudioAnalyzer/](https://github.com/Azure-Samples/media-services-v3-java/tree/master/AudioAnalytics/AudioAnalyzer)|How to analyze audio in a media file. |
-|Content Protection|
-|ContentProtection||
-|[BasicAESClearKey](https://github.com/Azure-Samples/media-services-v3-java/tree/master/ContentProtection/BasicAESClearKey)|How to dynamically encrypt your content with AES-128.|
-|[BasicPlayReady](https://github.com/Azure-Samples/media-services-v3-java/tree/master/ContentProtection/BasicPlayReady)|How to dynamically encrypt your content with PlayReady DRM.|
-|[BasicWidevine](https://github.com/Azure-Samples/media-services-v3-java/tree/master/ContentProtection/BasicWidevine)|How to dynamically encrypt your content with Widevine DRM.|
-|[OfflineFairPlay](https://github.com/Azure-Samples/media-services-v3-java/tree/master/ContentProtection/OfflineFairPlay)|How to dynamically encrypt your content with FairPlay DRM and play the content without requesting a license from license service.|
-|[OfflinePlayReadyAndWidevine](https://github.com/Azure-Samples/media-services-v3-java/tree/master/ContentProtection/OfflinePlayReadyAndWidevine)|How to dynamically encrypt your content with PlayReady and Widevine DRM and play the content without requesting a license from license service.|
-|DynamicPackagingVODContent||
-|[AssetFilters](https://github.com/Azure-Samples/media-services-v3-java/tree/master/DynamicPackagingVODContent/AssetFilters)|How to filter content using asset and account filters.|
-|[StreamHLSAndDASH](https://github.com/Azure-Samples/media-services-v3-java/tree/master/DynamicPackagingVODContent/StreamHLSAndDASH)|How to dynamically package VOD content into HLS/DASH for streaming.|
-|[LiveIngest/LiveEventWithDVR](https://github.com/Azure-Samples/media-services-v3-java/tree/master/LiveIngest/LiveEventWithDVR)|How to create a Live Event with a full archive up to 25 hours and a filter on the asset with 5-minutes DVR window and how to create a locator for streaming and use the filter.|
-|[VideoAnalytics/VideoAnalyzer](https://github.com/Azure-Samples/media-services-v3-java/tree/master/VideoAnalytics/VideoAnalyzer)|How to create a Live Event with a full archive up to 25 hours and a filter on the asset with 5-minutes DVR window and how to create a locator for streaming and use the filter.|
-|VideoEncoding||
-|[EncodingWithMESCustomPreset](https://github.com/Azure-Samples/media-services-v3-java/tree/master/VideoEncoding/EncodingWithMESCustomPreset)|How to create a custom encoding Transform using the StandardEncoderPreset settings.|
-|[EncodingWithMESPredefinedPreset](https://github.com/Azure-Samples/media-services-v3-java/tree/master/VideoEncoding/EncodingWithMESPredefinedPreset)|How to submit a job using a built-in preset and an HTTP URL input, publish output asset for streaming, and download results for verification.|
---
-## REST Postman collection
-
-The [REST Postman](https://github.com/Azure-Samples/media-services-v3-rest-postman) samples include a Postman environment and collection for you to import into the Postman client. The Postman collection samples are recommended for getting familiar with the API structure and how it works with Azure Resource Management (ARM), and the structure of calls from the client SDKs.
-
media-services Security Access Storage Managed Identity Cli Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-access-storage-managed-identity-cli-tutorial.md
- Title: Access storage with a Media Services Managed Identity
-description: If you would like to access a storage account when the storage account is configured to block requests from unknown IP addresses, the Media Services account must be granted access to the Storage account. Follow the steps below to create a Managed Identity for the Media Services account and grant this identity access to storage using the Media Services CLI.
----- Previously updated : 05/17/2021---
-# Tutorial: Access storage with a Media Services Managed Identity
---
-If you would like to access a storage account when the storage account is configured to block requests from unknown IP addresses, the Media Services account must be granted access to the Storage account. Follow the steps below to create a Managed Identity for the Media Services account and grant this identity access to storage using the Media Services CLI.
--
-This tutorial uses the 2020-05-01 Media Services API.
-
-## Sign in to Azure
-
-To use any of the commands in this article, you first have to be signed in to the subscription that you want to use.
-
- [!INCLUDE [Sign in to Azure with the CLI](./includes/task-sign-in-azure-cli.md)]
-
-### Set subscription
-
-Use this command to set the subscription that you want to work with.
--
-## Resource names
-
-Before you get started, decide on the names of the resources you'll create. They should be easily identifiable as a set, especially if you are not planning to use them after you are done testing. Naming rules are different for many resource types so it's best to stick with all lower case. For example, "mediatest1rg" for your resource group name and "mediatest1stor" for your storage account name. Use the same names for each step in this article.
-
-You'll see these names referenced in the commands below. The names of resources you'll need are:
--- myRG-- myStorageAccount-- myAmsAccount-- location-
-> [!NOTE]
-> The hyphens above are only used to separate guidance words. Because of the inconsistency of naming resources in Azure services, don't use hyphens when you name your resources.
-> Also, you don't create the region name. The region name is determined by Azure.
-
-### List Azure regions
-
-If you're not sure of the actual region name to use, use this command to get a listing:
--
-## Sequence
-
-Each of the steps below is done in a particular order because one or more values from the JSON responses are used in the next step in the sequence.
-
-## Create a Storage account
-
-The Media Services account you'll create must have a storage account associated with it. Create the storage account for the Media Services account first. You'll use the storage account name that replaces `myStorageAccount` for subsequent steps.
--
-## Create a Media Services account with a Service Principal (Managed Identity)
-
-Now create the Media Services account with a Service Principal, otherwise known as a Managed Identity.
-
-> [!IMPORTANT]
-> It is important that you remember to use the --mi flag in the command. Otherwise you will not be able to find the `principalId` for a later step.
--
-## Grant the Media Services Managed Identity access to the Storage account
-
-Grant the Media Services Managed Identity access to the Storage account. There are three commands:
-
-### Get (show) the Managed Identity of the Media Services account
-
-The first command below shows the Managed Identity of the Media Services account which is the `principalId` listed in the JSON returned by the command.
--
-### Create the Storage Blob Contributor role assignment
--
-### Create the Reader role assignment
--
-## Use the Managed Identity to access the Storage account
--
-## Validation
-
-To verify the account is encrypted using a Customer Managed Key, view the account encryption properties:
--
-The `storageAuthentication` property should show ΓÇ£ManagedIdentityΓÇ¥.
-
-For additional validation, you can check the Azure Storage logs to see which authentication method is used for each request.
-
-## Clean up resources
-
-If you aren't planning to use the resources you created, delete the resource group.
-
media-services Security Azure Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-azure-policy.md
- Title: Azure Policy built-in support in Media Services
-description: This article discusses Azure Policy built-in support for Azure Media Services scenarios.
------ Previously updated : 08/31/2020----
-# Azure Policy for Media Services
--
-Azure Media Services provides several built-in [Azure Policy](../../governance/policy/overview.md) definitions to help enforce organizational standards and compliance at-scale.
-Common use cases for Azure Policy include implementing governance for resource consistency,regulatory compliance, security, cost and management.
-
-Media Services provides several common use case definitions for Azure Policy that a built-in to help you get started.
-
-## Built-in Azure Policy definitions for Media Services
-
-Several built in policy use case definitions are available for use with Media Services to help get you started, and allow you to define your own custom policies.
--
-The [list of built-in policy definitions for Media Services](../../governance/policy/samples/built-in-policies.md#media-services) provides the latest definitions and links the code definitions and how to access them in the Portal.
-
-## Common scenarios that require Azure Policy
-
-* If your enterprise security requires you to ensure that all Media Services accounts are created with Private Links, you can use a policy definition to ensure that accounts are only created with the 2020-05-01 API (or later) to disable access to the legacy REST v2 API and access the Private Link feature.
-* If you want to enforce specific options on the tokens used for Content Key Policies, an Azure Policy definition can be constructed to support the specific requirements.
-* If your security goals require you to restrict a Job input source to only come from your trusted storage accounts, and restrict access to external HTTP(S) inputs through the use of JobInputHttp, an Azure policy can be constructed to limit the input URI pattern.
-
-## Example policy definitions
-
-Azure Media Services maintains and publishes a set of sample Azure Policy definitions in Git hub.
-See the [built-in policy definitions for Media Services](https://github.com/Azure/azure-policy/tree/master/built-in-policies/policyDefinitions/Media%20Services) samples in the azure-policy Git hub repository.
-
-See the following articles for more information:
--- [What is Azure Policy](../../governance/policy/overview.md)-- [Quickstart:Create a policy in the Portal](../../governance/policy/assign-policy-portal.md)-- [List of built-in policy definitions for Media Services](../../governance/policy/samples/built-in-policies.md#media-services)-
-## Next steps
--- [Developing with Media Services v3 APIs](media-services-apis-overview.md)-- [Role based access control in Media Services](security-rbac-concept.md)-- [Private link how-to with Media Services](security-private-link-how-to.md)
media-services Security Customer Managed Keys Portal Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-customer-managed-keys-portal-tutorial.md
- Title: Use customer-managed keys or BYOK portal
-description: In this tutorial, use the Azure portal to enable customer-managed keys or bring your own key (BYOK) with an Azure Media Services storage account.
---- Previously updated : 10/18/2020--
-# Tutorial: Use the Azure portal to use customer-managed keys or BYOK with Media Services
-
-With the 2020-05-01 or later version of the API, you can use a customer-managed RSA key with an Azure Media Services account that has a system-managed identity.This tutorial covers the steps in the Azure portal.
-
-The services used are:
--- Azure Storage-- Azure Key Vault-- Azure Media Services-
-In this tutorial, you'll learn to use the Azure portal to:
-
-> [!div class="checklist"]
-> - Create a resource group.
-> - Create a storage account with a system-managed identity.
-> - Create a Media Services account with a system-managed identity.
-> - Create a key vault for storing a customer-managed RSA key.
-
-## Prerequisites
-
-An Azure subscription.
-
-If you don't have an Azure subscription, [create a free trial account](https://azure.microsoft.com/free/).
-
-## System-managed keys
-
-<!-- Create a resource group -->
-
-> [!IMPORTANT]
-> For the following storage account creation steps, you will select the system-managed key choice in Advanced settings.
-
-<!-- Create a media services account -->
--
-<!-- Create a key vault -->
--
-<!-- Enable CMK BYOK on the account -->
-
-> [!IMPORTANT]
-> For the following storage encryption steps, you will select the **customer-managed key choice**.
-
-<!-- Set encryption for storage account -->
-
-## Change the key
-
-Media Services automatically detects when the key is changed. OPTIONAL: To test this process, create another key version for the same key. Media Services should detect that the key has been changed.
-
-## Clean up resources
-
-If you're not going to continue to use the resources that you created and *you don't want to continue to be billed*, delete them.
-
-## Next steps
-
-Go to the next article to learn how to:
-> [!div class="nextstepaction"]
-> [Encode a remote file based on URL and stream the video with REST](stream-files-tutorial-with-rest.md)
media-services Security Customer Managed Keys Rest Postman Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-customer-managed-keys-rest-postman-tutorial.md
- Title: Use customer-managed keys or BYOK REST API
-description: In this tutorial, use customer-managed keys or bring your own key (BYOK) with an Azure Media Services storage account.
---- Previously updated : 10/18/2020--
-# Tutorial: Use customer-managed keys or BYOK with Media Services REST API
-
-With the 2020-05-01 API, you can use a customer-managed RSA key with an Azure Media Services account that has a system-managed identity.This tutorial includes a Postman collection and environment to send REST requests to Azure services. The services used are:
--- Azure Active Directory (Azure AD) application registration for Postman-- Microsoft Graph API-- Azure Storage-- Azure Key Vault-- Azure Media Services-
-In this tutorial, you'll learn to use Postman to:
-
-> [!div class="checklist"]
-> - Get tokens for use with Azure services.
-> - Create a resource group and a storage account.
-> - Create a Media Services account with a system-managed identity.
-> - Create a key vault for storing a customer-managed RSA key.
-> - Update the Media Services account to use the RSA key with the storage account.
-> - Use variables in Postman.
-
-If you don't have an Azure subscription, [create a free trial account](https://azure.microsoft.com/free/).
-
-## Prerequisites
-
-1. Register a service principal with the appropriate permissions.
-1. Install [Postman](https://www.postman.com).
-1. Download the Postman collection for this tutorial at [Azure Samples: media-services-customer-managed-keys-byok](https://github.com/Azure-Samples/media-services-customer-managed-keys-byok).
-
-### Register a service principal with the needed permissions
-
-1. [Create a service principal](../../active-directory/develop/howto-create-service-principal-portal.md).
-1. Go to [Option 2: Create a new application secret](../../active-directory/develop/howto-create-service-principal-portal.md#authentication-two-options) to get the service principal secret.
-
- > [!IMPORTANT]
- >Copy and save the secret for later use. You can't access the secret after you leave the secret page in the portal.
-
-1. Assign permissions to the service principal, as shown in the following screenshot:
-
- :::image type="complex" source="./media/tutorial-byok/service-principal-permissions-1.png" alt-text="Screenshot showing the permissions needed for the service principal.":::
- Permissions are listed by service, permission name, type, and then description. Azure Key Vault: user impersonation, delegated, full access to Azure Key Vault. Azure Service Management: user impersonation, delegated, access Azure Service Management as organization user. Azure Storage: user impersonation, delegated, access Azure Storage. Media
- :::image-end:::
-
-### Install Postman
-
-If you haven't already installed Postman for use with Azure, you can get it at [postman.com](https://www.postman.com/).
-
-### Download and import the collection
-
-Download the Postman collection for this tutorial at [Azure Samples: media-services-customer-managed-keys-byok](https://github.com/Azure-Samples/media-services-customer-managed-keys-byok).
-
-## Install the Postman collection and environment
-
-1. Run Postman.
-1. Select **Import**.
-1. Select **Upload files**.
-1. Go to where you saved the collection and environment files.
-1. Select the collection and environment files.
-1. Select **Open**. A warning appears that says the files won't be imported as an API, but as collections. This warning is fine. It's what you want.
-
-The collection now shows in your collections as BYOK. Also, the environment variables appear in your environments.
-
-### Understand the REST API requests in the collection
-
-The collection provides the following REST API requests.
-
-> [!NOTE]
->
->- The requests must be sent in the sequence provided.
->- Most requests have test scripts that dynamically create global variables for the next request in the sequence.
->- You don't need to manually create global variables.
-
-In Postman, you'll see these variables contained within brackets. For example, `{{bearerToken}}`.
-
-1. Get an Azure AD token: The test sets the global variable **bearerToken**.
-2. Get a Microsoft Graph token: The test sets the global variable **graphToken**.
-3. Get service principal details: The test sets the global variable **servicePrincipalObjectId**.
-4. Create a storage account: The test sets the global variable **storageAccountId**.
-5. Create a Media Services account with a system-managed identity: The test sets the global variable **principalId**.
-6. Create a key vault to grant access to the service principal: The test sets the global variable **keyVaultId**.
-7. Get a Key Vault token: The test sets the global variable **keyVaultToken**.
-8. Create the RSA key in the key vault: The test sets the global variable **keyId**.
-9. Update the Media Services account to use the key with the storage account: There's no test script for this request.
-
-## Define environment variables
-
-1. Select the environment's drop-down list to switch to the environment you downloaded.
-1. Establish your environment variables in Postman. They're also used as variables contained within brackets. For example, `{{tenantId}}`.
-
- - **tenantId**: Your tenant ID.
- - **servicePrincipalId**: The ID of the service principal you establish with your favorite method, such as portal or CLI.
- - **servicePrincipalSecret**: The secret created for the service principal.
- - **subscription**: Your subscription ID.
- - **storageName**: The name you want to give to your storage.
- - **accountName**: The Media Services account name you want to use.
- - **keyVaultName**: The key vault name you want to use.
- - **resourceLocation**: The location **CentralUs** or where you want to put your resources. This collection has only been tested with **CentralUs**.
- - **resourceGroup**: The resource group name.
-
- The following variables are standard for working with Azure resources. So, there's no need to change them.
-
- - **armResource**: `https://management.core.windows.net`
- - **graphResource**: `https://graph.windows.net/`
- - **keyVaultResource**: `https://vault.azure.net`
- - **armEndpoint**: `management.azure.com`
- - **graphEndpoint**: `graph.windows.net`
- - **aadEndpoint**: `login.microsoftonline.com`
- - **keyVaultDomainSuffix**: `vault.azure.net`
-
-## Send the requests
-
-After you define your environment variables, you can run the requests one at a time in the previous sequence. Or, you can use Postman's runner to run the collection.
-
-## Change the key
-
-Media Services automatically detects when the key is changed. Create another key version for the same key to test this process. Media Services should detect the key in less than 15 minutes.
-
-## Clean up resources
-
-If you're not going to continue to use the resources that you created and *you don't want to continue to be billed*, delete them.
-
-## Next steps
-
-Go to the next article to learn how to:
-> [!div class="nextstepaction"]
-> [Encode a remote file based on URL and stream the video with REST](stream-files-tutorial-with-rest.md)
media-services Security Encrypt Data Managed Identity Cli Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-encrypt-data-managed-identity-cli-tutorial.md
- Title: Use a Key Vault key to encrypt data into a Media Services account
-description: If you'd like Media Services to encrypt data using a key from your Key Vault, the Media Services account must be granted *access* to the Key Vault. Follow the steps below to create a Managed Identity for the Media Services account and grant this identity access to their Key Vault using the Media Services CLI.
----- Previously updated : 05/14/2021---
-# Tutorial: Use a Key Vault key to encrypt data into a Media Services account
---
-If you'd like Media Services to encrypt data using a key from your Key Vault, the Media Services account must be granted *access* to the Key Vault. Follow the steps below to create a Managed Identity for the Media Services account and grant this identity access to your Key Vault using the Media Services CLI.
----
-This tutorial uses the 2020-05-01 Media Services API.
-
-## Sign in to Azure
-
-To use any of the commands in this article, you first have to be signed in to the subscription that you want to use.
-
- [!INCLUDE [Sign in to Azure with the CLI](./includes/task-sign-in-azure-cli.md)]
-
-### Set subscription
-
-Use this command to set the subscription that you want to work with.
--
-## Resource names
-
-Before you get started, decide on the names of the resources you'll create. They should be easily identifiable as a set, especially if you are not planning to use them after you are done testing. Naming rules are different for many resource types so it's best to stick with all lower case. For example, "mediatest1rg" for your resource group name and "mediatest1stor" for your storage account name. Use the same names for each step in this article.
-
-You'll see these names referenced in the commands below. The names of resources you'll need are:
--- myRG-- myStorageAccount-- myAmsAccount-- myKeyVault-- myKey-- location-
-> [!NOTE]
-> The hyphens above are only used to separate guidance words. Because of the inconsistency of naming resources in Azure services, don't use hyphens when you name your resources.
-> Also, you don't create the region name. The region name is determined by Azure.
-
-### List Azure regions
-
-If you're not sure of the actual region name to use, use this command to get a listing:
--
-## Sequence
-
-Each of the steps below is done in a particular order because one or more values from the JSON responses are used in the next step in the sequence.
--
-## Create a Storage account
-
-The Media Services account you'll create must have a storage account associated with it. Create the storage account for the Media Services account first. You'll use `your-storage-account-name` for subsequent steps.
--
-## Create a Media Services account with a Service Principal (Managed Identity)
-
-Now create the Media Services account with a Service Principal, otherwise known as a Managed Identity.
-
-> [!IMPORTANT]
-> It is important that you remember to use the --mi flag in the command. Otherwise you will not be able to find the `principalId` for a later step.
--
-## Create a Key Vault
-
-Create the Key Vault. The Key Vault is used to encrypt media data. You'll use `your-keyvault-name` to create your key and for later steps.
--
-## Grant the Media Services System Assigned Managed Identity access to the Key Vault
-
-Grant the Media Services Managed Identity access to the Key Vault. There are two commands:
-
-### Get (show) the Managed Identity of the Media Services account
-
-The first command below shows the Managed Identity of the Media Services account which is the `principalId` listed in the JSON returned by the command.
--
-### Set the Key Vault policy
-
-The second command grants the Principal ID access to the Key Vault. Set `object-id` to the value of `principalId` which you got from the previous step.
--
-### Set Media Services to use the key from Key Vault
-
-Set Media Services to use the key you've created. The value of the `key-identifier` property comes from the output when the key was created. This command may fail because of the time it takes to propagate access control changes. If this happens, retry after a few minutes.
--
-## Validation
-
-To verify the account is encrypted using a Customer Managed Key, view the account encryption properties:
--
-The `type` property should show `CustomerKey` and the `currentKeyIdentifier` should be set to the path of a key in the customerΓÇÖs Key Vault.
-
-## Clean up resources
-
-If you aren't planning to use the resources you created, delete the resource group.
-
media-services Security Function App Managed Identity Cli Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-function-app-managed-identity-cli-tutorial.md
- Title: Give an Azure Function app access to a Media Services account
-description: Suppose you want to build an ΓÇ£On AirΓÇ¥ sign for your broadcasting studio. You can determine when Media Services Live Events are running using the Media Services API but this may be hard to call from an embedded device. Instead, you could expose an HTTP API for your embedded device using Azure Functions. Azure Functions could then call Media Services to get the state of the Live Event.
----- Previously updated : 09/13/2021---
-# Tutorial: Give an Azure Function app access to a Media Services account
--
-Suppose you want let visitors to your website or application to know that you are ΓÇ£On AirΓÇ¥ in your broadcasting studio. You can determine when Media Services Live Events are running using the Media Services API but this may be hard to call from an embedded device. Instead, you could expose an HTTP API for your embedded device using Azure Functions. Azure Functions could then call Media Services to get the state of the Live Event.
--
-This tutorial uses the 2020-05-01 Media Services API.
-
-## Sign in to Azure
-
-To use any of the commands in this article, you first have to be signed in to the subscription that you want to use.
-
- [!INCLUDE [Sign in to Azure with the CLI](./includes/task-sign-in-azure-cli.md)]
-
-### Set subscription
-
-Use this command to set the subscription that you want to work with.
--
-## Prerequisites
-
-> [!IMPORTANT]
-> You are strongly encouraged to work through the [Create a C# function in Azure from the command line quickstart](../../azure-functions/create-first-function-cli-csharp.md) before attempting this tutorial. That is because the set up steps included there are the same steps needed here. It will also give you a chance to work with a simple example on which this tutorial is based.
-
-## Resource names
-
-Before you get started, decide on the names of the resources you'll create. They should be easily identifiable as a set, especially if you aren't planning to use them after you're done testing. Naming rules are different for many resource types so it's best to stick with all lower case. For example, "mediatest1rg" for your resource group name and "mediatest1stor" for your storage account name. Use the same names for each step in this article.
-
-You'll see these names referenced in the commands below. The names of resources you'll need are:
--- myRG-- myStorageAccount-- myAmsAccount-- location-- myFunction: use "OnAir"-- myLiveEvent: use "live1"-- ipaddresses use: "0.0.0./32"-
-> [!NOTE]
-> The hyphens above are only used to separate guidance words. Because of the inconsistency of naming resources in Azure services, don't use hyphens when you name your resources.<br/><br/>
-> Anything represented by 00000000-0000-0000-0000000000 is the unique identifier of the resource. This value is usually returned by a JSON response. It is also recommended that you copy and paste the JSON responses in Notepad or other text editor, as those responses will contain values you will need for later CLI commands.<br/><br/>
-> Also, you don't create the region name. The region name is determined by Azure.
-
-### List Azure regions
-
-If you're not sure of the actual region name to use, use this command to get a listing:
--
-## Sequence
-
-Each of the steps below is done in a particular order because one or more values from the JSON responses are used in the next step in the sequence.
-
-## Create a Storage account
-
-The Media Services account you'll create must have a storage account associated with it. Create the storage account for the Media Services account first. You'll use `your-storage-account-name` for subsequent steps.
--
-## Create a Media Services account
-
-Now create the Media Services account. Look for the `
--
-## Set up the Azure Function
-
-In this section, you'll set up your Azure Function.
-
-### Get the code
-
-Use the Azure Functions to create your function project and retrieve the code from the HTTP template.
-
-```azurecli-interactive
-func init MediaServicesLiveMonitor ΓÇôdotnet
-```
-
-### Change directory
-
-Make sure you change your working directory to the project directory. Otherwise you'll get errors.
-
-```azurecli-interactive
-cd .\MediaServicesLiveMonitor\
-```
-
-### Name your function
-
-```azurecli-interactive
-func new --name OnAir --template "HTTP trigger" --authlevel "anonymous"
-```
-
-## Configure the functions project
-
-### Install Media Services and other extensions
-
-Run the dotnet add package command in the Terminal window to install the extension packages that you need in your project. The following command installs the Media Services and Azure Identity packages.
-
-```bash
-dotnet add package Microsoft.Azure.Management.Media
-dotnet add package Azure.Identity
-```
-
-### Edit the OnAir.cs code
-
-Change the `OnAir.cs` file. Change `subscriptionId`, `resourceGroup`, and `mediaServicesAccountName` variables to the ones you decided upon earlier.
-
-```csharp
-using Azure.Core;
-using Azure.Identity;
-using Microsoft.AspNetCore.Http;
-using Microsoft.AspNetCore.Mvc;
-using Microsoft.Azure.Management.Media;
-using Microsoft.Azure.Management.Media.Models;
-using Microsoft.Azure.WebJobs;
-using Microsoft.Azure.WebJobs.Extensions.Http;
-using Microsoft.Extensions.Logging;
-using Microsoft.Rest;
-using System.Threading.Tasks;
-
-namespace MediaServicesLiveMonitor
-{
- public static class OnAir
- {
- [FunctionName("OnAir")]
- public static async Task<IActionResult> Run(
- [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
- ILogger log)
- {
- log.LogInformation("C# HTTP trigger function processed a request.");
-
- string name = req.Query["name"];
-
- if (string.IsNullOrWhiteSpace(name))
- {
- return new BadRequestObjectResult("Missing 'name' URL parameter");
- }
-
- var credential = new ManagedIdentityCredential();
- var accessTokenRequest = await credential.GetTokenAsync(
- new TokenRequestContext(
- scopes: new string[] { "https://management.core.windows.net" + "/.default" }
- )
- );
- ServiceClientCredentials credentials = new TokenCredentials(accessTokenRequest.Token, "Bearer");
-
- var subscriptionId = "00000000-0000-0000-000000000000"; // Update
- var resourceGroup = "<your-resource-group-name>"; // Update
- var mediaServicesAccountName = "<your-media-services-account-name>"; // Update
-
- var mediaServices = new AzureMediaServicesClient(credentials)
- {
- SubscriptionId = subscriptionId
- };
-
- var liveEvent = await mediaServices.LiveEvents.GetAsync(resourceGroup, mediaServicesAccountName, name);
-
- if (liveEvent == null)
- {
- return new NotFoundResult();
- }
-
- return new OkObjectResult(liveEvent.ResourceState == LiveEventResourceState.Running ? "On air" : "Off air");
- }
- }
-}
-```
-
-## Create the Function App
-
-Create the Function App to host the function. The name is the same as the one that you downloaded earlier, `MediaServicesLiveMonitorApp`.
-
-```azurecli-interactive
-
-az functionapp create --resource-group <your-resource-group-name> --consumption-plan-location your-region --runtime dotnet --functions-version 3 --name MediaServicesLiveMonitorApp --storage-account mediatest3store --assign-identity "[system]"
-
-```
-
-Look for `principalId` in the JSON response:
-
-```json
-{
-...
-"identity": {
-//Note the principalId value for the following step
- "principalId": "00000000-0000-0000-000000000000",
- "tenantId": "00000000-0000-0000-000000000000",
- "type": "SystemAssigned",
- "userAssignedIdentities": null
- }
-...
-```
-
-## Grant the function app access to the Media Services account resource
-
-For this request:
--- `assignee` is the `principalId` that is in the JSON response from `az functionapp create`-- `scope` is the `id` that is in the JSON response from `az ams account create`. See the example JSON response above.-
-```azurecli-interactive
-az role assignment create --assignee 00000000-0000-0000-000000000000 --role "Media Services Account Administrator" --scope "/subscriptions/<the-subscription-id>/resourceGroups/<your-resource-group>/providers/Microsoft.Media/mediaservices/<your-media-services-account-name>"
-```
-
-## Publish the function
-
-```azurecli-interactive
-func azure functionapp publish MediaServicesLiveMonitorApp
-```
-
-## Validation
-
-In a browser, go to the function URL, for example:
-
-`https://mediaserviceslivemonitorapp.azurewebsites.net/api/onair?name=live1`
-
-This should return a 404 (Not Found) error as the Live Event does not exist yet.
-
-### Create a Live Event
-
-```azurecli-interactive
-az ams live-event create --resource-group test3 --account-name mediatest3 --name live1 --streaming-protocol RTMP
-```
-
-In a browser, go to the function URL, for example:
-
-`https://mediaserviceslivemonitorapp.azurewebsites.net/api/onair?name=live1`
-
-This should now show ΓÇ£Off AirΓÇ¥.
-
-### Start the live event
-
-If you start the Live Event, the function should return ΓÇ£On AirΓÇ¥.
-
-```azurecli-interactive
-az ams live-event start live1
-```
-
-This function allows access to anyone. Securing access to the Azure Function and wiring up an ΓÇ£On AirΓÇ¥ light are out of scope for this document.
-
-## Clean up resources
-
-If you aren't planning to use the resources you created, delete the resource group.
-
media-services Security Pass Authentication Tokens How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-pass-authentication-tokens-how-to.md
- Title: Pass authentication tokens to Media Services v3 | Microsoft Docs
-description: Learn how to send authentication tokens from the client to the Media Services v3 key delivery service
----- Previously updated : 03/10/2021---
-# Pass tokens to the Azure Media Services v3 key delivery service
--
-Customers often ask how a player can pass tokens to the Azure Media Services key delivery service for verification so the player can obtain the key. Media Services supports the simple web token (SWT) and JSON Web Token (JWT) formats. Token authentication is applied to any type of key, regardless of whether you use common encryption or Advanced Encryption Standard (AES) envelope encryption in the system.
-
- Depending on the player and platform you target, you can pass the token with your player in the following ways:
-
-## Pass a token through the HTTP authorization header
-
-> [!NOTE]
-> The "Bearer" prefix is expected per the OAuth 2.0 specs. To set the video source, choose **AES (JWT Token)** or **AES (SWT Token)**. The token is passed via the Authorization header.
-
-## Pass a token via the addition of a URL query parameter with "token=tokenvalue."
-
-> [!NOTE]
-> The "Bearer" prefix isn't expected. Because the token is sent through a URL, you need to armor the token string. Here is a C# sample code that shows how to do it:
-
-```csharp
- string armoredAuthToken = System.Web.HttpUtility.UrlEncode(authToken);
- string uriWithTokenParameter = string.Format("{0}&token={1}", keyDeliveryServiceUri.AbsoluteUri, armoredAuthToken);
- Uri keyDeliveryUrlWithTokenParameter = new Uri(uriWithTokenParameter);
-```
-
-## Pass a token through the CustomData field
-
-This option is used for PlayReady license acquisition only, through the CustomData field of the PlayReady License Acquisition Challenge. In this case, the token must be inside the XML document as described here:
-
-```xml
- <?xml version="1.0"?>
- <CustomData xmlns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyCustomData/v1">
- <Token></Token>
- </CustomData>
-```
-
-Put your authentication token in the Token element.
media-services Security Private Link Arm How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-private-link-arm-how-to.md
- Title: Create a Media Services and Storage account with a private link using an ARM template-
-description: Create a Media Services account and Storage Account with Private Links to a VNet. The Azure Resource Manager (ARM) template also sets up DNS for both the Private Links. Finally the template creates a VM to allow the user to try out the Private Links.
----- Previously updated : 04/15/2021---
-# Create a Media Services and Storage account with a Private Link using an ARM template
--
-Create a Media Services account and Storage Account with Private Links to a VNet. The Azure Resource Manager (ARM) template also sets up DNS for both the Private Links. Finally the template creates a VM to allow the user to try out the Private Links.
-
-## Prerequisites
-
-Read [Quickstart: Create and deploy ARM templates by using the Azure portal](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md).
-
-## Limitations
--- For Media Services, the template only sets up Private Link for Key Delivery.-- A network security group isn't created for the VM.-- Network access control isn't configured for the Storage Account or Key Delivery.-
-The template creates:
--- A Media Services account and a Storage Account (as normal)-- A VNet with a subnet-- For both the Media Services account and the Storage Account:
- - Private Endpoints
- - Private DNS Zones
- - Links between links (to connect the private DNS zones to the VNet)
- - Private DNS zone groups (to trigger the automatic creation of DNS records in the private DNS zones)
-- A VM (with associated public IP address and network interface)--
-## Azure Resource Manager (ARM) template for private link
-
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "vmName": {
- "type": "string"
- },
- "vmAdminUsername": {
- "type": "string"
- },
- "vmAdminPassword": {
- "type": "secureString"
- },
- "vmSize": {
- "type": "string",
- "defaultValue": "Standard_D2_v3"
- },
- "location": {
- "type": "string",
- "defaultValue": "[resourceGroup().location]"
- },
- "storageAccountName": {
- "type": "string"
- },
- "mediaServicesAccountName": {
- "type": "string"
- }
- },
- "functions": [],
- "resources": [
- {
- "type": "Microsoft.Storage/storageAccounts",
- "apiVersion": "2021-01-01",
- "name": "[parameters('storageAccountName')]",
- "location": "[parameters('location')]",
- "sku": {
- "name": "Standard_LRS"
- },
- "kind": "StorageV2"
- },
- {
- "type": "Microsoft.Media/mediaservices",
- "apiVersion": "2020-05-01",
- "name": "[parameters('mediaServicesAccountName')]",
- "location": "[parameters('location')]",
- "properties": {
- "storageAccounts": [
- {
- "type": "Primary",
- "id": "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName'))]"
- }
- ]
- },
- "identity": {
- "type": "SystemAssigned"
- },
- "dependsOn": [
- "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName'))]"
- ]
- },
- {
- "type": "Microsoft.Network/virtualNetworks",
- "apiVersion": "2020-08-01",
- "name": "myVnet",
- "location": "[parameters('location')]",
- "properties": {
- "addressSpace": {
- "addressPrefixes": [
- "10.0.0.0/16"
- ]
- },
- "subnets": [
- {
- "name": "mySubnet",
- "properties": {
- "addressPrefix": "10.0.0.0/24",
- "privateEndpointNetworkPolicies": "Disabled"
- }
- }
- ]
- }
- },
- {
- "type": "Microsoft.Network/privateEndpoints",
- "apiVersion": "2020-08-01",
- "name": "storagePrivateEndpoint",
- "location": "[parameters('location')]",
- "properties": {
- "subnet": {
- "id": "[reference(resourceId('Microsoft.Network/virtualNetworks', 'myVnet')).subnets[0].id]"
- },
- "privateLinkServiceConnections": [
- {
- "name": "storagePrivateEndpointConnection",
- "properties": {
- "privateLinkServiceId": "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName'))]",
- "groupIds": [
- "blob"
- ]
- }
- }
- ]
- },
- "dependsOn": [
- "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName'))]",
- "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- ]
- },
- {
- "type": "Microsoft.Network/privateDnsZones",
- "apiVersion": "2020-06-01",
- "name": "privatelink.blob.core.windows.net",
- "location": "global"
- },
- {
- "type": "Microsoft.Network/privateDnsZones/virtualNetworkLinks",
- "apiVersion": "2020-06-01",
- "name": "[format('{0}/storageDnsZoneLink', 'privatelink.blob.core.windows.net')]",
- "location": "global",
- "properties": {
- "registrationEnabled": false,
- "virtualNetwork": {
- "id": "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- }
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.blob.core.windows.net')]",
- "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- ]
- },
- {
- "type": "Microsoft.Network/privateEndpoints/privateDnsZoneGroups",
- "apiVersion": "2020-08-01",
- "name": "[format('{0}/storagePrivateDnsZoneGroup', 'storagePrivateEndpoint')]",
- "properties": {
- "privateDnsZoneConfigs": [
- {
- "name": "config1",
- "properties": {
- "privateDnsZoneId": "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.blob.core.windows.net')]"
- }
- }
- ]
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.blob.core.windows.net')]",
- "[resourceId('Microsoft.Network/privateEndpoints', 'storagePrivateEndpoint')]"
- ]
- },
- {
- "type": "Microsoft.Network/privateEndpoints",
- "apiVersion": "2020-08-01",
- "name": "mediaServicesPrivateEndpoint",
- "location": "[parameters('location')]",
- "properties": {
- "subnet": {
- "id": "[reference(resourceId('Microsoft.Network/virtualNetworks', 'myVnet')).subnets[0].id]"
- },
- "privateLinkServiceConnections": [
- {
- "name": "mediaServicesPrivateEndpointConnection",
- "properties": {
- "privateLinkServiceId": "[resourceId('Microsoft.Media/mediaservices', parameters('mediaServicesAccountName'))]",
- "groupIds": [
- "keydelivery"
- ]
- }
- }
- ]
- },
- "dependsOn": [
- "[resourceId('Microsoft.Media/mediaservices', parameters('mediaServicesAccountName'))]",
- "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- ]
- },
- {
- "type": "Microsoft.Network/privateDnsZones",
- "apiVersion": "2020-06-01",
- "name": "privatelink.media.azure.net",
- "location": "global"
- },
- {
- "type": "Microsoft.Network/privateDnsZones/virtualNetworkLinks",
- "apiVersion": "2020-06-01",
- "name": "[format('{0}/mediaServicesDnsZoneLink', 'privatelink.media.azure.net')]",
- "location": "global",
- "properties": {
- "registrationEnabled": false,
- "virtualNetwork": {
- "id": "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- }
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.media.azure.net')]",
- "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- ]
- },
- {
- "type": "Microsoft.Network/privateEndpoints/privateDnsZoneGroups",
- "apiVersion": "2020-08-01",
- "name": "[format('{0}/mediaServicesPrivateDnsZoneGroup', 'mediaServicesPrivateEndpoint')]",
- "properties": {
- "privateDnsZoneConfigs": [
- {
- "name": "config1",
- "properties": {
- "privateDnsZoneId": "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.media.azure.net')]"
- }
- }
- ]
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/privateDnsZones', 'privatelink.media.azure.net')]",
- "[resourceId('Microsoft.Network/privateEndpoints', 'mediaServicesPrivateEndpoint')]"
- ]
- },
- {
- "type": "Microsoft.Network/publicIPAddresses",
- "apiVersion": "2020-08-01",
- "name": "publicIp",
- "location": "[parameters('location')]",
- "properties": {
- "publicIPAllocationMethod": "Dynamic",
- "dnsSettings": {
- "domainNameLabel": "[toLower(parameters('vmName'))]"
- }
- }
- },
- {
- "type": "Microsoft.Network/networkInterfaces",
- "apiVersion": "2020-08-01",
- "name": "vmNetworkInterface",
- "location": "[parameters('location')]",
- "properties": {
- "ipConfigurations": [
- {
- "name": "ipConfig1",
- "properties": {
- "privateIPAllocationMethod": "Dynamic",
- "publicIPAddress": {
- "id": "[resourceId('Microsoft.Network/publicIPAddresses', 'publicIp')]"
- },
- "subnet": {
- "id": "[reference(resourceId('Microsoft.Network/virtualNetworks', 'myVnet')).subnets[0].id]"
- }
- }
- }
- ]
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/publicIPAddresses', 'publicIp')]",
- "[resourceId('Microsoft.Network/virtualNetworks', 'myVnet')]"
- ]
- },
- {
- "type": "Microsoft.Compute/virtualMachines",
- "apiVersion": "2020-12-01",
- "name": "myVM",
- "location": "[parameters('location')]",
- "properties": {
- "hardwareProfile": {
- "vmSize": "[parameters('vmSize')]"
- },
- "osProfile": {
- "computerName": "[parameters('vmName')]",
- "adminUsername": "[parameters('vmAdminUsername')]",
- "adminPassword": "[parameters('vmAdminPassword')]"
- },
- "storageProfile": {
- "imageReference": {
- "publisher": "MicrosoftWindowsServer",
- "offer": "WindowsServer",
- "sku": "2019-Datacenter",
- "version": "latest"
- },
- "osDisk": {
- "name": "osDisk",
- "caching": "ReadWrite",
- "createOption": "FromImage",
- "managedDisk": {
- "storageAccountType": "Standard_LRS"
- },
- "diskSizeGB": 128
- }
- },
- "networkProfile": {
- "networkInterfaces": [
- {
- "id": "[resourceId('Microsoft.Network/networkInterfaces', 'vmNetworkInterface')]"
- }
- ]
- }
- },
- "dependsOn": [
- "[resourceId('Microsoft.Network/networkInterfaces', 'vmNetworkInterface')]"
- ]
- }
- ],
- "metadata": {
- "_generator": {
- "name": "bicep",
- "version": "0.3.126.58533",
- "templateHash": "2006367938138350540"
- }
- }
-}
-```
media-services Security Private Link Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-private-link-concept.md
- Title: Overview of using private links with Azure Media Services
-description: This article gives an overview of using private links with Azure Media Services.
----- Previously updated : 10/22/2021---
-# Overview of using Azure Private Link with Azure Media Services
--
-This article gives an overview of using private links with Azure Media Services.
-
-## When to use Private Link with Media Services
-
-Private Link allows Media Services to be accessed from private networks. When used with the network access controls provided by Media Services, private links can enable Media Services to be used without exposing endpoints to the public internet.
-
-## Azure Private Endpoint and Azure Private Link
-
-An [Azure Private Endpoint](../../private-link/private-endpoint-overview.md) is a network interface that uses a private IP address from your virtual network. This network interface connects you privately and securely to a service via Azure Private Link.
-
-Media Services endpoints may be accessed from a virtual network using private endpoints. Private endpoints may also be accessed from peered virtual networks or other networks connected to the virtual network using Express Route or VPN.
-
-[Azure Private Links](../../private-link/index.yml) allow access to Media Services private endpoints in your virtual network without exposing them to the public Internet. It routes traffic over the Microsoft backbone network.
-
-## Restricting access
-
-> [!Important]
-> Creating a private endpoint **DOES NOT** implicitly disable internet access to it.
-
-Internet access to the endpoints in the Media Services account can be restricted in one of two ways:
--- Restricting access to all resources within the Media Services account.-- Restricting access separately for each resource by using the IP allowlist.-
-## Media Services endpoints
-
-| Endpoint | Description | Supports private link | Internet access control |
-| | - | | -- |
-| Streaming Endpoint | The origin server for streaming video and formats media into HLS and DASH | Yes | IP allowlist |
-| Streaming Endpoint with CDN | Stream media to many viewers | No | Managed by CDN |
-| Key Delivery | Provides media content keys and DRM licenses to media viewers | Yes | IP allowlist |
-| Live event | Ingests media content for live streaming | Yes | IP allowlist |
-
-> [!NOTE]
-> Media Services accounts created with API versions prior to 2020-05-01 also have an endpoint for the legacy RESTv2 API endpoint (pending deprecation). This endpoint does not support private links.
-
-## Other Private Link enabled Azure services
-
-| Service | Media Services integration | Private link documentation |
-| - | -- | -- |
-| Azure Storage | Used to store media | [Use private endpoints for Azure Storage](../../storage/common/storage-private-endpoints.md) |
-| Azure Key Vault | Used to store [customer managed keys](security-customer-managed-keys-portal-tutorial.md) | [Configure Azure Key Vault networking settings](../../key-vault/general/how-to-azure-key-vault-network-security.md) |
-| Azure Resource Manager | Provides access to Media Services APIs | [Use REST API to create private link for managing Azure resources](../../azure-resource-manager/management/create-private-link-access-rest.md) |
-| Event Grid | Provides [notifications of Media Services events](./monitoring/job-state-events-cli-how-to.md) | [Configure private endpoints for Azure Event Grid topics or domains](../../event-grid/configure-private-endpoints.md) |
-
-## Private endpoints are created on the Media Services account
-
-Private Endpoints for Key Delivery, Streaming Endpoints, and Live Events are created on the Media Services account instead of being created individually.
-
-A private IP address is created for each Streaming Endpoint or Live Event in the Media Services account when a Media Services private endpoint resource is created. For example, if you have two started Streaming Endpoints, a single private endpoint should be created to connect both Streaming Endpoints to a virtual network. Resources can be connected to multiple virtual networks at the same time.
-
-Internet access to the Media Services account should be restricted, either for all the resources within the account or separately for each resource.
-
-## Private Link pricing
-For pricing details, see [Azure Private Link Pricing](https://azure.microsoft.com/pricing/details/private-link)
-
-## Private Link how-tos and FAQs
--- [Create a Media Services and Storage account with a Private Link using an Azure Resource Management template](security-private-link-arm-how-to.md)-- [Create a Private Link for a Streaming Endpoint](security-private-link-streaming-endpoint-how-to.md)
media-services Security Private Link Connect Private Endpoint Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-private-link-connect-private-endpoint-concept.md
- Title: Private Endpoint connections overview
-description: This article is an overview of Private Endpoint connections with Media Services.
----- Previously updated : 10/22/2021---
-# Private Endpoint connections overview
--
-This article is an overview of Private Endpoint connections with Media Services.
-
-## Clients using VNet
-
-Clients on a VNet using the private endpoint should use the same DNS name to connect to Media Services as clients connecting to the public Media Services endpoints. Media Services relies upon DNS resolution to automatically route the connections from the VNet to the Media Services endpoints over a private link.
-
-> [!IMPORTANT]
-> Use the same DNS names to the Media Services endpoints when using private endpoints as youΓÇÖd otherwise use. Please don't connect to the Media Services endpoints using its privatelink subdomain URL.
-
-Media Services creates a [private DNS zone](../../dns/private-dns-overview.md) attached to the VNet with the necessary updates for the private endpoints, by default. However, if you're using your own DNS server, you may need to make additional changes to your DNS configuration. The section on DNS changes below describes the updates required for private endpoints.
-
-## DNS changes for private endpoints
-
-When you create a private endpoint, the **DNS CNAME** resource record for each of the Media Services endpoints is updated to an alias in a subdomain with the prefix `privatelink`. By default, we also create a private DNS zone, corresponding to the `privatelink` subdomain, with the DNS A resource records for the private endpoints.
-
-When you resolve a Media Services DNS name from outside the VNet with the private endpoint, it resolves to the public endpoint of the Media Services endpoint. When resolved from the VNet hosting the private endpoint, the Media Services URL resolves to the private endpoint's IP address.
-
-For example, the DNS resource records for a Streaming Endpoint in the Media Services `MediaAccountA`, when resolved from outside the VNet hosting the private endpoint, will be:
-
-| Name | Type | Value |
-| - | - | -- |
-| mediaaccounta-uswe1.streaming.media.azure.net | CNAME | mediaaccounta-uswe1.streaming.privatelink.media.azure.net |
-|mediaaccounta-uswe1.streaming.privatelink.media.azure.net | CNAME | `<Streaming Endpoint public endpoint>` |
-| `<Streaming Endpoint public endpoint>` | CNAME | `<Streaming Endpoint internal endpoint>` |
-| `<Streaming Endpoint internal endpoint>` | A | `<Streaming Endpoint public IP address>` |
-
-You can deny or restrict public internet access to Media Services endpoints using IP allowlists, or by disabling public network access for all resources within the account.
-
-The DNS resource records for the example Streaming Endpoint in 'MediaAccountA', when resolved by a client in the VNet hosting the private endpoint, will be:
-
-| Name | Type | Value |
-| - | - | -- |
-| mediaaccounta-uswe1.streaming.media.azure.net | CNAME | mediaaccounta-uswe1.streaming.privatelink.media.azure.net |
-|mediaaccounta-uswe1.streaming.privatelink.media.azure.net | A | `<Streaming Endpoint public endpoint>`, for example" 10.0.0.9 |
-
-This approach enables access to the Media Services endpoint using the same DNS name for clients within the VNet hosting the private endpoints. It does the same thing for clients outside the VNet.
-
-If you're using a custom DNS server on your network, clients must resolve the FQDN for the Media Services endpoint to the private endpoint IP address. Configure your DNS server to delegate your private link subdomain to the private DNS zone for the VNet, or configure the A records forΓÇ»`mediaaccounta-usw22.streaming.privatelink.media.azure.net` with the private endpoint IP address.
-
-> [!TIP]
-> When using a custom or on-premises DNS server, you should configure your DNS server to resolve the Media Services endpoint name in the privatelink subdomain to the private endpoint IP address. You can do this by delegating the privatelink subdomain to the private DNS zone of the VNet, or configuring the DNS zone on your DNS server and adding the DNS A records.
-
-The recommended DNS zone names for private endpoints for storage services, and the associated endpoint target subresources, are:
-
-| Media Services Endpoint | Private Link Group ID | DNS Zone Name |
-| -- | | - |
-| Streaming Endpoint | streamingendpoint | privatelink.media.azure.net |
-| Key Delivery | streamingendpoint | privatelink.media.azure.net |
-| Live Event | liveevent | privatelink.media.azure.net |
-
-For more information about configuring your own DNS server to support private endpoints, refer to the following articles:
--- [Name resolution for resources in Azure virtual networks](../../virtual-network/virtual-networks-name-resolution-for-vms-and-role-instances.md#name-resolution-that-uses-your-own-dns-server)-- [DNS configuration for private endpoints](../../private-link/private-endpoint-overview.md#dns-configuration)-
-## Public network access flag
-
-The `publicNetworkAccess` flag on the Media Services account can be used to allow or block access to Media Services endpoints from the public internet. When `publicNetworkAccess` is disabled, requests to any Media Services endpoint from the public internet are blocked; requests to private endpoints are still allowed.
-
-## Service level IP allowlists
-
-When `publicNetworkAccess` is enabled, requests from the public internet are allowed, subject to service level IP allowlists. If `publicNetworkAccess` is disabled, requests from the public internet are blocked, regardless of the IP allowlist settings. IP allowlists only apply to requests from the public internet; requests to private endpoints are not filtered by the IP allowlists.
media-services Security Private Link Streaming Endpoint How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-private-link-streaming-endpoint-how-to.md
- Title: Create a Private Link for a Streaming Endpoint
-description: This article shows you how to use a private link with a Streaming Endpoint. You'll be creating a private endpoint resource which is a link between a virtual network and a streaming endpoint. This deployment creates a network interface IP address inside the virtual network. The private link allows you to connect the network interface in the private network to the streaming endpoint in the Media Services account. You'll also be creating DNS zones which pass the private IP addresses.
----- Previously updated : 10/22/2021---
-# Create a Private Link for a Streaming Endpoint
--
-This article shows you how to use a private link with a Streaming Endpoint. It's assumed that you already know how to create an [Azure resource group](../../azure-resource-manager/management/manage-resource-groups-portal.md), a [Media Services account](account-create-how-to.md), and an [Azure virtual network](../../virtual-network/quick-create-portal.md).
-
-You'll be creating a private endpoint resource which is a link between a virtual network and a streaming endpoint. This deployment creates a network interface IP address inside the virtual network. The private link allows you to connect the network interface in the private network to the streaming endpoint in the Media Services account. You'll also be creating DNS zones which pass the private IP addresses.
-
-The virtual network created for this walk-though is just to assist with the example. It's assumed that you have an existing virtual network that you'll use for production.
-
-> [!NOTE]
-> As you follow along with the steps, name your resources similarly so that they can be easily understood as having a similar purpose. For example, *privatelink1stor* for your storage account and *privatelink1mi* for your Managed Identity.
-
-## Create a resource group and a Media Services account
-
-1. Create an Azure resource group.
-1. Create a Media Services account. A default Streaming Endpoint is created when you create the account. Creating a Managed Identity is required during the setup process.
-1. Create an Azure virtual network with the default settings.
-
-At this point, there's nothing in your virtual network your Media Services account has an Internet facing endpoint which includes an Internet facing Streaming Endpoint, Key Delivery, and Live Events. The next step will make the Streaming Endpoint private.
-
-## Start the streaming endpoint
-
-1. Navigate to the Media Services account you created.
-1. Select **Streaming endpoints** from the menu. The Streaming endpoints screen will appear.
-1. Select the default Streaming endpoint that you created when you set up the Media Services account. The default Streaming endpoint screen will appear.
-1. Select **Start**. Start options will appear.
-1. Select **None** from the CDN pricing tier dropdown list.
-1. Select **Start**. The Streaming endpoint will start running. The endpoint is still Internet facing.
-
-## Create a private endpoint
-
-1. Navigate back to your Media Services account.
-1. Select **Networking** from the menu
-1. Select the **private endpoint connections** tab. The private endpoint connection screen will appear.
-1. Select **Add a private endpoint**. The Create a private endpoint screen will appear.
-1. In the **Name** field, give the private endpoint a name such as *privatelinkpe*.
-1. From the **Region** dropdown list, select a region such as *West US 2*.
-1. Select **Next: Resource**. The Resource screen will appear.
-
-## Assign the private endpoint to a resource
-
-1. From the **Connection methods** radio buttons, select the *Connect to an Azure resource in my directory* radio button.
-1. From the **Resource type** dropdown list, select *Microsoft.Media/mediaservices*.
-1. From the **Resource** dropdown list, select the Media Services account you created.
-1. From the **Target sub-resource** dropdown list, select the Streaming endpoint you created.
-
-## Deploy the private endpoint to the virtual network
-
-1. From the **Virtual network** dropdown list, select the virtual network you created.
-1. From the **Subnet** dropdown list, select the subnet you want to work with.
-1. Stay on this screen.
-
-## Create DNS zones to use with the private endpoint
-
-To use the streaming endpoint inside your virtual network, create private DNS zones. You can use the same DNS name and get back the private IP address of the streaming endpoint.
-
-1. On the same screen, for the **media-azure-net** configuration, select the resource group you created from the **Resource group** dropdown list.
-1. For the **privatelink-media-azure-net** configuration, select the same resource group from the **Resource group** dropdown list.
-1. Select **Next: Tags**. If you want to add tags to your resources, do that here.
-1. Select **Next: Review + create**. The Review + create screen will appear.
-1. Review your settings and make sure they're correct.
-1. Select **Create**. The private endpoint deployment screen appears.
-
-While the deployment is in progress, it's also creating an [Azure Resource Manager (ARM) template](../../azure-resource-manager/templates/overview.md). You can use ARM templates to automate deployment. To see the template, select **Template** from the menu.
-
-## Clean up resources
-
-If you aren't planning to use the resources created in this exercise, simply delete the resource group. If you don't delete the resources, you will be charged for them.
media-services Security Rbac Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-rbac-concept.md
- Title: Role-based access control for Media Services accounts
-description: This article discusses Azure role-based access control (Azure RBAC) for Azure Media Services accounts.
------ Previously updated : 08/31/2020----
-# Azure role-based access control (Azure RBAC) for Media Services accounts
--
-Currently, Azure Media Services does not define any custom roles specific to the service. To get full access to the Media Services account, customers can use the built-in roles of **Owner** or **Contributor**. The main difference between these roles is: the **Owner** can control who has access to a resource and the **Contributor** cannot. The built-in **Reader** role can also be used but the user or application will only have read access to the Media Services APIs.
-
-## Design principles
-
-One of the key design principles of the v3 API is to make the API more secure. v3 APIs do not return secrets or credentials on **Get** or **List** operations. The keys are always null, empty, or sanitized from the response. The user needs to call a separate action method to get secrets or credentials. The **Reader** role cannot call operations like Asset.ListContainerSas, StreamingLocator.ListContentKeys, ContentKeyPolicies.GetPolicyPropertiesWithSecrets. Having separate actions enables you to set more granular Azure RBAC security permissions in a custom role if desired.
-
-To list the operations Media Services supports, do:
-
-```csharp
-foreach (Microsoft.Azure.Management.Media.Models.Operation a in client.Operations.List())
-{
- Console.WriteLine($"{a.Name} - {a.Display.Operation} - {a.Display.Description}");
-}
-```
-
-The [built-in role definitions](../../role-based-access-control/built-in-roles.md) article tells you exactly what the role grants.
-
-See the following articles for more information:
--- [Classic subscription administrator roles, Azure roles, and Azure AD roles](../../role-based-access-control/rbac-and-directory-admin-roles.md)-- [What is Azure role-based access control (Azure RBAC)?](../../role-based-access-control/overview.md)-- [Add or remove Azure role assignments using the REST API](../../role-based-access-control/role-assignments-rest.md)-- [Media Services resource provider operations](../../role-based-access-control/resource-provider-operations.md#microsoftmedia)-
-## Next steps
--- [Developing with Media Services v3 APIs](media-services-apis-overview.md)-- [Get content key policy using Media Services .NET](drm-get-content-key-policy-how-to.md)
media-services Security Storage Roll Access Keys How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-storage-roll-access-keys-how-to.md
- Title: Update Media Services v3 after rolling storage access keys | Microsoft Docs
-description: This article gives you guidance on how to update Media Services v3 after rolling storage access keys.
------ Previously updated : 03/22/2021---
-# Update Media Services v3 after rolling storage access keys
--
-You're asked to select an Azure Storage account when you create a new Azure Media Services (AMS) account. You can add more than one storage account to your Media Services account. This article shows how to rotate storage keys. It also shows how to add storage accounts to a media account.
-
-To complete the actions described in this article, you should be using [Azure Resource Manager APIs](/rest/api/medi).
--
-## Storage access key generation
-
-When a new storage account is created, Azure generates two 512-bit storage access keys, that are used to authenticate access to your storage account. To keep your storage connections more secure, periodically regenerate and rotate your storage access key. Two access keys (primary and secondary) are provided to enable you to maintain connections to the storage account using one access key while you regenerate the other access key. This procedure is also called "rolling access keys".
-
-Media Services depends on a storage key provided to it. Specifically, the locators that are used to stream or download your assets depend on the specified storage access key. When an AMS account is created, it takes a dependency on the primary storage access key by default. However, as a user you can update the storage key that AMS has. You must let Media Services know which key to use by following these steps:
-
->[!NOTE]
-> If you have multiple storage accounts, you would perform this procedure with each storage account. The order in which you rotate storage keys is not fixed. You can rotate the secondary key first and then the primary key or vice versa.
->
-> Before executing the steps on a production account, make sure to test them on a pre-production account.
->
-
-## Steps to rotate storage keys
-
- 1. Change the storage account Primary key through the PowerShell cmdlet or [Azure](https://portal.azure.com/) portal.
- 2. Call the `Sync-AzMediaServiceStorageKeys` cmdlet with appropriate parameters to force the media account to pick up storage account keys
-
- The following example shows how to sync keys to storage accounts.
-
- `Sync-AzMediaServiceStorageKeys -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccountId $storageAccountId`
-
- 3. Wait an hour or so. Verify the streaming scenarios are working.
- 4. Change the storage account secondary key through the PowerShell cmdlet or Azure portal.
- 5. Call `Sync-AzMediaServiceStorageKeys` PowerShell with appropriate parameters to force the media account to pick up new storage account keys.
- 6. Wait an hour or so. Verify the streaming scenarios are working.
-
-### A PowerShell cmdlet example
-
-The following example demonstrates how to get the storage account and sync it with the AMS account.
-
-```console
-$regionName = "West US"
-$resourceGroupName = "SkyMedia-USWest-App"
-$mediaAccountName = "sky"
-$storageAccountName = "skystorage"
-$storageAccountId = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccountName"
-
-Sync-AzMediaServiceStorageKeys -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccountId $storageAccountId
-```
-
-## Steps to add storage accounts to your AMS account
-
-The following article shows how to add storage accounts to your AMS account: [Attach multiple storage accounts to a Media Services account](storage-managing-multiple-storage-accounts-how-to.md).
media-services Security Trusted Storage Rest Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/security-trusted-storage-rest-tutorial.md
- Title: Azure Media Services Trusted Storage
-description: In this tutorial, you'll learn how to enable trusted storage for Azure Media Services, use Manged Identities for trusted storage, and give Azure Services to access to a storage account when using a firewall or VPN.
---- Previously updated : 2/8/2021--
-# Tutorial: Media Services trusted storage
-
-In this tutorial, you'll learn:
-
-> [!div class="checklist"]
-> - How to enable trusted storage for Azure Media Services
-> - How to use Managed Identities for trusted storage
-> - How to give Azure Services to access to a storage account when using network access control such as a firewall or VPN
-
-With the 2020-05-01 API, you can enable trusted storage by associating a Managed Identity with a Media Services account.
-
->[!NOTE]
->Trusted storage is only available in the API, and is not currently enabled in the Azure portal.
-
-Media Services can automatically access your storage account using system authentication. Media Services validates that the Media Services account and the storage account are in the same subscription. It also validates that the user adding the association has access the storage account with Azure Resource Manager RBAC.
-
-However, if you want to use network access control to secure your storage account and enable trusted storage, [Managed Identities](concept-managed-identities.md) authentication is required. It allows Media Services to access the storage account that has been configured with a firewall or a VNet restriction through trusted storage access.
-
-## Overview
-
-> [!IMPORTANT]
-> Use the 2020-05-01 API for all requests to Media Services.
-
-These are the general steps for creating trusted storage for Media
-
-1. Create a resource group.
-1. Create a storage account.
-1. Poll the storage account until it's ready. When the storage account is ready, the request will return the service principal ID.
-1. Find the ID of the *Storage Blob Data Contributor* role.
-1. Call the authorization provider and add a role assignment.
-1. Update the media services account to authenticate to the storage account using Managed Identity.
-1. Delete the resources if you don't want to continue to use them and be charged for them.
-
-<!-- Link to storage role contributor role access differences information -->
-
-## Prerequisites
-
-You need an Azure subscription to get started. If you don't have an Azure subscription, [create a free trial account](https://azure.microsoft.com/free/).
-
-### Get your tenant ID and subscription ID
-
-If you don't know how to get your tenant ID and subscription ID, see [How to find your tenant ID](setup-azure-tenant-how-to.md) and [Find your tenant ID](setup-azure-tenant-how-to.md).
-
-### Create a service principal and secret
-
-If you don't know how to create a service principal and secret, see [Get credentials to access Media Services API](access-api-howto.md).
-
-## Use a REST client
-
-This script is intended for use with a REST client such as what is available in Visual Studio code extensions. Adapt it for your development environment.
-
-## Set initial variables
-
-This part of the script is for use in a REST client. You may use variables differently within your development environment.
-
-```rest
-### AAD details
-@tenantId = your tenant ID
-@servicePrincipalId = the service principal ID
-@servicePrincipalSecret = the service principal secret
-
-### AAD resources
-@armResource = https%3A%2F%2Fmanagement.core.windows.net%2F
-@graphResource = https%3A%2F%2Fgraph.windows.net%2F
-@storageResource = https%3A%2F%2Fstorage.azure.com%2F
-
-### Service endpoints
-@armEndpoint = management.azure.com
-@graphEndpoint = graph.windows.net
-@aadEndpoint = login.microsoftonline.com
-
-### ARM details
-@subscription = your subscription id
-@resourceGroup = the resource group you'll be creating
-@storageName = the name of the storage you'll be creating
-@accountName = the name of the account you'll be creating
-@resourceLocation = East US (or the location that works best for your region)
-```
-
-## Get a token for Azure Resource Manager
-
-```rest
-// @name getArmToken
-POST https://{{aadEndpoint}}/{{tenantId}}/oauth2/token
-Accept: application/json
-Content-Type: application/x-www-form-urlencoded
-
-resource={{armResource}}&client_id={{servicePrincipalId}}&client_secret={{servicePrincipalSecret}}&grant_type=client_credentials
-```
-
-## Get a token for the Graph API
-
-This part of the script is for use in a REST client. You may use variables differently within your development environment.
-
-```rest
-// @name getGraphToken
-POST https://{{aadEndpoint}}/{{tenantId}}/oauth2/token
-Accept: application/json
-Content-Type: application/x-www-form-urlencoded
-
-resource={{graphResource}}&client_id={{servicePrincipalId}}&client_secret={{servicePrincipalSecret}}&grant_type=client_credentials
-```
-
-## Get the service principal details
-
-```rest
-// @name getServicePrincipals
-GET https://{{graphEndpoint}}/{{tenantId}}/servicePrincipals?$filter=appId%20eq%20'{{servicePrincipalId}}'&api-version=1.6
-x-ms-client-request-id: cae3e4f7-17a0-476a-a05a-0dab934ba959
-Authorization: Bearer {{getGraphToken.response.body.access_token}}
-```
-
-## Store the service principal ID
-
-```rest
-@servicePrincipalObjectId = {{getServicePrincipals.response.body.value[0].objectId}}
-```
-
-## Create a resource group
-
-```rest
-// @name createResourceGroup
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}
- ?api-version=2016-09-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "location": "{{resourceLocation}}"
-}
-```
-
-## Create storage account
-
-```rest
-// @name createStorageAccount
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}
- ?api-version=2019-06-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "sku": {
- "name": "Standard_GRS"
- },
- "kind": "StorageV2",
- "location": "{{resourceLocation}}",
- "properties": {
- }
-}
-```
-
-## Get the storage account status
-
-The storage account will take a while to be ready so this request polls for its status. Repeat this request until the storage account is ready.
-
-```rest
-// @name getStorageAccountStatus
-GET {{createStorageAccount.response.headers.Location}}
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-```
-
-## Get the storage account details
-
-When the storage account is ready, get the properties of the storage account.
-
-```rest
-// @name getStorageAccount
-GET https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}
- ?api-version=2019-06-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-```
-
-## Get a token for ARM
-
-```rest
-// @name getStorageToken
-POST https://{{aadEndpoint}}/{{tenantId}}/oauth2/token
-Accept: application/json
-Content-Type: application/x-www-form-urlencoded
-
-resource={{storageResource}}&client_id={{servicePrincipalId}}&client_secret={{servicePrincipalSecret}}&grant_type=client_credentials
-```
-
-## Create a Media Services account with a system-assigned managed identity
-
-```rest
-// @name createMediaServicesAccount
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Media/mediaservices/{{accountName}}?api-version=2020-05-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "identity": {
- "type": "SystemAssigned"
- },
- "properties": {
- "storageAccounts": [
- {
- "id": "{{getStorageAccountStatus.response.body.id}}"
- }
- ],
- "encryption": {
- "type": "SystemKey"
- }
- },
- "location": "{{resourceLocation}}"
-}
-```
-
-## Get the storage Storage Blob Data role definition
-
-```rest
-// @name getStorageBlobDataContributorRoleDefinition
-GET https://management.azure.com/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}/providers/Microsoft.Authorization/roleDefinitions?$filter=roleName%20eq%20%27Storage%20Blob%20Data%20Contributor%27&api-version=2015-07-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-```
-
-### Set the storage role assignment
-
-The role assignment says that the service principal for the Media Services account has the storage role *Storage Blob Data Contributor*. This may take a while and it's important to wait or the Media Services account won't be set up correctly.
-
-```rest
-PUT https://management.azure.com/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}/providers/Microsoft.Authorization/roleAssignments/{{$guid}}?api-version=2020-04-01-preview
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "properties": {
- "roleDefinitionId": "/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}/providers/Microsoft.Authorization/roleDefinitions/{{getStorageBlobDataContributorRoleDefinition.response.body.value[0].name}}",
- "principalId": "{{createMediaServicesAccount.response.body.identity.principalId}}"
- }
-}
-```
-
-## Give Managed Identity bypass access to the storage account
-
-This action changes the access from system-managed identity to the Managed Identity. In this way, the storage account can access the storage account through a firewall as Azure services can access the storage account regardless of IP access rules (ACLs).
-
-Again, wait until the role has been assigned in the storage account, or the Media Services account will be set up incorrectly.
-
-```rest
-// @name setStorageAccountFirewall
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}
- ?api-version=2019-06-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "sku": {
- "name": "Standard_GRS"
- },
- "kind": "StorageV2",
- "location": "{{resourceLocation}}",
- "properties": {
- "minimumTlsVersion": "TLS1_2",
- "networkAcls": {
- "bypass": "AzureServices",
- "virtualNetworkRules": [],
- "ipRules": [],
- "defaultAction": "Deny"
- }
- }
-}
-```
-
-## Update the Media Services account to use the Managed Identity
-
-This request may need to be retried a few times as the storage role assignment can take a few minutes to propagate.
-
-```rest
-// @name updateMediaServicesAccountWithManagedStorageAuth
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Media/mediaservices/{{accountName}}?api-version=2020-05-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
- "identity": {
- "type": "SystemAssigned"
- },
- "properties": {
- "storageAccounts": [
- {
- "id": "{{getStorageAccountStatus.response.body.id}}"
- }
- ],
- "storageAuthentication": "ManagedIdentity",
- "encryption": {
- "type": "SystemKey"
- }
- },
- "location": "{{resourceLocation}}"
-}
-```
-
-## Test access
-
-Test access by creating an asset in the storage account.
-
-```rest
-// @name createAsset
-PUT https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Media/mediaservices/{{accountName}}/assets/testasset{{index}}withoutmi?api-version=2018-07-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-Content-Type: application/json; charset=utf-8
-
-{
-}
-```
-
-## Delete resources
-
-If you don't want to keep the resources that you created and continue to be charged for them, delete them.
-
-```rest
-### Clean up the Storage account
-DELETE https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Storage/storageAccounts/{{storageName}}
- ?api-version=2019-06-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-
-### Clean up the Media Services account
-DELETE https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Media/mediaservices/{{accountName}}?api-version=2020-05-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-
-### Clean up the Media Services account
-GET https://{{armEndpoint}}/subscriptions/{{subscription}}/resourceGroups/{{resourceGroup}}/providers/Microsoft.Media/mediaservices/{{accountName}}?api-version=2020-05-01
-Authorization: Bearer {{getArmToken.response.body.access_token}}
-```
-
-## Next steps
-
-[How to create an Asset](asset-create-asset-how-to.md)
media-services Setup Azure Subscription How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/setup-azure-subscription-how-to.md
- Title: How to find your Azure subscription
-description: Find your Azure subscription so you can set up your environment.
------- Previously updated : 08/31/2020----
-# Find your Azure subscription
---
-## [Portal](#tab/portal/)
-
-## Use the Azure portal
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Under the Azure services heading, select Subscriptions. (If no subscriptions are listed, you may need to switch Azure AD tenants.) Your Subscription IDs are listed in the second column.
-1. Copy the Subscription ID and paste it into a text document of your choice for use later.
-
-## [CLI](#tab/cli/)
-
-## Use the Azure CLI
-
-<!-- NOTE: The following are in the includes file and are reused in other How To articles. All task based content should be in the includes folder with the "task-" prepended to the file name. -->
-
-### List your Azure subscriptions with CLI
--
-### See also
-
-* [Azure CLI](/cli/azure/ams)
---
-## Next steps
-
-[Stream a file](stream-files-dotnet-quickstart.md)
media-services Setup Azure Tenant How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/setup-azure-tenant-how-to.md
- Title: How to find your Azure tenant ID
-description: Find your Azure tenant so you can set up your environment.
------- Previously updated : 02/8/2021---
-# How to find your Azure tenant ID
---
-## Find your tenant ID in the portal
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Select the Azure Active Directory icon under Azure services. The Active Directory page will open.
-1. Your tenant ID is located under **Basic Information**.
-1. Copy the tenant ID and paste it into a text document of your choice for use later.
-
-## Next steps
-
-[Stream a file](stream-files-dotnet-quickstart.md)
media-services Setup Postman Rest How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/setup-postman-rest-how-to.md
- Title: Configure Postman for Azure Media Services v3 REST API
-description: This article shows you how to configure Postman so it can be used to call Azure Media Services (AMS) REST APIs.
------ Previously updated : 08/31/2020--
-# Configure Postman for Media Services v3 REST API calls
--
-This article shows you how to configure **Postman** so it can be used to call Azure Media Services (AMS) REST APIs. This is provided as a learning tool, and not recommended for production applications. Production applications should use the supported client SDKs, which contain Azure Resource Management retry policies built-in.
--
-The article shows how to import environment and collection files into **Postman**. The collection contains grouped definitions of HTTP requests that call Azure Media Services (AMS) REST APIs. The environment file contains variables that are used by the collection.
-
-Before you start developing, review [Developing with Media Services v3 APIs](media-services-apis-overview.md).
-
-## Prerequisites
--- [Create a Media Services account](./account-create-how-to.md). Make sure to remember the resource group name and the Media Services account name. -- Get information needed to [access APIs](./access-api-howto.md)-- Install the [Postman](https://www.getpostman.com/) REST client to execute the REST APIs shown in some of the AMS REST tutorials. -
- We are using **Postman** but any REST tool would be suitable. Other alternatives are: **Visual Studio Code** with the REST plugin or **Telerik Fiddler**.
-
-> [!IMPORTANT]
-> Review [naming conventions](media-services-apis-overview.md#naming-conventions).
-
-## Download Postman files
-
-Clone a GitHub repository that contains the Postman collection and environment files.
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-rest-postman.git
- ```
-
-## Configure Postman
-
-### Configure the environment
-
-1. Open the **Postman** app.
-2. On the right of the screen, select the **Manage environment** option.
-
- ![Manage env](./media/develop-with-postman/postman-import-env.png)
-4. From the **Manage environment** dialog, click **Import**.
-2. Browse to the `Azure Media Service v3 Environment.postman_environment.json` file that was downloaded when you cloned `https://github.com/Azure-Samples/media-services-v3-rest-postman.git`.
-6. The **Azure Media Service v3 Environment** environment is added.
-
- > [!Note]
- > Update access variables with values you got from the **Access the Media Services API** section above.
-
-7. Double-click on the selected file and enter values that you got by following the accessing API steps.
-8. Close the dialog.
-9. Select the **Azure Media Service v3 Environment** environment from the dropdown.
-
- ![Choose env](./media/develop-with-postman/choose-env.png)
-
-### Configure the collection
-
-1. Click **Import** to import the collection file.
-1. Browse to the `Media Services v3.postman_collection.json` file that was downloaded when you cloned `https://github.com/Azure-Samples/media-services-v3-rest-postman.git`
-3. Choose the **Media Services v3.postman_collection.json** file.
-
- ![Import a file](./media/develop-with-postman/postman-import-collection.png)
-
-## Get Azure AD Token
-
-Before you start manipulating AMS v3 resources you need to get and set Azure AD Token for Service Principal Authentication.
-
-1. In the left window of the Postman app, select "Step 1: Get AAD Auth token".
-2. Then, select "Get Azure AD Token for Service Principal Authentication".
-3. Press **Send**.
-
- The following **POST** operation is sent.
-
- ```
- https://login.microsoftonline.com/:tenantId/oauth2/token
- ```
-
-4. The response comes back with the token and sets the "AccessToken" environment variable to the token value.
-
- ![Get AAD token](./media/develop-with-postman/postman-get-aad-auth-token.png)
-
-## Troubleshooting
-
-* If your application fails with "HTTP 504: Gateway Timeout", make sure that the location variable has not been explicitly set to a value other than the expected location of the Media Services account.
-* If you get an "account not found" error, also check to make sure that the location property in the Body JSON message is set to the location that the Media Services account is in.
-
-## See also
--- [Create filters with Media Services - REST](filters-dynamic-manifest-rest-howto.md)-- [Azure Resource Manager based REST API](https://github.com/Azure-Samples/media-services-v3-arm-templates)-
-## Next steps
--- [Stream files with REST](stream-files-tutorial-with-rest.md). -- [Tutorial: Encode a remote file based on URL and stream the video - REST](stream-files-tutorial-with-rest.md)
media-services Signal Descriptive Audio Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/signal-descriptive-audio-howto.md
- Title: Signal descriptive audio tracks with Media Services v3
-description: Follow the steps of this tutorial to upload a file, encode the video, add descriptive audio tracks, and stream your content with Media Services v3.
----- Previously updated : 03/09/2022---
-# Signal descriptive audio tracks
--
-You can add a narration track to your video to help visually impaired clients to follow the video recording by listening to the narration. In Media Services v3, you signal descriptive audio tracks by annotating the audio track in the manifest file.
-
-This article shows how to encode a video, upload an audio-only MP4 file (AAC codec) containing descriptive audio into the output asset, and edit the .ism file to include the descriptive audio.
-
-## Prerequisites
--- [Create a Media Services account](./account-create-how-to.md).-- Follow the steps in [Access Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You will need to use them to access the API.-- Review [Dynamic packaging](encode-dynamic-packaging-concept.md).-- Review the [Upload, encode, and stream videos](stream-files-tutorial-with-api.md) tutorial.-
-## [.NET](#tab/net/)
-
-## Create an input asset and upload a local file into it
-
-The **CreateInputAsset** function creates a new input [Asset](/rest/api/media/assets) and uploads the specified local video file into it. This **Asset** is used as the input to your encoding Job. In Media Services v3, the input to a **Job** can either be an **Asset**, or it can be content that you make available to your Media Services account via HTTPS URLs.
-
-If you want to learn how to encode from an HTTPS URL, see [this article](job-input-from-http-how-to.md) .
-
-In Media Services v3, you use Azure Storage APIs to upload files. The following .NET snippet shows how.
-
-The following function performs these actions:
-
-* Creates an **Asset**
-* Gets a writable [SAS URL](../../storage/common/storage-sas-overview.md) to the assetΓÇÖs [container in storage](../../storage/blobs/storage-quickstart-blobs-dotnet.md#upload-a-blob-to-a-container)
-* Uploads the file into the container in storage using the SAS URL
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateInputAsset)]
-
-If you need to pass the name of the created input asset to other methods, make sure to use the `Name` property on the asset object returned from `CreateInputAssetAsync`, for example, inputAsset.Name.
-
-## Create an output asset to store the result of the encoding job
-
-The output [Asset](/rest/api/media/assets) stores the result of your encoding job. The following function shows how to create an output asset.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateOutputAsset)]
-
-If you need to pass the name of the created output asset to other methods, make sure to use the `Name` property on the asset object returned from `CreateIOutputAssetAsync`, for example, outputAsset.Name.
-
-In the case of this article, pass the `outputAsset.Name` value to the `SubmitJobAsync` and `UploadAudioIntoOutputAsset` functions.
-
-## Create a transform and a job that encodes the uploaded file
-
-When encoding or processing content in Media Services, it is a common pattern to set up the encoding settings as a recipe. You would then submit a **Job** to apply that recipe to a video. By submitting new jobs for each new video, you are applying that recipe to all the videos in your library. A recipe in Media Services is called as a **Transform**. For more information, see [Transforms and Jobs](./transform-jobs-concept.md). The sample described in this tutorial defines a recipe that encodes the video in order to stream it to a variety of iOS and Android devices.
-
-The following example creates a transform (if one does not exist).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#EnsureTransformExists)]
-
-The following function submits a job.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#SubmitJob)]
-
-## Wait for the job to complete
-
-The job takes some time to complete and when it does you want to be notified. We recommend using Event Grid to wait for the job to complete.
-
-The job usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has encountered an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and **Canceled** when it is done.
-
-For more information, see [Handling Event Grid events](monitoring/reacting-to-media-services-events.md).
-
-## Upload the audio-only MP4 file
-
-Upload the additional audio-only MP4 file (AAC codec) containing descriptive audio into the output asset.
-
-```csharp
-private static async Task UpoadAudioIntoOutputAsset(
- IAzureMediaServicesClient client,
- string resourceGroupName,
- string accountName,
- string outputAssetName,
- string fileToUpload)
-{
- // Use the Assets.Get method to get the existing asset.
- // In Media Services v3, the Get method on entities returns null
- // if the entity doesn't exist (a case-insensitive check on the name).
-
- // Call Media Services API to create an Asset.
- // This method creates a container in storage for the Asset.
- // The files (blobs) associated with the asset will be stored in this container.
- Asset asset = await client.Assets.GetAsync(resourceGroupName, accountName, outputAssetName);
-
- if (asset != null)
- {
- // Use Media Services API to get back a response that contains
- // SAS URL for the Asset container into which to upload blobs.
- // That is where you would specify read-write permissions
- // and the exparation time for the SAS URL.
- var response = await client.Assets.ListContainerSasAsync(
- resourceGroupName,
- accountName,
- outputAssetName,
- permissions: AssetContainerPermission.ReadWrite,
- expiryTime: DateTime.UtcNow.AddHours(4).ToUniversalTime());
-
- var sasUri = new Uri(response.AssetContainerSasUrls.First());
-
- // Use Storage API to get a reference to the Asset container
- // that was created by calling Asset's CreateOrUpdate method.
- CloudBlobContainer container = new CloudBlobContainer(sasUri);
- var blob = container.GetBlockBlobReference(Path.GetFileName(fileToUpload));
-
- // Use Strorage API to upload the file into the container in storage.
- await blob.UploadFromFileAsync(fileToUpload);
- }
-}
-```
-
-Here is an example of a call to the `UpoadAudioIntoOutputAsset` function:
-
-```csharp
-await UpoadAudioIntoOutputAsset(client, config.ResourceGroup, config.AccountName, outputAsset.Name, "audio_description.m4a");
-```
-
-## Edit the .ism file
-
-When your encoding job is done, the output asset will contain the files generated by the encoding job.
-
-1. In the Azure portal, navigate to the storage account associated with your Media Services account.
-1. Find the container with the name of your output asset.
-1. In the container, find the .ism file and click **Edit blob** (in the right window).
-1. Edit the .ism file by adding the information about the uploaded audio-only MP4 file (AAC codec) containing descriptive audio and press **Save** when done.
-
- To signal the descriptive audio tracks, you need to add ΓÇ£accessibilityΓÇ¥ and ΓÇ£roleΓÇ¥ parameters to the .ism file. It is your responsibility to set these parameters correctly to signal an audio track as audio description. For example, add `<param name="accessibility" value="description" />` and `<param name="role" value="alternate" />` to the .ism file for a specific audio track, as shown in the following example.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
- <head>
- <meta name="clientManifestRelativePath" content="ignite.ismc" />
- <meta name="formats" content="mp4-v3" />
- </head>
- <body>
- <switch>
- <audio src="ignite_320x180_AACAudio_381.mp4" systemBitrate="128041" systemLanguage="eng">
- <param name="systemBitrate" value="128041" valuetype="data" />
- <param name="trackID" value="2" valuetype="data" />
- <param name="trackName" value="aac_eng_2_128041_2_1" valuetype="data" />
- <param name="systemLanguage" value="eng" valuetype="data" />
- <param name="trackIndex" value="ignite_320x180_AACAudio_381_2.mpi" valuetype="data" />
- </audio>
- <audio src="audio_description.m4a" systemBitrate="194000" systemLanguage="eng">
- <param name="trackName" value="aac_eng_audio_description" />
- <param name="accessibility" value="description" />
- <param name="role" value="alternate" />
- </audio>
- <video src="ignite_1280x720_AACAudio_3549.mp4" systemBitrate="3549855">
- <param name="systemBitrate" value="3549855" valuetype="data" />
- <param name="trackID" value="1" valuetype="data" />
- <param name="trackName" value="video" valuetype="data" />
- <param name="trackIndex" value="ignite_1280x720_AACAudio_3549_1.mpi" valuetype="data" />
- </video>
- <video src="ignite_960x540_AACAudio_2216.mp4" systemBitrate="2216764">
- <param name="systemBitrate" value="2216764" valuetype="data" />
- <param name="trackID" value="1" valuetype="data" />
- <param name="trackName" value="video" valuetype="data" />
- <param name="trackIndex" value="ignite_960x540_AACAudio_2216_1.mpi" valuetype="data" />
- </video>
- <video src="ignite_640x360_AACAudio_1154.mp4" systemBitrate="1154569">
- <param name="systemBitrate" value="1154569" valuetype="data" />
- <param name="trackID" value="1" valuetype="data" />
- <param name="trackName" value="video" valuetype="data" />
- <param name="trackIndex" value="ignite_640x360_AACAudio_1154_1.mpi" valuetype="data" />
- </video>
- <video src="ignite_480x270_AACAudio_721.mp4" systemBitrate="721893">
- <param name="systemBitrate" value="721893" valuetype="data" />
- <param name="trackID" value="1" valuetype="data" />
- <param name="trackName" value="video" valuetype="data" />
- <param name="trackIndex" value="ignite_480x270_AACAudio_721_1.mpi" valuetype="data" />
- </video>
- <video src="ignite_320x180_AACAudio_381.mp4" systemBitrate="381027">
- <param name="systemBitrate" value="381027" valuetype="data" />
- <param name="trackID" value="1" valuetype="data" />
- <param name="trackName" value="video" valuetype="data" />
- <param name="trackIndex" value="ignite_320x180_AACAudio_381_1.mpi" valuetype="data" />
- </video>
- </switch>
- </body>
-</smil>
-```
-
-## Get a streaming locator
-
-After the encoding is complete, the next step is to make the video in the output Asset available to clients for playback. You can accomplish this in two steps: first, create a [Streaming Locator](/rest/api/media/streaminglocators), and second, build the streaming URLs that clients can use.
-
-The process of creating a **Streaming Locator** is called publishing. By default, the **Streaming Locator** is valid immediately after you make the API calls, and lasts until it is deleted, unless you configure the optional start and end times.
-
-When creating a [StreamingLocator](/rest/api/media/streaminglocators), you will need to specify the desired **StreamingPolicyName**. In this example, you will be streaming in-the-clear (or non-encrypted content) so the predefined clear streaming policy (**PredefinedStreamingPolicy.ClearStreamingOnly**) is used.
-
-> [!IMPORTANT]
-> When using a custom [Streaming Policy](/rest/api/media/streamingpolicies), you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of Streaming Policy entries. You should not be creating a new Streaming Policy for each Streaming Locator.
-
-The following code assumes that you are calling the function with a unique locatorName.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateStreamingLocator)]
-
-While the sample in this topic discusses streaming, you can use the same call to create a Streaming Locator for delivering video via progressive download.
-
-### Get streaming URLs
-
-Now that the [Streaming Locator](/rest/api/media/streaminglocators) has been created, you can get the streaming URLs, as shown in **GetStreamingURLs**. To build a URL, you need to concatenate the [Streaming Endpoint](/rest/api/media/streamingendpoints) host name and the **Streaming Locator** path. In this sample, the *default* **Streaming Endpoint** is used. When you first create a Media Service account, this *default* **Streaming Endpoint** will be in a stopped state, so you need to call **Start**.
-
-> [!NOTE]
-> In this method, you need the locatorName that was used when creating the **Streaming Locator** for the output Asset.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#GetStreamingURLs)]
-
-## Test with Azure Media Player
-
-To test the stream, this article uses Azure Media Player.
-
-> [!NOTE]
-> If a player is hosted on an https site, make sure to update the URL to "https".
-
-1. Open a web browser and navigate to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL:** box, paste one of the streaming URL values you got from your application.
-
- You can paste the URL in HLS, Dash, or Smooth format and Azure Media Player will switch to an appropriate streaming protocol for playback on your device automatically.
-3. Press **Update Player**.
-
-Azure Media Player can be used for testing but should not be used in a production environment.
--
media-services Storage Account Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/storage-account-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Azure storage accounts
-: Azure Media Services
-description: Learn how to create an Azure storage account to use with Azure Media Services.
------ Previously updated : 01/29/2021---
-# Azure Storage accounts
--
-To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account. When creating a Media Services account, you need to supply the name of an Azure Storage account resource. The specified storage account is attached to your Media Services account.
-
-The Media Services account and all associated storage accounts must be in the same Azure subscription. It's strongly recommended to use storage accounts in the same location as the Media Services account to avoid additional latency and data egress costs.
-
-You must have one **Primary** storage account and you can have any number of **Secondary** storage accounts associated with your Media Services account. Media Services supports **General-purpose v2** (GPv2) or **General-purpose v1** (GPv1) accounts. Blob only accounts aren't allowed as **Primary**.
-
-We recommend that you use GPv2, so you can take advantage of the latest features and performance. To learn more about storage accounts, see [Azure Storage account overview](../../storage/common/storage-account-overview.md).
-
-> [!NOTE]
-> Only the hot access tier is supported for use with Azure Media Services, although the other access tiers can be used to reduce storage costs on content that isn't being actively used.
-
-There are different SKUs you can choose for your storage account. If you want to experiment with storage accounts, use `--sku Standard_LRS`. However, when picking a SKU for production, you should consider `--sku Standard_RAGRS`, which provides geographic replication for business continuity.
-
-## Assets in a storage account
-
-In Media Services v3, the Storage APIs are used to upload files into assets. For more information, see [Assets in Azure Media Services v3](assets-concept.md).
-
-> [!Note]
-> Don't attempt to change the contents of blob containers that were generated by the Media Services SDK without using Media Services APIs.
-
-## Storage side encryption
-
-To protect your assets at rest, the assets should be encrypted by the storage side encryption. The following table shows how the storage side encryption works in Media Services v3:
-
-|Encryption option|Description|Media Services v3|
-||||
-|Media Services storage encryption| AES-256 encryption, key managed by Media Services. |Not supported.<sup>1</sup>|
-|[Storage service encryption for data at rest](../../storage/common/storage-service-encryption.md)|Server-side encryption offered by Azure Storage, key managed by Azure or by customer.|Supported.|
-|[Storage client-side encryption](../../storage/common/storage-client-side-encryption.md)|Client-side encryption offered by Azure storage, key managed by customer in Key Vault.|Not supported.|
-
-<sup>1</sup> In Media Services v3, storage encryption (AES-256 encryption) is only supported for backwards compatibility when your assets were created with Media Services v2, which means v3 works with existing storage encrypted assets but won't allow creation of new ones.
-
-## Storage account double encryption
-
-Storage accounts support double encryption but the second layer must explicitly be enabled. See [Azure Storage encryption for data at rest](../../storage/common/storage-service-encryption.md#doubly-encrypt-data-with-infrastructure-encryption).
-
-## Storage account errors
-
-The "Disconnected" state for a Media Services account indicates that the account no longer has access to one or more of the attached storage accounts due to a change in storage access keys. Up-to-date storage access keys are required by Media Services to perform many tasks in the account.
-
-The following are the primary scenarios that would result in a Media Services account not having access to attached storage accounts.
-
-|Issue|Solution|
-|||
-|The Media Services account or attached storage account(s) were migrated to separate subscriptions. |Migrate the storage account(s) or Media Services account so that they're all in the same subscription. |
-|The Media Services account is using an attached storage account in a different subscription as it was an early Media Services account where this was supported. All early Media Services accounts were converted to modern Azure Resources Manager based accounts and will have a Disconnected state. |Migrate the storage account or Media Services account so that they're all in the same subscription.|
-
-## Next steps
-
-To learn how to attach a storage account to your Media Services account, see [Create an account](./account-create-how-to.md).
media-services Storage Managing Multiple Storage Accounts How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/storage-managing-multiple-storage-accounts-how-to.md
- Title: Managing Media Services v3 assets across multiple storage accounts | Microsoft Docs
-description: This articles give you guidance on how to manage Media Services v3 assets across multiple storage accounts.
------ Previously updated : 03/22/2021----
-# Managing Media Services v3 assets across multiple storage accounts
--
-You can attach multiple storage accounts to a single Media Services account. Ability to attach multiple storage accounts to a Media Services account provides the following benefits:
-
-* Load balancing your assets across multiple storage accounts.
-* Scaling Media Services for large amounts of content processing (as currently a single storage account has a max limit of 500 TB).
-
-This article demonstrates how to attach multiple storage accounts to a Media Services account using [Azure Resource Manager APIs](/rest/api/media/operations/azure-media-services-rest-api-reference) and [PowerShell](/powershell/module/az.media). It also shows how to specify different storage accounts when creating assets using the Media Services SDK.
--
-## Considerations
-
-When attaching multiple storage accounts to your Media Services account, the following considerations apply:
-
-* The Media Services account and all associated storage accounts must be in the same Azure subscription. It is recommended to use storage accounts in the same location as the Media Services account.
-* Once a storage account is attached to the specified Media Services account, it cannot be detached.
-* Primary storage account is the one indicated during Media Services account creation time. Currently, you cannot change the default storage account.
-* If you want to add a Cool Storage account to the AMS account, the storage account must be a Blob type and set to non-primary.
-
-Other considerations:
-
-Media Services uses the value of the **IAssetFile.Name** property when building URLs for the streaming content (for example, http://{WAMSAccount}.origin.mediaservices.windows.net/{GUID}/{IAssetFile.Name}/streamingParameters.) For this reason, percent-encoding is not allowed. The value of the Name property cannot have any of the following [percent-encoding-reserved characters](https://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters): !*'();:@&=+$,/?%#[]". Also, there can only be one ΓÇÿ.ΓÇÖ for the file name extension.
-
-## To attach storage accounts
-
-To attach storage accounts to your AMS account, use [Azure Resource Manager APIs](/rest/api/media/operations/azure-media-services-rest-api-reference) and [PowerShell](/powershell/module/az.media), as shown in the following example:
-
-```azurepowershell
-$regionName = "West US"
-$subscriptionId = " xxxxxxxx-xxxx-xxxx-xxxx- xxxxxxxxxxxx "
-$resourceGroupName = "SkyMedia-USWest-App"
-$mediaAccountName = "sky"
-$storageAccount1Name = "skystorage1"
-$storageAccount2Name = "skystorage2"
-$storageAccount1Id = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccount1Name"
-$storageAccount2Id = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccount2Name"
-$storageAccount1 = New-AzMediaServiceStorageConfig -StorageAccountId $storageAccount1Id -IsPrimary
-$storageAccount2 = New-AzMediaServiceStorageConfig -StorageAccountId $storageAccount2Id
-$storageAccounts = @($storageAccount1, $storageAccount2)
-
-Set-AzMediaService -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccounts $storageAccounts
-```
-
-### Support for Cool Storage
-
-Currently, if you want to add a Cool Storage account to the AMS account, the storage account must be a Blob type and set to non-primary.
-
-## To manage Media Services assets across multiple Storage Accounts
-
-The following code uses the latest Media Services SDK to perform the following tasks:
-
-1. Display all the storage accounts associated with the specified Media Services account.
-2. Retrieve the name of the default storage account.
-3. Create a new asset in the default storage account.
-4. Create an output asset of the encoding job in the specified storage account.
-
-```cs
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using System.Text;
-using System.Threading;
-using System.Threading.Tasks;
-
-namespace MultipleStorageAccounts
-{
- class Program
- {
- // Location of the media file that you want to encode.
- private static readonly string _singleInputFilePath =
- Path.GetFullPath(@"../..\supportFiles\multifile\interview2.wmv");
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Display the storage accounts associated with
- // the specified Media Services account:
- foreach (var sa in _context.StorageAccounts)
- Console.WriteLine(sa.Name);
-
- // Retrieve the name of the default storage account.
- var defaultStorageName = _context.StorageAccounts.Where(s => s.IsDefault == true).FirstOrDefault();
- Console.WriteLine("Name: {0}", defaultStorageName.Name);
- Console.WriteLine("IsDefault: {0}", defaultStorageName.IsDefault);
-
- // Retrieve the name of a storage account that is not the default one.
- var notDefaultStorageName = _context.StorageAccounts.Where(s => s.IsDefault == false).FirstOrDefault();
- Console.WriteLine("Name: {0}", notDefaultStorageName.Name);
- Console.WriteLine("IsDefault: {0}", notDefaultStorageName.IsDefault);
-
- // Create the original asset in the default storage account.
- IAsset asset = CreateAssetAndUploadSingleFile(AssetCreationOptions.None,
- defaultStorageName.Name, _singleInputFilePath);
- Console.WriteLine("Created the asset in the {0} storage account", asset.StorageAccountName);
-
- // Create an output asset of the encoding job in the other storage account.
- IAsset outputAsset = CreateEncodingJob(asset, notDefaultStorageName.Name, _singleInputFilePath);
- if (outputAsset != null)
- Console.WriteLine("Created the output asset in the {0} storage account", outputAsset.StorageAccountName);
-
- }
-
- static public IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string storageName, string singleFilePath)
- {
- var assetName = "UploadSingleFile_" + DateTime.UtcNow.ToString();
-
- // If you are creating an asset in the default storage account, you can omit the StorageName parameter.
- var asset = _context.Assets.Create(assetName, storageName, assetCreationOptions);
-
- var fileName = Path.GetFileName(singleFilePath);
-
- var assetFile = asset.AssetFiles.Create(fileName);
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
-
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return asset;
- }
-
- static IAsset CreateEncodingJob(IAsset asset, string storageName, string inputMediaFilePath)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("My encoding job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "Adaptive Streaming",
- Microsoft.WindowsAzure.MediaServices.Client.TaskOptions.ProtectedConfiguration);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset", storageName,
- AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new
- EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // Get an updated job reference.
- job = GetJob(job.Id);
-
- // If job state is Error the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- Console.WriteLine("\nExiting method due to job error.");
- return null;
- }
-
- // Get a reference to the output asset from the job.
- IAsset outputAsset = job.OutputMediaAssets[0];
-
- return outputAsset;
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
-
- private static void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("********************");
- Console.WriteLine("Job is finished.");
- Console.WriteLine("Please wait while local tasks or downloads complete...");
- Console.WriteLine("********************");
- Console.WriteLine();
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- Console.WriteLine("An error occurred in {0}", job.Id);
- break;
- default:
- break;
- }
- }
-
- static IJob GetJob(string jobId)
- {
- // Use a Linq select query to get an updated
- // reference by Id.
- var jobInstance =
- from j in _context.Jobs
- where j.Id == jobId
- select j;
- // Return the job reference as an Ijob.
- IJob job = jobInstance.FirstOrDefault();
-
- return job;
- }
- }
-}
-```
media-services Storage Sync Storage Keys How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/storage-sync-storage-keys-how-to.md
- Title: Sync the storage keys of a Media Services storage account
-description: This article shows you how to sync the storage keys of a Media Services storage account.
----- Previously updated : 03/08/2022---
-# Sync storage keys
--
-This article shows you how to sync the storage keys of a Media Services storage account.
-
-## Methods
-
-You can use the following methods to sync the storage keys of a Media Services storage account.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/mediaservices/sync-storage-keys).
--
media-services Stream Files Cli Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-files-cli-quickstart.md
- Title: Stream video files with Azure Media Services CLI
-description: Follow the steps of this tutorial to use Azure CLI to create a new Azure Media Services account, encode a file, and stream it to Azure Media Player.
---
-keywords: azure media services, stream
---- Previously updated : 08/31/2020-
-#Customer intent: As a developer, I want to create a Media Services account so that I can store, encrypt, encode, manage, and stream media content in Azure.
--
-# Tutorial: Encode a remote file based on URL and stream the video - Azure CLI
--
-This tutorial shows how to easily encode and stream videos on a variety of browsers and devices by using Azure Media Services and the Azure CLI. You can specify input content by using HTTPS or SAS URLs or paths to files in Azure Blob storage.
-
-The example in this article encodes content that you make accessible via an HTTPS URL. Media Services v3 doesn't currently support chunked transfer encoding over HTTPS URLs.
-
-By the end of this tutorial, you'll be able to stream a video.
-
-![Play the video](./media/stream-files-dotnet-quickstart/final-video.png)
--
-## Create a Media Services account
-
-Before you can encrypt, encode, analyze, manage, and stream media content in Azure, you need to create a Media Services account. That account must be associated with one or more storage accounts.
-
-Your Media Services account and all associated storage accounts must be in the same Azure subscription. We recommend that you use storage accounts that are in the same place as the Media Services account to limit latency and data egress costs.
-
-### Create a resource group
-
-```azurecli-interactive
-az group create -n amsResourceGroup -l westus2
-```
-
-### Create an Azure storage account
-
-In this example, we create a General-Purpose v2 Standard LRS account.
-
-If you want to experiment with storage accounts, use `--sku Standard_LRS`. When you're picking a SKU for production, consider using `--sku Standard_RAGRS`, which provides geographic replication for business continuity. For more information, see [storage accounts](/cli/azure/storage/account).
-
-```azurecli-interactive
-az storage account create -n amsstorageaccount --kind StorageV2 --sku Standard_LRS -l westus2 -g amsResourceGroup
-```
-
-### Create an Azure Media Services account
-
-```azurecli-interactive
-az ams account create --n amsaccount -g amsResourceGroup --storage-account amsstorageaccount -l westus2
-```
-
-You get a response like this:
-
-```
-{
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount",
- "location": "West US 2",
- "mediaServiceId": "8b569c2e-d648-4fcb-9035-c7fcc3aa7ddf",
- "name": "amsaccount",
- "resourceGroup": "amsResourceGroupTest",
- "storageAccounts": [
- {
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Storage/storageAccounts/amsstorageaccount",
- "resourceGroup": "amsResourceGroupTest",
- "type": "Primary"
- }
- ],
- "tags": null,
- "type": "Microsoft.Media/mediaservices"
-}
-```
-
-## Start the streaming endpoint
-
-The following Azure CLI command starts the default **Streaming Endpoint**.
-
-```azurecli-interactive
-az ams streaming-endpoint start -n default -a amsaccount -g amsResourceGroup
-```
-
-You get a response like this:
-
-```
-{
- "accessControl": null,
- "availabilitySetName": null,
- "cdnEnabled": true,
- "cdnProfile": "AzureMediaStreamingPlatformCdnProfile-StandardVerizon",
- "cdnProvider": "StandardVerizon",
- "created": "2019-02-06T21:58:03.604954+00:00",
- "crossSiteAccessPolicies": null,
- "customHostNames": [],
- "description": "",
- "freeTrialEndTime": "2019-02-21T22:05:31.277936+00:00",
- "hostName": "amsaccount-usw22.streaming.media.azure.net",
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount/streamingendpoints/default",
- "lastModified": "2019-02-06T21:58:03.604954+00:00",
- "location": "West US 2",
- "maxCacheAge": null,
- "name": "default",
- "provisioningState": "Succeeded",
- "resourceGroup": "amsResourceGroup",
- "resourceState": "Running",
- "scaleUnits": 0,
- "tags": {},
- "type": "Microsoft.Media/mediaservices/streamingEndpoints"
-}
-```
-
-If the streaming endpoint is already running, you get this message:
-
-```
-(InvalidOperation) The server cannot execute the operation in its current state.
-```
-
-## Create a transform for adaptive bitrate encoding
-
-Create a **Transform** to configure common tasks for encoding or analyzing videos. In this example, we do adaptive bitrate encoding. We then submit a job under the transform that we created. The job is the request to Media Services to apply the transform to the given video or audio content input.
-
-```azurecli-interactive
-az ams transform create --name testEncodingTransform --preset AdaptiveStreaming --description 'a simple Transform for Adaptive Bitrate Encoding' -g amsResourceGroup -a amsaccount
-```
-
-You get a response like this:
-
-```
-{
- "created": "2019-02-15T00:11:18.506019+00:00",
- "description": "a simple Transform for Adaptive Bitrate Encoding",
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount/transforms/testEncodingTransform",
- "lastModified": "2019-02-15T00:11:18.506019+00:00",
- "name": "testEncodingTransform",
- "outputs": [
- {
- "onError": "StopProcessingJob",
- "preset": {
- "odatatype": "#Microsoft.Media.BuiltInStandardEncoderPreset",
- "presetName": "AdaptiveStreaming"
- },
- "relativePriority": "Normal"
- }
- ],
- "resourceGroup": "amsResourceGroup",
- "type": "Microsoft.Media/mediaservices/transforms"
-}
-```
-
-## Create an output asset
-
-Create an output **Asset** to use as the encoding job's output.
-
-```azurecli-interactive
-az ams asset create -n testOutputAssetName -a amsaccount -g amsResourceGroup
-```
-
-You get a response like this:
-
-```
-{
- "alternateId": null,
- "assetId": "96427438-bbce-4a74-ba91-e38179b72f36",
- "container": null,
- "created": "2019-02-14T23:58:19.127000+00:00",
- "description": null,
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount/assets/testOutputAssetName",
- "lastModified": "2019-02-14T23:58:19.127000+00:00",
- "name": "testOutputAssetName",
- "resourceGroup": "amsResourceGroup",
- "storageAccountName": "amsstorageaccount",
- "storageEncryptionFormat": "None",
- "type": "Microsoft.Media/mediaservices/assets"
-}
-```
-
-## Start a job by using HTTPS input
-
-When you submit jobs to process videos, you have to tell Media Services where to find the input video. One option is to specify an HTTPS URL as the job input, as shown in this example.
-
-When you run `az ams job start`, you can set a label on the job's output. You can then use the label to identify what the output asset is for.
--- If you assign a value to the label, set '--output-assets' to "assetname=label".-- If you don't assign a value to the label, set '--output-assets' to "assetname=".-
- Notice that we add "=" to the `output-assets`.
-
-```azurecli-interactive
-az ams job start --name testJob001 --transform-name testEncodingTransform --base-uri 'https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/' --files 'Ignite-short.mp4' --output-assets testOutputAssetName= -a amsaccount -g amsResourceGroup
-```
-
-You get a response like this:
-
-```
-{
- "correlationData": {},
- "created": "2019-02-15T05:08:26.266104+00:00",
- "description": null,
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount/transforms/testEncodingTransform/jobs/testJob001",
- "input": {
- "baseUri": "https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/",
- "files": [
- "Ignite-short.mp4"
- ],
- "label": null,
- "odatatype": "#Microsoft.Media.JobInputHttp"
- },
- "lastModified": "2019-02-15T05:08:26.266104+00:00",
- "name": "testJob001",
- "outputs": [
- {
- "assetName": "testOutputAssetName",
- "error": null,
- "label": "",
- "odatatype": "#Microsoft.Media.JobOutputAsset",
- "progress": 0,
- "state": "Queued"
- }
- ],
- "priority": "Normal",
- "resourceGroup": "amsResourceGroup",
- "state": "Queued",
- "type": "Microsoft.Media/mediaservices/transforms/jobs"
-}
-```
-
-### Check status
-
-In five minutes, check the status of the job. It should be "Finished." It's not finished, check again in a few minutes. When it's finished, go to the next step and create a **Streaming Locator**.
-
-```azurecli-interactive
-az ams job show -a amsaccount -g amsResourceGroup -t testEncodingTransform -n testJob001
-```
-
-## Create a streaming locator and get a path
-
-After the encoding is complete, the next step is to make the video in the output asset available to clients for playback. To do this, first create a Streaming Locator. Then, build streaming URLs that clients can use.
-
-### Create a streaming locator
-
-```azurecli-interactive
-az ams streaming-locator create -n testStreamingLocator --asset-name testOutputAssetName --streaming-policy-name Predefined_ClearStreamingOnly -g amsResourceGroup -a amsaccount
-```
-
-You get a response like this:
-
-```
-{
- "alternativeMediaId": null,
- "assetName": "output-3b6d7b1dffe9419fa104b952f7f6ab76",
- "contentKeys": [],
- "created": "2019-02-15T04:35:46.270750+00:00",
- "defaultContentKeyPolicyName": null,
- "endTime": "9999-12-31T23:59:59.999999+00:00",
- "id": "/subscriptions/<id>/resourceGroups/amsResourceGroup/providers/Microsoft.Media/mediaservices/amsaccount/streamingLocators/testStreamingLocator",
- "name": "testStreamingLocator",
- "resourceGroup": "amsResourceGroup",
- "startTime": null,
- "streamingLocatorId": "e01b2be1-5ea4-42ca-ae5d-7fe704a5962f",
- "streamingPolicyName": "Predefined_ClearStreamingOnly",
- "type": "Microsoft.Media/mediaservices/streamingLocators"
-}
-```
-
-### Get streaming locator paths
-
-```azurecli-interactive
-az ams streaming-locator get-paths -a amsaccount -g amsResourceGroup -n testStreamingLocator
-```
-
-You get a response like this:
-
-```
-{
- "downloadPaths": [],
- "streamingPaths": [
- {
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/e01b2be1-5ea4-42ca-ae5d-7fe704a5962f/ignite.ism/manifest(format=m3u8-aapl)"
- ],
- "streamingProtocol": "Hls"
- },
- {
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/e01b2be1-5ea4-42ca-ae5d-7fe704a5962f/ignite.ism/manifest(format=mpd-time-csf)"
- ],
- "streamingProtocol": "Dash"
- },
- {
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/e01b2be1-5ea4-42ca-ae5d-7fe704a5962f/ignite.ism/manifest"
- ],
- "streamingProtocol": "SmoothStreaming"
- }
- ]
-}
-```
-
-Copy the HTTP live streaming (HLS) path. In this case, it's `/e01b2be1-5ea4-42ca-ae5d-7fe704a5962f/ignite.ism/manifest(format=m3u8-aapl)`.
-
-## Build the URL
-
-### Get the streaming endpoint host name
-
-```azurecli-interactive
-az ams streaming-endpoint list -a amsaccount -g amsResourceGroup -n default
-```
-
-Copy the `hostName` value. In this case, it's `amsaccount-usw22.streaming.media.azure.net`.
-
-### Assemble the URL
-
-"https:// " + &lt;hostName value&gt; + &lt;Hls path value&gt;
-
-Here's an example:
-
-`https://amsaccount-usw22.streaming.media.azure.net/7f19e783-927b-4e0a-a1c0-8a140c49856c/ignite.ism/manifest(format=m3u8-aapl)`
-
-## Test playback by using Azure Media Player
-
-> [!NOTE]
-> If a player is hosted on an HTTPS site, make sure to start the URL with "https".
-
-1. Open a web browser and go to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL** box, paste the URL that you built in the previous section. You can paste the URL in HLS, Dash, or Smooth format. Azure Media Player will automatically use an appropriate streaming protocol for playback on your device.
-3. Select **Update Player**.
-
->[!NOTE]
->Azure Media Player can be used for testing but should not be used in a production environment.
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts that you created for this tutorial, delete the resource group.
-
-Run this Azure CLI command:
-
-```azurecli-interactive
-az group delete --name amsResourceGroup
-```
-
-## Next steps
-
-[Media Services overview](media-services-overview.md)
media-services Stream Files Dotnet Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-files-dotnet-quickstart.md
- Title: Stream video files with Azure Media Services - .NET
-description: Follow the steps of this tutorial to use .NET to create a new Azure Media Services account, encode a file, and stream it to Azure Media Player.
---
-keywords: azure media services, stream
----- Previously updated : 07/23/2021-
-#Customer intent: As a developer, I want to create a Media Services account so that I can store, encrypt, encode, manage, and stream media content in Azure.
--
-# Tutorial: Encode a remote file based on URL and stream the video - .NET
--
-This tutorial shows you how easy it is to encode and start streaming videos on a wide variety of browsers and devices using Azure Media Services. An input content can be specified using HTTPS URLs, SAS URLs, or paths to files located in Azure Blob storage.
-The sample in this topic encodes content that you make accessible via an HTTPS URL. Note that currently, AMS v3 does not support chunked transfer encoding over HTTPS URLs.
-
-By the end of the tutorial you will be able to stream a video.
-
-![Play the video](./media/stream-files-dotnet-quickstart/final-video.png)
--
-## Prerequisites
--- Install [Visual Studio Code for Windows/macOS/Linux](https://code.visualstudio.com/) or [Visual Studio 2019 for Windows or Mac](https://visualstudio.microsoft.com/).-- Install [.NET 5.0 SDK](https://dotnet.microsoft.com/download)-- [Create a Media Services account](./account-create-how-to.md). Be sure to copy the **API Access** details in JSON format or store the values needed to connect to the Media Services account in the *.env* file format used in this sample.-- Follow the steps in [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md). Be sure to *save the credentials*. You'll need to use them to access the API in this sample, or enter them into the *.env* file format.-
-## Download and configure the sample
-
-Clone a GitHub repository that contains the streaming .NET sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts.git
- ```
-
-The sample is located in the [EncodeAndStreamFiles](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/tree/master/AMSV3Quickstarts/EncodeAndStreamFiles) folder under AMSV3Quickstarts.
--
-The sample performs the following actions:
-
-1. Creates a **Transform** (first, checks if the specified Transform exists).
-2. Creates an output **Asset** that is used as the encoding **Job**'s output.
-3. Creates the **Job**'s input that is based on an HTTPS URL.
-4. Submits the encoding **Job** using the input and output that was created earlier.
-5. Checks the Job's status.
-6. Creates a **Streaming Locator**.
-7. Builds streaming URLs.
-
-For explanations about what each function in the sample does, examine the code and look at the comments in [this source file](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/blob/master/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs).
-
-## Run the sample app
-
-When you run the app, URLs that can be used to playback the video using different protocols are displayed.
-
-1. Open AMSV3Quickstarts in VSCode.
-2. Press Ctrl+F5 to run the *EncodeAndStreamFiles* application with .NET. This may take a few minutes.
-3. The app will output three URLs. You will use these URLs to test the stream in the next step.
-
-![Screenshot of the output from the EncodeAndStreamFiles app in Visual Studio showing three streaming URLs for use in the Azure Media Player.](./media/stream-files-tutorial-with-api/output.png)
-
-In the sample's [source code](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/blob/master/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs), you can see how the URL is built. To build it, you need to concatenate the streaming endpoint's host name and the streaming locator path.
-
-## Test with Azure Media Player
-
-To test the stream, this article uses Azure Media Player.
-
-> [!NOTE]
-> If a player is hosted on an https site, make sure to update the URL to "https".
-
-1. Open a web browser and navigate to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL:** box, paste one of the streaming URL values you got when you ran the application.
-
- You can paste the URL in HLS, Dash, or Smooth format and Azure Media Player will switch to an appropriate streaming protocol for playback on your device automatically.
-3. Press **Update Player**. This should start playing the video file in the repository.
-
-Azure Media Player can be used for testing but should not be used in a production environment.
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## Examine the code
-
-For explanations about what each function in the sample does, examine the code and look at the comments in [this source file](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/blob/master/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs).
-
-The [upload, encode, and stream files](stream-files-tutorial-with-api.md) tutorial gives you a more advanced streaming example with detailed explanations.
-
-### Job error codes
-
-See [Error codes](/rest/api/media/jobs/get#joberrorcode).
-
-## Multithreading
-
-The Azure Media Services v3 SDKs are not thread-safe. When working with multi-threaded application, you should generate a new AzureMediaServicesClient object per thread.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Tutorial: upload, encode, and stream files](stream-files-tutorial-with-api.md)
media-services Stream Files Nodejs Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-files-nodejs-quickstart.md
- Title: How to encode and stream video files with Node.js
-description: How to stream video files with Node.js. Follow the steps of this tutorial to create a new Azure Media Services account, encode a file, and stream it to Azure Media Player.
---
-keywords: azure media services, stream, Node.js
---- Previously updated : 02/17/2021----
-# How to encode and stream video files with Node.js
--
-This quickstart shows you how easy it is to encode and start streaming videos on a wide variety of browsers and devices using Azure Media Services. An input video file can be specified using HTTPS URLs, SAS URLs, or paths to files located in Azure Blob storage.
-
-By the end of this quickstart you will know:
--- How to encode with Node.js-- How to stream with Node.js-- How to upload a file from an HTTPS URL with Node.js-- How to use an HLS or DASH client player with Node.js-
-The sample in this article encodes content that you make accessible via an HTTPS URL. Note that currently, AMS v3 does not support chunked transfer encoding over HTTPS URLs.
-
-![Play the video](./media/stream-files-nodejs-quickstart/final-video.png)
--
-## Prerequisites
--- Install [Node.js](https://nodejs.org/en/download/)-- [Create a Media Services account](./create-account-howto.md).<br/>Make sure to remember the values that you used for the resource group name and Media Services account name.-- Follow the steps in [Access Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You will need to use them to access the API.-- Walk through the [Configure and Connect with Node.js](./configure-connect-nodejs-howto.md) how-to first to understand how to use the Node.js client SDK-
-## Download and configure the sample
-
-Clone a GitHub repository that contains the streaming Node.js sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-node-tutorials.git
- ```
-
-The sample is located in the [StreamFilesSample](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/master/StreamFilesSample) folder.
-
-Open [index.ts](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/master/StreamFilesSample/index.ts) in your downloaded project. Update the *sample.env* file in the root folder with the values and credentials that you got from [accessing APIs](./access-api-howto.md). Rename the *sample.env* file to *.env* (Yes, just the extension).
-
-The sample performs the following actions:
-
-1. Creates a **Transform** with a [Content Aware Encoding preset](./encode-content-aware-concept.md). It first checks if the specified Transform exists.
-1. Creates an output **Asset** that is used by the encoding **Job** to contain the output
-1. Optionally uploads a local file using the Storage Blob SDK
-1. Creates the **Job** input that is based on an HTTPS URL or uploaded file
-1. Submits the encoding **Job**, using the input and output that was created earlier
-1. Checks the Job's status
-1. Downloads the output of the encoding job to a local folder
-1. Creates a **Streaming Locator** to use in the player
-1. Builds streaming URLs for HLS and DASH
-1. Plays the content back in a player application - Azure Media Player
-
-## Run the sample
-
-1. The application downloads encoded files. Create a folder where you want the output files to go and update the value of the **outputFolder** variable in the [index.ts](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/StreamFilesSample/index.ts#L65) file. It is set to "./Temp" by default.
-1. Open a **command prompt**, browse to the sample's directory.
-1. Make a copy of the sample.env file and rename it to ".env"
-1. Update the contents of the .env file to match your account settings and subscription information for accessing the Media Services account. You can find this information in the API Access menu in the portal for the Media Services account.
-1. Install the packages used in the *packages.json* file.
-
- ```bash
- npm install
- ```
-
-1. Launch Visual Studio Code from the root folder of the samples
- ```bash
- code .
- ```
-
-Open the folder for *StreamFilesSample*, and open the *index.ts* file in the Visual Studio Code editor.
-While in the *index.ts* file, press F5 to launch the debugger.
-
-## Test with Azure Media Player
-
-Use Azure Media Player to test the stream. You can also use any HLS or DASH compliant player, like Shaka player, HLS.js, Dash.js, or others.
-
-You should be able to click on the link generated in the sample and launch the AMP player with the DASH manifest already loaded.
-
-> [!NOTE]
-> If a player is hosted on an https site, make sure to update the URL to "https".
-
-1. Open a web browser and navigate to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL:** box, paste one of the streaming URL values you got when you ran the application.You can paste the URL in HLS, Dash, or Smooth format and Azure Media Player will switch to an appropriate streaming protocol for playback on your device automatically.
-3. Press **Update Player**.
-
-Azure Media Player can be used for testing but should not be used in a production environment.
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## More developer documentation for Node.js on Azure
--- [Azure for JavaScript & Node.js developers](/azure/developer/javascript/)-- [Media Services source code in the @azure/azure-sdk-for-js Git Hub repo](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/mediaservices/arm-mediaservices)-- [Azure Package Documentation for Node.js developers](/javascript/api/overview/azure/)-
-## See also
--- [Job error codes](/rest/api/media/jobs/get#joberrorcode).-- [npm install @azure/arm-mediaservices](https://www.npmjs.com/package/@azure/arm-mediaservices)-- [Azure for JavaScript & Node.js developers](/azure/developer/javascript/)-- [Media Services source code in the @azure/azure-sdk-for-js repo](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/mediaservices/arm-mediaservices)-
-## Next steps
-
-> [Media Services concepts](concepts-overview.md)
media-services Stream Files Tutorial With Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-files-tutorial-with-api.md
- Title: Upload, encode, and stream with Media Services v3
-: Azure Media Services
-description: Tutorial showing how to upload a file, encode video, and stream content with Azure Media Services v3.
-------- Previously updated : 07/23/2021---
-# Tutorial: Upload, encode, and stream videos with Media Services v3
--
-> [!NOTE]
-> Even though this tutorial uses [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.liveevent) examples, the general steps are the same for [REST API](/rest/api/medi#sdks).
-
-Azure Media Services lets you encode your media files into formats that play on a wide variety of browsers and devices. For example, you might want to stream your content in Apple's HLS or MPEG DASH formats. Before streaming, you should encode your high-quality digital media file. For help with encoding, see [Encoding concept](encode-concept.md). This tutorial uploads a local video file and encodes the uploaded file. You can also encode content that you make accessible via an HTTPS URL. For more information, see [Create a job input from an HTTP(s) URL](job-input-from-http-how-to.md).
-
-![Play a video with Azure Media Player](./media/stream-files-tutorial-with-api/final-video.png)
-
-This tutorial shows you how to:
-
-> [!div class="checklist"]
-> * Download the sample app described in the topic.
-> * Examine the code that uploads, encodes, and streams.
-> * Run the app.
-> * Test the streaming URL.
-> * Clean up resources.
--
-## Prerequisites
--- Install [Visual Studio Code for Windows/macOS/Linux](https://code.visualstudio.com/) or [Visual Studio 2019 for Windows or Mac](https://visualstudio.microsoft.com/).-- Install [.NET 5.0 SDK](https://dotnet.microsoft.com/download)-- [Create a Media Services account](./account-create-how-to.md). Be sure to copy the **API Access** details in JSON format or store the values needed to connect to the Media Services account in the *.env* file format used in this sample.-- Follow the steps in [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You'll need to use them to access the API in this sample, or enter them into the *.env* file format.-
-## Download and configure the sample
-
-Clone a GitHub repository that has the streaming .NET sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git
- ```
-
-The sample is located in the [UploadEncodeAndStreamFiles](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/tree/main/AMSV3Tutorials/UploadEncodeAndStreamFiles) folder.
--
-## Examine the code that uploads, encodes, and streams
-
-This section examines functions defined in the [Program.cs](https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials/blob/main/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs) file of the *UploadEncodeAndStreamFiles* project.
-
-The sample performs the following actions:
-
-1. Creates a new **Transform** (first, checks if the specified Transform exists).
-2. Creates an output **Asset** that's used as the encoding **Job**'s output.
-3. Create an input **Asset** and uploads the specified local video file into it. The asset is used as the job's input.
-4. Submits the encoding job using the input and output that was created.
-5. Checks the job's status.
-6. Creates a **Streaming Locator**.
-7. Builds streaming URLs.
-
-### Start using Media Services APIs with the .NET SDK
-
-To start using Media Services APIs with .NET, you need to create an `AzureMediaServicesClient` object. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory. Another option is to use interactive authentication, which is implemented in `GetCredentialsInteractiveAuthAsync`.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#CreateMediaServicesClientAsync)]
-
-In the code that you cloned at the beginning of the article, the `GetCredentialsAsync` function creates the `ServiceClientCredentials` object based on the credentials supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsAsync)]
-
-In the case of interactive authentication, the `GetCredentialsInteractiveAuthAsync` function creates the `ServiceClientCredentials` object based on an interactive authentication and the connection parameters supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/Common_Utils/Authentication.cs#GetCredentialsInteractiveAuthAsync)]
-
-### Create an input asset and upload a local file into it
-
-The **CreateInputAsset** function creates a new input [Asset](/rest/api/medi) article.
-
-In Media Services v3, you use Azure Storage APIs to upload files. The following .NET snippet shows how.
-
-The following function performs these actions:
-
-* Creates an **Asset**.
-* Gets a writable [SAS URL](../../storage/common/storage-sas-overview.md) to the assetΓÇÖs [container in storage](../../storage/blobs/storage-quickstart-blobs-dotnet.md#upload-a-blob-to-a-container).
-
- If using assetΓÇÖs [ListContainerSas](/rest/api/media/assets/listcontainersas) function to get SAS URLs, note that the function returns multiple SAS URLs as there are two storage account keys for each storage account. A storage account has two keys because it allows for seamless rotation of storage account keys (for example, change one while using the other then start using the new key and rotate the other key). The 1st SAS URL represents storage key1 and second one storage key2.
-* Uploads the file into the container in storage using the SAS URL.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateInputAsset)]
-
-### Create an output asset to store the result of a job
-
-The output [Asset](/rest/api/media/assets) stores the result of your encoding job. The project defines the **DownloadResults** function that downloads the results from this output asset into the "output" folder, so you can see what you got.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateOutputAsset)]
-
-### Create a Transform and a Job that encodes the uploaded file
-
-When encoding or processing content in Media Services, it's a common pattern to set up the encoding settings as a recipe. You would then submit a **Job** to apply that recipe to a video. By submitting new jobs for each new video, you're applying that recipe to all the videos in your library. A recipe in Media Services is called a **Transform**. For more information, see [Transforms and Jobs](./transform-jobs-concept.md). The sample described in this tutorial defines a recipe that encodes the video in order to stream it to a variety of iOS and Android devices.
-
-#### Transform
-
-When creating a new [Transform](/rest/api/medi).
-
-You can use a built-in EncoderNamedPreset or use custom presets. For more information, see [How to customize encoder presets](transform-custom-transform-how-to.md).
-
-When creating a [Transform](/rest/api/media/transforms), you should first check if one already exists using the **Get** method, as shown in the code that follows. In Media Services v3, **Get** methods on entities return **null** if the entity doesnΓÇÖt exist (a case-insensitive check on the name).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#EnsureTransformExists)]
-
-#### Job
-
-As mentioned above, the [Transform](/rest/api/media/transforms) object is the recipe and a [Job](/rest/api/media/jobs) is the actual request to Media Services to apply that **Transform** to a given input video or audio content. The **Job** specifies information like the location of the input video, and the location for the output.
-
-In this example, the input video has been uploaded from your local machine. If you want to learn how to encode from an HTTPS URL, see [this](job-input-from-http-how-to.md) article.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#SubmitJob)]
-
-### Wait for the Job to complete
-
-The job takes some time to complete and when it does you want to be notified. The code sample below shows how to poll the service for the status of the [Job](/rest/api/media/jobs). Polling isn't a recommended best practice for production apps because of potential latency. Polling can be throttled if overused on an account. Developers should instead use Event Grid.
-
-Event Grid is designed for high availability, consistent performance, and dynamic scale. With Event Grid, your apps can listen for and react to events from virtually all Azure services, as well as custom sources. Simple, HTTP-based reactive event handling helps you build efficient solutions through intelligent filtering and routing of events. See [Route events to a custom web endpoint](monitoring/job-state-events-cli-how-to.md).
-
-The **Job** usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has encountered an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and **Canceled** when it's done.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#WaitForJobToFinish)]
-
-### Job error codes
-
-See [Error codes](/rest/api/media/jobs/get#joberrorcode).
-
-### Get a Streaming Locator
-
-After the encoding is complete, the next step is to make the video in the output Asset available to clients for playback. You can make it available in two steps: first, create a [Streaming Locator](/rest/api/media/streaminglocators), and second, build the streaming URLs that clients can use.
-
-The process of creating a **Streaming Locator** is called publishing. By default, the **Streaming Locator** is valid immediately after you make the API calls, and lasts until it's deleted, unless you configure the optional start and end times.
-
-When creating a [StreamingLocator](/rest/api/media/streaminglocators), you'll need to specify the desired **StreamingPolicyName**. In this example, you'll be streaming in-the-clear (or non-encrypted content) so the predefined clear streaming policy (**PredefinedStreamingPolicy.ClearStreamingOnly**) is used.
-
-> [!IMPORTANT]
-> When using a custom [Streaming Policy](/rest/api/media/streamingpolicies), you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of Streaming Policy entries. You shouldn't be creating a new Streaming Policy for each Streaming Locator.
-
-The following code assumes that you're calling the function with a unique locatorName.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CreateStreamingLocator)]
-
-While the sample in this topic discusses streaming, you can use the same call to create a Streaming Locator for delivering video via progressive download.
-
-### Get streaming URLs
-
-Now that the [Streaming Locator](/rest/api/media/streaminglocators) has been created, you can get the streaming URLs, as shown in **GetStreamingURLs**. To build a URL, you need to concatenate the [Streaming Endpoint](/rest/api/media/streamingendpoints) host name and the **Streaming Locator** path. In this sample, the *default* **Streaming Endpoint** is used. When you first create a Media Service account, this *default* **Streaming Endpoint** will be in a stopped state, so you need to call **Start**.
-
-> [!NOTE]
-> In this method, you need the locatorName that was used when creating the **Streaming Locator** for the output Asset.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#GetStreamingURLs)]
-
-### Clean up resources in your Media Services account
-
-Generally, you should clean up everything except objects that you're planning to reuse (typically, you'll reuse Transforms, and you'll persist StreamingLocators, etc.). If you want your account to be clean after experimenting, delete the resources that you don't plan to reuse. For example, the following code deletes the job, created assets and content key policy:
-
-[!code-csharp[Main](../../../media-services-v3-dotnet-tutorials/AMSV3Tutorials/UploadEncodeAndStreamFiles/Program.cs#CleanUp)]
-
-## Run the sample app
-
-1. Press Ctrl+F5 to run the *EncodeAndStreamFiles* app.
-2. Copy one of the streaming URLs from the console.
-
-This example displays URLs that can be used to play back the video using different protocols:
-
-![Example output showing URLs for Media Services streaming video](./media/stream-files-tutorial-with-api/output.png)
-
-## Test the streaming URL
-
-To test the stream, this article uses Azure Media Player.
-
-> [!NOTE]
-> If a player is hosted on an https site, make sure to update the URL to "https".
-
-1. Open a web browser and navigate to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL:** box, paste one of the streaming URL values you got when you ran the app.
-3. Select **Update Player**.
-
-Azure Media Player can be used for testing but shouldn't be used in a production environment.
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group you created earlier.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## Multithreading
-
-The Azure Media Services v3 SDKs aren't thread-safe. When developing a multi-threaded app, you should generate and use a new AzureMediaServicesClient object per thread.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-Now that you know how to upload, encode, and stream your video, see the following article:
-
-> [!div class="nextstepaction"]
-> [Analyze videos](analyze-videos-tutorial.md)
media-services Stream Files Tutorial With Rest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-files-tutorial-with-rest.md
- Title: Encode a remote file and stream using Media Services
-description: Follow the steps of this tutorial to encode a file based on a URL and stream your content with Azure Media Services using REST.
------- Previously updated : 03/17/2021---
-# Tutorial: Encode a remote file based on URL and stream the video - REST
--
-Azure Media Services enables you to encode your media files into formats that can be played on a wide variety of browsers and devices. For example, you might want to stream your content in Apple's HLS or MPEG DASH formats. Before streaming, you should encode your high-quality digital media file. For encoding guidance, see [Encoding concept](encode-concept.md).
-
-This tutorial shows you how to encode a file based on a URL and stream the video with Azure Media Services using REST.
---
-![Play the video](./media/stream-files-tutorial-with-api/final-video.png)
-
-This tutorial shows you how to:
-
-> [!div class="checklist"]
-> * Create a Media Services account
-> * Access the Media Services API
-> * Download Postman files
-> * Configure Postman
-> * Send requests using Postman
-> * Test the streaming URL
-> * Clean up resources
--
-## Prerequisites
--- [Create a Media Services account](./account-create-how-to.md).-
- Make sure to remember the values that you used for the resource group name and Media Services account name
--- Install the [Postman](https://www.getpostman.com/) REST client to execute the REST APIs shown in some of the AMS REST tutorials. -
- We are using **Postman** but any REST tool would be suitable. Other alternatives are: **Visual Studio Code** with the REST plugin or **Telerik Fiddler**.
-
-## Download Postman files
-
-Clone a GitHub repository that contains the Postman collection and environment files.
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-rest-postman.git
- ```
-
-## Access API
-
-For detailed information, see [Get credentials to access Media Services API](access-api-howto.md)
-
-## Configure Postman
-
-### Configure the environment
-
-1. Open the **Postman** app.
-2. On the right of the screen, select the **Manage environment** option.
-
- ![Manage env](./media/develop-with-postman/postman-import-env.png)
-4. From the **Manage environment** dialog, click **Import**.
-2. Browse to the `Azure Media Service v3 Environment.postman_environment.json` file that was downloaded when you cloned `https://github.com/Azure-Samples/media-services-v3-rest-postman.git`.
-6. The **Azure Media Service v3 Environment** environment is added.
-
- > [!Note]
- > Update access variables with values you got from the **Access the Media Services API** section above.
-
-7. Double-click on the selected file and enter values that you got by following the [accessing API](#access-api) steps.
-8. Close the dialog.
-9. Select the **Azure Media Service v3 Environment** environment from the dropdown.
-
- ![Choose env](./media/develop-with-postman/choose-env.png)
-
-### Configure the collection
-
-1. Click **Import** to import the collection file.
-1. Browse to the `Media Services v3.postman_collection.json` file that was downloaded when you cloned `https://github.com/Azure-Samples/media-services-v3-rest-postman.git`
-3. Choose the **Media Services v3.postman_collection.json** file.
-
- ![Import a file](./media/develop-with-postman/postman-import-collection.png)
-
-## Send requests using Postman
-
-In this section, we send requests that are relevant to encoding and creating URLs so you can stream your file. Specifically, the following requests are sent:
-
-1. Get Azure AD Token for Service Principal Authentication
-1. Start a Streaming Endpoint
-2. Create an output asset
-3. Create a Transform
-4. Create a Job
-5. Create a Streaming Locator
-6. List paths of the Streaming Locator
-
-> [!Note]
-> This tutorial assumes you are creating all resources with unique names.
-
-### Get Azure AD Token
-
-1. In the left window of the Postman app, select "Step 1: Get AAD Auth token".
-2. Then, select "Get Azure AD Token for Service Principal Authentication".
-3. Press **Send**.
-
- The following **POST** operation is sent.
-
- ```
- https://login.microsoftonline.com/:aadTenantDomain/oauth2/token
- ```
-
-4. The response comes back with the token and sets the "AccessToken" environment variable to the token value. To see the code that sets "AccessToken" , click on the **Tests** tab.
-
- ![Get AAD token](./media/develop-with-postman/postman-get-aad-auth-token.png)
--
-### Start a Streaming Endpoint
-
-To enable streaming, you first have to start the [Streaming Endpoint](./stream-streaming-endpoint-concept.md) from which you want to stream the video.
-
-> [!NOTE]
-> You are only billed when your Streaming Endpoint is in the running state.
-
-1. In the left window of the Postman app, select "Streaming and Live".
-2. Then, select "Start StreamingEndpoint".
-3. Press **Send**.
-
- * The following **POST** operation is sent:
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaservices/:accountName/streamingEndpoints/:streamingEndpointName/start?api-version={{api-version}}
- ```
- * If the request is successful, the `Status: 202 Accepted` is returned.
-
- This status means that the request has been accepted for processing; however, the processing has not been completed. You can query for the operation status based on the value in the `Azure-AsyncOperation` response header.
-
- For example, the following GET operation returns the status of your operation:
-
- `https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/<resourceGroupName>/providers/Microsoft.Media/mediaservices/<accountName>/streamingendpointoperations/1be71957-4edc-4f3c-a29d-5c2777136a2e?api-version=2018-07-01`
-
- The [track asynchronous Azure operations](../../azure-resource-manager/management/async-operations.md) article explains in depth how to track the status of asynchronous Azure operations through values returned in the response.
-
-### Create an output asset
-
-The output [Asset](/rest/api/media/assets) stores the result of your encoding job.
-
-1. In the left window of the Postman app, select "Assets".
-2. Then, select "Create or update an Asset".
-3. Press **Send**.
-
- * The following **PUT** operation is sent:
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/assets/:assetName?api-version={{api-version}}
- ```
- * The operation has the following body:
-
- ```json
- {
- "properties": {
- "description": "My Asset",
- "alternateId" : "some GUID",
- "storageAccountName": "<replace from environment file>",
- "container": "<supply any valid container name of your choosing>"
- }
- }
- ```
-
-> [!NOTE]
-> Be sure to replace the storage account and container names either with those from the environment file or supply your own.
->
-> As you complete the steps described in the rest of this article, make sure that you supply valid parameters in request bodies.
-
-### Create a transform
-
-When encoding or processing content in Media Services, it is a common pattern to set up the encoding settings as a recipe. You would then submit a **Job** to apply that recipe to a video. By submitting new jobs for each new video, you are applying that recipe to all the videos in your library. A recipe in Media Services is called as a **Transform**. For more information, see [Transforms and Jobs](./transform-jobs-concept.md). The sample described in this tutorial defines a recipe that encodes the video in order to stream it to a variety of iOS and Android devices.
-
-When creating a new [Transform](/rest/api/medi).
-
-You can use a built-in EncoderNamedPreset or use custom presets.
-
-> [!Note]
-> When creating a [Transform](/rest/api/media/transforms), you should first check if one already exists using the **Get** method. This tutorial assumes you are creating the transform with a unique name.
-
-1. In the left window of the Postman app, select "Encoding and Analysis".
-2. Then, select "Create Transform".
-3. Press **Send**.
-
- * The following **PUT** operation is sent.
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/transforms/:transformName?api-version={{api-version}}
- ```
- * The operation has the following body:
-
- ```json
- {
- "properties": {
- "description": "Standard Transform using an Adaptive Streaming encoding preset from the library of built-in Standard Encoder presets",
- "outputs": [
- {
- "onError": "StopProcessingJob",
- "relativePriority": "Normal",
- "preset": {
- "@odata.type": "#Microsoft.Media.BuiltInStandardEncoderPreset",
- "presetName": "AdaptiveStreaming"
- }
- }
- ]
- }
- }
- ```
-
-### Create a job
-
-A [Job](/rest/api/media/jobs) is the actual request to Media Services to apply the created **Transform** to a given input video or audio content. The **Job** specifies information like the location of the input video, and the location for the output.
-
-In this example, the job's input is based on an HTTPS URL ("https:\//nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/").
-
-1. In the left window of the Postman app, select "Encoding and Analysis".
-2. Then, select "Create or Update Job".
-3. Press **Send**.
-
- * The following **PUT** operation is sent.
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/transforms/:transformName/jobs/:jobName?api-version={{api-version}}
- ```
- * The operation has the following body:
-
- ```json
- {
- "properties": {
- "input": {
- "@odata.type": "#Microsoft.Media.JobInputHttp",
- "baseUri": "https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/",
- "files": [
- "Ignite-short.mp4"
- ]
- },
- "outputs": [
- {
- "@odata.type": "#Microsoft.Media.JobOutputAsset",
- "assetName": "testAsset1"
- }
- ]
- }
- }
- ```
-
-The job takes some time to complete and when it does you want to be notified. To see the progress of the job, we recommend using Event Grid. It is designed for high availability, consistent performance, and dynamic scale. With Event Grid, your apps can listen for and react to events from virtually all Azure services, as well as custom sources. Simple, HTTP-based reactive event handling helps you build efficient solutions through intelligent filtering and routing of events. See [Route events to a custom web endpoint](monitoring/job-state-events-cli-how-to.md).
-
-The **Job** usually goes through the following states: **Scheduled**, **Queued**, **Processing**, **Finished** (the final state). If the job has encountered an error, you get the **Error** state. If the job is in the process of being canceled, you get **Canceling** and **Canceled** when it is done.
-
-#### Job error codes
-
-See [Error codes](/rest/api/media/jobs/get#joberrorcode).
-
-### Create a streaming locator
-
-After the encoding job is complete, the next step is to make the video in the output **Asset** available to clients for playback. You can accomplish this in two steps: first, create a [StreamingLocator](/rest/api/media/streaminglocators), and second, build the streaming URLs that clients can use.
-
-The process of creating a streaming locator is called publishing. By default, the streaming locator is valid immediately after you make the API calls, and lasts until it is deleted, unless you configure the optional start and end times.
-
-When creating a [StreamingLocator](/rest/api/media/streaminglocators), you need to specify the desired **StreamingPolicyName**. In this example, you will be streaming in-the-clear (or non-encrypted) content, so the predefined clear streaming policy "Predefined_ClearStreamingOnly" is used.
-
-> [!IMPORTANT]
-> When using a custom [StreamingPolicy](/rest/api/media/streamingpolicies), you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed.
-
-Your Media Service account has a quota for the number of **Streaming Policy** entries. You should not be creating a new **Streaming Policy** for each streaming locator.
-
-1. In the left window of the Postman app, select "Streaming Policies and Locators".
-2. Then, select "Create a Streaming Locator (clear)".
-3. Press **Send**.
-
- * The following **PUT** operation is sent.
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/streamingLocators/:streamingLocatorName?api-version={{api-version}}
- ```
- * The operation has the following body:
-
- ```json
- {
- "properties": {
- "streamingPolicyName": "Predefined_ClearStreamingOnly",
- "assetName": "testAsset1",
- "contentKeys": [],
- "filters": []
- }
- }
- ```
-
-### List paths and build streaming URLs
-
-#### List paths
-
-Now that the [Streaming Locator](/rest/api/media/streaminglocators) has been created, you can get the streaming URLs
-
-1. In the left window of the Postman app, select "Streaming Policies".
-2. Then, select "List Paths".
-3. Press **Send**.
-
- * The following **POST** operation is sent.
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/streamingLocators/:streamingLocatorName/listPaths?api-version={{api-version}}
- ```
-
- * The operation has no body:
-
-4. Note one of the paths you want to use for streaming, you will use it in the next section. In this case, the following paths were returned:
-
- ```
- "streamingPaths": [
- {
- "streamingProtocol": "Hls",
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/cdb80234-1d94-42a9-b056-0eefa78e5c63/Ignite-short.ism/manifest(format=m3u8-aapl)"
- ]
- },
- {
- "streamingProtocol": "Dash",
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/cdb80234-1d94-42a9-b056-0eefa78e5c63/Ignite-short.ism/manifest(format=mpd-time-csf)"
- ]
- },
- {
- "streamingProtocol": "SmoothStreaming",
- "encryptionScheme": "NoEncryption",
- "paths": [
- "/cdb80234-1d94-42a9-b056-0eefa78e5c63/Ignite-short.ism/manifest"
- ]
- }
- ]
- ```
-
-#### Build the streaming URLs
-
-In this section, let's build an HLS streaming URL. URLs consist of the following values:
-
-1. The protocol over which data is sent. In this case "https".
-
- > [!NOTE]
- > If a player is hosted on an https site, make sure to update the URL to "https".
-
-2. StreamingEndpoint's hostname. In this case, the name is "amsaccount-usw22.streaming.media.azure.net".
-
- To get the hostname, you can use the following GET operation:
-
- ```
- https://management.azure.com/subscriptions/00000000-0000-0000-0000-0000000000000/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaservices/:accountName/streamingEndpoints/default?api-version={{api-version}}
- ```
- and make sure that you set the `resourceGroupName` and `accountName` parameters to match the environment file.
-
-3. A path that you got in the previous (List paths) section.
-
-As a result, the following HLS URL was built
-
-```
-https://amsaccount-usw22.streaming.media.azure.net/cdb80234-1d94-42a9-b056-0eefa78e5c63/Ignite-short.ism/manifest(format=m3u8-aapl)
-```
-
-## Test the streaming URL
--
-> [!NOTE]
-> Make sure the **Streaming Endpoint** from which you want to stream is running.
-
-To test the stream, this article uses Azure Media Player.
-
-1. Open a web browser and navigate to [https://aka.ms/azuremediaplayer/](https://aka.ms/azuremediaplayer/).
-2. In the **URL:** box, paste the URL you built.
-3. Press **Update Player**.
-
-Azure Media Player can be used for testing but should not be used in a production environment.
-
-## Clean up resources in your Media Services account
-
-Generally, you should clean up everything except objects that you are planning to reuse (typically, you will reuse **Transforms**, and you will persist **Streaming Locators**, etc.). If you want for your account to be clean after experimenting, you should delete the resources that you do not plan to reuse.
-
-To delete a resource, select "Delete ..." operation under whichever resource you want to delete.
-
-## Clean up resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group you created earlier.
-
-Execute the following CLI command:
-
-```azurecli
-az group delete --name amsResourceGroup
-```
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-Now that you know how to upload, encode, and stream your video, see the following article:
-
-> [!div class="nextstepaction"]
-> [Analyze videos](analyze-videos-tutorial.md)
media-services Stream Live Streaming Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-live-streaming-concept.md
- Title: Overview of Live streaming
-description: This article gives an overview of live streaming using Azure Media Services v3.
----- Previously updated : 03/25/2021--
-# Live streaming with Azure Media Services v3
--
-Azure Media Services enables you to deliver live events to your customers on the Azure cloud. To stream your live events with Media Services, you need the following:
--- A camera that is used to capture the live event.<br/>For setup ideas, check out [Simple and portable event video gear setup]( https://link.medium.com/KNTtiN6IeT).-
- If you do not have access to a camera, tools such as [Telestream Wirecast](https://www.telestream.net/wirecast/overview.htm) can be used to generate a live feed from a video file.
-- A live video encoder that converts signals from a camera (or another device, like a laptop) into a contribution feed that is sent to Media Services. The contribution feed can include signals related to advertising, such as SCTE-35 markers.<br/>For a list of recommended live streaming encoders, see [live streaming encoders](encode-recommended-on-premises-live-encoders.md). Also, check out this blog: [Live streaming production with OBS](https://link.medium.com/ttuwHpaJeT).-- Components in Media Services, which enable you to ingest, preview, package, record, encrypt, and broadcast the live event to your customers, or to a CDN for further distribution.-
-For customers looking to deliver content to large internet audiences, we recommend that you enable CDN on the [streaming endpoint](stream-streaming-endpoint-concept.md).
-
-This article gives an overview and guidance of live streaming with Media Services and links to other relevant articles.
-
-> [!NOTE]
-> You can use the [Azure portal](https://portal.azure.com/) to manage v3 [Live Events](live-event-outputs-concept.md), view v3 [assets](assets-concept.md), get info about accessing APIs. For all other management tasks (for example, Transforms and Jobs), use the [REST API](/rest/api/medi#sdks).
-
-## Dynamic packaging and delivery
-
-With Media Services, you can take advantage of [dynamic packaging](encode-dynamic-packaging-concept.md), which allows you to preview and broadcast your live streams in [MPEG DASH, HLS, and Smooth Streaming formats](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming) from the contribution feed that is being sent to the service. Your viewers can play back the live stream with any HLS, DASH, or Smooth Streaming compatible players. You can use [Azure Media Player](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) in your web or mobile applications to deliver your stream in any of these protocols.
-
-## Dynamic encryption
-
-Dynamic encryption enables you to dynamically encrypt your live or on-demand content with AES-128 or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients. For more information, see [dynamic encryption](drm-content-protection-concept.md).
-
-> [!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Dynamic filtering
-
-Dynamic filtering is used to control the number of tracks, formats, bitrates, and presentation time windows that are sent out to the players. For more information, see [filters and dynamic manifests](filters-dynamic-manifest-concept.md).
-
-## Live event types
-
-[Live events](/rest/api/medi).
-
-### Pass-through
-
-![Diagram showing how the video and audio feeds from a pass-through Live Event are ingested and processed.](./media/live-streaming/pass-through.svg)
-
-When using the pass-through **Live Event** (basic or standard), you rely on your on-premises live encoder to generate a multiple bitrate video stream and send that as the contribution feed to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event then carries through the incoming video streams to the dynamic packager (Streaming Endpoint) without any further transcoding. Such a pass-through Live Event is optimized for long-running live events or 24x365 linear live streaming.
-
-### Live encoding
-
-![live encoding](./media/live-streaming/live-encoding.svg)
-
-When using cloud encoding with Media Services, you would configure your on-premises live encoder to send a single bitrate video as the contribution feed (up to 32Mbps aggregate) to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event transcodes the incoming single bitrate stream into [multiple bitrate video streams](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming) at varying resolutions to improve delivery and makes it available for delivery to playback devices via industry standard protocols like MPEG-DASH, Apple HTTP Live Streaming (HLS), and Microsoft Smooth Streaming.
-
-### Live transcription (preview)
-
-Live transcription is a feature you can use with live events that are either pass-through or live encoding. For more information, see [live transcription](live-event-live-transcription-how-to.md). When this feature is enabled, the service uses the [Speech-To-Text](../../cognitive-services/speech-service/speech-to-text.md) feature of Cognitive Services to transcribe the spoken words in the incoming audio into text. This text is then made available for delivery along with video and audio in MPEG-DASH and HLS protocols.
-
-> [!NOTE]
-> Currently, live transcription is available as a preview feature in West US 2.
--
-## Live streaming workflow
-
-To understand the live streaming workflow in Media Services v3, you have to first review and understand the following concepts:
--- [Streaming endpoints](stream-streaming-endpoint-concept.md)-- [Live events and live outputs](live-event-outputs-concept.md)-- [Streaming locators](stream-streaming-locators-concept.md)-
-### General steps
-
-1. In your Media Services account, make sure the **streaming endpoint** (origin) is running.
-2. Create a [live event](live-event-outputs-concept.md). <br/>When creating the event, you can specify to autostart it. Alternatively, you can start the event when you are ready to start streaming.<br/> When autostart is set to true, the Live Event will be started right after creation. The billing starts as soon as the Live Event starts running. You must explicitly call Stop on the live event resource to halt further billing. For more information, see [live event states and billing](live-event-states-billing-concept.md).
-3. Get the ingest URL(s) and configure your on-premises encoder to use the URL to send the contribution feed.<br/>See [recommended live encoders](encode-recommended-on-premises-live-encoders.md).
-4. Get the preview URL and use it to verify that the input from the encoder is actually being received.
-5. Create a new **asset** object.
-
- Each live output is associated with an asset, which it uses to record the video into the associated Azure blob storage container.
-6. Create a **live output** and use the asset name that you created so that the stream can be archived into the asset.
-
- Live Outputs start on creation and stop when deleted. When you delete the Live Output, you are not deleting the underlying asset and content in the asset.
-7. Create a **streaming locator** with the [built-in streaming policy types](stream-streaming-policy-concept.md).
-
- To publish the live output, you must create a streaming locator for the associated asset.
-8. List the paths on the **streaming locator** to get back the URLs to use (these are deterministic).
-9. Get the hostname for the **streaming endpoint** (Origin) you wish to stream from.
-10. Combine the URL from step 8 with the hostname in step 9 to get the full URL.
-11. If you wish to stop making your **live event** viewable, you need to stop streaming the event and delete the **streaming locator**.
-12. If you are done streaming events and want to clean up the resources provisioned earlier, follow the following procedure.
-
- * Stop pushing the stream from the encoder.
- * Stop the live event. Once the live event is stopped, it will not incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
- * You can stop your streaming endpoint, unless you want to continue to provide the archive of your live event as an on-demand stream. If the live event is in stopped state, it will not incur any charges.
-
-The asset that the live output is archiving to, automatically becomes an on-demand asset when the live output is deleted. You must delete all live outputs before a live event can be stopped. You can use an optional flag [removeOutputsOnStop](/rest/api/media/liveevents/stop#request-body) to automatically remove live outputs on stop.
-
-> [!TIP]
-> See [Live streaming tutorial](stream-live-tutorial-with-api.md), the article examines the code that implements the steps described above.
-
-## Other important articles
--- [Recommended live encoders](encode-recommended-on-premises-live-encoders.md)-- [Using a cloud DVR](live-event-cloud-dvr-time-how-to.md)-- [Live event types feature comparison](live-event-types-comparison-reference.md)-- [States and billing](live-event-states-billing-concept.md)-- [Latency](live-event-latency-reference.md)-- [Quotas and limits](limits-quotas-constraints-reference.md)-
-## Live streaming FAQ
-
-See the [live streaming questions in the FAQ](frequently-asked-questions.yml).
media-services Stream Live Tutorial With Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-live-tutorial-with-api.md
- Title: Stream live with Media Services by using .NET 5.0-
-description: Learn how to stream live events by using .NET 5.0
-------- Previously updated : 06/13/2019----
-# Tutorial: Stream live with Media Services by using .NET 5.0
-
-In Azure Media Services, [live events](/rest/api/media/liveevents) are responsible for processing live streaming content. A live event provides an input endpoint (ingest URL) that you then provide to a live encoder. The live event receives input streams from the live encoder and makes them available for streaming through one or more [streaming endpoints](/rest/api/media/streamingendpoints). Live events also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-This tutorial shows how to use .NET 5.0 to create a *pass-through* type of a live event. In this tutorial, you will:
-
-> [!div class="checklist"]
-> * Download a sample app.
-> * Examine the code that performs live streaming.
-> * Watch the event with [Azure Media Player](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) on the [Media Player demo site](https://ampdemo.azureedge.net).
-> * Clean up resources.
--
-> [!NOTE]
-> Even though the tutorial uses [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.liveevent) examples, the general steps are the same for [REST API](/rest/api/medi#sdks).
-
-## Prerequisites
-
-You need the following items to complete the tutorial:
--- Install [Visual Studio Code for Windows/macOS/Linux](https://code.visualstudio.com/) or [Visual Studio 2019 for Windows or Mac](https://visualstudio.microsoft.com/).-- Install [.NET 5.0 SDK](https://dotnet.microsoft.com/download)-- [Create a Media Services account](./account-create-how-to.md). Be sure to copy the **API Access** details in JSON format or store the values needed to connect to the Media Services account in the *.env* file format used in this sample.-- Follow the steps in [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You'll need to use them to access the API in this sample, or enter them into the *.env* file format. -
-You need these additional items for live-streaming software:
--- A camera or a device (like a laptop) that's used to broadcast an event.-- An on-premises software encoder that encodes your camera stream and sends it to the Media Services live-streaming service through the Real-Time Messaging Protocol (RTMP). For more information, see [Recommended on-premises live encoders](encode-recommended-on-premises-live-encoders.md). The stream has to be in RTMP or Smooth Streaming format. -
- This sample assumes that you'll use Open Broadcaster Software (OBS) Studio to broadcast RTMP to the ingest endpoint. [Install OBS Studio](https://obsproject.com/download).
-
-> [!TIP]
-> Review [Live streaming with Media Services v3](stream-live-streaming-concept.md) before proceeding.
-
-## Download and configure the sample
-
-Clone the GitHub repository that contains the live-streaming .NET sample to your machine by using the following command:
-
-```bash
-git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git
-```
-
-The live-streaming sample is in the [Live](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Live) folder.
--
-> [!IMPORTANT]
-> This sample uses a unique suffix for each resource. If you cancel the debugging or terminate the app without running it through, you'll end up with multiple live events in your account.
->
-> Be sure to stop the running live events. Otherwise, *you'll be billed*!
-
-## Examine the code that performs live streaming
-
-This section examines functions defined in the [Authentication.cs](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/Common_Utils/Authentication.cs) file (in the Common_Utils folder) and [Program.cs](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/Live/LiveEventWithDVR/Program.cs) file of the *LiveEventWithDVR* project.
-
-The sample creates a unique suffix for each resource so that you don't have name collisions if you run the sample multiple times without cleaning up.
--
-### Start using Media Services APIs with the .NET SDK
-
-Authentication.cs creates a `AzureMediaServicesClient` object using credentials supplied in the local configuration files (appsettings.json or .env).
-
-An `AzureMediaServicesClient` object allows you to start using Media Services APIs with .NET. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory, which is implemented in `GetCredentailsAsync`. Another option is to use interactive authentication, which is implemented in `GetCredentialsInteractiveAuthAsync`.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Common_Utils/Authentication.cs#CreateMediaServicesClientAsync)]
-
-In the code that you cloned at the beginning of the article, the `GetCredentialsAsync` function creates the `ServiceClientCredentials` object based on the credentials supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Common_Utils/Authentication.cs#GetCredentialsAsync)]
-
-In the case of interactive authentication, the `GetCredentialsInteractiveAuthAsync` function creates the `ServiceClientCredentials` object based on an interactive authentication and the connection parameters supplied in the local configuration file (*appsettings.json*) or through the *.env* environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Common_Utils/Authentication.cs#GetCredentialsInteractiveAuthAsync)]
--
-### Create a live event
-
-This section shows how to create a *pass-through* type of live event (`LiveEventEncodingType` set to `None`). For information about the available types, see [Live event types](live-event-outputs-concept.md#live-event-types). In addition to pass-through, you can use a live transcoding event for 720p or 1080p adaptive bitrate cloud encoding.
-
-You might want to specify the following things when you're creating the live event:
-
-* **The ingest protocol for the live event**. Currently, the RTMP, RTMPS, and Smooth Streaming protocols are supported. You can't change the protocol option while the live event or its associated live outputs are running. If you need different protocols, create a separate live event for each streaming protocol.
-* **IP restrictions on the ingest and preview**. You can define the IP addresses that are allowed to ingest a video to this live event. Allowed IP addresses can be specified as one of these choices:
-
- * A single IP address (for example, `10.0.0.1`)
- * An IP range that uses an IP address and a Classless Inter-Domain Routing (CIDR) subnet mask (for example, `10.0.0.1/22`)
- * An IP range that uses an IP address and a dotted decimal subnet mask (for example, `10.0.0.1(255.255.252.0)`)
-
- If no IP addresses are specified and there's no rule definition, then no IP address will be allowed. To allow any IP address, create a rule and set `0.0.0.0/0`. The IP addresses have to be in one of the following formats: IPv4 address with four numbers or a CIDR address range.
-* **Autostart on an event as you create it**. When autostart is set to `true`, the live event will start after creation. That means the billing starts as soon as the live event starts running. You must explicitly call `Stop` on the live event resource to halt further billing. For more information, see [Live event states and billing](live-event-states-billing-concept.md).
-
- Standby modes are available to start the live event in a lower-cost "allocated" state that makes it faster to move to a running state. This is useful for situations like hot pools that need to hand out channels quickly to streamers.
-* **A static host name and a unique GUID**. For an ingest URL to be predictive and easier to maintain in a hardware-based live encoder, set the `useStaticHostname` property to `true`. For detailed information, see [Live event ingest URLs](live-event-outputs-concept.md#live-event-ingest-urls).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CreateLiveEvent)]
-
-### Get ingest URLs
-
-After the Live Event is created, you can get ingest URLs that you'll provide to the live encoder. The encoder uses these URLs to input a live stream.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#GetIngestURL)]
-
-### Get the preview URL
-
-Use `previewEndpoint` to preview and verify that the input from the encoder is being received.
-
-> [!IMPORTANT]
-> Make sure that the video is flowing to the preview URL before you continue.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#GetPreviewURLs)]
-
-### Create and manage live events and live outputs
-
-After you have the stream flowing into the live event, you can begin the streaming event by creating an asset, live output, and streaming locator. This will archive the stream and make it available to viewers through the streaming endpoint.
-
-When you're learning these concepts, it's helpful to think of the asset object as the tape that you would insert into a video tape recorder in the old days. The live output is the tape recorder machine. The live event is just the video signal coming into the back of the machine.
-
-You first create the signal by creating the live event. The signal is not flowing until you start that live event and connect your encoder to the input.
-
-The "tape" can be created at any time. It's just an empty asset that you'll hand to the live output object, the "tape recorder" in this analogy.
-
-The "tape recorder" can also be created at any time. You can create a live output before starting the signal flow, or after. If you need to speed up things, it's sometimes helpful to create the output before you start the signal flow.
-
-To stop the "tape recorder," you call `delete` on `LiveOutput`. This action doesn't delete the *contents* of the "tape" (asset). The asset is always kept with the archived video content until you call `delete` explicitly on the asset itself.
-
-The next section will walk through the creation of the asset and the live output.
-
-#### Create an asset
-
-Create an asset for the live output to use. In our analogy, this will be the "tape" that we record the live video signal onto. Viewers will be able to see the contents live or on demand from this virtual tape.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CreateAsset)]
-
-#### Create a live output
-
-Live outputs start when they're created and stop when they're deleted. When you delete the live output, you're not deleting the underlying asset or content in the asset. Think of it as ejecting the "tape." The asset with the recording will last as long as you like. When it's ejected (meaning, when the live output is deleted), it will be available for on-demand viewing immediately.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CreateLiveOutput)]
-
-#### Create a streaming locator
-
-> [!NOTE]
-> When your Media Services account is created, a default streaming endpoint is added to your account in the stopped state. To start streaming your content and take advantage of [dynamic packaging](encode-dynamic-packaging-concept.md) and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the running state.
-
-When you publish the asset by using a streaming locator, the live event (up to the DVR window length) will continue to be viewable until the streaming locator's expiration or deletion, whichever comes first. This is how you make the virtual "tape" recording available for your viewing audience to see live and on demand. The same URL can be used to watch the live event, the DVR window, or the on-demand asset when the recording is complete (when the live output is deleted).
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CreateStreamingLocator)]
-
-```csharp
-
-// Get the URL to stream the output
-ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);
-
-foreach (StreamingPath path in paths.StreamingPaths)
-{
- UriBuilder uriBuilder = new UriBuilder();
- uriBuilder.Scheme = "https";
- uriBuilder.Host = streamingEndpoint.HostName;
-
- uriBuilder.Path = path.Paths[0];
- // Get the URL from the uriBuilder: uriBuilder.ToString()
-}
-```
-
-### Clean up resources in your Media Services account
-
-If you're done streaming events and want to clean up the resources provisioned earlier, use the following procedure:
-
-1. Stop pushing the stream from the encoder.
-1. Stop the live event. After the live event is stopped, it won't incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
-1. Stop your streaming endpoint, unless you want to continue to provide the archive of your live event as an on-demand stream. If the live event is in a stopped state, it won't incur any charges.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CleanupLiveEventAndOutput)]
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/Live/LiveEventWithDVR/Program.cs#CleanupLocatorAssetAndStreamingEndpoint)]
-
-## Watch the event
-
-Press **Ctrl+F5** to run the code. This will output streaming URLs that you can use to watch your live event. Copy the streaming URL that you got to create a streaming locator. You can use a media player of your choice. [Azure Media Player](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) is available to test your stream at the [Media Player demo site](https://ampdemo.azureedge.net).
-
-A live event automatically converts events to on-demand content when it's stopped. Even after you stop and delete the event, users can stream your archived content as a video on demand for as long as you don't delete the asset. An asset can't be deleted if an event is using it; the event must be deleted first.
-
-## Clean up remaining resources
-
-If you no longer need any of the resources in your resource group, including the Media Services and storage accounts that you created for this tutorial, delete the resource group that you created earlier.
-
-Run the following CLI command:
-
-```azurecli-interactive
-az group delete --name amsResourceGroup
-```
-
-> [!IMPORTANT]
-> Leaving the live event running incurs billing costs. Be aware that if the project or program stops responding or is closed out for any reason, it might leave the live event running in a billing state.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-[Stream files](stream-files-tutorial-with-api.md)
-
media-services Stream Live Tutorial With Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-live-tutorial-with-nodejs.md
- Title: Stream live with Media Services by using Node.js and TypeScript-
-description: Learn how to stream live events by using Node.js, TypeScript, and OBS Studio.
-------- Previously updated : 02/13/2021----
-# Tutorial: Stream live with Media Services by using Node.js and TypeScript
-
-In Azure Media Services, [live events](/rest/api/media/liveevents) are responsible for processing live streaming content. A live event provides an input endpoint (ingest URL) that you then provide to a live encoder. The live event receives input streams from the live encoder and makes them available for streaming through one or more [streaming endpoints](/rest/api/media/streamingendpoints). Live events also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-This tutorial shows how to use Node.js and TypeScript to create a *pass-through* type of a live event and broadcast a live stream to it by using [OBS Studio](https://obsproject.com/download).
-
-In this tutorial, you will:
-
-> [!div class="checklist"]
-> * Download sample code.
-> * Examine the code that configures and performs live streaming.
-> * Watch the event with [Azure Media Player](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) on the [Media Player demo site](https://ampdemo.azureedge.net).
-> * Clean up resources.
---
-> [!NOTE]
-> Even though the tutorial uses Node.js examples, the general steps are the same for [REST API](/rest/api/medi#sdks).
-
-## Prerequisites
-
-You need the following items to complete the tutorial:
--- Install [Node.js](https://nodejs.org/en/download/).-- Install [TypeScript](https://www.typescriptlang.org/).-- [Create a Media Services account](./create-account-howto.md). Remember the values that you use for the resource group name and Media Services account name.-- Follow the steps in [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md) and save the credentials. You'll need them to access the API and configure your environment variables file.-- Walk through the [Configure and connect with Node.js](./configure-connect-nodejs-howto.md) article to understand how to use the Node.js client SDK.-- Install Visual Studio Code or Visual Studio.-- [Set up your Visual Studio Code environment](https://code.visualstudio.com/Docs/languages/typescript) to support the TypeScript language.-
-You need these additional items for live-streaming software:
--- A camera or a device (like a laptop) that's used to broadcast an event.-- An on-premises software encoder that encodes your camera stream and sends it to the Media Services live-streaming service through the Real-Time Messaging Protocol (RTMP). For more information, see [Recommended on-premises live encoders](encode-recommended-on-premises-live-encoders.md). The stream has to be in RTMP or Smooth Streaming format.-
- This sample assumes that you'll use Open Broadcaster Software (OBS) Studio to broadcast RTMP to the ingest endpoint. [Install OBS Studio](https://obsproject.com/download).
-
- Use the following encoding settings in OBS Studio:
-
- - Encoder: NVIDIA NVENC (if available) or x264
- - Rate control: CBR
- - Bit rate: 2,500 Kbps (or something reasonable for your computer)
- - Keyframe interval: 2 s, or 1 s for low latency
- - Preset: Low-latency Quality or Performance (NVENC) or "veryfast" using x264
- - Profile: high
- - GPU: 0 (Auto)
- - Max B-frames: 2
-
-> [!TIP]
-> Review [Live streaming with Media Services v3](stream-live-streaming-concept.md) before proceeding.
-
-## Download and configure the sample
-
-Clone the GitHub repository that contains the live-streaming Node.js sample to your machine by using the following command:
-
-```bash
-git clone https://github.com/Azure-Samples/media-services-v3-node-tutorials.git
-```
-
-The live-streaming sample is in the [Live](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/Live/Standard_Passthrough_Live_Event) folder.
-
-In the [root folder](https://github.com/Azure-Samples/media-services-v3-node-tutorials/tree/main/) folder, copy the file named *sample.env* to a new file called *.env* to store your environment variable settings that you gathered in the article [Access the Azure Media Services API with the Azure CLI](./access-api-howto.md).
-Make sure that the file name includes the dot (.) in front of "env" so it can work with the code sample correctly.
-
-The [.env file](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/sample.env) contains your Azure Active Directory (Azure AD) application key and secret. It also contains the account name and subscription information required to authenticate SDK access to your Media Services account. The *.gitignore* file is already configured to prevent publishing this file into your forked repository. Don't allow these credentials to be leaked, because they're important secrets for your account.
-
-> [!IMPORTANT]
-> This sample uses a unique suffix for each resource. If you cancel the debugging or terminate the app without running it through, you'll end up with multiple live events in your account.
->
-> Be sure to stop the running live events. Otherwise, *you'll be billed*! Run the program all the way to completion to clean up resources automatically. If the program stops, or you inadvertently stop the debugger and break out of the program execution, you should double check the portal to confirm that you haven't left any live events in the running or standby state that would result in unwanted billing charges.
-
-## Examine the TypeScript code for live streaming
-
-This section examines functions defined in the [index.ts](https://github.com/Azure-Samples/media-services-v3-node-tutorials/blob/main/Live/Standard_Passthrough_Live_Event/index.ts) file of the *Live/Standard_Passthrough_Live_Event* project.
-
-The sample creates a unique suffix for each resource so that you don't have name collisions if you run the sample multiple times without cleaning up.
-
-### Start using the Media Services SDK for Node.js with TypeScript
-
-To start using Media Services APIs with Node.js, you need to first add the [@azure/arm-mediaservices](https://www.npmjs.com/package/@azure/arm-mediaservices) SDK module by using the npm package
-
-```bash
-npm install @azure/arm-mediaservices
-```
-
-In the *package.json* file, this is already configured for you. You just need to run `npm install` to load the modules and dependencies:
-
-1. Install the packages used in the *packages.json* file:
-
- ```bash
- npm install
- ```
-
-1. Open Visual Studio Code from the root folder. (This is required to start from the folder where the *.vscode* folder and *tsconfig.json* files are located.)
-
- ```bash
- code .
- ```
-
-Open the folder for *Live/Standard_Passthrough_Live_Event*, and open the *index.ts* file in the Visual Studio Code editor.
-
-While you're in the *index.ts* file, select the F5 key to open the debugger.
-
-### Setting the longRunningOperationUpdateIntervalMs
-
-To speed up the polling of long running operations from the default of 30s down to a couple of seconds, you need to set the *longRunningOperationUpdateIntervalMs* and pass this value to the *updateIntervaleInMs* property of the options parameter on createAndWait() operations when using liveEvents. This can be seen throughout the sample. This sample uses a value of 2000 ms (2 seconds). This change reduces the time it takes to poll for the status of a long-running operation on the Azure Resource Manager endpoint. It will shorten the time to complete major operations like creating live events, starting, and stopping, which are all asynchronous calls. We recommend a value of 2 seconds for most scenarios that are time sensitive.
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#LongRunningOperation)]
--
-### Create a live event
-
-This section shows how to create a standard *pass-through* type of live event (`LiveEventEncodingType` set to `PassthroughStandard`). For information about the available types, see [Live event types](live-event-outputs-concept.md#live-event-types). In addition to basic or standard pass-through, you can use a live encoding event for 720p or 1080p adaptive bitrate cloud encoding.
-Examples of each of these types of events is available in the *Live* folder of the sample repository. In addition, a sample demonstrating how to listen to Event Grid events through Event Hubs is also included.
-
-You might want to specify the following things when you're creating the live event:
-
-* **The ingest protocol for the live event**. Currently, the RTMP, RTMPS, and Smooth Streaming protocols are supported. You can't change the protocol option while the live event or its associated live outputs are running. If you need different protocols, create a separate live event for each streaming protocol.
-* **IP restrictions on the ingest and preview**. You can define the IP addresses that are allowed to ingest a video to this live event. Allowed IP addresses can be specified as one of these choices:
-
- * A single IP address (for example, `10.0.0.1`)
- * An IP range that uses an IP address and a Classless Inter-Domain Routing (CIDR) subnet mask (for example, `10.0.0.1/22`)
- * An IP range that uses an IP address and a dotted decimal subnet mask (for example, `10.0.0.1(255.255.252.0)`)
-
- If no IP addresses are specified and there's no rule definition, then no IP address will be allowed. To allow any IP address, create a rule and set `0.0.0.0/0`. The IP addresses have to be in one of the following formats: IPv4 address with four numbers or a CIDR address range.
-* **Autostart on an event as you create it**. When autostart is set to `true`, the live event will start after creation. That means the billing starts as soon as the live event starts running. You must explicitly call `Stop` on the live event resource to halt further billing. For more information, see [Live event states and billing](live-event-states-billing-concept.md).
-
- Standby modes are available to start the live event in a lower-cost "allocated" state that makes it faster to move to a running state. This is useful for situations like hot pools that need to hand out channels quickly to streamers.
-* **A static host name and a unique GUID**. For an ingest URL to be predictive and easier to maintain in a hardware-based live encoder, set the `useStaticHostname` property to `true`. For `accessToken`, use a custom, unique GUID. For detailed information, see [Live event ingest URLs](live-event-outputs-concept.md#live-event-ingest-urls).
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#CreateLiveEvent)]
--
-### Create an asset to record and archive the live event
-
-In the following block of code, you create an empty asset to use as the "tape" to record your live event archive to.
-
-When you're learning these concepts, it's helpful to think of the asset object as the tape that you would insert into a video tape recorder in the old days. The live output is the tape recorder machine. The live event is just the video signal coming into the back of the machine.
-
-Keep in mind that the asset, or "tape," can be created at any time. You'll hand the empty asset to the live output object, the "tape recorder" in this analogy.
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#CreateAsset)]
--
-### Create the live output
-
-In this section, you create a live output that uses the asset name as input to tell where to record the live event to. In addition, you set up the time-shifting (DVR) window to be used in the recording.
-
-The sample code shows how to set up a 1-hour time-shifting window. This window will allow clients to play back anything in the last hour of the event. In addition, only the last 1 hour of the live event will remain in the archive. You can extend this window to be up to 25 hours if needed. Also note that you can control the output manifest naming that the HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming over HTTP (DASH) manifests use in your URL paths when published.
-
-The live output, or "tape recorder" in our analogy, can be created at any time as well. You can create a live output before starting the signal flow, or after. If you need to speed up things, it's often helpful to create the output before you start the signal flow.
-
-Live outputs start when they're created and stop when they're deleted. When you delete the live output, you're not deleting the underlying asset or content in the asset. Think of it as ejecting the "tape." The asset with the recording will last as long as you like. When it's ejected (meaning, when the live output is deleted), it will be available for on-demand viewing immediately.
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#CreateLiveOutput)]
---
-### Get ingest URLs
-
-After the live event is created, you can get ingest URLs that you'll provide to the live encoder. The encoder uses these URLs to input a live stream by using the RTMP protocol.
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#GetIngestURL)]
--
-### Get the preview URL
-
-Use `previewEndpoint` to preview and verify that the input from the encoder is being received.
-
-> [!IMPORTANT]
-> Make sure that the video is flowing to the preview URL before you continue.
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#GetPreviewURL)]
--
-### Create and manage live events and live outputs
-
-After you have the stream flowing into the live event, you can begin the streaming event by publishing a streaming locator for your client players to use. This will make it available to viewers through the streaming endpoint.
-
-You first create the signal by creating the live event. The signal is not flowing until you start that live event and connect your encoder to the input.
-
-To stop the "tape recorder," you call `delete` on `LiveOutput`. This action doesn't delete the *contents* of your archive on the "tape" (asset). It only deletes the "tape recorder" and stops the archiving. The asset is always kept with the archived video content until you call `delete` explicitly on the asset itself. As soon as you delete `LiveOutput`, the recorded content of the asset is still available to play back through any published streaming locator URLs.
-
-If you want to remove the ability of a client to play back the archived content, you first need to remove all locators from the asset. You also flush the content delivery network (CDN) cache on the URL path, if you're using a CDN for delivery. Otherwise, the content will live in the CDN's cache for the standard time-to-live setting on the CDN (which might be up to 72 hours).
-
-#### Create a streaming locator to publish HLS and DASH manifests
-
-> [!NOTE]
-> When your Media Services account is created, a default streaming endpoint is added to your account in the stopped state. To start streaming your content and take advantage of [dynamic packaging](encode-dynamic-packaging-concept.md) and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the running state.
-
-When you publish the asset by using a streaming locator, the live event (up to the DVR window length) will continue to be viewable until the streaming locator's expiration or deletion, whichever comes first. This is how you make the virtual "tape" recording available for your viewing audience to see live and on demand. The same URL can be used to watch the live event, the DVR window, or the on-demand asset when the recording is complete (when the live output is deleted).
--
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#CreateStreamingLocator)]
--
-#### Build the paths to the HLS and DASH manifests
-
-The method `BuildManifestPaths` in the sample shows how to deterministically create the streaming paths to use for HLS or DASH delivery to various clients and player frameworks.
-
-[!code-typescript[Main](../../../media-services-v3-node-tutorials/Live/Standard_Passthrough_Live_Event/index.ts#BuildManifestPaths)]
--
-## Watch the event
-
-To watch the event, copy the streaming URL that you got when you ran the code to create a streaming locator. You can use a media player of your choice. [Azure Media Player](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) is available to test your stream at the [Media Player demo site](https://ampdemo.azureedge.net).
-
-A live event automatically converts events to on-demand content when it's stopped. Even after you stop and delete the event, users can stream your archived content as a video on demand for as long as you don't delete the asset. An asset can't be deleted if an event is using it; the event must be deleted first.
-
-## Clean up resources in your Media Services account
-
-If you run the application all the way through, it will automatically clean up all of the resources used in the `cleanUpResources` function. Make sure that the application or debugger runs all the way to completion, or you might leak resources and end up with running live events in your account. Double check in the Azure portal to confirm that all resources are cleaned up in your Media Services account.
-
-In the sample code, refer to the `cleanUpResources` method for details.
-
-> [!IMPORTANT]
-> Leaving the live event running incurs billing costs. Be aware that if the project or program stops responding or is closed out for any reason, it might leave the live event running in a billing state.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## More developer documentation for Node.js on Azure
--- [Azure for JavaScript and Node.js developers](/azure/developer/javascript/)-- [Media Services source code in the @azure/azure-sdk-for-js GitHub repo](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/mediaservices/arm-mediaservices)-- [Azure package documentation for Node.js developers](/javascript/api/overview/azure/)--
-## Next steps
-
-[Stream files](stream-files-tutorial-with-api.md)
media-services Stream Manage Streaming Endpoints How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-manage-streaming-endpoints-how-to.md
- Title: Manage streaming endpoints
-description: This article demonstrates how to manage streaming endpoints with Azure Media Services v3.
----- Previously updated : 03/01/2022---
-# Manage streaming endpoints with Media Services v3
--
-When your Media Services account is created a **default** [Streaming Endpoint](stream-streaming-endpoint-concept.md) is added to your account in the **Stopped** state. To start streaming your content and take advantage of [dynamic packaging](encode-dynamic-packaging-concept.md) and [dynamic encryption](drm-content-protection-concept.md), the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-This article shows you how to execute the [start](/rest/api/media/streamingendpoints/start) command on your streaming endpoint using different technologies.
-
-> [!NOTE]
-> You are only billed when your Streaming Endpoint is in running state.
-
-## Prerequisites
-
-Review:
-
-* [Media Services concepts](concepts-overview.md)
-* [Streaming Endpoint concept](stream-streaming-endpoint-concept.md)
-* [Dynamic packaging](encode-dynamic-packaging-concept.md)
-
-## [Portal](#tab/portal/)
-
-## Use the Azure portal
-
-1. Sign in to the [Azure portal](https://portal.azure.com/).
-1. Go to your Azure Media Services account.
-1. In the left pane, select **Streaming Endpoints**.
-1. Select the streaming endpoint you want to start, and then select **Start**.
-
-## [CLI](#tab/CLI/)
-
-## Use the Azure CLI
-
-```cli
-az ams streaming-endpoint start [--account-name]
- [--ids]
- [--name]
- [--no-wait]
- [--resource-group]
- [--subscription]
-```
-
-For more information, see [az ams streaming-endpoint start](/cli/azure/ams/streaming-endpoint#az-ams-streaming-endpointstart).
-
-## [REST](#tab/rest/)
-
-## Use REST
-
-```rest
-POST https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mediaresources/providers/Microsoft.Media/mediaservices/slitestmedia10/streamingEndpoints/myStreamingEndpoint1/start?api-version=2018-07-01
-```
-
-For more information, see:
-
-* The [start a StreamingEndpoint](/rest/api/media/streamingendpoints/start) reference documentation.
-* Starting a streaming endpoint is an asynchronous operation.
-
- For information about how to monitor long-running operations, see [Long-running operations](media-services-apis-overview.md).
-* This [Postman collection](https://github.com/Azure-Samples/media-services-v3-rest-postman/blob/master/Postman/Media%20Services%20v3.postman_collection.json) contains examples of multiple REST operations, including on how to start a streaming endpoint.
-
-## [.NET](#tab/net/)
-
-## Use .NET
-
-```csharp
-StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(config.ResourceGroup, config.AccountName, DefaultStreamingEndpointName);
-
-if (streamingEndpoint != null)
-{
- if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
- {
- await client.StreamingEndpoints.StartAsync(config.ResourceGroup, config.AccountName, DefaultStreamingEndpointName);
- }
-```
-
-See the complete [.NET code sample](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/Streaming/StreamHLSAndDASH/Program.cs#L112).
--
media-services Stream Scale Streaming Cdn Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-scale-streaming-cdn-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Stream content with CDN integration
-: Azure Media Services
-description: Learn about streaming content with CDN integration, as well as prefetching and Origin-Assist CDN-Prefetch.
------ Previously updated : 08/31/2020---
-# Stream content with CDN integration
--
-Azure Content Delivery Network (CDN) offers developers a global solution for rapidly delivering high-bandwidth content to users by caching their content at strategically placed physical nodes across the world.
-
-CDN caches content streamed from a Media Services [Streaming Endpoint (origin)](stream-streaming-endpoint-concept.md) per codec, per streaming protocol, per bitrate, per container format, and per encryption/DRM. For each combination of codec-streaming protocol-container format-bitrate-encryption, there will be a separate CDN cache.
-
-The popular content will be served directly from the CDN cache as long as the video fragment is cached. Live content is likely to be cached because you typically have many people watching the exact same thing. On-demand content can be a bit trickier because you could have some content that's popular and some that isn't. If you have millions of video assets where none of them are popular (only one or two viewers a week) but you have thousands of people watching all different videos, the CDN becomes much less effective.
-
-You also need to consider how adaptive streaming works. Each individual video fragment is cached as its own entity. For example, imagine the first time a certain video is watched. If the viewer skips around watching only a few seconds here and there, only the video fragments associated with what the person watched get cached in CDN. With adaptive streaming, you typically have 5 to 7 different bitrates of video. If one person is watching one bitrate and another person is watching a different bitrate, then they're each cached separately in the CDN. Even if two people are watching the same bitrate, they could be streaming over different protocols. Each protocol (HLS, MPEG-DASH, Smooth Streaming) is cached separately. So each bitrate and protocol are cached separately and only those video fragments that have been requested are cached.
-
-Except for the test environment, we recommend that CDN be enabled for both Standard and Premium streaming endpoints. Each type of streaming endpoint has a different supported throughput limit.
-It is difficult to make a precise calculation for the maximum number of concurrent streams supported by a streaming endpoint as there are various factors to take into account. These include:
--- Maximum bitrates used for streaming-- Player pre-buffer and switching behavior. Players try to burst segments from an origin and use load speed to calculate adaptive bitrate switching. If a streaming endpoint gets close to saturation, response times can vary and players start switching to lower quality. As this is reducing load on the Streaming Endpoint players, scale back to higher quality creating unwanted switching triggers.
-Overall it is safe to estimate the maximum concurrent streams by taking the maximum streaming endpoint throughput and divide this by the maximum bitrate (assuming all players use the highest bitrate.) For example, you can have a Standard streaming endpoint which is limited to 600 Mbps and the highest bitrate of 3Mbp. In this case, approximately 200 concurrent streams are supported at the top bitrate. Remember to factor in the audio bandwidth requirements as well. Although an audio stream may only be streaming at 128 kps, the total streaming adds up quickly when you multiply it by the number of concurrent streams.
-
-This topic discusses enabling [CDN integration](#enable-azure-cdn-integration). It also explains prefetching (active caching) and the [Origin-Assist CDN-Prefetch](#origin-assist-cdn-prefetch) concept.
-
-## Considerations
--- The [streaming endpoint](stream-streaming-endpoint-concept.md) `hostname` and the streaming URL remain the same whether or not you enable CDN.-- If you need the ability to test your content with or without CDN, create another streaming endpoint that isn't CDN enabled.-
-## Enable Azure CDN integration
-
-> [!IMPORTANT]
-> You can't enable CDN for trial or student Azure accounts.
->
-> CDN integration is enabled in all the Azure data centers except Federal Government and China regions.
-
-After a streaming endpoint is provisioned with CDN enabled, there's a defined wait time on Media Services before DNS update is done to map the streaming endpoint to CDN endpoint.
-
-If you later want to disable/enable the CDN, your streaming endpoint must be in the **stopped** state. Once the streaming endpoint is started it could take up to four hours for the Azure CDN integration to be enabled and for the changes to be active across all the CDN POPs. However, you can start your streaming endpoint and stream without interruptions from the streaming endpoint. Once the integration is complete, the stream is delivered from the CDN. During the provisioning period, your streaming endpoint will be in the **starting** state and you might observe degraded performance.
-
-When the Standard streaming endpoint is created, it's configured by default with Standard Verizon. You can configure Premium Verizon or Standard Akamai providers using REST APIs.
-
-Azure Media Services integration with Azure CDN is implemented on **Azure CDN from Verizon** for standard streaming endpoints. Premium streaming endpoints can be configured using all **Azure CDN pricing tiers and providers**.
-
-> [!NOTE]
-> For details about Azure CDN, see the [CDN overview](../../cdn/cdn-overview.md).
-
-## Determine if a DNS change was made
-
-You can determine if DNS change was made on a streaming endpoint (the traffic is being directed to the Azure CDN) by using <https://www.digwebinterface.com>. If you see azureedge.net domain names in the results, the traffic is now being pointed to the CDN.
-
-## Origin-Assist CDN-Prefetch
-
-CDN caching is a reactive process. If CDN can predict what the next object will be requested, CDN can proactively request and cache the next object. With this process, you can achieve a cache-hit for all (or most) of the objects, which improves performance.
-
-The concept of prefetching strives to position objects at the "edge of the internet" in anticipation that these will be requested by the player imminently, thereby reducing the time to deliver that object to the player.
-
-To achieve this goal, a streaming endpoint (origin) and CDN need to work hand-in-hand in a couple ways:
--- The Media Services origin needs to have the "intelligence" (Origin-Assist) to inform CDN the next object to prefetch.-- CDN does the prefetch and caching (CDN-prefetch part). CDN also needs to have the "intelligence" to inform the origin whether it's a prefetch or a regular fetch, handle the 404 responses, and a way to avoid endless prefetch loop.-
-### Benefits
-
-The benefits of the *Origin-Assist CDN-Prefetch* feature includes:
--- Prefetch improves video playback quality by pre-positioning anticipated video segments at the edge during playback, reducing latency to the viewer, and improving video segment download times. This results in faster video start-up time and lower rebuffering occurrences.-- This concept is applicable to general CDN-origin scenario and isn't limited to media.-- Akamai has added this feature to [Akamai Cloud Embed (ACE)](https://learn.akamai.com/en-us/products/media_delivery/cloud_embed.html).-
-> [!NOTE]
-> This feature is not yet applicable to the Akamai CDN integrated with Media Services streaming endpoint. However, it's available for Media Services customers that have a pre-existing Akamai contract and require custom integration between Akamai CDN and the Media Services origin.
-
-### How it works
-
-CDN support for the `Origin-Assist CDN-Prefetch` headers (for both live and video on-demand streaming) is available to customers who have direct contract with Akamai CDN. The feature involves the following HTTP header exchanges between Akamai CDN and the Media Services origin:
-
-|HTTP header|Values|Sender|Receiver|Purpose|
-| - | - | - | - | -- |
-|`CDN-Origin-Assist-Prefetch-Enabled` | 1 (default) or 0 |CDN|Origin|To indicate CDN is prefetch enabled.|
-|`CDN-Origin-Assist-Prefetch-Path`| Example: <br/>Fragments(video=1400000000,format=mpd-time-cmaf)|Origin|CDN|To provide prefetch path to CDN.|
-|`CDN-Origin-Assist-Prefetch-Request`|1 (prefetch request) or 0 (regular request)|CDN|Origin|To indicate the request from CDN is a prefetch.|
-
-To see part of the header exchange in action, you can try the following steps:
-
-1. Use Postman or cURL to issue a request to the Media Services origin for an audio or video segment or fragment. Make sure to add the header `CDN-Origin-Assist-Prefetch-Enabled: 1` in the request.
-2. In the response, you should see the header `CDN-Origin-Assist-Prefetch-Path` with a relative path as its value.
-
-### Supported streaming protocols
-
-The `Origin-Assist CDN-Prefetch` feature supports the following streaming protocols for live and on-demand streaming:
-
-* HLS v3
-* HLS v4
-* HLS CMAF
-* DASH (CSF)
-* DASH (CMAF)
-* Smooth streaming
-
-### FAQs
-
-* What if a prefetch path URL is invalid so that CDN prefetch gets a 404?
-
- CDN will only cache a 404 response for 10 seconds (or other configured value).
-
-* Suppose you have an on-demand video. If CDN-prefetch is enabled, does this feature imply that once a client requests the first video segment, prefetch will start a loop to prefetch all subsequent video segments at the same bitrate?
-
- No, CDN-prefetch is done only after a client-initiated request/response. CDN-prefetch is never triggered by a prefetch, to avoid a prefetch loop.
-
-* Is Origin-Assist CDN-Prefetch feature always on? How can it be turned on/off?
-
- This feature is off by default. Customers need to turn it on via Akamai API.
-
-* For live streaming, what would happen to Origin-Assist if the next segment or fragment isn't yet available?
-
- In this case, the Media Services origin won't provide `CDN-Origin-Assist-Prefetch-Path` header and CDN-prefetch will not occur.
-
-* How does `Origin-Assist CDN-Prefetch` work with dynamic manifest filters?
-
- This feature works independently of manifest filter. When the next fragment is out of a filter window, its URL will still be located by looking into the raw client manifest and then returned as CDN prefetch response header. So CDN will get the URL of a fragment that's filtered out from DASH/HLS/Smooth manifest. However, the player will never make a GET request to CDN to fetch that fragment, because that fragment isn't included in the DASH/HLS/Smooth manifest held by the player (the player doesn't know that fragment's existence).
-
-* Can DASH MPD/HLS playlist/Smooth manifest be prefetched?
-
- No, DASH MPD, HLS master playlist, HLS variant playlist, or smooth manifest URL isn't added to the prefetch header.
-
-* Are prefetch URLs relative or absolute?
-
- While Akamai CDN allows both, the Media Services origin only provides relative URLs for prefetch path because there's no apparent benefit in using absolute URLs.
-
-* Does this feature work with DRM-protected contents?
-
- Yes, since this feature works at the HTTP level, it doesn't decode or parse any segment/fragment. It doesn't care whether the content is encrypted or not.
-
-* Does this feature work with Server Side Ad Insertion (SSAI)?
-
- It does for original/main content (the original video content before ad insertion) works, since SSAI doesn't change the timestamp of the source content from the Media Services origin. Whether this feature works with ad contents depends on whether ad origin supports Origin-Assist. For example, if ad contents are also hosted in Azure Media Services (same or separate origin), ad contents will also be prefetched.
-
-* Does this feature work with UHD/HEVC contents?
-
- Yes.
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## Next steps
-
-* Make sure to review the [Streaming Endpoint (origin)](stream-streaming-endpoint-concept.md) document.
-* The sample [in this repository](https://github.com/Azure-Samples/media-services-v3-dotnet-quickstarts/blob/master/AMSV3Quickstarts/EncodeAndStreamFiles/Program.cs) shows how to start the default streaming endpoint with .NET.
media-services Stream Streaming Endpoint Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-streaming-endpoint-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Streaming Endpoints (Origin)
-: Azure Media Services
-description: Learn about Streaming Endpoints (Origin), a dynamic packaging and streaming service that delivers content directly to a client player app or to a Content Delivery Network (CDN).
------- Previously updated : 01/20/2022---
-# Streaming Endpoints (Origin) in Azure Media Services
-
-In Microsoft Azure Media Services, a [Streaming Endpoint](/rest/api/media/streamingendpoints) represents a dynamic (just-in-time) packaging and origin service that can deliver your live and on-demand content directly to a client player app, using one of the common streaming media protocols (HLS or DASH). The **Streaming Endpoint** also provides dynamic (just-in-time) encryption to industry-leading DRMs.
-
-When you create a Media Services account, a **default** streaming endpoint is created for you in a stopped state. You can create more streaming endpoints can be created under the account (see [Quotas and limits](limits-quotas-constraints-reference.md)).
-
-> [!NOTE]
-> To start streaming videos, you need to start the **Streaming Endpoint** from which you want to stream the video.
-> You're only billed when your streaming endpoint is in the running state.
-
-Make sure to also review the article [Dynamic packaging](encode-dynamic-packaging-concept.md).
-
-## Naming convention
-
-The host name format of the streaming URL is `{servicename}-{accountname}-{regionname}.streaming.media.azure.net`, where
-`servicename` = the streaming endpoint name or the live event name.
-
-When using the default streaming endpoint, `servicename` is omitted so the URL is: `{accountname}-{regionname}.streaming.azure.net`.
-
-### Limitations
-
-* The streaming endpoint name has a max value of 24 characters.
-* The name should follow this [regex](/dotnet/standard/base-types/regular-expression-language-quick-reference) pattern: `^[a-zA-Z0-9]+(-*[a-zA-Z0-9])*$`.
-
-## Types
-
-There are two **Streaming Endpoint** types: **Standard** (preview) and **Premium**. The type is defined by the number of scale units (`scaleUnits`) you allocate for the streaming endpoint.
-
-The maximum streaming unit limit is usually 10. Contact [Azure support](https://azure.microsoft.com/support/create-ticket/) to raise the limit for your account.
-
-The following table describes the Premium and Standard streaming endpoint types.
-
-|Type|Scale units|Description|
-|--|--|--|
-|**Standard**|0|The default streaming endpoint is a **Standard** type. You can change it to the Premium type by adjusting the `scaleUnits`.|
-|**Premium**|> 0|**Premium** streaming endpoints are suitable for advanced workloads and providing dedicated and scalable bandwidth capacity. You can move to a **Premium** type by adjusting the `scaleUnits` (streaming units). The `scaleUnits` provides a dedicated egress capacity that you can purchase in increments of 200 Mbps. When using the **Premium** type, each enabled unit provides an additional bandwidth capacity to the app. |
-
-> [!NOTE]
-> For customers looking to deliver content to large internet audiences, we recommend you enable CDN on the streaming endpoint.
-
-## Comparing streaming types
-
-Feature|Standard|Premium
-||
-Throughput |Up to 600 Mbps and can provide a much higher effective throughput when you use CDN.|200 Mbps per streaming unit (SU). Can provide a much higher effective throughput when you use CDN.
-CDN|Azure CDN, third-party CDN, or no CDN.|Azure CDN, third-party CDN, or no CDN.
-Billing is prorated| Daily|Daily
-Dynamic encryption|Yes|Yes
-Dynamic packaging|Yes|Yes
-Scale|Auto scales up to the targeted throughput.|Additional SUs.
-IP filtering/G20/Custom host <sup>1</sup>|Yes|Yes
-Progressive download|Yes|Yes
-Resource type| Shared <sup>2</sup>|Dedicated
-Recommended usage |Recommended for testing and non-essential streaming scenarios.|Professional usage.
-
-<sup>1</sup> Only used directly on the streaming endpoint when the CDN isn't enabled on the endpoint.<br/>
-<sup>2</sup> Standard streaming endpoints use a shared pool of resources.<br/>
-
-### Versions
-
-|Type|StreamingEndpointVersion|ScaleUnits|CDN|Billing|
-|--|-|--|--|--|
-|Classic|1.0|0|NA|Free|
-|Standard Streaming Endpoint (preview)|2.0|0|Yes|Paid|
-|Premium Streaming Units|1.0|> 0|Yes|Paid|
-|Premium Streaming Units|2.0|> 0|Yes|Paid|
-
-> [!NOTE]
-> The SLA is only applicable to the Premium streaming endpoints and not the Standard streaming endpoints. For information on SLA, see [Pricing and SLA](https://azure.microsoft.com/pricing/details/media-services/).
-
-## Migration between types
-
-From | To | Action
-||
-Classic|Standard|Need to opt in
-Classic|Premium| Scale (additional streaming units)
-Standard/Premium|Classic|Not available (If the streaming endpoint version is 1.0. Allowed to change to classic by setting the `scaleunits` value to "0".)
-Standard (with/without CDN)|Premium with the same configurations.|Allowed in the **started** state (via Azure portal).
-Premium (with/without CDN)|Standard with the same configurations.|Allowed in the **started** state (via Azure portal).
-Standard (with/without CDN)|Premium with the different configurations.|Allowed in the **stopped** state (via Azure portal). Not allowed in the **running** state.
-Premium (with/without CDN)|Standard with the different configurations.|Allowed in the **stopped** state (via Azure portal). Not allowed in the **running** state.
-Version 1.0 with SU >= 1 with CDN|Standard/Premium with no CDN|Allowed in the **stopped** state. Not allowed in the **started** state.
-Version 1.0 with SU >= 1 with CDN|Standard with/without CDN|Allowed in the **stopped** state. Not allowed in the **started** state. Version 1.0 CDN will be deleted and new one created and started.
-Version 1.0 with SU >= 1 with CDN|Premium with/without CDN|Allowed in the **stopped** state. Not allowed in the **started** state. Classic CDN will be deleted and new one created and started.
-
-## Streaming endpoint properties
-
-This section discusses some of the properties of streaming endpoints. For examples of how to create a new streaming endpoint and descriptions of all the properties, see [Streaming endpoint](/rest/api/media/streamingendpoints/create).
-
-* `accessControl` - Configures the following security settings for this streaming endpoint: Akamai Signature Header Authentication keys and IP addresses that are allowed to connect to this endpoint. This property can only be set when `cdnEnabled` is set to false.
--- `cdnEnabled` - Indicates if the Azure CDN integration for this streaming endpoint is enabled (disabled by default). If you set `cdnEnabled` to true, the following configurations get disabled: `customHostNames` and `accessControl`.
-
- Not all data centers support the Azure CDN integration. To check if your data center has the Azure CDN integration available, do the following steps:
-
- - Try to set the `cdnEnabled` to true.
- - Check the returned result for the `HTTP Error Code 412` (PreconditionFailed) message - "Streaming endpoint CdnEnabled property can't be set to true as CDN capability is unavailable in the current region."
-
- If you get this error, the data center doesn't support it. Try another data center.
--- `cdnProfile` - When `cdnEnabled` is set to true, you can also pass `cdnProfile` values. `cdnProfile` is the name of the CDN profile where the CDN endpoint point gets created. You can provide an existing `cdnProfile` or use a new one. If value is `NULL` and `cdnEnabled` is true, the default value "AzureMediaStreamingPlatformCdnProfile" is used. If the provided `cdnProfile` exists already, an endpoint gets created under it. If the profile doesn't exist, a new profile automatically gets created.--- `cdnProvider` - When CDN is enabled, you can also pass `cdnProvider` values. `cdnProvider` controls which provider will be used. Presently, three values are supported - "StandardVerizon", "PremiumVerizon" and "StandardAkamai". If the value is not provided and `cdnEnabled` is true, use the default value "StandardVerizon".--- `crossSiteAccessPolicies` - Specifies cross-site access policies for various clients. For more information, see [Cross-domain policy file specification](https://www.adobe.com/devnet-docs/acrobatetk/tools/AppSec/CrossDomain_PolicyFile_Specification.pdf) and [Making a Service Available Across Domain Boundaries](/previous-versions/azure/azure-services/gg185950(v=azure.100)). The settings only apply to Smooth Streaming.--- `customHostNames` - Configures a streaming endpoint to accept traffic directed to a custom host name. This property is valid for Standard and Premium streaming endpoints and can be set when `cdnEnabled` is false.-
- * The ownership of the domain name must be confirmed by Media Services. Media Services verifies the domain name ownership with the help of the `CName` record that contains the Media Services account ID as a component to be added to the domain in use. For example, if you use "sports.contoso.com" as a custom host name for the streaming endpoint, configure a record for `<accountId>.contoso.com` to point to one of Media Services verification host names. The verification host name is composed of `verifydns.<mediaservices-dns-zone>`.
-
- Following are the expected DNS zones to be used in the verify record for different Azure regions.
-
- - North America, Europe, Singapore, Hong Kong SAR, and Japan:
-
- - `media.azure.net`
- - `verifydns.media.azure.net`
-
- - China:
-
- - `mediaservices.chinacloudapi.cn`
- - `verifydns.mediaservices.chinacloudapi.cn`
-
- * For example, a `CName` record that maps "945a4c4e-28ea-45cd-8ccb-a519f6b700ad.contoso.com" to "verifydns.media.azure.net" proves that the Media Services ID "945a4c4e-28ea-45cd-8ccb-a519f6b700ad" has the ownership of the *contoso.com* domain, enabling any name under *contoso.com* to be used as a custom host name for a streaming endpoint under that account. To find the Media Service ID value, go to the [Azure portal](https://portal.azure.com/) and select your Media Service account. The **Account ID** appears on the top right of the page.
-
- * If there's an attempt to set a custom host name without a proper verification of the `CName` record, the DNS response will fail and then be cached for some time. Once a proper record is in place, it might take some time until the cached response gets revalidated. Depending on the DNS provider for the custom domain, it takes anywhere from a few minutes to an hour to revalidate the record.
-
- * In addition to the `CName` that maps `<accountId>.<parent domain>` to `verifydns.<mediaservices-dns-zone>`, you must create another `CName` that maps the custom host name (like `sports.contoso.com`) to your Media Services Streaming Endpoint's host name (like `amstest-usea.streaming.media.azure.net`).
-
- > [!NOTE]
- > Streaming endpoints located in the same data center can't share the same custom host name.
-
- Presently, Media Services does not support TLS with custom domains.
--- `maxCacheAge` - Overrides the default max-age HTTP cache control header set by the streaming endpoint on media fragments and on-demand manifests. The value is set in seconds.--- `resourceState` - Below is the description of the states of your streaming endpoint.-
- * Stopped - the initial state of a Streaming Endpoint after creation.
- * Starting - Transitioning to the running state.
- * Running - Able to stream content to the clients.
- * Scaling - the scale units are being increased or decreased.
- * Stopping: Transitioning to the stopped state.
- * Deleting: Being deleted.
--- `scaleUnits` - Provides a dedicated egress capacity that you can purchase in increments of 200 Mbps. If you need to move to a **Premium** type, adjust the value of `scaleUnits`.-
-## Why use multiple streaming endpoints?
-
-A single streaming endpoint can stream both live and on-demand videos and most customers use only one streaming endpoint. This section explains the scenarios that might need you to use multiple streaming endpoints.
-
-* Each reserved unit allows for 200 Mbps of bandwidth. If you need more than 2,000 Mbps (2 Gbps) of bandwidth, use the second streaming endpoint and load balance that provides an additional bandwidth.
-
- CDN is the best way to achieve the scale out for streaming content. However, if you are delivering so much content that the CDN is pulling more than 2 Gbps, you can add additional streaming endpoints (origins). In this case, you would need to hand out content URLs that are balanced across the two streaming endpoints. This approach gives better caching than trying to send requests to each origin randomly (for example, via a traffic manager).
-
- > [!TIP]
- > Usually, when the CDN is pulling more than 2 Gbps, then something might be misconfigured (for example, no origin shielding).
-
-* Load balancing different CDN providers - For example, you could set up the default streaming endpoint to use the Verizon CDN and create a second one to use Akamai. Now, add load balancing between the two endpoints to achieve multi-CDN balancing.
-
- However, the customer often does load balancing across multiple CDN providers using a single origin.
-
-* Streaming mixed content - Live streaming and video on-demand. The access patterns for live and on-demand content are different. The live content tends to get a lot of demand for the same content all at once. The video on-demand content (for example, a long tail archive content) has low usage on the same content. Thus, caching works very well on the live content but not as well on the long tail content.
-
- Consider a scenario in which your customers are mainly watching live content but are only occasionally watching on-demand content and it is served from the same streaming endpoint. The low usage of on-demand content would occupy cache space that would be better saved for the live content. In this scenario, we would recommend serving the live content from one streaming endpoint and the long tail content from another streaming endpoint. This will improve the performance of the live event content.
-
-## Scaling streaming with CDN
-
-See the following articles:
--- [CDN overview](../../cdn/cdn-overview.md)-- [Scaling streaming with CDN](stream-scale-streaming-cdn-concept.md)-
-## Ask questions and get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
-
-[Dynamic packaging](encode-dynamic-packaging-concept.md)
-
-## Next steps
-
-[Manage streaming endpoints](stream-manage-streaming-endpoints-how-to.md)
media-services Stream Streaming Endpoint Error Codes Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-streaming-endpoint-error-codes-reference.md
- Title: Azure Media Services packaging and origin errors
-description: This topic describes errors that you may receive from the Azure Media Services Streaming Endpoint (Orgin) service.
------- Previously updated : 05/07/2019----
-# Streaming Endpoint (Origin) errors
-
-This topic describes errors that you may receive from the Azure Media Services [Streaming Endpoint service](stream-streaming-endpoint-concept.md).
-
-## 400 Bad Request
-
-The request contains invalid information and is rejected with these error codes and due to one of the following reasons:
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_BAD_URL_SYNTAX |0x80890201|A URL syntax or format error. Examples include requests for an invalid type, an invalid fragment, or an invalid track. |
-|MPE_ENC_ENCRYPTION_NOT_SPECIFIED_IN_URL |0x8088024C|The request has no encryption tag in the URL. CMAF requests require an encryption tag in the URL. Other protocols that are configured with more than one encryption type also require the encryption tag for disambiguation. |
-|MPE_STORAGE_BAD_URL_SYNTAX |0x808900E9|The request to storage to fulfill the request failed with a Bad Request error. |
-
-## 403 Forbidden
-
-The request is not allowed due to one of the following reasons:
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_STORAGE_AUTHENTICATION_FAILED |0x808900EA|The request to storage to fulfill the request failed with an Authentication failure. This can happen if the storage keys were rotated and the service was unable to sync the storage keys. <br/><br/>Contact Azure support by going to [Help + support](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest) in the Azure portal.|
-|MPE_STORAGE_INSUFFICIENT_ACCOUNT_PERMISSIONS |0x808900EB |Storage Operation error, access failed due to Insufficient Account Permissions. |
-|MPE_STORAGE_ACCOUNT_IS_DISABLED |0x808900EC |The request to storage to fulfill the request failed because the storage account Is Disabled. |
-|MPE_STORAGE_AUTHENTICATION_FAILURE |0x808900F3 |Storage Operation error, access failed due to generic errors. |
-|MPE_OUTPUT_FORMAT_BLOCKED |0x80890207 |The output format is blocked due to the configuration in the StreamingPolicy. |
-|MPE_ENC_ENCRYPTION_REQUIRED |0x8088021E |Encryption is required for the content, Delivery policy is required for the output format. |
-|MPE_ENC_ENCRYPTION_NOT_SET_IN_DELIVERY_POLICY |0x8088024D |Encryption is not set in delivery policy settings. |
-
-## 404 Not Found
-
-The operation is attempting to act on a resource that no longer exists. For example, the resource may have already been deleted.
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_EGRESS_TRACK_NOT_FOUND |0x80890209 |The requested track is not found. |
-|MPE_RESOURCE_NOT_FOUND |0x808901F9 |The requested resource is not found. |
-|MPE_UNAUTHORIZED |0x80890244 |The access is unauthorized. |
-|MPE_EGRESS_TIMESTAMP_NOT_FOUND |0x8089020A |The requested timestamp is not found. |
-|MPE_EGRESS_FILTER_NOT_FOUND |0x8089020C |The requested dynamic manifest filter is not found. |
-|MPE_FRAGMENT_BY_INDEX_NOT_FOUND |0x80890252 |The requested fragment index is beyond the valid range. |
-|MPE_LIVE_MEDIA_ENTRIES_NOT_FOUND |0x80890254 |Live media entries cannot be found to get moov buffer. |
-|MPE_FRAGMENT_TIMESTAMP_NOT_FOUND |0x80890255 |Unable to find the fragment at the requested time for a particular track.<br/><br/>Could be that the fragment isn't in storage. Try a different layer of the presentation that might have a fragment. |
-|MPE_MANIFEST_MEDIA_ENTRY_NOT_FOUND |0x80890256 |Unable to find the media entry for the requested bitrate in the manifest. <br/><br/>Could be that the player asked for a video track of a certain bitrate that wasn't in the manifest.|
-|MPE_METADATA_NOT_FOUND |0x80890257 |Unable to find certain metadata in the manifest or unable to find rebase from storage. |
-|MPE_STORAGE_RESOURCE_NOT_FOUND |0x808900ED |Storage Operation error, resource not found. |
-
-## 409 Conflict
-
-The ID provided for a resource on a `PUT` or `POST` operation has been taken by an existing resource. Use another ID for the resource to resolve this issue.
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_STORAGE_CONFLICTΓÇ» |0x808900EEΓÇ» |Storage Operation error, conflict error.ΓÇ» |
-
-## 410
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_FILTER_FORCE_END_LEFT_EDGE_CROSSED_DVR_WINDOW|0x80890263|For live streaming, when the filter that has forceEndTimestamp set to true, the start or end timestamp is outside of the current DVR window.|
-
-## 412 Precondition Failure
-
-The operation specified an eTag that is different from the version available at the server, that is, an optimistic concurrency error. Retry the request after reading the latest version of the resource and updating the eTag on the request.
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_FRAGMENT_NOT_READY |0x80890200 |The requested fragment is not ready.|
-|MPE_STORAGE_PRECONDITION_FAILED| 0x808900EF|Storage operation error, a precondition failure.|
-
-## 415 Unsupported Media Type
-
-The payload format sent by the client is in an unsupported format.
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_ENC_ALREADY_ENCRYPTED| 0x8088021F| Should not apply encryption on already encrypted content.|
-|MPE_ENC_INVALID_INPUT_ENCRYPTION_FORMAT|0x8088021D |The encryption is invalid for the input format.|
-|MPE_INVALID_ASSET_DELIVERY_POLICY_TYPE|0x8088021C| Delivery policy type is invalid.|
-|MPE_ENC_MULTIPLE_SAME_DELIVERY_TYPE|0x8088024E |The original settings could be shared by multiple output formats.|
-|MPE_FORMAT_NOT_SUPPORTED|0x80890205|The media format or type is unsupported. For example, Media Services does not support quality level count that is over 64. In FLV video tag, Media Services does not support a video frame with multiple SPS and multiple PPS.|
-|MPE_INPUT_FORMAT_NOT_SUPPORTED|0x80890218| The input format of asset requested is not supported. Media Services supports Smooth (live), MP4 (VoD) and Progressive download formats.|
-|MPE_OUTPUT_FORMAT_NOT_SUPPORTED|0x8089020D|The output format requested is not supported. Media Services supports Smooth, DASH(CSF, CMAF), HLS (v3, v4, CMAF), and Progressive download formats.|
-|MPE_ENCRYPTION_NOT_SUPPORTED|0x80890208|Encountered unsupported encryption type.|
-|MPE_MEDIA_TYPE_NOT_SUPPORTED|0x8089020E|The media type requested is not supported by the output format. The supported types are video, audio or "SUBT" subtitle.|
-|MPE_MEDIA_ENCODING_NOT_SUPPORTED|0x8089020F|The source asset media was encoded with a media format that is not compatible with the output format.|
-|MPE_VIDEO_ENCODING_NOT_SUPPORTED|0x80890210|The source asset was encoded with a video format that is not compatible with the output format. H.264, AVC, H.265 (HEVC, hev1 or hvc1) are supported.|
-|MPE_AUDIO_ENCODING_NOT_SUPPORTED|0x80890211|The source asset was encoded with an audio format that is not compatible with the output format. Supported audio formats are AAC, E-AC3 (DD+), Dolby DTS.|
-|MPE_SOURCE_PROTECTION_CONVERSION_NOT_SUPPORTED|0x80890212|The source protected asset cannot be converted to the output format.|
-|MPE_OUTPUT_PROTECTION_FORMAT_NOT_SUPPORTED|0x80890213|The protection format is not supported by the output format.|
-|MPE_INPUT_PROTECTION_FORMAT_NOT_SUPPORTED|0x80890219|The protection format is not supported by the input format.|
-|MPE_INVALID_VIDEO_NAL_UNIT|0x80890231|Invalid video NAL unit, for example, only the first NAL in the sample can be an AUD.|
-|MPE_INVALID_NALU_SIZE|0x80890260|Invalid NAL unit size.|
-|MPE_INVALID_NALU_LENGTH_FIELD|0x80890261|Invalid NAL unit length value.|
-|MPE_FILTER_INVALID|0x80890236|Invalid dynamic manifest filters.|
-|MPE_FILTER_VERSION_INVALID|0x80890237|Invalid or unsupported filter versions.|
-|MPE_FILTER_TYPE_INVALID|0x80890238|Invalid filter type.|
-|MPE_FILTER_RANGE_ATTRIBUTE_INVALID|0x80890239|Invalid range is specified by the filter.|
-|MPE_FILTER_TRACK_ATTRIBUTE_INVALID|0x8089023A|Invalid track attribute is specified by the filter.|
-|MPE_FILTER_PRESENTATION_WINDOW_INVALID|0x8089023B|Invalid presentation window length is specified by the filter.|
-|MPE_FILTER_LIVE_BACKOFF_INVALID|0x8089023C|Invalid live back off is specified by the filter.|
-|MPE_FILTER_MULTIPLE_SAME_TYPE_FILTERS|0x8089023D|Only one absTimeInHNS element is supported in legacy filters.|
-|MPE_FILTER_REMOVED_ALL_STREAMS|0x8089023E|There is no more streams at all after applying the filters.|
-|MPE_FILTER_LIVE_BACKOFF_OVER_DVRWINDOW|0x8089023F|The live back off is beyond the DVR window.|
-|MPE_FILTER_LIVE_BACKOFF_OVER_PRESENTATION_WINDOW|0x80890262|The live back off is greater than the presentation window.|
-|MPE_FILTER_COMPOSITION_FILTER_COUNT_OVER_LIMIT|0x80890246|Exceeded ten (10) maximum allowed default filters.|
-|MPE_FILTER_COMPOSITION_MULTIPLE_FIRST_QUALITY_OPERATOR_NOT_ALLOWED|0x80890248|Multiple first video quality operator is not allowed in combined request filters.|
-|MPE_FILTER_FIRST_QUALITY_ATTRIBUTE_INVALID|0x80890249|The number of first quality bitrate attributes must be one (1).|
-|MPE_HLS_SEGMENT_TOO_LARGE|0x80890243|HLS segment duration must smaller than one third of the DVR window and HLS back off.|
-|MPE_KEY_FRAME_INTERVAL_TOO_LARGE|0x808901FE|Fragment durations must be less than or equal to approximately 20 seconds, or the input quality levels are not time aligned.|
-|MPE_DTS_RESERVEDBOX_EXPECTED|0x80890105|DTS-specific error, cannot find the ReservedBox when it should present in the DTSSpecficBox during DTS box parsing.|
-|MPE_DTS_INVALID_CHANNEL_COUNT|0x80890106|DTS-specific error, no channels found in the DTSSpecficBox during DTS box parsing.|
-|MPE_DTS_SAMPLETYPE_MISMATCH|0x80890107|DTS-specific error, sample type mismatch in the DTSSpecficBox.|
-|MPE_DTS_MULTIASSET_DTSH_MISMATCH|0x80890108|DTS-specific error, multi-asset is set but DTSH sample type mismatch.|
-|MPE_DTS_INVALID_CORESTREAM_SIZE|0x80890109|DTS-specific error, core stream size is invalid.|
-|MPE_DTS_INVALID_SAMPLE_RESOLUTION|0x8089010A|DTS-specific error, sample resolution is invalid.|
-|MPE_DTS_INVALID_SUBSTREAM_INDEX|0x8089010B|DTS-specific error, sub-stream extension index is invalid.|
-|MPE_DTS_INVALID_BLOCK_NUM|0x8089010C|DTS-specific error, sub-stream block number is invalid.|
-|MPE_DTS_INVALID_SAMPLING_FREQUENCE|0x8089010D|DTS-specific error, sampling frequency is invalid.|
-|MPE_DTS_INVALID_REFCLOCKCODE|0x8089010E|DTS-specific error, the reference clock code in sub-stream extension is invalid.|
-|MPE_DTS_INVALID_SPEAKERS_REMAP|0x8089010F|DTS-specific error, the number of speakers remap set is invalid.|
-
-For encryption articles and examples, see:
--- [Concept: content protection](drm-content-protection-concept.md)-- [Concept: Content Key Policies](drm-content-key-policy-concept.md)-- [Concept: Streaming Policies](stream-streaming-policy-concept.md)-- [Sample: protect with AES encryption](drm-playready-license-template-concept.md)-- [Sample: protect with DRM](drm-protect-with-drm-tutorial.md)-
-For filter guidance, see:
--- [Concept: dynamic manifests](filters-dynamic-manifest-concept.md)-- [Concept: filters](filters-concept.md)-- [Sample: create filters with REST APIs](filters-dynamic-manifest-rest-howto.md)-- [Sample: create filters with .NET](filters-dynamic-manifest-dotnet-how-to.md)-- [Sample: create filters with CLI](filters-dynamic-manifest-cli-how-to.md)-
-For live articles and samples, see:
--- [Concept: live streaming overview](stream-live-streaming-concept.md)-- [Concept: Live Events and Live Outputs](live-event-outputs-concept.md)-- [Sample: live streaming tutorial](stream-live-tutorial-with-api.md)-
-## 416 Range Not Satisfiable
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_STORAGE_INVALID_RANGE|0x808900F1|Storage Operation error, returned http 416 error, invalid range.|
-
-## 500 Internal Server Error
-
-During the processing of the request, Media Services encounters some error that prevents the processing from continuing.
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_STORAGE_SOCKET_TIMEOUT|0x808900F4|Received and translated from Winhttp error code of ERROR_WINHTTP_TIMEOUT (0x00002ee2).|
-|MPE_STORAGE_SOCKET_CONNECTION_ERROR|0x808900F5|Received and translated from Winhttp error code of ERROR_WINHTTP_CONNECTION_ERROR (0x00002efe).|
-|MPE_STORAGE_SOCKET_NAME_NOT_RESOLVED|0x808900F6|Received and translated from Winhttp error code of ERROR_WINHTTP_NAME_NOT_RESOLVED (0x00002ee7).|
-|MPE_STORAGE_INTERNAL_ERROR|0x808900E6|Storage Operation error, general InternalError of one of HTTP 500 errors.|
-|MPE_STORAGE_OPERATION_TIMED_OUT|0x808900E7|Storage Operation error, general OperationTimedOut of one of HTTP 500 errors.|
-|MPE_STORAGE_FAILURE|0x808900F2|Storage Operation error, other HTTP 500 errors than InternalError or OperationTimedOut.|
-
-## 503 Service Unavailable
-
-The server is currently unable to receive requests. This error may be caused by excessive requests to the service. Media Services throttling mechanism restricts the resource usage for applications that make excessive request to the service.
-
-> [!NOTE]
-> Check the error message and error code string to get more detailed information about the reason you got the 503 error. This error does not always mean throttling.
->
-
-|Error code|Hexadecimal value |Error description|
-||||
-|MPE_STORAGE_SERVER_BUSY|0x808900E8|Storage Operation error, received HTTP server busy error 503.|
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
--- [Encoding error codes](/rest/api/media/jobs/get#joberrorcode)-- [Azure Media Services concepts](concepts-overview.md)-- [Quotas and limits](limits-quotas-constraints-reference.md)-
-## Next steps
-
-[Example: access ErrorCode and Message from ApiException with .NET](configure-connect-dotnet-howto.md#connect-to-the-net-client)
media-services Stream Streaming Locators Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-streaming-locators-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Streaming Locators in Azure Media Services
-description: This article gives an explanation of what Streaming Locators are, and how they are used by Azure Media Services.
------- Previously updated : 03/04/2020----
-# Streaming Locators
-
-To make videos in the output Asset available to clients for playback, you have to create a [Streaming Locator](/rest/api/medi#get-a-streaming-locator).
-
-The process of creating a **Streaming Locator** is called publishing. By default, the **Streaming Locator** is valid immediately after you make the API calls, and lasts until it is deleted, unless you configure the optional start and end times.
-
-When creating a **Streaming Locator**, you must specify an **Asset** name and a **Streaming Policy** name. For more information, see the following topics:
-
-* [Assets](assets-concept.md)
-* [Streaming Policies](stream-streaming-policy-concept.md)
-* [Content Key Policies](drm-content-key-policy-concept.md)
-
-You can also specify the start and end time on your Streaming Locator, which will only let your user play the content between these times (for example, between 5/1/2019 to 5/5/2019).
-
-## Considerations
-
-* **Streaming Locators** are not updatable.
-* Properties of **Streaming Locators** that are of the Datetime type are always in UTC format.
-* You should design a limited set of policies for your Media Service account and reuse them for your Streaming Locators whenever the same options are needed. For more information, see [Quotas and limits](limits-quotas-constraints-reference.md).
-
-## Create Streaming Locators
-
-### Not encrypted
-
-If you want to stream your file in-the-clear (non-encrypted), set the predefined clear streaming policy: to 'Predefined_ClearStreamingOnly' (in .NET, you can use the PredefinedStreamingPolicy.ClearStreamingOnly enum).
-
-```csharp
-StreamingLocator locator = await client.StreamingLocators.CreateAsync(
- resourceGroup,
- accountName,
- locatorName,
- new StreamingLocator
- {
- AssetName = assetName,
- StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
- });
-```
-
-### Encrypted
-
-If you need to encrypt your content with the CENC encryption, set your policy to 'Predefined_MultiDrmCencStreaming'. The Widevine encryption will be applied to a DASH stream and PlayReady to Smooth. The key will be delivered to a playback client based on the configured DRM licenses.
-
-```csharp
-StreamingLocator locator = await client.StreamingLocators.CreateAsync(
- resourceGroup,
- accountName,
- locatorName,
- new StreamingLocator
- {
- AssetName = assetName,
- StreamingPolicyName = "Predefined_MultiDrmCencStreaming",
- DefaultContentKeyPolicyName = contentPolicyName
- });
-```
-
-If you also want to encrypt your HLS stream with CBCS (FairPlay), use 'Predefined_MultiDrmStreaming'.
-
-> [!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Associate filters with Streaming Locators
-
-See [Filters: associate with Streaming Locators](filters-concept.md#associating-filters-with-streaming-locator).
-
-## Filter, order, page Streaming Locator entities
-
-See [Filtering, ordering, paging of Media Services entities](filter-order-page-entities-how-to.md).
-
-## List Streaming Locators by Asset name
-
-To get Streaming Locators based on the associated Asset name, use the following operations:
-
-|Language|API|
-|||
-|REST|[liststreaminglocators](/rest/api/media/assets/liststreaminglocators)|
-|CLI|[az ams asset list-streaming-locators](/cli/azure/ams/asset#az-ams-asset-list-streaming-locators)|
-|.NET|[ListStreamingLocators](/dotnet/api/microsoft.azure.management.media.assetsoperationsextensions.liststreaminglocators#Microsoft_Azure_Management_Media_AssetsOperationsExtensions_ListStreamingLocators_Microsoft_Azure_Management_Media_IAssetsOperations_System_String_System_String_System_String_)|
-|Java|[AssetStreamingLocator](/rest/api/media/assets/liststreaminglocators#assetstreaminglocator)|
-|Node.js|[listStreamingLocators](/javascript/api/@azure/arm-mediaservices/assets#liststreaminglocators-string--string--string--msrest-requestoptionsbase-)|
-
-## See also
-
-* [Assets](assets-concept.md)
-* [Streaming Policies](stream-streaming-policy-concept.md)
-* [Content Key Policies](drm-content-key-policy-concept.md)
-* [Tutorial: Upload, encode, and stream videos using .NET](stream-files-tutorial-with-api.md)
-
-## Next steps
-
-[How to create a streaming locator and build URLs](create-streaming-locator-build-url.md)
media-services Stream Streaming Policy Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/stream-streaming-policy-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Streaming Policies in Azure Media Services
-description: This article gives an explanation of what Streaming Policies are, and how they are used by Azure Media Services.
------- Previously updated : 05/28/2019---
-# Streaming Policies
-
-In Azure Media Services v3, [Streaming Policies](/rest/api/medi). Media Services v3 provides some predefined Streaming Policies so that you can use them directly for trial or production.
-
-The currently available predefined Streaming Policies:<br/>
-* 'Predefined_DownloadOnly'
-* 'Predefined_ClearStreamingOnly'
-* 'Predefined_DownloadAndClearStreaming'
-* 'Predefined_ClearKey'
-* 'Predefined_MultiDrmCencStreaming'
-* 'Predefined_MultiDrmStreaming'
-
-The following "Decision tree" helps you choose a predefined Streaming Policy for your scenario.
-
-> [!IMPORTANT]
-> * Properties of **Streaming Policies** that are of the Datetime type are always in UTC format.
-> * You should design a limited set of policies for your Media Service account and reuse them for your Streaming Locators whenever the same options are needed. For more information, see [Quotas and limits](limits-quotas-constraints-reference.md).
-
-## Decision tree
-
-Click the image to view it full size.
-
-[![Diagram showing a decision tree that is designed to help you choose a predefined Streaming Policy for your scenario.](./media/streaming-policy/large.png)](./media/streaming-policy/large.png#lightbox)
-
-If encrypting your content, you need to create a [Content Key Policy](drm-content-key-policy-concept.md), the **Content Key Policy** is not needed for clear streaming or downloading.
-
-If you have special requirements (for example, if you want to specify different protocols, need to use a custom key delivery service, or need to use a clear audio track), you can [create](/rest/api/media/streamingpolicies/create) a custom Streaming Policy.
-
-## Get a Streaming Policy definition
-
-If you want to see the definition of a Streaming Policy, use [Get](/rest/api/media/streamingpolicies/get) and specify the policy name. For example:
-
-### REST
-
-Request:
-
-```
-GET https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/contoso/providers/Microsoft.Media/mediaServices/contosomedia/streamingPolicies/clearStreamingPolicy?api-version=2018-07-01
-```
-
-Response:
-
-```
-{
- "name": "clearStreamingPolicy",
- "id": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/contoso/providers/Microsoft.Media/mediaservices/contosomedia/streamingPolicies/clearStreamingPolicy",
- "type": "Microsoft.Media/mediaservices/streamingPolicies",
- "properties": {
- "created": "2018-08-08T18:29:30.8501486Z",
- "noEncryption": {
- "enabledProtocols": {
- "download": true,
- "dash": true,
- "hls": true,
- "smoothStreaming": true
- }
- }
- }
-}
-```
-
-## Filtering, ordering, paging
-
-See [Filtering, ordering, paging of Media Services entities](filter-order-page-entities-how-to.md).
-
-## Next steps
-
-* [Stream a file](stream-files-dotnet-quickstart.md)
-* [Use AES-128 dynamic encryption and the key delivery service](drm-playready-license-template-concept.md)
-* [Use DRM dynamic encryption and license delivery service](drm-protect-with-drm-tutorial.md)
media-services Transform Add Custom Transform Output How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-add-custom-transform-output-how-to.md
- Title: Add a custom transform output for a Media Services job
-description: This article shows you how to add a custom transform output for a Media Services job.
----- Previously updated : 03/08/2022--
-# Add a custom transform output
--
-<!-- NOTE: The following are in the includes folder and are reused in other How To articles. All task based content should be in the includes folder with the task- prefix prepended to the file name. -->
-
-This article shows you how to add a custom transform output for a Media Services job.
-
-## Methods
-
-You can use the following methods to add a custom transform output for a Media Services job.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/transforms).
--
media-services Transform Create Basic Audio How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-basic-audio-how-to.md
- Title: Create a basic audio transform
-description: Create a basic audio transform using Media Services API.
----- Previously updated : 03/09/2022---
-# Create a basic audio transform
--
-This article shows how to create a basic audio transform. The basic mode was added to the `#Microsoft.Media.AudioAnalyzerPreset` with the 2020-05-01 release.
-
-## Prerequisites
-
-Follow the steps in [Create a Media Services account](./account-create-how-to.md) to create the needed Media Services account and resource group to create an asset.
-
-## Methods
-
-## [REST](#tab/rest/)
-
-### Using the REST API
--
-## Next steps
---
media-services Transform Create Copy Video Audio How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-copy-video-audio-how-to.md
- Title: Create a CopyVideo CopyAudio transform
-description: Create a CopyVideo CopyAudio transform using Media Services API.
----- Previously updated : 03/09/2022---
-# Create a CopyVideo CopyAudio transform
--
-This article shows how to create a `CopyVideo/CopyAudio` transform.
-
-This transform allows you to have input video/input audio streams copied from the input asset to the output asset without any changes. This can be of value with multi bitrate encoding output where the input video and/or audio would be part of the output. It simply writes the manifest and other files needed to stream content.
-
-## Prerequisites
-
-Follow the steps in [Create a Media Services account](./account-create-how-to.md) to create the needed Media Services account and resource group to create an asset.
-
-## Methods
-
-## [REST](#tab/rest/)
-
-### Using the REST API
---
media-services Transform Create Copyallbitratenoninterleaved How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-copyallbitratenoninterleaved-how-to.md
- Title: Create a CopyAllBitrateNonInterleaved transform
-description: Create a CopyAllBitrateNonInterleaved transform.
---- Previously updated : 03/09/2022---
-# Create a CopyAllBitrateNonInterleaved transform
--
-This article shows how to create a `CopyAllBitrateNonInterleaved` transform.
-
-## Prerequisites
-
-Follow the steps in [Create a Media Services account](./account-create-how-to.md) to create the needed Media Services account and resource group to create an asset.
-
-## Methods
-
-## [REST](#tab/rest/)
-
-### Using the REST API
---
media-services Transform Create Custom Transform How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-custom-transform-how-to.md
- Title: Create a Media Service custom transform
-description: This article shows you how to create a Media Services custom transform.
----- Previously updated : 03/08/2022---
-# Create a custom transform
--
-This article shows you how to create a Media Services custom transform.
-
-## Methods
-
-You can use the following methods to create a Media Services custom transform.
-
-## CLI
-
media-services Transform Create Overlay How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-overlay-how-to.md
- Title: How to create an image overlay
-description: Learn how to create an image overlay
---- Previously updated : 03/09/2022--
-# How to create an image overlay
--
-Media Services allows you to overlay an image, audio file, or another video on top of a video. The input must specify exactly one image file. You can specify an image file in JPG, PNG, GIF or BMP format, or an audio file (such as a WAV, MP3, WMA or M4A file), or a video file in a supported file format.
-
-## [.NET](#tab/net/)
-
-## Prerequisites
-
-* Collect the account information that you need to configure the *appsettings.json* file in the sample. If you're not sure how to do that, see [Quickstart: Register an application with the Microsoft identity platform](../../active-directory/develop/quickstart-register-app.md). The following values are expected in the *appsettings.json* file.
-
-```json
- {
- "AadClientId": "",
- "AadEndpoint": "https://login.microsoftonline.com",
- "AadSecret": "",
- "AadTenantId": "",
- "AccountName": "",
- "ArmAadAudience": "https://management.core.windows.net/",
- "ArmEndpoint": "https://management.azure.com/",
- "Location": "",
- "ResourceGroup": "",
- "SubscriptionId": ""
- }
-```
-
-If you aren't already familiar with the creation of Transforms, it is recommended that you complete the following activities:
-
-* Read [Encoding video and audio with Media Services](encode-concept.md)
-* Read [How to encode with a custom transform - .NET](transform-custom-transform-how-to.md). Follow the steps in that article to set up the .NET needed to work with transforms, then return here to try out an overlays preset sample.
-* See the [Transforms reference document](/rest/api/media/transforms).
-
-Once you are familiar with Transforms, download the overlays sample.
-
-## Overlays preset sample
-
-Clone the Media Services .NET sample repository.
-
-```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git
-```
-
-Navigate into the solution folder, and launch Visual Studio Code, or Visual Studio 2019.
-
-A number of encoding samples are available in the VideoEncoding folder. Open the project in the [VideoEncoding/Encoding_OverlayImage](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_OverlayImage) solution folder to get started learning how to use overlays.
-
-The sample project contains two media files. A video file, and a logo image to overlay on top of the video.
-* ignite.mp4
-* cloud.png
-
-In this sample, we first create a custom Transform that can overlay the image on top of the video in the CreateCustomTransform method. Using the *[Filters](/rest/api/media/transforms/create-or-update#filters)* property of the *[StandardEncoderPreset](/rest/api/media/transforms/create-or-update#standardencoderpreset)*, we assign a new Filters collection that contains the video overlay settings.
-
-A [VideoOverlay](/rest/api/media/transforms/create-or-update#videooverlay) contains a property called *InputLabel* that is required to map from the list of job input files submitted into the job and locate the right input source file intended for use as the overlay image or video. When submitting the job this same label name is used to match up to the setting here in the Transform. In the sample we are using the label name of "logo" as seen in the string constant *OverlayLabel*.
-
-The following code snippet shows how the Transform is formatted to use an overlay.
-
-```csharp
-new TransformOutput
- {
- Preset = new StandardEncoderPreset
- {
- Filters = new Filters
- {
- Overlays = new List<Overlay>
- {
- new VideoOverlay
- {
- InputLabel = OverlayLabel, // same as the one used in the JobInput to identify which asset is the overlay image
- Position = new Rectangle( "1200","670") // left, top position of the overlay in absolute pixel position relative to the source videos resolution.
-
- }
- }
- },
- Codecs = new List<Codec>
- {
- new AacAudio
- {
- },
- new H264Video
- {
- KeyFrameInterval = TimeSpan.FromSeconds(2),
- Layers = new List<H264Layer>
- {
- new H264Layer
- {
- Profile = H264VideoProfile.Baseline,
- Bitrate = 1000000, // 1Mbps
- Width = "1280",
- Height = "720"
- },
- new H264Layer // Adding a second layer to see that the image also is scaled and positioned the same way on this layer.
- {
- Profile = H264VideoProfile.Baseline,
- Bitrate = 600000, // 600 kbps
- Width = "480",
- Height = "270"
- }
- }
- }
- },
- Formats = new List<Format>
- {
- new Mp4Format
- {
- FilenamePattern = "{Basename}_{Bitrate}{Extension}",
- }
- }
- }
- }
-```
-
-When submitting the Job to the Transform, you must first create the two input assets.
-
-* Asset 1 - in this sample the first Asset created is the local video file "ignite.mp4". This is the video that we will use as the background of the composite, and overlay a logo image on top of.
-* Asset 2 - in this sample, the second asset (stored in the overlayImageAsset variable) contains the .png file to be used for the logo. This image will be positioned onto the video during encoding.
-
-When the Job is created in the *SubmitJobAsync* method, we first construct a JobInput array using a List<> object. The List will contain the references to the two source assets.
-
-In order to identify and match which input asset is to be used as the overlay in the Filter defined in above Transform, we again use the "logo" label name to handle the matching. The label name is added to the JobInputAsset for the .png image. This tells the Transform which asset to use when doing the overlay operation. you can re-use this same Transform with different Assets stored in Media Services that contain various logos or graphics that you wish to overlay, and simply change the asset name passed into the Job, while using the same label name "logo" for the Transform to match it to.
-
-``` csharp
- // Add both the Video and the Overlay image assets here as inputs to the job.
- List<JobInput> jobInputs = new List<JobInput>() {
- new JobInputAsset(assetName: inputAssetName),
- new JobInputAsset(assetName: overlayAssetName, label: OverlayLabel)
- };
-```
-
-Run the sample by selecting the project in the Run and Debug window in Visual Studio Code. The sample will output progress of the encoding operation, and finally download the contents into the /Output folder in either your project root, or in the case of full Visual Studio this may be in your /bin/Output folder.
-
-The sample also publishes the content for streaming and will output the full HLS, DASH and Smooth Streaming manifest file URLs that can be used in any compatible player. You can also easily copy the manifest URL to the [Azure Media Player demo](http://ampdemo.azureedge.net/) and paste the URL that ends with /manifest into the URL box, then click *Update Player*.
-
-## API references
-
-* [VideoOverlay object](/rest/api/media/transforms/create-or-update#videooverlay)
-* [Filters](/rest/api/media/transforms/create-or-update#filters)
-* [StandardEncoderPreset](/rest/api/media/transforms/create-or-update#standardencoderpreset)
---
media-services Transform Create Thumbnail Sprites How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-thumbnail-sprites-how-to.md
- Title: Create a thumbnail sprites transform
-description: How do I create thumbnail sprites? You can create a transform for a job that will generate thumbnail sprites for your videos. This article shows you how.
------- Previously updated : 2/17/2021---
-# Create a thumbnail sprite transform
--
-This article shows you how with the Media Services 2020-05-01 v3 API.
-
-You can use Media Encoder Standard to generate a thumbnail sprite, which is a JPEG file that contains multiple small resolution thumbnails stitched together into a single (large) image, together with a VTT file. This VTT file specifies the time range in the input video that each thumbnail represents, together with the size and coordinates of that thumbnail within the large JPEG file. Video players use the VTT file and sprite image to show a 'visual' seekbar, providing a viewer with visual feedback when scrubbing back and forward along the video timeline.
-
-Add the code snippets for your preferred development language.
-
-## [REST](#tab/rest/)
--
-## [.NET](#tab/dotnet/)
--
-See also thumbnail sprite creation in a [complete encoding sample](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoEncoding/Encoding_SpriteThumbnail/Program.cs#L261-L287) at Azure Samples.
---
-## Next steps
-
media-services Transform Create Transform How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-create-transform-how-to.md
- Title: Azure CLI Script Example - Create a transform
-description: Transforms describe a simple workflow of tasks for processing your video or audio files (often referred to as a "recipe"). The Azure CLI script in this article shows how to create a transform.
------- Previously updated : 11/18/2020-----
-# Create a transform
--
-The Azure CLI script in this article shows how to create a transform. Transforms describe a simple workflow of tasks for processing your video or audio files (often referred to as a "recipe"). You should always check if a Transform with desired name and "recipe" already exist. If it does, you should reuse it.
-
-## Prerequisites
-
-[Create a Media Services account](./account-create-how-to.md).
-
-## Methods
-
-You can use the following methods to create a transform.
-
-## [Portal](#tab/portal/)
--
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/transforms/create-or-update).
---
media-services Transform Crop How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-crop-how-to.md
- Title: How to crop video files with Media Services
-description: Cropping is the process of selecting a rectangular window within the video frame, and encoding just the pixels within that window. This topic shows how to crop video files with Media Services.
----- Previously updated : 03/09/2022---
-# How to crop video files with Media Services
--
-You can use Media Services to crop an input video. Cropping is the process of selecting a rectangular window within the video frame, and encoding just the pixels within that window. The following diagram helps illustrate the process.
-
-## Pre-processing stage
-
-Cropping is a pre-processing stage, so the *cropping parameters* in the encoding preset apply to the *input* video. Encoding is a subsequent stage, and the width/height settings apply to the *pre-processed* video, and not to the original video. When designing your preset, do the following:
-
-1. Select the crop parameters based on the original input video
-1. Select your encode settings based on the cropped video.
-
-> [!WARNING]
-> If you do not match your encode settings to the cropped video, the output will not be as you expect.
-
-For example, your input video has a resolution of 1920x1080 pixels (16:9 aspect ratio), but has black bars (pillar boxes) at the left and right, so that only a 4:3 window or 1440x1080 pixels contains active video. You can crop the black bars, and encode the 1440x1080 area.
-
-## [.NET](#tab/net/)
-
-## Transform code
-
-The following code snippet illustrates how to write a transform in .NET to crop videos. The code assumes that you have a local file to work with.
--- Left is the left-most location of the crop.-- Top is the top-most location of the crop.-- Width is the final width of the crop.-- Height is the final height of the crop.-
-```dotnet
-var preset = new StandardEncoderPreset
-
- {
-
- Filters = new Filters
-
- {
-
- Crop = new Rectangle
-
- {
-
- Left = "200",
-
- Top = "200",
-
- Width = "1280",
-
- Height = "720"
-
- }
-
- },
-
- Codecs =
-
- {
-
- new AacAudio(),
-
- new H264Video()
-
- {
-
- Layers =
-
- {
-
- new H264Layer
-
- {
-
- Bitrate = 1000000,
-
- Width = "1280",
-
- Height = "720"
-
- }
-
- }
-
- }
-
- },
-
- Formats =
-
- {
-
- new Mp4Format
-
- {
-
- FilenamePattern = "{Basename}_{Bitrate}{Extension}"
-
- }
-
- }
-
- }
-
-```
--
media-services Transform Custom Transform How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-custom-transform-how-to.md
- Title: Encode custom transform
-description: This topic shows how to use Azure Media Services v3 to encode a custom transform.
----- Previously updated : 03/09/2022---
-# How to encode with a custom transform
--
-When encoding with Azure Media Services, you can get started quickly with one of the recommended built-in presets based on industry best practices as demonstrated in the [Streaming files](stream-files-tutorial-with-api.md) tutorial. You can also build a custom preset to target your specific scenario or device requirements.
-
-## Considerations
-
-When creating custom presets, the following considerations apply:
-
-* All values for height and width on AVC content must be a multiple of 4.
-* In Azure Media Services v3, all of the encoding bitrates are in bits per second. This is different from the presets with our v2 APIs, which used kilobits/second as the unit. For example, if the bitrate in v2 was specified as 128 (kilobits/second), in v3 it would be set to 128000 (bits/second).
-
-## Prerequisites
-
-[Create a Media Services account](./account-create-how-to.md)
-
-## [CLI](#tab/cli/)
-
-## Define a custom preset
-
-The following example defines the request body of a new Transform. We define a set of outputs that we want to be generated when this Transform is used.
-
-In this example, we first add an AacAudio layer for the audio encoding and two H264Video layers for the video encoding. In the video layers, we assign labels so that they can be used in the output file names. Next, we want the output to also include thumbnails. In the example below we specify images in PNG format, generated at 50% of the resolution of the input video, and at three timestamps - {25%, 50%, 75} of the length of the input video. Lastly, we specify the format for the output files - one for video + audio, and another for the thumbnails. Since we have multiple H264Layers, we have to use macros that produce unique names per layer. We can either use a `{Label}` or `{Bitrate}` macro, the example shows the former.
-
-We are going to save this transform in a file. In this example, we name the file `customPreset.json`.
-
-```json
-{
- "@odata.type": "#Microsoft.Media.StandardEncoderPreset",
- "codecs": [
- {
- "@odata.type": "#Microsoft.Media.AacAudio",
- "channels": 2,
- "samplingRate": 48000,
- "bitrate": 128000,
- "profile": "AacLc"
- },
- {
- "@odata.type": "#Microsoft.Media.H264Video",
- "keyFrameInterval": "PT2S",
- "stretchMode": "AutoSize",
- "sceneChangeDetection": false,
- "complexity": "Balanced",
- "layers": [
- {
- "width": "1280",
- "height": "720",
- "label": "HD",
- "bitrate": 3400000,
- "maxBitrate": 3400000,
- "bFrames": 3,
- "slices": 0,
- "adaptiveBFrame": true,
- "profile": "Auto",
- "level": "auto",
- "bufferWindow": "PT5S",
- "referenceFrames": 3,
- "entropyMode": "Cabac"
- },
- {
- "width": "640",
- "height": "360",
- "label": "SD",
- "bitrate": 1000000,
- "maxBitrate": 1000000,
- "bFrames": 3,
- "slices": 0,
- "adaptiveBFrame": true,
- "profile": "Auto",
- "level": "auto",
- "bufferWindow": "PT5S",
- "referenceFrames": 3,
- "entropyMode": "Cabac"
- }
- ]
- },
- {
- "@odata.type": "#Microsoft.Media.PngImage",
- "stretchMode": "AutoSize",
- "start": "25%",
- "step": "25%",
- "range": "80%",
- "layers": [
- {
- "width": "50%",
- "height": "50%"
- }
- ]
- }
- ],
- "formats": [
- {
- "@odata.type": "#Microsoft.Media.Mp4Format",
- "filenamePattern": "Video-{Basename}-{Label}-{Bitrate}{Extension}",
- "outputFiles": []
- },
- {
- "@odata.type": "#Microsoft.Media.PngFormat",
- "filenamePattern": "Thumbnail-{Basename}-{Index}{Extension}"
- }
- ]
-}
-```
-
-## Create a new transform
-
-In this example, we create a **Transform** that is based on the custom preset we defined earlier. When creating a Transform, you should first check if one already exist. If the Transform exists, reuse it. The following `show` command returns the `customTransformName` transform if it exists:
-
-```azurecli-interactive
-az ams transform show -a amsaccount -g amsResourceGroup -n customTransformName
-```
-
-The following Azure CLI command creates the Transform based on the custom preset (defined earlier).
-
-```azurecli-interactive
-az ams transform create -a amsaccount -g amsResourceGroup -n customTransformName --description "Basic Transform using a custom encoding preset" --preset customPreset.json
-```
-
-For Media Services to apply the Transform to the specified video or audio, you need to submit a Job under that Transform. For a complete example that shows how to submit a job under a transform, see [Quickstart: Stream video files - Azure CLI](stream-files-cli-quickstart.md).
-
-## [REST](#tab/rest/)
-
-## Define a custom preset
-
-The following example defines the request body of a new Transform. We define a set of outputs that we want to be generated when this Transform is used.
-
-In this example, we first add an AacAudio layer for the audio encoding and two H264Video layers for the video encoding. In the video layers, we assign labels so that they can be used in the output file names. Next, we want the output to also include thumbnails. In the example below we specify images in PNG format, generated at 50% of the resolution of the input video, and at three timestamps - {25%, 50%, 75} of the length of the input video. Lastly, we specify the format for the output files - one for video + audio, and another for the thumbnails. Since we have multiple H264Layers, we have to use macros that produce unique names per layer. We can either use a `{Label}` or `{Bitrate}` macro, the example shows the former.
-
-```json
-{
- "properties": {
- "description": "Basic Transform using a custom encoding preset",
- "outputs": [
- {
- "onError": "StopProcessingJob",
- "relativePriority": "Normal",
- "preset": {
- "@odata.type": "#Microsoft.Media.StandardEncoderPreset",
- "codecs": [
- {
- "@odata.type": "#Microsoft.Media.AacAudio",
- "channels": 2,
- "samplingRate": 48000,
- "bitrate": 128000,
- "profile": "AacLc"
- },
- {
- "@odata.type": "#Microsoft.Media.H264Video",
- "keyFrameInterval": "PT2S",
- "stretchMode": "AutoSize",
- "sceneChangeDetection": false,
- "complexity": "Balanced",
- "layers": [
- {
- "width": "1280",
- "height": "720",
- "label": "HD",
- "bitrate": 3400000,
- "maxBitrate": 3400000,
- "bFrames": 3,
- "slices": 0,
- "adaptiveBFrame": true,
- "profile": "Auto",
- "level": "auto",
- "bufferWindow": "PT5S",
- "referenceFrames": 3,
- "entropyMode": "Cabac"
- },
- {
- "width": "640",
- "height": "360",
- "label": "SD",
- "bitrate": 1000000,
- "maxBitrate": 1000000,
- "bFrames": 3,
- "slices": 0,
- "adaptiveBFrame": true,
- "profile": "Auto",
- "level": "auto",
- "bufferWindow": "PT5S",
- "referenceFrames": 3,
- "entropyMode": "Cabac"
- }
- ]
- },
- {
- "@odata.type": "#Microsoft.Media.PngImage",
- "stretchMode": "AutoSize",
- "start": "25%",
- "step": "25%",
- "range": "80%",
- "layers": [
- {
- "width": "50%",
- "height": "50%"
- }
- ]
- }
- ],
- "formats": [
- {
- "@odata.type": "#Microsoft.Media.Mp4Format",
- "filenamePattern": "Video-{Basename}-{Label}-{Bitrate}{Extension}",
- "outputFiles": []
- },
- {
- "@odata.type": "#Microsoft.Media.PngFormat",
- "filenamePattern": "Thumbnail-{Basename}-{Index}{Extension}"
- }
- ]
- }
- }
- ]
- }
-}
-
-```
-
-## Create a new transform
-
-In this example, we create a **Transform** that is based on the custom preset we defined earlier. When creating a Transform, you should first use [Get](/rest/api/media/transforms/get) to check if one already exists. If the Transform exists, reuse it.
-
-In the Postman's collection that you downloaded, select **Transforms and Jobs**->**Create or Update Transform**.
-
-The **PUT** HTTP request method is similar to:
-
-```
-PUT https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/transforms/:transformName?api-version={{api-version}}
-```
-
-Select the **Body** tab and replace the body with the json code you [defined earlier](#define-a-custom-preset). For Media Services to apply the Transform to the specified video or audio, you need to submit a Job under that Transform.
-
-Select **Send**.
-
-For Media Services to apply the Transform to the specified video or audio, you need to submit a Job under that Transform. For a complete example that shows how to submit a job under a transform, see [Tutorial: Stream video files - REST](stream-files-tutorial-with-rest.md).
-
-## [.NET](#tab/net/)
-
-## Download the sample
-
-Clone a GitHub repository that contains the full .NET Core sample to your machine using the following command:
-
- ```bash
- git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git
- ```
-
-The custom preset sample is located in the [Encoding with a custom preset using .NET](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/VideoEncoding/Encoding_H264) folder.
-
-## Create a transform with a custom preset
-
-When creating a new [Transform](/rest/api/media/transforms), you need to specify what you want it to produce as an output. The required parameter is a [TransformOutput](/rest/api/media/transforms/createorupdate#transformoutput) object, as shown in the code below. Each **TransformOutput** contains a **Preset**. The **Preset** describes the step-by-step instructions of video and/or audio processing operations that are to be used to generate the desired **TransformOutput**. The following **TransformOutput** creates custom codec and layer output settings.
-
-When creating a [Transform](/rest/api/media/transforms), you should first check if one already exists using the **Get** method, as shown in the code that follows. In Media Services v3, **Get** methods on entities return **null** if the entity doesn't exist (a case-insensitive check on the name).
-
-### Example custom transform
-
-The following example defines a set of outputs that we want to be generated when this Transform is used. We first add an AacAudio layer for the audio encoding and two H264Video layers for the video encoding. In the video layers, we assign labels so that they can be used in the output file names. Next, we want the output to also include thumbnails. In the example below we specify images in PNG format, generated at 50% of the resolution of the input video, and at three timestamps - {25%, 50%, 75%} of the length of the input video. Lastly, we specify the format for the output files - one for video + audio, and another for the thumbnails. Since we have multiple H264Layers, we have to use macros that produce unique names per layer. We can either use a `{Label}` or `{Bitrate}` macro, the example shows the former.
-
-[!code-csharp[Main](../../../media-services-v3-dotnet/VideoEncoding/Encoding_H264/Program.cs#EnsureTransformExists)]
--
media-services Transform Delete Transform How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-delete-transform-how-to.md
- Title: Delete a Media Services transform
-description: This article shows you how to delete a Media Services transform.
----- Previously updated : 03/01/2022---
-# Delete a transform
--
-This article shows you how to delete a Media Services transform.
-
-## Methods
-
-You can use the following methods to delete a Media Services transform.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/transforms/delete).
--
media-services Transform Generate Thumbnails How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-generate-thumbnails-how-to.md
- Title: Generate thumbnails using Media Encoder Standard
-description: This article shows how to encode an asset and generate thumbnails at the same time using Media Encoder Standard.
----- Previously updated : 03/09/2022--
-# How to generate thumbnails using Encoder Standard
--
-You can use Media Encoder Standard to generate one or more thumbnails from your input video in [JPEG](https://en.wikipedia.org/wiki/JPEG), [PNG](https://en.wikipedia.org/wiki/Portable_Network_Graphics), or [BMP](https://en.wikipedia.org/wiki/BMP_file_format) image file formats.
--
-## [REST](#tab/rest/)
-
-## Recommended reading and practice
-
-It is recommended that you become familiar with custom transforms by reading [How to encode with a custom transform](transform-custom-transform-how-to.md).
-
-## Thumbnail parameters
-
-You should set the following parameters:
--- **start** - The position in the input video from where to start generating thumbnails. The value can be in ISO 8601 format (For example, PT05S to start at 5 seconds), or a frame count (For example, 10 to start at the 10th frame), or a relative value to stream duration (For example, 10% to start at 10% of stream duration). Also supports a macro {Best}, which tells the encoder to select the best thumbnail from the first few seconds of the video and will only produce one thumbnail, no matter what other settings are for Step and Range. The default value is macro {Best}.-- **step** - The intervals at which thumbnails are generated. The value can be in ISO 8601 format (for example, PT05S for one image every 5 seconds), or a frame count (for example, 30 for one image every 30 frames), or a relative value to stream duration (for example, 10% for one image every 10% of stream duration). Step value will affect the first generated thumbnail, which may not be exactly the one specified at transform preset start time. This is due to the encoder, which tries to select the best thumbnail between start time and step position from start time as the first output. As the default value is 10%, it means that if the stream has long duration, the first generated thumbnail might be far away from the one specified at start time. Try to select a reasonable value for step if the first thumbnail is expected be close to start time, or set the range value to 1 if only one thumbnail is needed at start time.-- **range** - The position relative to transform preset start time in the input video at which to stop generating thumbnails. The value can be in ISO 8601 format (For example, PT5M30S to stop at 5 minutes and 30 seconds from start time), or a frame count (For example, 300 to stop at the 300th frame from the frame at start time. If this value is 1, it means only producing one thumbnail at start time), or a relative value to the stream duration (For example, 50% to stop at half of stream duration from start time). The default value is 100%, which means to stop at the end of the stream.-- **layers** - A collection of output image layers to be produced by the encoder.-
-## Example of a "single PNG file" preset
-
-The following JSON preset can be used to produce a single output PNG file from the first few seconds of the input video, where the encoder makes a best-effort attempt at finding an ΓÇ£interestingΓÇ¥ frame. Note that the output image dimensions have been set to 100%, meaning these match the dimensions of the input video. Note also how the ΓÇ£FormatΓÇ¥ setting in "Outputs" is required to match the use of "PngLayers" in the ΓÇ£CodecsΓÇ¥ section.
-
-```json
-{
- "properties": {
- "description": "Basic Transform using a custom encoding preset for thumbnails",
- "outputs": [
- {
- "onError": "StopProcessingJob",
- "relativePriority": "Normal",
- "preset": {
- "@odata.type": "#Microsoft.Media.StandardEncoderPreset",
- "codecs": [
- {
- "@odata.type": "#Microsoft.Media.PngImage",
- "stretchMode": "AutoSize",
- "start": "{Best}",
- "step": "25%",
- "range": "80%",
- "layers": [
- {
- "width": "50%",
- "height": "50%"
- }
- ]
- }
- ],
- "formats": [
- {
- "@odata.type": "#Microsoft.Media.Mp4Format",
- "filenamePattern": "Video-{Basename}-{Label}-{Bitrate}{Extension}",
- "outputFiles": []
- },
- {
- "@odata.type": "#Microsoft.Media.PngFormat",
- "filenamePattern": "Thumbnail-{Basename}-{Index}{Extension}"
- }
- ]
- }
- }
- ]
- }
-}
-
-```
-
-## Example of a "series of JPEG images" preset
-
-The following JSON preset can be used to produce a set of 10 images at timestamps of 5%, 15%, …, 95% of the input timeline, where the image size is specified to be one quarter that of the input video.
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "25%",
- "Height": "25%"
- }
- ],
- "Start": "5%",
- "Step": "10%",
- "Range": "96%",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
-}
-```
-
-## Example of a "one image at a specific timestamp" preset
-
-The following JSON preset can be used to produce a single JPEG image at the 30-second mark of the input video. This preset expects the input video to be more than 30 seconds in duration (else the job fails).
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "25%",
- "Height": "25%"
- }
- ],
- "Start": "00:00:30",
- "Step": "1",
- "Range": "1",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
-}
-```
-
-## Example of a "thumbnails at different resolutions" preset
-
-The following preset can be used to generate thumbnails at different resolutions in one task. In the example, at positions 5%, 15%, …, 95% of the input timeline, the encoder generates two images – one at 100% of the input video resolution and the other at 50%.
-
-Note the use of {Resolution} macro in the FileName; it indicates to the encoder to use the width and height that you specified in the Encoding section of the preset while generating the file name of the output images. This also helps you distinguish between the different images easily.
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
-{
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "100%",
- "Height": "100%"
-},
-{
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "50%",
- "Height": "50%"
-}
-
- ],
- "Start": "5%",
- "Step": "10%",
- "Range": "96%",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Resolution}_{Index}{Extension}",
- "Format": {
-"Type": "JpgFormat"
- }
- }
- ]
-}
-```
-
-## Example of generating a thumbnail while encoding
-
-While all of the above examples have discussed how you can submit an encoding task that only produces images, you can also combine video/audio encoding with thumbnail generation. The following JSON preset tells Encoder Standard to generate a thumbnail during encoding.
-
-### JSON preset
-
-For information about schema, see [this](../previous/media-services-mes-schema.md) article.
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": "true",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4500,
- "MaxBitrate": 4500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "ReferenceFrames": 3,
- "EntropyMode": "Cabac",
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
-
- }
- ],
- "Type": "H264Video"
- },
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "100%",
- "Height": "100%"
- }
- ],
- "Start": "{Best}",
- "Type": "JpgImage"
- },
- {
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- },
- {
- "FileName": "{Basename}_{Resolution}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-## [.NET](#tab/net/)
-
-## Recommended reading and practice
-
-It is recommended that you become familiar with custom transforms by reading [How to encode with a custom transform](transform-custom-transform-how-to.md).
-
-## Transform code example
-
-The below code example creates just a thumbnail. You should set the following parameters:
--- **start** - The position in the input video from where to start generating thumbnails. The value can be in ISO 8601 format (For example, PT05S to start at 5 seconds), or a frame count (For example, 10 to start at the 10th frame), or a relative value to stream duration (For example, 10% to start at 10% of stream duration). Also supports a macro {Best}, which tells the encoder to select the best thumbnail from the first few seconds of the video and will only produce one thumbnail, no matter what other settings are for Step and Range. The default value is macro {Best}.-- **step** - The intervals at which thumbnails are generated. The value can be in ISO 8601 format (for example, PT05S for one image every 5 seconds), or a frame count (for example, 30 for one image every 30 frames), or a relative value to stream duration (for example, 10% for one image every 10% of stream duration). Step value will affect the first generated thumbnail, which may not be exactly the one specified at transform preset start time. This is due to the encoder, which tries to select the best thumbnail between start time and step position from start time as the first output. As the default value is 10%, it means that if the stream has long duration, the first generated thumbnail might be far away from the one specified at start time. Try to select a reasonable value for step if the first thumbnail is expected be close to start time, or set the range value to 1 if only one thumbnail is needed at start time.-- **range** - The position relative to transform preset start time in the input video at which to stop generating thumbnails. The value can be in ISO 8601 format (For example, PT5M30S to stop at 5 minutes and 30 seconds from start time), or a frame count (For example, 300 to stop at the 300th frame from the frame at start time. If this value is 1, it means only producing one thumbnail at start time), or a relative value to the stream duration (For example, 50% to stop at half of stream duration from start time). The default value is 100%, which means to stop at the end of the stream.-- **layers** - A collection of output image layers to be produced by the encoder.-
-```csharp
-
-private static Transform EnsureTransformExists(IAzureMediaServicesClient client, string resourceGroupName, string accountName, string transformName)
-{
- // Does a Transform already exist with the desired name? Assume that an existing Transform with the desired name
- // also uses the same recipe or Preset for processing content.
- Transform transform = client.Transforms.Get(resourceGroupName, accountName, transformName);
-
- if (transform == null)
- {
- // Create a new Transform Outputs array - this defines the set of outputs for the Transform
- TransformOutput[] outputs = new TransformOutput[]
- {
- // Create a new TransformOutput with a custom Standard Encoder Preset
- // This demonstrates how to create custom codec and layer output settings
-
- new TransformOutput(
- new StandardEncoderPreset(
- codecs: new Codec[]
- {
- // Generate a set of PNG thumbnails
- new PngImage(
- start: "25%",
- step: "25%",
- range: "80%",
- layers: new PngLayer[]{
- new PngLayer(
- width: "50%",
- height: "50%"
- )
- }
- )
- },
- // Specify the format for the output files for the thumbnails
- formats: new Format[]
- {
- new PngFormat(
- filenamePattern:"Thumbnail-{Basename}-{Index}{Extension}"
- )
- }
- ),
- onError: OnErrorType.StopProcessingJob,
- relativePriority: Priority.Normal
- )
- };
-
- string description = "A transform that includes thumbnails.";
- // Create the custom Transform with the outputs defined above
- transform = client.Transforms.CreateOrUpdate(resourceGroupName, accountName, transformName, outputs, description);
- }
-
- return transform;
-}
-```
-
media-services Transform Jobs Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-jobs-concept.md
-
-# Mandatory fields. See more on aka.ms/skyeye/meta.
Title: Transforms and Jobs in Media Services
-: Azure Media Services
-description: Transforms describe the rules for processing your videos in Azure Media Services.
----- Previously updated : 03/22/2021---
-# Transforms and Jobs in Media Services
-
-This topic gives details about [Transforms](/rest/api/media/transforms) and [Jobs](/rest/api/media/jobs) and explains the relationship between these entities.
-
-## Overview
-
-### Transforms/Jobs workflow
-
-The following diagram shows transforms/jobs workflow:
-
-![Transforms and jobs workflow in Azure Media Services](./media/encoding/transforms-jobs.png)
-
-#### Typical workflow
-
-1. Create a Transform.
-2. Submit Jobs under that Transform.
-3. List Transforms.
-4. Delete a Transform, if you aren't planning to use it in the future.
-
-#### Example
-
-Suppose you wanted to extract the first frame of all your videos as a thumbnail imageΓÇôthe steps you would take are:
-
-1. Define the recipe, or the rule for processing your videos: "use the first frame of the video as the thumbnail".
-2. For each video, you would tell the service:
- 1. Where to find that video.
- 2. Where to write the output thumbnail image.
-
-A **Transform** helps you create the recipe once (Step 1), and submit Jobs using that recipe (Step 2).
-
-> [!NOTE]
-> Properties of **Transform** and **Job** of the Datetime type are always in UTC format.
-
-## Transforms
-
-Use **Transforms** to configure common tasks for encoding or analyzing videos. Each **Transform** describes a recipe or a workflow of tasks for processing your video or audio files. A single Transform can apply more than one rule. For example, a Transform could specify that each video be encoded into an MP4 file at a given bitrate, and that a thumbnail image be generated from the first frame of the video. You would add one TransformOutput entry for each rule that you want to include in your Transform. You use presets to tell the Transform how the input media files should be processed.
-
-### Viewing schema
-
-In Media Services v3, presets are strongly typed entities in the API itself. You can find the "schema" definition for these objects in [Open API Specification (or Swagger)](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/mediaservices/resource-manager/Microsoft.Media/stable/2018-07-01). You can also view the preset definitions (like **StandardEncoderPreset**) in the [REST API](/rest/api/media/transforms/createorupdate#standardencoderpreset), [.NET SDK](/dotnet/api/microsoft.azure.management.media.models.standardencoderpreset), or other Media Services v3 SDK reference documentation.
-
-### Creating Transforms
-
-You can create Transforms using REST, CLI, or any of the published SDKs. The Media Services v3 API is driven by Azure Resource Manager, so you can also use Resource Manager templates to create and deploy Transforms in your Media Services account. Azure role-based access control can be used to lock down access to Transforms.
-
-### Updating Transforms
-
-If you need to update your [Transform](/rest/api/media/transforms), use the **Update** operation. It's intended for making changes to the description, or the priorities of the underlying TransformOutputs. It's recommended that such updates be done when all in-progress jobs have completed. If you intend to rewrite the recipe, you need to create a new Transform.
-
-### Transform object diagram
-
-The following diagram shows the **Transform** object and the objects it references, including the derivation relationships. The gray arrows show a type that the Job references and the green arrows show class derivation relationships.
-
-Select the image to view it full size.
-
-[![Diagram showing the Transform object and the objects it references, including the class derivation relationships between the objects.](./media/api-diagrams/transform-small.png)](./media/api-diagrams/transform-large.png#lightbox)
-
-## Jobs
-
-A **Job** is the actual request to Media Services to apply the **Transform** to a given input video or audio content. Once the Transform has been created, you can submit jobs using Media Services APIs, or any of the published SDKs. The **Job** specifies information like the location of the input video and the location for the output. You can specify the location of your input video using: HTTPS URLs, SAS URLs, or [Assets](/rest/api/media/assets).
-
-### Job input from HTTPS
-
-Use [job input from HTTPS](job-input-from-http-how-to.md) if your content is already accessible via a URL and you don't need to store the source file in Azure (for example, import from S3). This method is also suitable if you have the content in Azure Blob storage but have no need for the file to be in an Asset. Currently, this method only supports a single file for input.
-
-### Asset as Job input
-
-Use [Asset as job input](job-input-from-local-file-how-to.md) if the input content is already in an Asset or the content is stored in local file. It's also a good option if you plan to publish the input asset for streaming or download (say you want to publish the mp4 for download but also want to do speech to text or face detection). This method supports multi-file assets (for example, MBR streaming sets that were encoded locally).
-
-### Checking Job progress
-
-The progress and state of jobs can be obtained by monitoring events with Event Grid. For more information, see [Monitor events using EventGrid](monitoring/job-state-events-cli-how-to.md).
-
-### Updating Jobs
-
-The Update operation on the [Job](/rest/api/media/jobs) entity can be used to modify the *description* and the *priority* properties after the job has been submitted. A change to the *priority* property is effective only if the job is still in a queued state. If the job has begun processing, or has finished, changing priority has no effect.
-
-### Job object diagram
-
-The following diagram shows the **Job** object and the objects it references including the derivation relationships.
-
-Click the image to view it full size.
-
-[![Diagram showing the Job object and the objects it references, including the class derivation relationships between the objects.](./media/api-diagrams/job-small.png)](./media/api-diagrams/job-large.png#lightbox)
-
-## Ask questions, give feedback, get updates
-
-Check out the [Azure Media Services community](media-services-community.md) article to see different ways you can ask questions, give feedback, and get updates about Media Services.
-
-## See also
-
-* [Error codes](/rest/api/media/jobs/get#joberrorcode)
-* [Filtering, ordering, paging of Media Services entities](filter-order-page-entities-how-to.md)
-
-## Next steps
--- Before you start developing, review [Developing with Media Services v3 APIs](media-services-apis-overview.md) (includes information on accessing APIs, naming conventions, etc.)-- Check out these tutorials:-
- - [Tutorial: Encode a remote file based on URL and stream the video](stream-files-tutorial-with-rest.md)
- - [Tutorial: Upload, encode, and stream videos](stream-files-tutorial-with-api.md)
- - [Tutorial: Analyze videos with Media Services v3](analyze-videos-tutorial.md)
media-services Transform Stitch How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-stitch-how-to.md
- Title: How to stitch two or more video files | Microsoft Docs
-description: This article shows how to stitch two or more video files.
----- Previously updated : 03/09/2022---
-# How to stitch two or more video files
--
-## Stitch two or more video files
-
-The following example illustrates how you can generate a preset to stitch two or more video files. The most common scenario is when you want to add a header or a trailer to the main video.
-
-> [!NOTE]
-> Video files edited together should share properties (video resolution, frame rate, audio track count, etc.). You should take care not to mix videos of different frame rates, or with different number of audio tracks.
-
-## [.NET](#tab/net/)
-
-## Prerequisites
-
-Clone or download the [Media Services .NET samples](https://github.com/Azure-Samples/media-services-v3-dotnet/).
-
-## Example
-
-Find the code that shows how to stitch video files in [EncodingWithMESCustomStitchTwoAssets folder](https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoEncoding/Encoding_StitchTwoAssets/Program.cs).
media-services Transform Subclip Video How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-subclip-video-how-to.md
- Title: Subclip a video when encoding with Media Services
-description: This topic describes how to subclip a video when encoding with Azure Media Services.
---- Previously updated : 03/09/2022---
-# Subclip a video
-
-You can trim or subclip a video when encoding it using a Media Services Job.
-
-This functionality works with any Transform that is built using either the BuiltInStandardEncoderPreset presets, or the StandardEncoderPreset presets.
-
-## [REST](#tab/rest/)
-
-## Subclip a video when encoding with Media Services - REST
-
-You can trim or subclip a video when encoding it using a [Job](/rest/api/media/jobs). This functionality works with any [Transform](/rest/api/media/transforms) that is built using either the [BuiltInStandardEncoderPreset](/rest/api/media/transforms/createorupdate#builtinstandardencoderpreset) presets, or the [StandardEncoderPreset](/rest/api/media/transforms/createorupdate#standardencoderpreset) presets.
-
-The REST example in this topic creates a job that trims a video as it submits an encoding job.
--
-## Prerequisites
-
-To complete the steps described in this topic, you have to:
--- [Create an Azure Media Services account](./account-create-how-to.md).-- [Configure Postman for Azure Media Services REST API calls](setup-postman-rest-how-to.md).
-
- Make sure to follow the last step in the topic [Get Azure AD Token](setup-postman-rest-how-to.md#get-azure-ad-token).
-- Create a Transform and an output Assets. You can see how to create a Transform and an output Assets in the [Encode a remote file based on URL and stream the video - REST](stream-files-tutorial-with-rest.md) tutorial.-- Review the [Encoding concept](encode-concept.md) topic.-
-## Create a subclipping job
-
-1. In the Postman collection that you downloaded, select **Transforms and jobs** -> **Create Job with Sub Clipping**.
-
- The **PUT** request looks like this:
-
- ```
- https://management.azure.com/subscriptions/:subscriptionId/resourceGroups/:resourceGroupName/providers/Microsoft.Media/mediaServices/:accountName/transforms/:transformName/jobs/:jobName?api-version={{api-version}}
- ```
-1. Update the value of "transformName" environment variable with your transform name.
-1. Select the **Body** tab and update the "myOutputAsset" with your output Asset name.
-
- ```json
- {
- "properties": {
- "description": "A Job with transform cb9599fb-03b3-40eb-a2ff-7ea909f53735 and single clip.",
-
- "input": {
- "@odata.type": "#Microsoft.Media.JobInputHttp",
- "baseUri": "https://nimbuscdn-nimbuspm.streaming.mediaservices.windows.net/2b533311-b215-4409-80af-529c3e853622/",
- "files": [
- "Ignite-short.mp4"
- ],
- "start": {
- "@odata.type": "#Microsoft.Media.AbsoluteClipTime",
- "time": "PT10S"
- },
- "end": {
- "@odata.type": "#Microsoft.Media.AbsoluteClipTime",
- "time": "PT40S"
- }
- },
-
- "outputs": [
- {
- "@odata.type": "#Microsoft.Media.JobOutputAsset",
- "assetName": "myOutputAsset"
- }
- ],
- "priority": "Normal"
- }
- }
- ```
-1. Press **Send**.
-
- You see the **Response** with the info about the job that was created and submitted and the job's status.
-
-## [.NET](#tab/net/)
-
-The following C# example creates a job that trims a video in an Asset as it submits an encoding job.
-
-## Prerequisites
-
-To complete the steps described in this topic, you have to:
--- [Create an Azure Media Services account](./account-create-how-to.md)-- Create a Transform and an input and output Assets. You can see how to create a Transform and input and output Assets in the [Upload, encode, and stream videos using .NET](stream-files-tutorial-with-api.md) tutorial.-- Review the [Encoding concept](encode-concept.md) topic.-
-## Example
-
-```csharp
-/// <summary>
-/// Submits a request to Media Services to apply the specified Transform to a given input video.
-/// </summary>
-/// <param name="client">The Media Services client.</param>
-/// <param name="resourceGroupName">The name of the resource group within the Azure subscription.</param>
-/// <param name="accountName"> The Media Services account name.</param>
-/// <param name="transformName">The name of the transform.</param>
-/// <param name="jobName">The (unique) name of the job.</param>
-/// <param name="inputAssetName">The name of the input asset.</param>
-/// <param name="outputAssetName">The (unique) name of the output asset that will store the result of the encoding job. </param>
-// <SubmitJob>
-private static async Task<Job> JobWithBuiltInStandardEncoderWithSingleClipAsync(
- IAzureMediaServicesClient client,
- string resourceGroupName,
- string accountName,
- string transformName,
- string jobName,
- string inputAssetName,
- string outputAssetName)
-{
- var jobOutputs = new List<JobOutputAsset>
- {
- new JobOutputAsset(state: JobState.Queued, progress: 0, assetName: outputAssetName)
- };
-
- var clipStart = new AbsoluteClipTime()
- {
- Time = new TimeSpan(0, 0, 20)
- };
-
- var clipEnd = new AbsoluteClipTime()
- {
- Time = new TimeSpan(0, 0, 30)
- };
-
- var jobInput = new JobInputAsset(assetName: inputAssetName, start: clipStart, end: clipEnd);
-
- Job job = await client.Jobs.CreateAsync(
- resourceGroupName,
- accountName,
- transformName,
- jobName,
- new Job(input: jobInput, outputs: jobOutputs.ToArray(), name: jobName)
- {
- Description = $"A Job with transform {transformName} and single clip.",
- Priority = Priority.Normal,
- });
-
- return job;
-}
-
-```
media-services Transform Update Transform How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/transform-update-transform-how-to.md
- Title: Update a Media Services transform
-description: This article shows you how to update a Media Services transform.
----- Previously updated : 03/08/2022---
-# Update a transform
--
-This article shows you how to update a Media Services transform.
-
-## Methods
-
-You can use the following methods to update a Media Services transform.
-
-## [CLI](#tab/cli/)
--
-## [REST](#tab/rest/)
-
-See the Media Services [REST API](/rest/api/media/transforms/update).
media-services Video On Demand Simple Portal Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/latest/video-on-demand-simple-portal-quickstart.md
- Title: Quickstart Video on Demand with Media Services
-description: This article shows you how to do the basic steps for delivering video on demand (VOD) with Azure Media Services.
------ Previously updated : 03/01/2022---
-# Quickstart Basic Video On Demand (VOD) with Media Services
-
-This article shows you how to do the basic steps for delivering a basic video on demand (VOD) application with Azure Media Services and a GitHub repository. All the steps happen with your web browser from our documentation, the Azure portal, and GitHub.
-
-## Prerequisites
--- [Create a Media Services account](account-create-how-to.md). When you set up the Media Services account, a storage account, a user managed identity, and a default streaming endpoint will also be created.-- One MP4 video to use for this exercise.-- Create a GitHub account if you don't have one already, and stay logged in.-- Create an Azure [Static Web App](../../static-web-apps/get-started-portal.md?tabs=vanilla-javascript).-
-> [!NOTE]
-> You will be switching between several browser tabs or windows during this process. The below steps assume that you have your browser set to open tabs. Keep them all open.
--
-<!-- ## Create a transform -->
--
-Stay on this screen for the next steps.
-
-<!-- ## Create a job -->
-
-Next, you'll create a job which is for telling Media Services which transform to run on files within an asset. The asset you choose will be the input asset. The job will create an output asset to contain the encoded files as well as the manifest.
--
-Once you've viewed what is in the output asset, close the tab. Go back to the asset browser tab.
-
-In order to stream your videos you need a streaming locator.
-
-<!-- ## Create a streaming locator -->
--
-On this screen, you'll see that the streaming endpoint that was created when you created your account is in the Streaming endpoint dropdown list along with other data about the streaming locator.
-
-In the streaming and download section, you'll see the URLs to use for your streaming application. For the following steps, you'll use the URL that ends with `(format=m3u8-cmaf)`. Keep this browser tab open as you'll be coming back to it in a later step.
-
-## Create a web page with a video player client
-
-Assuming that you created a Static Web App, you'll now change the HTML in the https://docsupdatetracker.net/index.html file. If you didn't create a web app with Azure, you can still use this code where you plan to host your web app.
-
-1. If you aren't already logged in, sign in to GitHub and navigate to the repository you created for the Static Web App.
-1. Navigate to the *https://docsupdatetracker.net/index.html* file. It should be in a directory called `src`.
-1. Select the edit pencil icon to edit the file.
-1. Replace the code that is in the html file with the following code:
-
- ```html
- <html lang="en-US">
- <head>
- <meta charset="utf-8">
- <meta http-equiv="X-UA-Compatible" content="IE=edge">
- <title>Basic Video on Demand Static Web App</title>
- <meta name="description" content="">
- <meta name="viewport" content="width=device-width, initial-scale=1">
-
- <!--*****START OF Azure Media Player Scripts*****-->
- <!--Note: DO NOT USE the "latest" folder in production. Replace "latest" with a version number like "1.0.0"-->
- <!--EX:<script src="//amp.azure.net/libs/amp/1.0.0/azuremediaplayer.min.js"></script>-->
- <!--Azure Media Player versions can be queried from //aka.ms/ampchangelog-->
- <link href="//amp.azure.net/libs/amp/latest/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet">
- <script src="//amp.azure.net/libs/amp/latest/azuremediaplayer.min.js"></script>
- <!--*****END OF Azure Media Player Scripts*****-->
- </head>
- <body>
- <h1>Clear Streaming Only</h1>
- <video id="azuremediaplayer" class="azuremediaplayer amp-default-skin amp-big-play-centered" controls autoplay width="640" height="400" poster="" data-setup='{}' tabindex="0">
- <source src="put streaming url here" type="application/vnd.ms-sstr+xml" />
- <p class="amp-no-js">To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video</p>
- </video>
- </body>
- </html>
- ```
-
-1. Return to the Azure portal, Streaming locator browser tab where the streaming URLs are located.
-1. Copy the URL that ends with `(format=m3u8-cmaf)` under HLS.
-1. Return to the index file on GitHub browser tab.
-1. Paste the URL into the `src` value in the source object in the HTML.
-1. Select **Commit changes** to commit the change. It may take a minute for the changes to be live.
-1. Back in the Azure portal, Static web app tab, select the link next to **URL** to open the index page in another tab of your browser. The player should appear on the page.
-1. Select the **video play** button. The video should begin playing. If it isn't playing, check that your streaming endpoint is running.
-
-## Clean up resources
-
-If you don't intend to further develop this basic web app, make sure you delete all the resources you created or you'll be billed.
media-services Availability Regions V 2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/availability-regions-v-2.md
- Title: Azure Media Services regional availability | Microsoft Docs
-description: This article is an overview of Microsoft Azure Media Services features and service regional availability.
------ Previously updated : 03/10/2021----
-# Media Services regional availability
--
-> [!NOTE]
-> No new features or functionality are being added to Media Services v2. Check out the latest version, [Media Services v3](../latest/media-services-overview.md). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
-
-Microsoft Azure Media Services (AMS) enables you to securely upload, store, encode, and package video or audio content for both on-demand and live streaming delivery to various clients (for example, TV, PC, and mobile devices).
-
-AMS operates in multiple regions around the world, giving you flexibility in choosing where to build your applications. This article is an overview of Microsoft Azure Media Services features and service regional availability.
-
-For more information about the entire Azure global infrastructure, see [Azure geographies](https://azure.microsoft.com/global-infrastructure/geographies/).
-
-## AMS accounts
-
-Use [Azure Products by Region](https://azure.microsoft.com/global-infrastructure/services/?products=media-services&regions=all) to determine whether Media Services is available in a specific region.
-
-## Streaming endpoints
-
-Media Services customers can choose either a **Standard** streaming endpoint or a **Premium** streaming endpoint.
-
-|Name|Status|Region
-||||
-|Standard|GA|All|
-|Premium|GA|All|
-
-## Live encoding
-
-Available in all regions except: Germany, Brazil South, India West, India South, and India Central.
-
-## Encoding media processors
-
-AMS offers two on-demand encoders **Media Encoder Standard** and **Media Encoder Premium Workflow**. For more information, see [Overview and comparison of Azure on-demand media encoders](media-services-encode-asset.md).
-
-|Media processor name|Status|Regions
-||||
-|Media Encoder Standard|GA|All|
-|Media Encoder Premium Workflow|GA|All except China|
-
-## Analytics media processors
-
-Media Analytics is a collection of speech and vision components that makes it easier for organizations and enterprises to derive actionable insights from their video files. For more information, see [Azure Media Services Analytics Overview](./legacy-components.md).
-
-> [!NOTE]
-> Some analytics media processors will be retired. For the retirement dates, see the [legacy components](legacy-components.md) topic.
-
-|Media processor name|Status|Region
-||||
-|Azure Media Face Detector|Preview|All|
-|Azure Media Indexer|GA|All|
-|Azure Media Motion Detector|Preview|All|
-|Azure Media OCR|Preview|All|
-|Azure Media Redactor|GA|All|
-|Azure Media Video Thumbnails|Preview|All|
-
-## Protection
-
-Microsoft Azure Media Services enables you to secure your media from the time it leaves your computer through storage, processing, and delivery. For more information, see [Protecting AMS content](media-services-content-protection-overview.md).
-
-|Encryption|Status|Regions|
-||||
-|Storage|GA|All|
-|AES-128 keys|GA|All|
-|Fairplay|GA|All|
-|PlayReady|GA|All|
-|Widevine|GA|All except Germany, Federal Government, and China.
-
-> [!NOTE]
-> Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Reserved units (RUs)
-
-The number of provisioned reserved units determines the number of media tasks that can be processed concurrently in a given account.
-
-Available in all regions.
-
-## Reserved unit (RU) type
-
-A Media Services account is associated with a reserved unit type that determines the speed with which your media processing tasks are completed. You can choose between the following reserved unit types: S1, S2, or S3.
-
-|RU type name|Status|Regions
-||||
-|S1|GA|All|
-|S2|GA|All except Brazil South, and India West|
-|S3|GA|All except India West|
-
-## Next steps
-
-[Migrate to Media Services v3](../latest/media-services-overview.md)
-
-## Provide feedback
-
media-services Generate Thumbnail Sprite https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/generate-thumbnail-sprite.md
- Title: Generate a thumbnail sprite with Azure Media Services | Microsoft Docs
-description: This topic shows how to generate a thumbnail sprite with Azure Media Services.
------ Previously updated : 03/10/2021---
-# Generate a thumbnail sprite
--
-You can use Media Encoder Standard to generate a thumbnail sprite, which is a JPEG file that contains multiple small resolution thumbnails stitched together into a single (large) image, together with a VTT file. This VTT file specifies the time range in the input video that each thumbnail represents, together with the size and coordinates of that thumbnail within the large JPEG file. Video players use the VTT file and sprite image to show a 'visual' seekbar, providing a viewer with visual feedback when scrubbing back and forward along the video timeline.
-
-In order to use Media Encoder Standard to generate Thumbnail Sprite, the preset:
-
-1. Must use JPG thumbnail image format
-2. Must specify Start/Step/Range values as either timestamps, or % values (and not frame counts)
-
- 1. It is okay to mix timestamps and % values
-
-3. Must have the SpriteColumn value, as a non-negative number greater than or equal to 1
-
- 1. If SpriteColumn is set to M >= 1, the output image is a rectangle with M columns. If the number of thumbnails generated via #2 is not an exact multiple of M, the last row will be incomplete, and left with black pixels.
-
-Here is an example:
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "Start": "00:00:01",
- "Type": "JpgImage",
- "Step": "5%",
- "Range": "100%",
- "JpgLayers": [
- {
- "Type": "JpgLayer",
- "Width": "10%",
- "Height": "10%",
- "Quality": 90
- }
- ],
- "SpriteColumn": 10
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
-}
-```
-
-## Known Issues
-
-1. It's not possible to generate a sprite image with a single row of images (SpriteColumn = 1 results in an image with a single column).
-2. Chunking of the sprite images into moderately sized JPEG images is not supported yet. Hence, care must be taken to limit the number of thumbnails and their size, so that the resultant stitched Thumbnail Sprite is around 8M pixels or less.
-3. Azure Media Player supports sprites on Microsoft Edge, Chrome, and Firefox browsers. VTT parsing is not supported in IE11.
-
-## Next steps
-
-[Encode content](media-services-encode-asset.md)
media-services Hybrid Design Drm Sybsystem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/hybrid-design-drm-sybsystem.md
- Title: Hybrid design of DRM subsystem(s) using Azure Media Services | Microsoft Docs
-description: This topic discusses hybrid design of DRM subsystem(s) using Azure Media Services.
------ Previously updated : 03/10/2021---
-# Hybrid design of DRM subsystems
--
-This topic discusses hybrid design of DRM subsystem(s) using Azure Media Services.
-
-## Overview
-
-Azure Media Services provides support for the following three DRM system:
-
-* PlayReady
-* Widevine (Modular)
-* FairPlay
-
-The DRM support includes DRM encryption (dynamic encryption) and license delivery, with Azure Media Player supporting all 3 DRMs as a browser player SDK.
-
-For a detailed treatment of DRM/CENC subsystem design and implementation, please see the document titled [CENC with Multi-DRM and Access Control](media-services-cenc-with-multidrm-access-control.md).
-
-Although we offer complete support for three DRM systems, sometimes customers need to use various parts of their own infrastructure/subsystems in addition to Azure Media Services to build a hybrid DRM subsystem.
-
-Below are some common questions asked by customers:
-
-* "Can I use my own DRM license servers?" (In this case, customers have invested in DRM license server farm with embedded business logic).
-* "Can I use only your DRM license delivery in Azure Media Services without hosting content in AMS?"
-
-## Modularity of the AMS DRM platform
-
-As part of a comprehensive cloud video platform, Azure Media Services DRM has a design with flexibility and modularity in mind. You can use Azure Media Services with any of the following different combinations described in the table below (an explanation of the notation used in the table follows).
-
-|**Content hosting & origin**|**Content encryption**|**DRM license delivery**|
-||||
-|AMS|AMS|AMS|
-|AMS|AMS|Third-party|
-|AMS|Third-party|AMS|
-|AMS|Third-party|Third-party|
-|Third-party|Third-party|AMS|
-
-### Content hosting & origin
-
-* AMS: video asset is hosted in AMS and streaming is through AMS streaming endpoints (but not necessarily dynamic packaging).
-* Third-party: video is hosted and delivered on a third-party streaming platform outside of AMS.
-
-### Content encryption
-
-* AMS: content encryption is performed dynamically/on-demand by AMS dynamic encryption.
-* Third-party: content encryption is performed outside of AMS using a pre-processing workflow.
-
-### DRM license delivery
-
-* AMS: DRM license is delivered by AMS license delivery service.
-* Third-party: DRM license is delivered by a third-party DRM license server outside of AMS.
-
-## Configure based on your hybrid scenario
-
-### Content key
-
-Through configuration of a content key, you can control the following attributes of both AMS dynamic encryption and AMS license delivery service:
-
-* The content key used for dynamic DRM encryption.
-* DRM license content to be delivered by license delivery
-* Type of **content key authorization policy restriction**: open, IP, or token restriction.
-* If **token** type of **content key authorization policy restriction is used**, the **content key authorization policy restriction** must be met before a license is issued.
-
-### Asset delivery policy
-
-Through configuration of an asset delivery policy, you can control the following attributes used by AMS dynamic packager and dynamic encryption of an AMS streaming endpoint:
-
-* Streaming protocol and DRM encryption combination, such as DASH under CENC (PlayReady and Widevine), smooth streaming under PlayReady, HLS under Widevine or PlayReady.
-* The default/embedded license delivery URLs for each of the involved DRMs.
-* Whether license acquisition URLs (LA_URLs) in DASH MPD or HLS playlist contain query string of key ID (KID) for Widevine and FairPlay, respectively.
-
-## Scenarios and samples
-
-Based on the explanations in the previous section, the following five hybrid scenarios use respective **Content key**-**Asset delivery policy** configuration combinations (the samples mentioned in the last column follow the table):
-
-|**Content hosting & origin**|**DRM encryption**|**DRM license delivery**|**Configure content key**|**Configure asset delivery policy**|**Sample**|
-|||||||
-|AMS|AMS|AMS|Yes|Yes|Sample 1|
-|AMS|AMS|Third-party|Yes|Yes|Sample 2|
-|AMS|Third-party|AMS|Yes|No|Sample 3|
-|AMS|Third-party|Outside|No|No|Sample 4|
-|Third-party|Third-party|AMS|Yes|No|
-
-In the samples, PlayReady protection works for both DASH and smooth streaming. The video URLs below are smooth streaming URLs. To get the corresponding DASH URLs, just append "(format=mpd-time-csf)". You could use the [azure media test player](https://aka.ms/amtest) to test in a browser. It allows you to configure which streaming protocol to use, under which tech. IE11 and Microsoft Edge on Windows 10 support PlayReady through EME. For more information, see [details about the test tool](./offline-playready-streaming-windows-10.md).
-
-### Sample 1
-
-* Source (base) URL: `https://willzhanmswest.streaming.mediaservices.windows.net/1efbd6bb-1e66-4e53-88c3-f7e5657a9bbd/RussianWaltz.ism/manifest`
-* PlayReady LA_URL (DASH & smooth): `https://willzhanmswest.keydelivery.mediaservices.windows.net/PlayReady/`
-* Widevine LA_URL (DASH): `https://willzhanmswest.keydelivery.mediaservices.windows.net/Widevine/?kid=78de73ae-6d0f-470a-8f13-5c91f7c4`
-* FairPlay LA_URL (HLS): `https://willzhanmswest.keydelivery.mediaservices.windows.net/FairPlay/?kid=ba7e8fb0-ee22-4291-9654-6222ac611bd8`
-
-### Sample 2
-
-* Source (base) URL: `https://willzhanmswest.streaming.mediaservices.windows.net/1a670626-4515-49ee-9e7f-cd50853e41d8/Microsoft_HoloLens_TransformYourWorld_816p23.ism/Manifest`
-* PlayReady LA_URL (DASH & smooth): `http://willzhan12.cloudapp.net/PlayReady/RightsManager.asmx`
-
-### Sample 3
-
-* Source URL: https://willzhanmswest.streaming.mediaservices.windows.net/8d078cf8-d621-406c-84ca-88e6b9454acc/20150807-bridges-2500.ism/manifest
-* PlayReady LA_URL (DASH & smooth): `https://willzhanmswest.keydelivery.mediaservices.windows.net/PlayReady/`
-
-### Sample 4
-
-* Source URL: https://willzhanmswest.streaming.mediaservices.windows.net/7c085a59-ae9a-411e-842c-ef10f96c3f89/20150807-bridges-2500.ism/manifest
-* PlayReady LA_URL (DASH & smooth): `https://willzhan12.cloudapp.net/playready/rightsmanager.asmx`
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Summary
-
-In summary, Azure Media Services DRM components are flexible, you can use them in a hybrid scenario by properly configuring content key and asset delivery policy, as described in this topic.
-
-## Next steps
-View Media Services learning paths.
--
-## Provide feedback
media-services Indexer Task Preset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/indexer-task-preset.md
- Title: Task preset for Azure Media Indexer
-description: This topic gives an overview of task preset for Azure Media Services Media Indexer.
------ Previously updated : 03/10/2021--
-# Task preset for Azure Media Indexer
--
-Azure Media Indexer is a Media Processor that you use to perform the following tasks: make media files and content searchable, generate closed captioning tracks and keywords, index asset files that are part of your asset.
-
-This topic describes the task preset that you need to pass to your indexing job. For complete example, see [Indexing media files with Azure Media Indexer](media-services-index-content.md).
-
-## Azure Media Indexer Configuration XML
-
-The following table explains elements and attributes of the configuration XML.
-
-|Name|Require|Description|
-||||
-|Input|true|Asset file(s) that you want to index.<br/>Azure Media Indexer supports the following media file formats: MP4, MOV, WMV, MP3, M4A, WMA, AAC, WAV. <br/><br/>You can specify the file name (s) in the **name** or **list** attribute of the **input** element (as shown below). If you do not specify which asset file to index, the primary file is picked. If no primary asset file is set, the first file in the input asset is indexed.<br/><br/>To explicitly specify the asset file name, do:<br/>```<input name="TestFile.wmv" />```<br/><br/>You can also index multiple asset files at once (up to 10 files). To do this:<br/>- Create a text file (manifest file) and give it an .lst extension.<br/>- Add a list of all the asset file names in your input asset to this manifest file.<br/>- Add (upload) the manifest file to the asset.<br/>- Specify the name of the manifest file in the inputΓÇÖs list attribute.<br/>```<input list="input.lst">```<br/><br/>**Note:** If you add more than 10 files to the manifest file, the indexing job will fail with the 2006 error code.|
-|metadata|false|Metadata for the specified asset file(s).<br/>```<metadata key="..." value="..." />```<br/><br/>You can supply values for predefined keys. <br/><br/>Currently, the following keys are supported:<br/><br/>**title** and **description** - used to update the language model to improve speech recognition accuracy.<br/>```<metadata key="title" value="[Title of the media file]" /><metadata key="description" value="[Description of the media file]" />```<br/><br/>**username** and **password** - used for authentication when downloading internet files via http or https.<br/>```<metadata key="username" value="[UserName]" /><metadata key="password" value="[Password]" />```<br/>The username and password values apply to all media URLs in the input manifest.|
-|features<br/><br/>Added in version 1.2. Currently, the only supported feature is speech recognition ("ASR").|false|The Speech Recognition feature has the following settings keys:<br/><br/>Language:<br/>- The natural language to be recognized in the multimedia file.<br/>- English, Spanish<br/><br/>CaptionFormats:<br/>- a semicolon-separated list of the desired output caption formats (if any)<br/>- ttml;webvtt<br/><br/><br/>GenerateKeywords:<br/>- A boolean flag specifying whether or not a keyword XML file is required.<br/>- True; False.|
-
-## Azure Media Indexer configuration XML example
-
-```
-<?xml version="1.0" encoding="utf-8"?>
-<configuration version="2.0">
- <input>
- <metadata key="title" value="[Title of the media file]" />
- <metadata key="description" value="[Description of the media file]" />
- </input>
- <settings>
- </settings>
-
- <features>
- <feature name="ASR">
- <settings>
- <add key="Language" value="English"/>
- <add key="GenerateKeywords" value ="true" />
- </settings>
- </feature>
- </features>
-
-</configuration>
-```
-
-## Next steps
-
-See [Indexing media files with Azure Media Indexer](media-services-index-content.md).
-
media-services Legacy Components https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/legacy-components.md
- Title: Azure Media Services legacy components | Microsoft Docs
-description: This topic discusses Azure Media Services legacy components.
------ Previously updated : 01/26/2022--
-# Azure Media Services legacy components
--
-Over time, we enhance Media Service components and retire legacy components. This article helps you migrate your application from a legacy component to a current component.
-
-## Retirement plans of legacy components and migration guidance
-
-The *Windows Azure Media Encoder* (WAME) and *Azure Media Encoder* (AME) media processors are deprecated.
-
-* [Migrate from Windows Azure Media Encoder to Media Encoder Standard](migrate-windows-azure-media-encoder.md)
-* [Migrate from Azure Media Encoder to Media Encoder Standard](migrate-azure-media-encoder.md)
-
-The following Media Analytics media processors are either deprecated or soon to be deprecated:
-
-
-| Media processor name | Retirement date | Additional notes |
-| | | |
-| Azure Media Indexer | March 1, 2023 | This media processor will be replaced by the [Media Services v3 AudioAnalyzerPreset Basic mode](../latest/analyze-video-audio-files-concept.md). For more information, see [Migrate from Azure Media Indexer 2 to Azure Video Analyzer for Media](migrate-indexer-v1-v2.md) (formerly Video Indexer). |
-| Azure Media Indexer 2 | January 1, 2020 | This media processor will be replaced by the [Media Services v3 AudioAnalyzerPreset Basic mode](../latest/analyze-video-audio-files-concept.md). For more information, see [Migrate from Azure Media Indexer 2 to Azure Video Analyzer for Media](migrate-indexer-v1-v2.md) (formerly Video Indexer). |
-| Motion Detection | June 1, 2020|No replacement plans at this time. |
-| Video Summarization |June 1, 2020|No replacement plans at this time.|
-| Video Optical Character Recognition | June 1, 2020 |This media processor was replaced by Azure Video Analyzer for Media. Also, consider using [Azure Media Services v3 API](../latest/analyze-video-audio-files-concept.md). <br/>See [Compare Azure Media Services v3 presets and Video Analyzer for Media](../../azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md). |
-| Face Detector | June 1, 2020 | This media processor was replaced by Azure Video Analyzer for Media. Also, consider using [Azure Media Services v3 API](../latest/analyze-video-audio-files-concept.md). <br/>See [Compare Azure Media Services v3 presets and Video Analyzer for Media](../../azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md). |
-| Content Moderator | June 1, 2020 |This media processor was replaced by Azure Video Analyzer for Media. Also, consider using [Azure Media Services v3 API](../latest/analyze-video-audio-files-concept.md). <br/>See [Compare Azure Media Services v3 presets and Video Analyzer for Media](../../azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md). |
-| Media Encoder Premium Workflow | February 29, 2024 | The AMS v2 API no longer supports the Premium Encoder. If you previously used the workflow-based Premium Encoder for HEVC encoding, you should migrate to the [new v3 Standard Encoder](../latest/encode-media-encoder-standard-formats-reference.md) with HEVC encoding support. <br/> If you require advanced workflow features of the Premium Encoder, you're encouraged to start using an Azure advanced encoding partner from [Imagine Communications](https://imaginecommunications.com/), [Telestream](https://telestream.net), or [Bitmovin](https://bitmovin.com). |
-
-## Next step
-
-[Migration guidance for moving from Media Services v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
media-services Media Rest Apis With Postman https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-rest-apis-with-postman.md
- Title: Configure Postman for Azure Media Services REST API calls
-description: This article describes how to configure Postman for Media Services REST API calls.
------ Previously updated : 03/10/2021--
-# Configure Postman for Media Services v2 REST API calls
---
-This tutorial shows you how to configure **Postman** so it can be used to call Azure Media Services (AMS) REST APIs. The tutorial shows how to import environment and collection files into **Postman**. The collection contains grouped definitions of HTTP requests that call Azure Media Services (AMS) REST APIs. The environment file contains variables that are used by the collection.
-
-This environment and collection is used in articles that show how to achieve various tasks with Azure Media Services REST APIs.
-
-## Prerequisites
--- Install the [Postman](https://www.getpostman.com/) REST client to execute the REST APIs shown in some of the AMS REST tutorials. -
- We are using **Postman** but any REST tool would be suitable. Other alternatives are: **Visual Studio Code** with the REST plugin or **Telerik Fiddler**.
-
-## Configure the environment
-
-1. Create a .json file that contains the environment variables used in AMS tutorials. Name the file (for example, **AzureMediaServices.postman_environment.json**). Open the file and paste the code that defines the Postman environment from [this code listing](postman-environment.md).
-2. Open the **Postman**.
-3. On the right of the screen, select the **Manage environment** option.
-
- ![Screenshot shows the Manage Environment option selected.](./media/media-services-rest-upload-files/postman-create-env.png)
-4. From the **Manage environment** dialog, click **Import**.
-5. Browse and select the **AzureMediaServices.postman_environment.json** file.
-6. The **AzureMedia** environment is added.
-7. Close the dialog.
-8. Select the **AzureMedia** environment.
-
- ![Screenshot shows the AzureMedia environment selected.](./media/media-services-rest-upload-files/postman-choose-env.png)
-
-## Configure the collection
-
-1. Create a .json file that contains the **Postman** collection with all the operations that are needed to upload a file to Media Services. Name the file (for example, **AzureMediaServicesOperations.postman_collection.json**). Open the file and paste the code that defines the **Postman** collection from [this code listing](postman-collection.md).
-2. Click **Import** to import the collection file.
-3. Choose the **AzureMediaServicesOperations.postman_collection.json** file.
-
- ![Screenshot shows the IMPORT dialog box with Choose Files selected.](./media/media-services-rest-upload-files/postman-import-collection.png)
-
-## Next steps
-
-Check out the [upload assets](media-services-rest-upload-files.md) article.
media-services Media Services Account Concept https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-account-concept.md
- Title: Manage Azure Media Services v2 accounts | Microsoft Docs
-description: To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account. This article explains how to manage Azure Media Services v2 accounts.
------ Previously updated : 03/10/2021---
-# Manage Azure Media Services v2 accounts
--
-To start managing, encrypting, encoding, analyzing, and streaming media content in Azure, you need to create a Media Services account. When creating a Media Services account, you need to supply the name of an Azure Storage account resource. The specified storage account is attached to your Media Services account. The Media Services account and all associated storage accounts must be in the same Azure subscription.
-
-## Moving a Media Services account between subscriptions
-
-If you need to move a Media Services account to a new subscription, you need to first move the entire resource group that contains the Media Services account to the new subscription. You must move all attached resources: Azure Storage accounts, Azure CDN profiles, etc. For more information, see [Move resources to new resource group or subscription](../../azure-resource-manager/management/move-resource-group-and-subscription.md). As with any resources in Azure, resource group moves can take some time to complete.
-
-Media Services v2 does not support multi-tenancy model. If you need to move a Media Services account to a subscription in a new tenant, create a new Azure Active Directory (Azure AD) application in the new tenant. Then move your account to the subscription in the new tenant. After the tenant move completes, you can begin using an Azure AD application from the new tenant to access the Media Services account using the v2 APIs.
-
-> [!IMPORTANT]
-> You need to reset the [Azure AD authentication](media-services-portal-get-started-with-aad.md) info to access Media Services v2 API.
-
-### Considerations
-
-* Create backups of all data in your account before migrating to a different subscription.
-* You need to stop all the Streaming Endpoints and live streaming resources. Your users will not be able to access your content for the duration of the resource group move.
-
-> [!IMPORTANT]
-> Do not start the Streaming Endpoint until the move completes successfully.
-
-### Troubleshoot
-
-If a Media Services account or an associated Azure Storage account become "disconnected" following the resource group move, try rotating the Storage Account keys. If rotating the Storage Account keys does not resolve the "disconnected" status of the Media Services account, file a new support request from the "Support + troubleshooting" menu in the Media Services account.
-
-## Next steps
-
-[Create an account](media-services-portal-create-account.md)
media-services Media Services Advanced Encoding With Mes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-advanced-encoding-with-mes.md
- Title: Perform advanced encoding by customizing MES presets | Microsoft Docs
-description: This topic shows how to perform advanced encoding by customizing Media Encoder Standard task presets.
------ Previously updated : 3/10/2021----
-# Perform advanced encoding by customizing MES presets
--
-## Overview
-
-This topic shows how to customize Media Encoder Standard presets. The [Encoding with Media Encoder Standard using custom presets](media-services-custom-mes-presets-with-dotnet.md) topic shows how to use .NET to create an encoding task and a job that executes this task. Once you customize a preset, supply the custom presets to the encoding task.
-
-If using an XML preset, make sure to preserve the order of elements, as shown in XML samples below (for example, KeyFrameInterval should precede SceneChangeDetection).
-
-> [!NOTE]
-> Many of the advanced Media Services v2 features of the Media Encoder Standard are currently not available in v3. For more information, see [the Migration Guide](../latest/migrate-v-2-v-3-migration-introduction.md).
-
-## Support for relative sizes
-
-When generating thumbnails, you do not need to always specify output width and height in pixels. You can specify them in percentages, in the range [1%, …, 100%].
-
-### JSON preset
-
-```json
-"Width": "100%",
-"Height": "100%"
-```
-
-### XML preset
-
-```xml
-<Width>100%</Width>
-<Height>100%</Height>
-```
-
-## Generate thumbnails
-
-This section shows how to customize a preset that generates thumbnails. The preset defined below contains information on how you want to encode your file as well as information needed to generate thumbnails. You can take any of the MES presets documented [this](media-services-mes-presets-overview.md) section and add code that generates thumbnails.
-
-> [!NOTE]
-> The **SceneChangeDetection** setting in the following preset can only be set to true if you are encoding to a single bitrate video. If you are encoding to a multi-bitrate video and set **SceneChangeDetection** to true, the encoder returns an error.
->
->
-
-For information about schema, see [this](media-services-mes-schema.md) topic.
-
-Make sure to review the [Considerations](#considerations) section.
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": "true",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4500,
- "MaxBitrate": 4500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "ReferenceFrames": 3,
- "EntropyMode": "Cabac",
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
-
- }
- ],
- "Type": "H264Video"
- },
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": 640,
- "Height": 360
- }
- ],
- "Start": "{Best}",
- "Type": "JpgImage"
- },
- {
- "PngLayers": [
- {
- "Type": "PngLayer",
- "Width": 640,
- "Height": 360,
- }
- ],
- "Start": "00:00:01",
- "Step": "00:00:10",
- "Range": "00:00:58",
- "Type": "PngImage"
- },
- {
- "BmpLayers": [
- {
- "Type": "BmpLayer",
- "Width": 640,
- "Height": 360
- }
- ],
- "Start": "10%",
- "Step": "10%",
- "Range": "90%",
- "Type": "BmpImage"
- },
- {
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- },
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "PngFormat"
- }
- },
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "BmpFormat"
- }
- },
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-### XML preset
-
-```xml
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>4500</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4500</MaxBitrate>
- </H264Layer>
- </H264Layers>
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- <JpgImage Start="{Best}">
- <JpgLayers>
- <JpgLayer>
- <Width>640</Width>
- <Height>360</Height>
- <Quality>90</Quality>
- </JpgLayer>
- </JpgLayers>
- </JpgImage>
- <BmpImage Start="10%" Step="10%" Range="90%">
- <BmpLayers>
- <BmpLayer>
- <Width>640</Width>
- <Height>360</Height>
- </BmpLayer>
- </BmpLayers>
- </BmpImage>
- <PngImage Start="00:00:01" Step="00:00:10" Range="00:00:58">
- <PngLayers>
- <PngLayer>
- <Width>640</Width>
- <Height>360</Height>
- </PngLayer>
- </PngLayers>
- </PngImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- <Output FileName="{Basename}_{Index}{Extension}">
- <JpgFormat />
- </Output>
- <Output FileName="{Basename}_{Index}{Extension}">
- <BmpFormat />
- </Output>
- <Output FileName="{Basename}_{Index}{Extension}">
- <PngFormat />
- </Output>
- </Outputs>
-</Preset>
-```
-
-### Considerations
-
-The following considerations apply:
-
-* The use of explicit timestamps for Start/Step/Range assumes that the input source is at least 1 minute long.
-* Jpg/Png/BmpImage elements have Start, Step, and Range string attributes ΓÇô these can be interpreted as:
-
- * Frame Number if they are non-negative integers, for example "Start": "120",
- * Relative to source duration if expressed as %-suffixed, for example "Start": "15%", OR
- * Timestamp if expressed as HH:MM:SS… format, for example "Start" : "00:01:00"
-
- You can mix and match notations as you please.
-
- Additionally, Start also supports a special Macro:{Best}, which attempts to determine the first "interesting" frame of the content
- NOTE: (Step and Range are ignored when Start is set to {Best})
- * Defaults: Start:{Best}
-* Output format needs to be explicitly provided for each Image format: Jpg/Png/BmpFormat. When present, MES matches JpgVideo to JpgFormat and so on. OutputFormat introduces a new image-codec specific Macro: {Index}, which needs to be present (once and only once) for image output formats.
-
-## <a id="trim_video"></a>Trim a video (clipping)
-This section talks about modifying the encoder presets to clip or trim the input video where the input is a so-called mezzanine file or on-demand file. The encoder can also be used to clip or trim an asset, which is captured or archived from a live stream ΓÇô the details for this are available in [this blog](https://azure.microsoft.com/blog/sub-clipping-and-live-archive-extraction-with-media-encoder-standard/).
-
-To trim your videos, you can take any of the MES presets documented [this](media-services-mes-presets-overview.md) section and modify the **Sources** element (as shown below). The value of StartTime needs to match the absolute timestamps of the input video. For example, if the first frame of the input video has a timestamp of 12:00:10.000, then StartTime should be at least 12:00:10.000 and greater. In the example below, we assume that the input video has a starting timestamp of zero. **Sources** should be placed at the beginning of the preset.
-
-### <a id="json"></a>JSON preset
-
-```json
-{
- "Version": 1.0,
- "Sources": [
- {
- "StartTime": "00:00:04",
- "Duration": "00:00:16"
- }
- ],
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "StretchMode": "AutoSize",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1500,
- "MaxBitrate": 1500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-### XML preset
-To trim your videos, you can take any of the MES presets documented [here](media-services-mes-presets-overview.md) and modify the **Sources** element (as shown below).
-
-```xml
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Sources>
- <Source StartTime="PT4S" Duration="PT14S"/>
- </Sources>
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>320</Width>
- <Height>180</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
-## <a id="overlay"></a>Create an overlay
-
-The Media Encoder Standard allows you to overlay an image onto an existing video. Currently, the following formats are supported: png, jpg, gif, and bmp. The preset defined below is a basic example of a video overlay.
-
-In addition to defining a preset file, you also have to let Media Services know which file in the asset is the overlay image and which file is the source video onto which you want to overlay the image. The video file has to be the **primary** file.
-
-If you are using .NET, add the following two functions to the .NET example defined in [this](media-services-custom-mes-presets-with-dotnet.md#encoding_with_dotnet) topic. The **UploadMediaFilesFromFolder** function uploads files from a folder (for example, BigBuckBunny.mp4 and Image001.png) and sets the mp4 file to be the primary file in the asset. The **EncodeWithOverlay** function uses the custom preset file that was passed to it (for example, the preset that follows) to create the encoding task.
-
-```csharp
-static public IAsset UploadMediaFilesFromFolder(string folderPath)
-{
- IAsset asset = _context.Assets.CreateFromFolder(folderPath, AssetCreationOptions.None);
-
- foreach (var af in asset.AssetFiles)
- {
- // The following code assumes
- // you have an input folder with one MP4 and one overlay image file.
- if (af.Name.Contains(".mp4"))
- af.IsPrimary = true;
- else
- af.IsPrimary = false;
-
- af.Update();
- }
-
- return asset;
-}
-
-static public IAsset EncodeWithOverlay(IAsset assetSource, string customPresetFileName)
-{
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Load the XML (or JSON) from the local file.
- string configuration = File.ReadAllText(customPresetFileName);
-
- // Create a task
- ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input assets to be encoded.
- // This asset contains a source file and an overlay file.
- task.InputAssets.Add(assetSource);
-
- // Add an output asset to contain the results of the job.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
-}
-```
-
-> [!NOTE]
-> Current limitations:
->
-> The overlay opacity setting is not supported.
->
-> Your source video file and the overlay image file have to be in the same asset, and the video file needs to be set as the primary file in this asset.
->
->
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Sources": [
- {
- "Streams": [],
- "Filters": {
- "VideoOverlay": {
- "Position": {
- "X": 100,
- "Y": 100,
- "Width": 100,
- "Height": 50
- },
- "AudioGainLevel": 0.0,
- "MediaParams": [
- {
- "OverlayLoopCount": 1
- },
- {
- "IsOverlay": true,
- "OverlayLoopCount": 1
- }
- ],
- "Source": "Image001.png",
- "Clip": {
- "Duration": "00:00:05"
- },
- "FadeInDuration": {
- "Duration": "00:00:01"
- },
- "FadeOutDuration": {
- "StartTime": "00:00:03",
- "Duration": "00:00:04"
- }
- }
- },
- "Pad": true
- }
- ],
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1045,
- "MaxBitrate": 1045,
- "BufferWindow": "00:00:05",
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "Width": "640",
- "Height": "360",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Type": "CopyAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}{Extension}",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-### XML preset
-
-```xml
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Sources>
- <Source>
- <Streams />
- <Filters>
- <VideoOverlay>
- <Source>Image001.png</Source>
- <Clip Duration="PT5S" />
- <FadeInDuration Duration="PT1S" />
- <FadeOutDuration StartTime="PT3S" Duration="PT4S" />
- <Position X="100" Y="100" Width="100" Height="50" />
- <Opacity>0</Opacity>
- <AudioGainLevel>0</AudioGainLevel>
- <MediaParams>
- <MediaParam>
- <IsOverlay>false</IsOverlay>
- <OverlayLoopCount>1</OverlayLoopCount>
- </MediaParam>
- <MediaParam>
- <IsOverlay>true</IsOverlay>
- <OverlayLoopCount>1</OverlayLoopCount>
- </MediaParam>
- </MediaParams>
- </VideoOverlay>
- </Filters>
- <Pad>true</Pad>
- </Source>
- </Sources>
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>1045</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1045</MaxBitrate>
- </H264Layer>
- </H264Layers>
- </H264Video>
- <CopyAudio />
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}{Extension}">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
-## <a id="silent_audio"></a>Insert a silent audio track when input has no audio
-By default, if you send an input to the encoder that contains only video, and no audio, then the output asset contains files that contain only video data. Some players may not be able to handle such output streams. You can use this setting to force the encoder to add a silent audio track to the output in that scenario.
-
-To force the encoder to produce an asset that contains a silent audio track when input has no audio, specify the "InsertSilenceIfNoAudio" value.
-
-You can take any of the MES presets documented in [this](media-services-mes-presets-overview.md) section, and make the following modification:
-
-### JSON preset
-
-```json
-{
- "Channels": 2,
- "SamplingRate": 44100,
- "Bitrate": 96,
- "Type": "AACAudio",
- "Condition": "InsertSilenceIfNoAudio"
-}
-```
-
-### XML preset
-
-```xml
-<AACAudio Condition="InsertSilenceIfNoAudio">
- <Channels>2</Channels>
- <SamplingRate>44100</SamplingRate>
- <Bitrate>96</Bitrate>
-</AACAudio>
-```
-
-## <a id="deinterlacing"></a>Disable auto de-interlacing
-Customers don't need to do anything if they like the interlace contents to be automatically de-interlaced. When the auto de-interlacing is on (default) the MES does the auto detection of interlaced frames and only de-interlaces frames marked as interlaced.
-
-You can turn the auto de-interlacing off. This option is not recommended.
-
-### JSON preset
-
-```json
-"Sources": [
- {
- "Filters": {
- "Deinterlace": {
- "Mode": "Off"
- }
- },
- }
-]
-```
-
-### XML preset
-
-```xml
-<Sources>
- <Source>
- <Filters>
- <Deinterlace>
- <Mode>Off</Mode>
- </Deinterlace>
- </Filters>
- </Source>
-</Sources>
-```
-
-## <a id="audio_only"></a>Audio-only presets
-This section demonstrates two audio-only MES presets: AAC Audio and AAC Good Quality Audio.
-
-### AAC Audio
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_AAC_{AudioBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-### AAC Good Quality Audio
-
-```json
-{
- "Version": 1.0,
- "Codecs": [
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 192,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_AAC_{AudioBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-## <a id="concatenate"></a>Concatenate two or more video files
-
-The following example illustrates how you can generate a preset to concatenate two or more video files. The most common scenario is when you want to add a header or a trailer to the main video. The intended use is when the video files being edited together share properties (video resolution, frame rate, audio track count, etc.). You should take care not to mix videos of different frame rates, or with different number of audio tracks.
-
->[!NOTE]
->The current design of the concatenation feature expects that the input video clips are consistent in terms of resolution, frame rate etc.
-
-### Requirements and considerations
-
-* Input videos should only have one audio track.
-* Input videos should all have the same frame rate.
-* You must upload your videos into separate assets and set the videos as the primary file in each asset.
-* You need to know the duration of your videos.
-* The preset examples below assumes that all the input videos start with a timestamp of zero. You need to modify the StartTime values if the videos have different starting timestamp, as is typically the case with live archives.
-* The JSON preset makes explicit references to the AssetID values of the input assets.
-* The sample code assumes that the JSON preset has been saved to a local file, such as "C:\supportFiles\preset.json". It also assumes that two assets have been created by uploading two video files, and that you know the resultant AssetID values.
-* The code snippet and JSON preset shows an example of concatenating two video files. You can extend it to more than two videos by:
-
- 1. Calling task.InputAssets.Add() repeatedly to add more videos, in order.
- 2. Making corresponding edits to the "Sources" element in the JSON, by adding more entries, in the same order.
-
-### .NET code
-
-```csharp
-IAsset asset1 = _context.Assets.Where(asset => asset.Id == "nb:cid:UUID:606db602-efd7-4436-97b4-c0b867ba195b").FirstOrDefault();
-IAsset asset2 = _context.Assets.Where(asset => asset.Id == "nb:cid:UUID:a7e2b90f-0565-4a94-87fe-0a9fa07b9c7e").FirstOrDefault();
-
-// Declare a new job.
-IJob job = _context.Jobs.Create("Media Encoder Standard Job for Concatenating Videos");
-// Get a media processor reference, and pass to it the name of the
-// processor to use for the specific task.
-IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
-// Load the XML (or JSON) from the local file.
-string configuration = File.ReadAllText(@"c:\supportFiles\preset.json");
-
-// Create a task
-ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
- processor,
- configuration,
- TaskOptions.None);
-
-// Specify the input videos to be concatenated (in order).
-task.InputAssets.Add(asset1);
-task.InputAssets.Add(asset2);
-// Add an output asset to contain the results of the job.
-// This output is specified as AssetCreationOptions.None, which
-// means the output asset is not encrypted.
-task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
-job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
-job.Submit();
-job.GetExecutionProgressTask(CancellationToken.None).Wait();
-```
-
-### JSON preset
-
-Update your custom preset with ids of the assets that you want to concatenate, and with the appropriate time segment for each video.
-
-```json
-{
- "Version": 1.0,
- "Sources": [
- {
- "AssetID": "606db602-efd7-4436-97b4-c0b867ba195b",
- "StartTime": "00:00:01",
- "Duration": "00:00:15"
- },
- {
- "AssetID": "a7e2b90f-0565-4a94-87fe-0a9fa07b9c7e",
- "StartTime": "00:00:02",
- "Duration": "00:00:05"
- }
- ],
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Level": "auto",
- "Bitrate": 1800,
- "MaxBitrate": 1800,
- "BufferWindow": "00:00:05",
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "Width": "640",
- "Height": "360",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-## <a id="crop"></a>Crop videos with Media Encoder Standard
-See the [Crop videos with Media Encoder Standard](media-services-crop-video.md) topic.
-
-## <a id="no_video"></a>Insert a video track when input has no video
-
-By default, if you send an input to the encoder that contains only audio, and no video, then the output asset contains files that contain only audio data. Some players, including Azure Media Player (see [this](https://feedback.azure.com/d365community/idea/aaacdea0-0d25-ec11-b6e6-000d3a4f09d0)) may not be able to handle such streams. You can use this setting to force the encoder to add a monochrome video track to the output in that scenario.
-
-> [!NOTE]
-> Forcing the encoder to insert an output video track increases the size of the output Asset, and thereby the cost incurred for the encoding Task. You should run tests to verify that this resultant increase has only a modest impact on your monthly charges.
->
-
-### Inserting video at only the lowest bitrate
-
-Suppose you are using a multiple bitrate encoding preset such as ["H264 Multiple Bitrate 720p"](media-services-mes-preset-h264-multiple-bitrate-720p.md) to encode your entire input catalog for streaming, which contains a mix of video files and audio-only files. In this scenario, when the input has no video, you may want to force the encoder to insert a monochrome video track at just the lowest bitrate, as opposed to inserting video at every output bitrate. To achieve this, you need to use the **InsertBlackIfNoVideoBottomLayerOnly** flag.
-
-You can take any of the MES presets documented in [this](media-services-mes-presets-overview.md) section, and make the following modification:
-
-#### JSON preset
-
-```json
-{
- "KeyFrameInterval": "00:00:02",
- "StretchMode": "AutoSize",
- "Condition": "InsertBlackIfNoVideoBottomLayerOnly",
- "H264Layers": [
- …
- ]
-}
-```
-
-#### XML preset
-
-When using XML, use Condition="InsertBlackIfNoVideoBottomLayerOnly" as an attribute to the **H264Video** element and Condition="InsertSilenceIfNoAudio" as an attribute to **AACAudio**.
-
-```xml
-. . .
-<Encoding>
- <H264Video Condition="InsertBlackIfNoVideoBottomLayerOnly">
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <StretchMode>AutoSize</StretchMode>
- <H264Layers>
- <H264Layer>
- . . .
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio Condition="InsertSilenceIfNoAudio">
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
-</Encoding>
-. . .
-```
-
-### Inserting video at all output bitrates
-Suppose you are using a multiple bitrate encoding preset such as ["H264 Multiple Bitrate 720p](media-services-mes-preset-H264-Multiple-Bitrate-720p.md) to encode your entire input catalog for streaming, which contains a mix of video files and audio-only files. In this scenario, when the input has no video, you may want to force the encoder to insert a monochrome video track at all the output bitrates. This ensures that your output Assets are all homogenous with respect to number of video tracks and audio tracks. To achieve this, you need to specify the "InsertBlackIfNoVideo" flag.
-
-You can take any of the MES presets documented in [this](media-services-mes-presets-overview.md) section, and make the following modification:
-
-#### JSON preset
-
-```json
-{
- "KeyFrameInterval": "00:00:02",
- "StretchMode": "AutoSize",
- "Condition": "InsertBlackIfNoVideo",
- "H264Layers": [
- …
- ]
-}
-```
-
-#### XML preset
-
-When using XML, use Condition="InsertBlackIfNoVideo" as an attribute to the **H264Video** element and Condition="InsertSilenceIfNoAudio" as an attribute to **AACAudio**.
-
-```xml
-. . .
-<Encoding>
- <H264Video Condition="InsertBlackIfNoVideo">
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <StretchMode>AutoSize</StretchMode>
- <H264Layers>
- <H264Layer>
- . . .
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio Condition="InsertSilenceIfNoAudio">
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
-</Encoding>
-. . .
-```
-
-## <a id="rotate_video"></a>Rotate a video
-The [Media Encoder Standard](media-services-dotnet-encode-with-media-encoder-standard.md) supports rotation by angles of 0/90/180/270. The default behavior is "Auto", where it tries to detect the rotation metadata in the incoming video file and compensate for it. Include the following **Sources** element to one of the presets defined in [this](media-services-mes-presets-overview.md) section:
-
-### JSON preset
-
-```json
- "Sources": [
- {
- "Streams": [],
- "Filters": {
- "Rotation": "90"
- }
- }
- ],
- "Codecs": [
-
- ...
-```
-
-### XML preset
-
-```xml
-<Sources>
- <Source>
- <Streams />
- <Filters>
- <Rotation>90</Rotation>
- </Filters>
- </Source>
-</Sources>
-```
-
-Also, see [this](media-services-mes-schema.md#PreserveResolutionAfterRotation) topic for more information on how the encoder interprets the Width and Height settings in the preset, when rotation compensation is triggered.
-
-You can use the value "0" to indicate to the encoder to ignore rotation metadata, if present, in the input video.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Media Services Encoding Overview](media-services-encode-asset.md)
media-services Media Services Autogen Bitrate Ladder With Mes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-autogen-bitrate-ladder-with-mes.md
- Title: Use Media Encoder Standard to auto-generate a bitrate ladder - Azure | Microsoft Docs
-description: This topic shows how to use Media Encoder Standard (MES) to auto-generate a bitrate ladder based on the input resolution and bitrate.
------ Previously updated : 03/10/2021--
-# Use Media Encoder Standard to auto-generate a bitrate ladder
--
-## Overview
-
-This article shows how to use Media Encoder Standard (MES) to auto-generate a bitrate ladder (bitrate-resolution pairs) based on the input resolution and bitrate. The auto-generated preset will never exceed the input resolution and bitrate. For example, if the input is 720p at 3 Mbps, output remains 720p at best, and will start at rates lower than 3 Mbps.
-
-### Encoding for Streaming Only
-
-If your intent is to encode your source video only for streaming, then you should use the "Adaptive Streaming" preset when creating an encoding task. When using the **Adaptive Streaming** preset, the MES encoder will intelligently cap a bitrate ladder. However, you will not be able to control the encoding costs, since the service determines how many layers to use and at what resolution. You can see examples of output layers produced by MES as a result of encoding with the **Adaptive Streaming** preset at the end of this article. The output Asset contains MP4 files where audio and video is not interleaved.
-
-### Encoding for Streaming and Progressive Download
-
-If your intent is to encode your source video for streaming as well as to produce MP4 files for progressive download, then you should use the "Content Adaptive Multiple Bitrate MP4" preset when creating an encoding task. When using the **Content Adaptive Multiple Bitrate MP4** preset, the MES encoder applies the same encoding logic as above, but now the output asset will contain MP4 files where audio and video is interleaved. You can use one of these MP4 files (for example, the highest bitrate version) as a progressive download file.
-
-## <a id="encoding_with_dotnet"></a>Encoding with Media Services .NET SDK
-
-The following code example uses Media Services .NET SDK to perform the following tasks:
--- Create an encoding job.-- Get a reference to the Media Encoder Standard encoder.-- Add an encoding task to the job and specify to use the **Adaptive Streaming** preset. -- Create an output asset that contains the encoded asset.-- Add an event handler to check the job progress.-- Submit the job.-
-#### Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-#### Example
-
-```
-using System;
-using System.Configuration;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Threading;
-
-namespace AdaptiveStreamingMESPresest
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Get an uploaded asset.
- var asset = _context.Assets.FirstOrDefault();
-
- // Encode and generate the output using the "Adaptive Streaming" preset.
- EncodeToAdaptiveBitrateMP4Set(asset);
-
- Console.ReadLine();
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Job");
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task
- ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
- processor,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
- private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
-
- // Cast sender as a job.
- IJob job = (IJob)sender;
-
- // Display or log error details as needed.
- break;
- default:
- break;
- }
- }
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-## <a id="output"></a>Output
-
-This section shows three examples of output layers produced by MES as a result of encoding with the **Adaptive Streaming** preset.
-
-### Example 1
-Source with height "1080" and framerate "29.970" produces 6 video layers:
-
-|Layer|Height|Width|Bitrate(kbps)|
-|||||
-|1|1080|1920|6780|
-|2|720|1280|3520|
-|3|540|960|2210|
-|4|360|640|1150|
-|5|270|480|720|
-|6|180|320|380|
-
-### Example 2
-Source with height "720" and framerate "23.970" produces 5 video layers:
-
-|Layer|Height|Width|Bitrate(kbps)|
-|||||
-|1|720|1280|2940|
-|2|540|960|1850|
-|3|360|640|960|
-|4|270|480|600|
-|5|180|320|320|
-
-### Example 3
-Source with height "360" and framerate "29.970" produces 3 video layers:
-
-|Layer|Height|Width|Bitrate(kbps)|
-|||||
-|1|360|640|700|
-|2|270|480|440|
-|3|180|320|230|
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Media Services Encoding Overview](media-services-encode-asset.md)
-
media-services Media Services Build Smooth Streaming Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-build-smooth-streaming-apps.md
- Title: Smooth Streaming Windows Store App Tutorial | Microsoft Docs
-description: Learn how to use Azure Media Services to create a C# Windows Store application with a XML MediaElement control to playback Smooth Stream content.
------ Previously updated : 03/10/2021---
-# How to Build a Smooth Streaming Windows Store Application
--
-The Smooth Streaming Client SDK for Windows 8 enables developers to build Windows Store applications that can play on-demand and live Smooth Streaming content. In addition to the basic playback of Smooth Streaming content, the SDK also provides rich features like Microsoft PlayReady protection, quality level restriction, Live DVR, audio stream switching, listening for status updates (such as quality level changes) and error events, and so on. For more information of the supported features, see the [release notes](https://www.iis.net/learn/media/smooth-streaming/smooth-streaming-client-sdk-for-windows-8-release-notes). For more information, see [Player Framework for Windows 8](https://developerpublish.com/player-framework-for-windows-8-preview-6-released/).
-
-This tutorial contains four lessons:
-
-1. Create a Basic Smooth Streaming Store Application
-2. Add a Slider Bar to Control the Media Progress
-3. Select Smooth Streaming Streams
-4. Select Smooth Streaming Tracks
-
-## Prerequisites
-> [!NOTE]
-> Windows Store projects version 8.1 and earlier are not supported in Visual Studio 2017. For more information, see [Visual Studio 2017 Platform Targeting and Compatibility](https://www.visualstudio.com/en-us/productinfo/vs2017-compatibility-vs).
-
-* Windows 8 32-bit or 64-bit.
-* Visual Studio versions 2012 through 2015.
-* [Microsoft Smooth Streaming Client SDK for Windows 8](https://visualstudiogallery.msdn.microsoft.com/04423d13-3b3e-4741-a01c-1ae29e84fea6?SRC=Home).
-
-The completed solution for each lesson can be downloaded from MSDN Developer Code Samples (Code Gallery):
-
-* [Lesson 1](https://code.msdn.microsoft.com/Smooth-Streaming-Client-0bb1471f) - A Simple Windows 8 Smooth Streaming Media Player,
-* [Lesson 2](https://code.msdn.microsoft.com/A-simple-Windows-8-Smooth-ee98f63a) - A Simple Windows 8 Smooth Streaming Media Player with a Slider Bar Control,
-* [Lesson 3](https://code.msdn.microsoft.com/A-Windows-8-Smooth-883c3b44) - A Windows 8 Smooth Streaming Media Player with Stream Selection,
-* [Lesson 4](https://code.msdn.microsoft.com/A-Windows-8-Smooth-aa9e4907) - A Windows 8 Smooth Streaming Media Player with Track Selection.
-
-## Lesson 1: Create a Basic Smooth Streaming Store Application
-
-In this lesson, you will create a Windows Store application with a MediaElement control to play Smooth Stream content. The running application looks like:
-
-![Smooth Streaming Windows Store application example][PlayerApplication]
-
-For more information on developing Windows Store application, see [Develop Great Apps for Windows 8](https://developer.microsoft.com/en-us/windows/).
-This lesson contains the following procedures:
-
-1. Create a Windows Store project
-2. Design the user interface (XAML)
-3. Modify the code behind file
-4. Compile and test the application
-
-### To create a Windows Store project
-
-1. Run Visual Studio; versions 2012 through 2015 are supported.
-1. From the **FILE** menu, click **New**, and then click **Project**.
-1. From the New Project dialog, type or select the following values:
-
- | Name | Value |
- | | |
- | Template group |Installed/Templates/Visual C#/Windows Store |
- | Template |Blank App (XAML) |
- | Name |SSPlayer |
- | Location |C:\SSTutorials |
- | Solution Name |SSPlayer |
- | Create directory for solution |(selected) |
-
-1. Click **OK**.
-
-### To add a reference to the Smooth Streaming Client SDK
-
-1. From Solution Explorer, right-click **SSPlayer**, and then click **Add Reference**.
-1. Type or select the following values:
-
- | Name | Value |
- | | |
- | Reference group |Windows/Extensions |
- | Reference |Select Microsoft Smooth Streaming Client SDK for Windows 8 and Microsoft Visual C++ Runtime Package |
-
-1. Click **OK**.
-
-After adding the references, you must select the targeted platform (x64 or x86), adding references will not work for Any CPU platform configuration. In solution explorer, you will see yellow warning mark for these added references.
-
-### To design the player user interface
-
-1. From Solution Explorer, double click **MainPage.xaml** to open it in the design view.
-2. Locate the **&lt;Grid&gt;** and **&lt;/Grid&gt;** tags the XAML file, and paste the following code between the two tags:
-
- ```xml
- <Grid.RowDefinitions>
-
- <RowDefinition Height="20"/> <!-- spacer -->
- <RowDefinition Height="50"/> <!-- media controls -->
- <RowDefinition Height="100*"/> <!-- media element -->
- <RowDefinition Height="80*"/> <!-- media stream and track selection -->
- <RowDefinition Height="50"/> <!-- status bar -->
- </Grid.RowDefinitions>
-
- <StackPanel Name="spMediaControl" Grid.Row="1" Orientation="Horizontal">
- <TextBlock x:Name="tbSource" Text="Source : " FontSize="16" FontWeight="Bold" VerticalAlignment="Center" />
- <TextBox x:Name="txtMediaSource" Text="https://ecn.channel9.msdn.com/o9/content/smf/smoothcontent/elephantsdream/Elephants_Dream_1024-h264-st-aac.ism/manifest" FontSize="10" Width="700" Margin="0,4,0,10" />
- <Button x:Name="btnSetSource" Content="Set Source" Width="111" Height="43" Click="btnSetSource_Click"/>
- <Button x:Name="btnPlay" Content="Play" Width="111" Height="43" Click="btnPlay_Click"/>
- <Button x:Name="btnPause" Content="Pause" Width="111" Height="43" Click="btnPause_Click"/>
- <Button x:Name="btnStop" Content="Stop" Width="111" Height="43" Click="btnStop_Click"/>
- <CheckBox x:Name="chkAutoPlay" Content="Auto Play" Height="55" Width="Auto" IsChecked="{Binding AutoPlay, ElementName=mediaElement, Mode=TwoWay}"/>
- <CheckBox x:Name="chkMute" Content="Mute" Height="55" Width="67" IsChecked="{Binding IsMuted, ElementName=mediaElement, Mode=TwoWay}"/>
- </StackPanel>
-
- <StackPanel Name="spMediaElement" Grid.Row="2" Height="435" Width="1072"
- HorizontalAlignment="Center" VerticalAlignment="Center">
- <MediaElement x:Name="mediaElement" Height="356" Width="924" MinHeight="225"
- HorizontalAlignment="Center" VerticalAlignment="Center"
- AudioCategory="BackgroundCapableMedia" />
- <StackPanel Orientation="Horizontal">
- <Slider x:Name="sliderProgress" Width="924" Height="44"
- HorizontalAlignment="Center" VerticalAlignment="Center"
- PointerPressed="sliderProgress_PointerPressed"/>
- <Slider x:Name="sliderVolume"
- HorizontalAlignment="Right" VerticalAlignment="Center" Orientation="Vertical"
- Height="79" Width="148" Minimum="0" Maximum="1" StepFrequency="0.1"
- Value="{Binding Volume, ElementName=mediaElement, Mode=TwoWay}"
- ToolTipService.ToolTip="{Binding Value, RelativeSource={RelativeSource Mode=Self}}"/>
- </StackPanel>
- </StackPanel>
-
- <StackPanel Name="spStatus" Grid.Row="4" Orientation="Horizontal">
- <TextBlock x:Name="tbStatus" Text="Status : "
- FontSize="16" FontWeight="Bold" VerticalAlignment="Center" HorizontalAlignment="Center" />
- <TextBox x:Name="txtStatus" FontSize="10" Width="700" VerticalAlignment="Center"/>
- </StackPanel>
- ```
- The MediaElement control is used to playback media. The slider control named sliderProgress will be used in the next lesson to control the media progress.
-3. Press **CTRL+S** to save the file.
-
-The MediaElement control does not support Smooth Streaming content out-of-box. To enable the Smooth Streaming support, you must register the Smooth Streaming byte-stream handler by file name extension and MIME type. To register, you use the MediaExtensionManager.RegisterByteStreamHandler method of the Windows.Media namespace.
-
-In this XAML file, some event handlers are associated with the controls. You must define those event handlers.
-
-### To modify the code behind file
-
-1. From Solution Explorer, right-click **MainPage.xaml**, and then click **View Code**.
-2. At the top of the file, add the following using statement:
-
- ```csharp
- using Windows.Media;
- ```
-
-3. At the beginning of the **MainPage** class, add the following data member:
-
- ```csharp
- private MediaExtensionManager extensions = new MediaExtensionManager();
- ```
-
-4. At the end of the **MainPage** constructor, add the following two lines:
-
- ```csharp
- extensions.RegisterByteStreamHandler("Microsoft.Media.AdaptiveStreaming.SmoothByteStreamHandler", ".ism", "text/xml");
- extensions.RegisterByteStreamHandler("Microsoft.Media.AdaptiveStreaming.SmoothByteStreamHandler", ".ism", "application/vnd.ms-sstr+xml");
- ```
-
-5. At the end of the **MainPage** class, paste the following code:
- ```csharp
- # region UI Button Click Events
- private void btnPlay_Click(object sender, RoutedEventArgs e)
- {
-
- mediaElement.Play();
- txtStatus.Text = "MediaElement is playing ...";
- }
- private void btnPause_Click(object sender, RoutedEventArgs e)
- {
-
- mediaElement.Pause();
- txtStatus.Text = "MediaElement is paused";
- }
- private void btnSetSource_Click(object sender, RoutedEventArgs e)
- {
-
- sliderProgress.Value = 0;
- mediaElement.Source = new Uri(txtMediaSource.Text);
-
- if (chkAutoPlay.IsChecked == true)
- {
- txtStatus.Text = "MediaElement is playing ...";
- }
- else
- {
- txtStatus.Text = "Click the Play button to play the media source.";
- }
- }
- private void btnStop_Click(object sender, RoutedEventArgs e)
- {
-
- mediaElement.Stop();
- txtStatus.Text = "MediaElement is stopped";
- }
- private void sliderProgress_PointerPressed(object sender, PointerRoutedEventArgs e)
- {
-
- txtStatus.Text = "Seek to position " + sliderProgress.Value;
- mediaElement.Position = new TimeSpan(0, 0, (int)(sliderProgress.Value));
- }
- # endregion
- ```
- The sliderProgress_PointerPressed event handler is defined here. There are more works to do to get it working, which will be covered in the next lesson of this tutorial.
-6. Press **CTRL+S** to save the file.
-
-The finished the code behind file shall look like this:
-
-![Codeview in Visual Studio of Smooth Streaming Windows Store application][CodeViewPic]
-
-### To compile and test the application
-
-1. From the **BUILD** menu, click **Configuration Manager**.
-2. Change **Active solution platform** to match your development platform.
-3. Press **F6** to compile the project.
-4. Press **F5** to run the application.
-5. At the top of the application, you can either use the default Smooth Streaming URL or enter a different one.
-6. Click **Set Source**. Because **Auto Play** is enabled by default, the media shall play automatically. You can control the media using the **Play**, **Pause** and **Stop** buttons. You can control the media volume using the vertical slider. However the horizontal slider for controlling the media progress is not fully implemented yet.
-
-You have completed lesson1. In this lesson, you use a MediaElement control to playback Smooth Streaming content. In the next lesson, you will add a slider to control the progress of the Smooth Streaming content.
-
-## Lesson 2: Add a Slider Bar to Control the Media Progress
-
-In lesson 1, you created a Windows Store application with a MediaElement XAML control to playback Smooth Streaming media content. It comes some basic media functions like start, stop and pause. In this lesson, you will add a slider bar control to the application.
-
-In this tutorial, we will use a timer to update the slider position based on the current position of the MediaElement control. The slider start and end time also need to be updated in case of live content. This can be better handled in the adaptive source update event.
-
-Media sources are objects that generate media data. The source resolver takes a URL or byte stream and creates the appropriate media source for that content. The source resolver is the standard way for the applications to create media sources.
-
-This lesson contains the following procedures:
-
-1. Register the Smooth Streaming handler
-2. Add the adaptive source manager level event handlers
-3. Add the adaptive source level event handlers
-4. Add MediaElement event handlers
-5. Add slider bar related code
-6. Compile and test the application
-
-### To register the Smooth Streaming byte-stream handler and pass the propertyset
-
-1. From Solution Explorer, right click **MainPage.xaml**, and then click **View Code**.
-2. At the beginning of the file, add the following using statement:
-
- ```csharp
- using Microsoft.Media.AdaptiveStreaming;
- ```
-3. At the beginning of the MainPage class, add the following data members:
-
- ```csharp
- private Windows.Foundation.Collections.PropertySet propertySet = new Windows.Foundation.Collections.PropertySet();
- private IAdaptiveSourceManager adaptiveSourceManager;
- ```
-4. Inside the **MainPage** constructor, add the following code after the **this.Initialize Components();** line and the registration code lines written in the previous lesson:
-
- ```csharp
- // Gets the default instance of AdaptiveSourceManager which manages Smooth
- //Streaming media sources.
- adaptiveSourceManager = AdaptiveSourceManager.GetDefault();
- // Sets property key value to AdaptiveSourceManager default instance.
- // {A5CE1DE8-1D00-427B-ACEF-FB9A3C93DE2D}" must be hardcoded.
- propertySet["{A5CE1DE8-1D00-427B-ACEF-FB9A3C93DE2D}"] = adaptiveSourceManager;
- ```
-5. Inside the **MainPage** constructor, modify the two RegisterByteStreamHandler methods to add the forth parameters:
-
- ```csharp
- // Registers Smooth Streaming byte-stream handler for ".ism" extension and,
- // "text/xml" and "application/vnd.ms-ss" mime-types and pass the propertyset.
- // http://*.ism/manifest URI resources will be resolved by Byte-stream handler.
- extensions.RegisterByteStreamHandler(
-
- "Microsoft.Media.AdaptiveStreaming.SmoothByteStreamHandler",
- ".ism",
- "text/xml",
- propertySet );
- extensions.RegisterByteStreamHandler(
-
- "Microsoft.Media.AdaptiveStreaming.SmoothByteStreamHandler",
- ".ism",
- "application/vnd.ms-sstr+xml",
- propertySet);
- ```
-6. Press **CTRL+S** to save the file.
-
-### To add the adaptive source manager level event handler
-
-1. From Solution Explorer, right click **MainPage.xaml**, and then click **View Code**.
-2. Inside the **MainPage** class, add the following data member:
-
- ```csharp
- private AdaptiveSource adaptiveSource = null;
- ```
-3. At the end of the **MainPage** class, add the following event handler:
-
- ```csharp
- # region Adaptive Source Manager Level Events
- private void mediaElement_AdaptiveSourceOpened(AdaptiveSource sender, AdaptiveSourceOpenedEventArgs args)
- {
-
- adaptiveSource = args.AdaptiveSource;
- }
-
- # endregion Adaptive Source Manager Level Events
- ```
-4. At the end of the **MainPage** constructor, add the following line to subscribe to the adaptive source open event:
-
- ```csharp
- adaptiveSourceManager.AdaptiveSourceOpenedEvent +=
- new AdaptiveSourceOpenedEventHandler(mediaElement_AdaptiveSourceOpened);
- ```
-5. Press **CTRL+S** to save the file.
-
-### To add adaptive source level event handlers
-
-1. From Solution Explorer, right click **MainPage.xaml**, and then click **View Code**.
-2. Inside the **MainPage** class, add the following data member:
-
- ```csharp
- private AdaptiveSourceStatusUpdatedEventArgs adaptiveSourceStatusUpdate;
- private Manifest manifestObject;
- ```
-3. At the end of the **MainPage** class, add the following event handlers:
-
- ```csharp
- # region Adaptive Source Level Events
- private void mediaElement_ManifestReady(AdaptiveSource sender, ManifestReadyEventArgs args)
- {
-
- adaptiveSource = args.AdaptiveSource;
- manifestObject = args.AdaptiveSource.Manifest;
- }
-
- private void mediaElement_AdaptiveSourceStatusUpdated(AdaptiveSource sender, AdaptiveSourceStatusUpdatedEventArgs args)
- {
-
- adaptiveSourceStatusUpdate = args;
- }
-
- private void mediaElement_AdaptiveSourceFailed(AdaptiveSource sender, AdaptiveSourceFailedEventArgs args)
- {
-
- txtStatus.Text = "Error: " + args.HttpResponse;
- }
-
- # endregion Adaptive Source Level Events
- ```
-4. At the end of the **mediaElement AdaptiveSourceOpened** method, add the following code to subscribe to the events:
-
- ```csharp
- adaptiveSource.ManifestReadyEvent +=
-
- mediaElement_ManifestReady;
- adaptiveSource.AdaptiveSourceStatusUpdatedEvent +=
-
- mediaElement_AdaptiveSourceStatusUpdated;
- adaptiveSource.AdaptiveSourceFailedEvent +=
-
- mediaElement_AdaptiveSourceFailed;
- ```
-5. Press **CTRL+S** to save the file.
-
-The same events are available on Adaptive Source manger level as well, which can be used for handling functionality common to all media elements in the app. Each AdaptiveSource includes its own events and all AdaptiveSource events will be cascaded under AdaptiveSourceManager.
-
-### To add Media Element event handlers
-
-1. From Solution Explorer, right click **MainPage.xaml**, and then click **View Code**.
-2. At the end of the **MainPage** class, add the following event handlers:
-
- ```csharp
- # region Media Element Event Handlers
- private void MediaOpened(object sender, RoutedEventArgs e)
- {
-
- txtStatus.Text = "MediaElement opened";
- }
-
- private void MediaFailed(object sender, ExceptionRoutedEventArgs e)
- {
-
- txtStatus.Text= "MediaElement failed: " + e.ErrorMessage;
- }
-
- private void MediaEnded(object sender, RoutedEventArgs e)
- {
-
- txtStatus.Text ="MediaElement ended.";
- }
-
- # endregion Media Element Event Handlers
- ```
-3. At the end of the **MainPage** constructor, add the following code to subscript to the events:
-
- ```csharp
- mediaElement.MediaOpened += MediaOpened;
- mediaElement.MediaEnded += MediaEnded;
- mediaElement.MediaFailed += MediaFailed;
- ```
-4. Press **CTRL+S** to save the file.
-
-### To add slider bar related code
-
-1. From Solution Explorer, right click **MainPage.xaml**, and then click **View Code**.
-2. At the beginning of the file, add the following using statement:
-
- ```csharp
- using Windows.UI.Core;
- ```
-3. Inside the **MainPage** class, add the following data members:
-
- ```csharp
- public static CoreDispatcher _dispatcher;
- private DispatcherTimer sliderPositionUpdateDispatcher;
- ```
-4. At the end of the **MainPage** constructor, add the following code:
-
- ```csharp
- _dispatcher = Window.Current.Dispatcher;
- PointerEventHandler pointerpressedhandler = new PointerEventHandler(sliderProgress_PointerPressed);
- sliderProgress.AddHandler(Control.PointerPressedEvent, pointerpressedhandler, true);
- ```
-5. At the end of the **MainPage** class, add the following code:
-
- ```csharp
- # region sliderMediaPlayer
- private double SliderFrequency(TimeSpan timevalue)
- {
-
- long absvalue = 0;
- double stepfrequency = -1;
-
- if (manifestObject != null)
- {
- absvalue = manifestObject.Duration - (long)manifestObject.StartTime;
- }
- else
- {
- absvalue = mediaElement.NaturalDuration.TimeSpan.Ticks;
- }
-
- TimeSpan totalDVRDuration = new TimeSpan(absvalue);
-
- if (totalDVRDuration.TotalMinutes >= 10 && totalDVRDuration.TotalMinutes < 30)
- {
- stepfrequency = 10;
- }
- else if (totalDVRDuration.TotalMinutes >= 30
- && totalDVRDuration.TotalMinutes < 60)
- {
- stepfrequency = 30;
- }
- else if (totalDVRDuration.TotalHours >= 1)
- {
- stepfrequency = 60;
- }
-
- return stepfrequency;
- }
-
- void updateSliderPositionoNTicks(object sender, object e)
- {
-
- sliderProgress.Value = mediaElement.Position.TotalSeconds;
- }
-
- public void setupTimer()
- {
-
- sliderPositionUpdateDispatcher = new DispatcherTimer();
- sliderPositionUpdateDispatcher.Interval = new TimeSpan(0, 0, 0, 0, 300);
- startTimer();
- }
-
- public void startTimer()
- {
-
- sliderPositionUpdateDispatcher.Tick += updateSliderPositionoNTicks;
- sliderPositionUpdateDispatcher.Start();
- }
-
- // Slider start and end time must be updated in case of live content
- public async void setSliderStartTime(long startTime)
- {
-
- await _dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
- {
- TimeSpan timespan = new TimeSpan(adaptiveSourceStatusUpdate.StartTime);
- double absvalue = (int)Math.Round(timespan.TotalSeconds, MidpointRounding.AwayFromZero);
- sliderProgress.Minimum = absvalue;
- });
- }
-
- // Slider start and end time must be updated in case of live content
- public async void setSliderEndTime(long startTime)
- {
-
- await _dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
- {
- TimeSpan timespan = new TimeSpan(adaptiveSourceStatusUpdate.EndTime);
- double absvalue = (int)Math.Round(timespan.TotalSeconds, MidpointRounding.AwayFromZero);
- sliderProgress.Maximum = absvalue;
- });
- }
-
- # endregion sliderMediaPlayer
- ```
-
- > [!NOTE]
- > CoreDispatcher is used to make changes to the UI thread from non UI Thread. In case of bottleneck on dispatcher thread, developer can choose to use dispatcher provided by UI-element they intend to update. For example:
-
- ```csharp
- await sliderProgress.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () => { TimeSpan
-
- timespan = new TimeSpan(adaptiveSourceStatusUpdate.EndTime);
- double absvalue = (int)Math.Round(timespan.TotalSeconds, MidpointRounding.AwayFromZero);
-
- sliderProgress.Maximum = absvalue; });
- ```
-6. At the end of the **mediaElement_AdaptiveSourceStatusUpdated** method, add the following code:
-
- ```csharp
- setSliderStartTime(args.StartTime);
- setSliderEndTime(args.EndTime);
- ```
-7. At the end of the **MediaOpened** method, add the following code:
-
- ```csharp
- sliderProgress.StepFrequency = SliderFrequency(mediaElement.NaturalDuration.TimeSpan);
- sliderProgress.Width = mediaElement.Width;
- setupTimer();
- ```
-8. Press **CTRL+S** to save the file.
-
-### To compile and test the application
-
-1. Press **F6** to compile the project.
-2. Press **F5** to run the application.
-3. At the top of the application, you can either use the default Smooth Streaming URL or enter a different one.
-4. Click **Set Source**.
-5. Test the slider bar.
-
-You have completed lesson 2. In this lesson you added a slider to application.
-
-## Lesson 3: Select Smooth Streaming Streams
-Smooth Streaming is capable to stream content with multiple language audio tracks that are selectable by the viewers. In this lesson, you will enable viewers to select streams. This lesson contains the following procedures:
-
-1. Modify the XAML file
-2. Modify the code behind file
-3. Compile and test the application
-
-### To modify the XAML file
-
-1. From Solution Explorer, right-click **MainPage.xaml**, and then click **View Designer**.
-2. Locate &lt;Grid.RowDefinitions&gt;, and modify the RowDefinitions so they looks like:
-
- ```xml
- <Grid.RowDefinitions>
- <RowDefinition Height="20"/>
- <RowDefinition Height="50"/>
- <RowDefinition Height="100"/>
- <RowDefinition Height="80"/>
- <RowDefinition Height="50"/>
- </Grid.RowDefinitions>
- ```
-3. Inside the &lt;Grid&gt;&lt;/Grid&gt; tags, add the following code to define a listbox control, so users can see the list of available streams, and select streams:
-
- ```xml
- <Grid Name="gridStreamAndBitrateSelection" Grid.Row="3">
- <Grid.RowDefinitions>
- <RowDefinition Height="300"/>
- </Grid.RowDefinitions>
- <Grid.ColumnDefinitions>
- <ColumnDefinition Width="250*"/>
- <ColumnDefinition Width="250*"/>
- </Grid.ColumnDefinitions>
- <StackPanel Name="spStreamSelection" Grid.Row="1" Grid.Column="0">
- <StackPanel Orientation="Horizontal">
- <TextBlock Name="tbAvailableStreams" Text="Available Streams:" FontSize="16" VerticalAlignment="Center"></TextBlock>
- <Button Name="btnChangeStreams" Content="Submit" Click="btnChangeStream_Click"/>
- </StackPanel>
- <ListBox x:Name="lbAvailableStreams" Height="200" Width="200" Background="Transparent" HorizontalAlignment="Left"
- ScrollViewer.VerticalScrollMode="Enabled" ScrollViewer.VerticalScrollBarVisibility="Visible">
- <ListBox.ItemTemplate>
- <DataTemplate>
- <CheckBox Content="{Binding Name}" IsChecked="{Binding isChecked, Mode=TwoWay}" />
- </DataTemplate>
- </ListBox.ItemTemplate>
- </ListBox>
- </StackPanel>
- </Grid>
- ```
-4. Press **CTRL+S** to save the changes.
-
-### To modify the code behind file
-
-1. From Solution Explorer, right-click **MainPage.xaml**, and then click **View Code**.
-2. Inside the SSPlayer namespace, add a new class:
-
- ```csharp
- #region class Stream
-
- public class Stream
- {
- private IManifestStream stream;
- public bool isCheckedValue;
- public string name;
-
- public string Name
- {
- get { return name; }
- set { name = value; }
- }
-
- public IManifestStream ManifestStream
- {
- get { return stream; }
- set { stream = value; }
- }
-
- public bool isChecked
- {
- get { return isCheckedValue; }
- set
- {
- // mMke the video stream always checked.
- if (stream.Type == MediaStreamType.Video)
- {
- isCheckedValue = true;
- }
- else
- {
- isCheckedValue = value;
- }
- }
- }
-
- public Stream(IManifestStream streamIn)
- {
- stream = streamIn;
- name = stream.Name;
- }
- }
- #endregion class Stream
- ```
-3. At the beginning of the MainPage class, add the following variable definitions:
-
- ```csharp
- private List<Stream> availableStreams;
- private List<Stream> availableAudioStreams;
- private List<Stream> availableTextStreams;
- private List<Stream> availableVideoStreams;
- ```
-4. Inside the MainPage class, add the following region:
- ```csharp
- #region stream selection
- ///<summary>
- ///Functionality to select streams from IManifestStream available streams
- /// </summary>
-
- // This function is called from the mediaElement_ManifestReady event handler
- // to retrieve the streams and populate them to the local data members.
- public void getStreams(Manifest manifestObject)
- {
- availableStreams = new List<Stream>();
- availableVideoStreams = new List<Stream>();
- availableAudioStreams = new List<Stream>();
- availableTextStreams = new List<Stream>();
-
- try
- {
- for (int i = 0; i<manifestObject.AvailableStreams.Count; i++)
- {
- Stream newStream = new Stream(manifestObject.AvailableStreams[i]);
- newStream.isChecked = false;
-
- //populate the stream lists based on the types
- availableStreams.Add(newStream);
-
- switch (newStream.ManifestStream.Type)
- {
- case MediaStreamType.Video:
- availableVideoStreams.Add(newStream);
- break;
- case MediaStreamType.Audio:
- availableAudioStreams.Add(newStream);
- break;
- case MediaStreamType.Text:
- availableTextStreams.Add(newStream);
- break;
- }
-
- // Select the default selected streams from the manifest.
- for (int j = 0; j<manifestObject.SelectedStreams.Count; j++)
- {
- string selectedStreamName = manifestObject.SelectedStreams[j].Name;
- if (selectedStreamName.Equals(newStream.Name))
- {
- newStream.isChecked = true;
- break;
- }
- }
- }
- }
- catch (Exception e)
- {
- txtStatus.Text = "Error: " + e.Message;
- }
- }
-
- // This function set the list box ItemSource
- private async void refreshAvailableStreamsListBoxItemSource()
- {
- try
- {
- //update the stream check box list on the UI
- await _dispatcher.RunAsync(CoreDispatcherPriority.Normal, ()
- => { lbAvailableStreams.ItemsSource = availableStreams; });
- }
- catch (Exception e)
- {
- txtStatus.Text = "Error: " + e.Message;
- }
- }
-
- // This function creates a selected streams list
- private void createSelectedStreamsList(List<IManifestStream> selectedStreams)
- {
- bool isOneVideoSelected = false;
- bool isOneAudioSelected = false;
-
- // Only one video stream can be selected
- for (int j = 0; j<availableVideoStreams.Count; j++)
- {
- if (availableVideoStreams[j].isChecked && (!isOneVideoSelected))
- {
- selectedStreams.Add(availableVideoStreams[j].ManifestStream);
- isOneVideoSelected = true;
- }
- }
-
- // Select the first video stream from the list if no video stream is selected
- if (!isOneVideoSelected)
- {
- availableVideoStreams[0].isChecked = true;
- selectedStreams.Add(availableVideoStreams[0].ManifestStream);
- }
-
- // Only one audio stream can be selected
- for (int j = 0; j<availableAudioStreams.Count; j++)
- {
- if (availableAudioStreams[j].isChecked && (!isOneAudioSelected))
- {
- selectedStreams.Add(availableAudioStreams[j].ManifestStream);
- isOneAudioSelected = true;
- txtStatus.Text = "The audio stream is changed to " + availableAudioStreams[j].ManifestStream.Name;
- }
- }
-
- // Select the first audio stream from the list if no audio steam is selected.
- if (!isOneAudioSelected)
- {
- availableAudioStreams[0].isChecked = true;
- selectedStreams.Add(availableAudioStreams[0].ManifestStream);
- }
-
- // Multiple text streams are supported.
- for (int j = 0; j < availableTextStreams.Count; j++)
- {
- if (availableTextStreams[j].isChecked)
- {
- selectedStreams.Add(availableTextStreams[j].ManifestStream);
- }
- }
- }
-
- // Change streams on a smooth streaming presentation with multiple video streams.
- private async void changeStreams(List<IManifestStream> selectStreams)
- {
- try
- {
- IReadOnlyList<IStreamChangedResult> returnArgs =
- await manifestObject.SelectStreamsAsync(selectStreams);
- }
- catch (Exception e)
- {
- txtStatus.Text = "Error: " + e.Message;
- }
- }
- #endregion stream selection
- ```
-5. Locate the mediaElement_ManifestReady method, append the following code at the end of the function:
- ```csharp
- getStreams(manifestObject);
- refreshAvailableStreamsListBoxItemSource();
- ```
- So when MediaElement manifest is ready, the code gets a list of the available streams, and populates the UI list box with the list.
-6. Inside the MainPage class, locate the UI buttons click events region, and then add the following function definition:
- ```csharp
- private void btnChangeStream_Click(object sender, RoutedEventArgs e)
- {
- List<IManifestStream> selectedStreams = new List<IManifestStream>();
-
- // Create a list of the selected streams
- createSelectedStreamsList(selectedStreams);
-
- // Change streams on the presentation
- changeStreams(selectedStreams);
- }
- ```
-
-### To compile and test the application
-
-1. Press **F6** to compile the project.
-2. Press **F5** to run the application.
-3. At the top of the application, you can either use the default Smooth Streaming URL or enter a different one.
-4. Click **Set Source**.
-5. The default language is audio_eng. Try to switch between audio_eng and audio_es. Every time, you select a new stream, you must click the Submit button.
-
-You have completed lesson 3. In this lesson, you add the functionality to choose streams.
-
-## Lesson 4: Select Smooth Streaming tracks
-
-A Smooth Streaming presentation can contain multiple video files encoded with different quality levels (bit rates) and resolutions. In this lesson, you will enable users to select tracks. This lesson contains the following procedures:
-
-1. Modify the XAML file
-2. Modify the code behind file
-3. Compile and test the application
-
-### To modify the XAML file
-
-1. From Solution Explorer, right-click **MainPage.xaml**, and then click **View Designer**.
-2. Locate the &lt;Grid&gt; tag with the name **gridStreamAndBitrateSelection**, append the following code at the end of the tag:
- ```xml
- <StackPanel Name="spBitRateSelection" Grid.Row="1" Grid.Column="1">
- <StackPanel Orientation="Horizontal">
- <TextBlock Name="tbBitRate" Text="Available Bitrates:" FontSize="16" VerticalAlignment="Center"/>
- <Button Name="btnChangeTracks" Content="Submit" Click="btnChangeTrack_Click" />
- </StackPanel>
- <ListBox x:Name="lbAvailableVideoTracks" Height="200" Width="200" Background="Transparent" HorizontalAlignment="Left"
- ScrollViewer.VerticalScrollMode="Enabled" ScrollViewer.VerticalScrollBarVisibility="Visible">
- <ListBox.ItemTemplate>
- <DataTemplate>
- <CheckBox Content="{Binding Bitrate}" IsChecked="{Binding isChecked, Mode=TwoWay}"/>
- </DataTemplate>
- </ListBox.ItemTemplate>
- </ListBox>
- </StackPanel>
- ```
-3. Press **CTRL+S** to save the changes
-
-### To modify the code behind file
-
-1. From Solution Explorer, right-click **MainPage.xaml**, and then click **View Code**.
-2. Inside the SSPlayer namespace, add a new class:
- ```csharp
- #region class Track
- public class Track
- {
- private IManifestTrack trackInfo;
- public string _bitrate;
- public bool isCheckedValue;
-
- public IManifestTrack TrackInfo
- {
- get { return trackInfo; }
- set { trackInfo = value; }
- }
-
- public string Bitrate
- {
- get { return _bitrate; }
- set { _bitrate = value; }
- }
-
- public bool isChecked
- {
- get { return isCheckedValue; }
- set
- {
- isCheckedValue = value;
- }
- }
-
- public Track(IManifestTrack trackInfoIn)
- {
- trackInfo = trackInfoIn;
- _bitrate = trackInfoIn.Bitrate.ToString();
- }
- //public Track() { }
- }
- #endregion class Track
- ```
-3. At the beginning of the MainPage class, add the following variable definitions:
- ```csharp
- private List<Track> availableTracks;
- ```
-4. Inside the MainPage class, add the following region:
- ```csharp
- #region track selection
- /// <summary>
- /// Functionality to select video streams
- /// </summary>
-
- /// This Function gets the tracks for the selected video stream
- public void getTracks(Manifest manifestObject)
- {
- availableTracks = new List<Track>();
-
- IManifestStream videoStream = getVideoStream();
- IReadOnlyList<IManifestTrack> availableTracksLocal = videoStream.AvailableTracks;
- IReadOnlyList<IManifestTrack> selectedTracksLocal = videoStream.SelectedTracks;
-
- try
- {
- for (int i = 0; i < availableTracksLocal.Count; i++)
- {
- Track thisTrack = new Track(availableTracksLocal[i]);
- thisTrack.isChecked = true;
-
- for (int j = 0; j < selectedTracksLocal.Count; j++)
- {
- string selectedTrackName = selectedTracksLocal[j].Bitrate.ToString();
- if (selectedTrackName.Equals(thisTrack.Bitrate))
- {
- thisTrack.isChecked = true;
- break;
- }
- }
- availableTracks.Add(thisTrack);
- }
- }
- catch (Exception e)
- {
- txtStatus.Text = e.Message;
- }
- }
-
- // This function gets the video stream that is playing
- private IManifestStream getVideoStream()
- {
- IManifestStream videoStream = null;
- for (int i = 0; i < manifestObject.SelectedStreams.Count; i++)
- {
- if (manifestObject.SelectedStreams[i].Type == MediaStreamType.Video)
- {
- videoStream = manifestObject.SelectedStreams[i];
- break;
- }
- }
- return videoStream;
- }
-
- // This function set the UI list box control ItemSource
- private async void refreshAvailableTracksListBoxItemSource()
- {
- try
- {
- // Update the track check box list on the UI
- await _dispatcher.RunAsync(CoreDispatcherPriority.Normal, ()
- => { lbAvailableVideoTracks.ItemsSource = availableTracks; });
- }
- catch (Exception e)
- {
- txtStatus.Text = "Error: " + e.Message;
- }
- }
-
- // This function creates a list of the selected tracks.
- private void createSelectedTracksList(List<IManifestTrack> selectedTracks)
- {
- // Create a list of selected tracks
- for (int j = 0; j < availableTracks.Count; j++)
- {
- if (availableTracks[j].isCheckedValue == true)
- {
- selectedTracks.Add(availableTracks[j].TrackInfo);
- }
- }
- }
-
- // This function selects the tracks based on user selection
- private void changeTracks(List<IManifestTrack> selectedTracks)
- {
- IManifestStream videoStream = getVideoStream();
- try
- {
- videoStream.SelectTracks(selectedTracks);
- }
- catch (Exception ex)
- {
- txtStatus.Text = ex.Message;
- }
- }
- #endregion track selection
- ```
-5. Locate the mediaElement_ManifestReady method, append the following code at the end of the function:
- ```csharp
- getTracks(manifestObject);
- refreshAvailableTracksListBoxItemSource();
- ```
-6. Inside the MainPage class, locate the UI buttons click events region, and then add the following function definition:
- ```csharp
- private void btnChangeStream_Click(object sender, RoutedEventArgs e)
- {
- List<IManifestStream> selectedStreams = new List<IManifestStream>();
-
- // Create a list of the selected streams
- createSelectedStreamsList(selectedStreams);
-
- // Change streams on the presentation
- changeStreams(selectedStreams);
- }
- ```
-
-### To compile and test the application
-
-1. Press **F6** to compile the project.
-2. Press **F5** to run the application.
-3. At the top of the application, you can either use the default Smooth Streaming URL or enter a different one.
-4. Click **Set Source**.
-5. By default, all of the tracks of the video stream are selected. To experiment the bit rate changes, you can select the lowest bit rate available, and then select the highest bit rate available. You must click Submit after each change. You can see the video quality changes.
-
-You have completed lesson 4. In this lesson, you add the functionality to choose tracks.
-
-## Media Services learning paths
--
-## Provide feedback
-
-## Other Resources:
-* [How to build a Smooth Streaming Windows 8 JavaScript application with advanced features](https://blogs.iis.net/cenkd/archive/2012/08/10/how-to-build-a-smooth-streaming-windows-8-javascript-application-with-advanced-features.aspx)
-* [Smooth Streaming Technical Overview](https://www.iis.net/learn/media/on-demand-smooth-streaming/smooth-streaming-technical-overview)
-
-[PlayerApplication]: ./media/media-services-build-smooth-streaming-apps/SSClientWin8-1.png
-[CodeViewPic]: ./media/media-services-build-smooth-streaming-apps/SSClientWin8-2.png
media-services Media Services Castlabs Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-castlabs-integration.md
- Title: Using castLabs to deliver Widevine licenses to Azure Media Services | Microsoft Docs
-description: This article describes how you can use Azure Media Services (AMS) to deliver a stream that is dynamically encrypted by AMS with both PlayReady and Widevine DRMs.
------ Previously updated : 03/10/2021---
-# Using castLabs to deliver Widevine licenses to Azure Media Services
--
-## Overview
-
-This article describes how you can use Azure Media Services (AMS) to deliver a stream that is dynamically encrypted by AMS with both PlayReady and Widevine DRMs. The PlayReady license comes from Media Services PlayReady license server and Widevine license is delivered by **castLabs** license server.
-
-To play back streaming content protected by CENC (PlayReady and/or Widevine), you can use [Azure Media Player](https://aka.ms/azuremediaplayer). See [AMP document](https://amp.azure.net/libs/amp/latest/docs/) for details.
-
-The following diagram demonstrates a high-level Azure Media Services and castLabs integration architecture.
-
-![integration](./media/media-services-castlabs-integration/media-services-castlabs-integration.png)
-
-## Typical system set up
-
-* Media content is stored in AMS.
-* Key IDs of content keys are stored in both castLabs and AMS.
-* castLabs and AMS both have token authentication built in. The following sections discuss authentication tokens.
-* When a client requests to stream the video, the content is dynamically encrypted with **Common Encryption** (CENC) and dynamically packaged by AMS to Smooth Streaming and DASH. We also deliver PlayReady M2TS elementary stream encryption for HLS streaming protocol.
-* PlayReady license is retrieved from AMS license server and Widevine license is retrieved from castLabs license server.
-* Media Player automatically decides which license to fetch based on the client platform capability.
-
-## Authentication token generation for getting a license
-
-Both castLabs and AMS support JWT (JSON Web Token) token format used to authorize a license.
-
-### JWT token in AMS
-
-The following table describes JWT token in AMS.
-
-| Issuer | Issuer string from the chosen Secure Token Service (STS) |
-| | |
-| Audience |Audience string from the used STS |
-| Claims |A set of claims |
-| NotBefore |Start validity of the token |
-| Expires |End validity of the token |
-| SigningCredentials |The key that is shared among PlayReady License Server, castLabs License Server and STS, it could be either symmetric or asymmetric key. |
-
-### JWT token in castLabs
-
-The following table describes JWT token in castLabs.
-
-| Name | Description |
-| | |
-| optData |A JSON string containing information about you. |
-| crt |A JSON string containing information about the asset, its license info and playback rights. |
-| iat |The current datetime in epoch. |
-| jti |A unique identifier about this token (every token can only be used once in the castLabs system). |
-
-## Sample solution setup
-
-The [sample solution](https://github.com/AzureMediaServicesSamples/CastlabsIntegration) consists of two projects:
-
-* A console app that can be used to set DRM restrictions on an already ingested asset, for both PlayReady and Widevine.
-* A Web Application that hands out tokens, which could be seen as a VERY SIMPLIFIED version of an STS.
-
-To use the console application:
-
-1. Change the app.config to setup AMS credentials, castLabs credentials, STS configuration and shared key.
-2. Upload an Asset into AMS.
-3. Get the UUID from the uploaded Asset, and change Line 32 in the Program.cs file:
-
- var objIAsset = _context.Assets.Where(x => x.Id == "nb:cid:UUID:dac53a5d-1500-80bd-b864-f1e4b62594cf").FirstOrDefault();
-4. Use an AssetId for naming the asset in the castLabs system (Line 44 in the Program.cs file).
-
- You must set AssetId for **castLabs**; it needs to be a unique alphanumeric string.
-5. Run the program.
-
-To use the Web Application (STS):
-
-1. Change the web.config to setup castlabs merchant ID, the STS configuration and the shared key.
-2. Deploy to Azure Websites.
-3. Navigate to the website.
-
-## Playing back a video
-
-To play back a video encrypted with common encryption (PlayReady and/or Widevine), you can use the [Azure Media Player](https://aka.ms/azuremediaplayer). When running the console app, the Content Key ID and the Manifest URL are echoed.
-
-1. Open a new tab and launch your STS: http://[yourStsName].azurewebsites.net/api/token/assetid/[yourCastLabsAssetId]/contentkeyid/[thecontentkeyid].
-2. Go to [Azure Media Player](https://aka.ms/azuremediaplayer).
-3. Paste in the streaming URL.
-4. Click the **Advanced Options** checkbox.
-5. In the **Protection** dropdown, select PlayReady and/or Widevine.
-6. Paste the token that you got from your STS in the Token textbox.
-
- The castLab license server does not need the ΓÇ£Bearer=ΓÇ¥ prefix in front of the token. So please remove that before submitting the token.
-7. Update the player.
-8. The video should be playing.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Cenc With Multidrm Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-cenc-with-multidrm-access-control.md
- Title: Design of a content protection system with access control using Azure Media Services | Microsoft Docs
-description: Learn about how to license the Microsoft Smooth Streaming Client Porting Kit.
------ Previously updated : 03/10/2021----
-# Design of a content protection system with access control using Azure Media Services
--
-## Overview
-
-Designing and building a digital rights management (DRM) subsystem for an over-the-top (OTT) or online streaming solution is a complex task. Operators/online video providers typically outsource this task to specialized DRM service providers. The goal of this document is to present a reference design and implementation of an end-to-end DRM subsystem in an OTT or online streaming solution.
-
-The targeted readers for this document are engineers who work in DRM subsystems of OTT or online streaming/multiscreen solutions or readers who are interested in DRM subsystems. The assumption is that readers are familiar with at least one of the DRM technologies on the market, such as PlayReady, Widevine, FairPlay, or Adobe Access.
-
-In this discussion of DRM, we also include common encryption (CENC) with multi-DRM. A major trend in online streaming and the OTT industry is to use CENC with multi-native DRM on various client platforms. This trend is a shift from the previous one that used a single DRM and its client SDK for various client platforms. When you use CENC with multi-native DRM, both PlayReady and Widevine are encrypted per the [Common Encryption (ISO/IEC 23001-7 CENC)](https://www.iso.org/iso/home/store/catalogue_ics/catalogue_detail_ics.htm?csnumber=65271/) specification.
-
-The benefits of CENC with multi-DRM are that it:
-
-* Reduces encryption cost because a single encryption process is used to target different platforms with its native DRMs.
-* Reduces the cost of managing encrypted assets because only a single copy of encrypted assets is needed.
-* Eliminates DRM client licensing cost because the native DRM client is usually free on its native platform.
-
-Microsoft is an active promoter of DASH and CENC together with some major industry players. Azure Media Services provides support for DASH and CENC. For recent announcements, see the following blogs:
-
-* [Announcing Google Widevine license delivery services in Azure Media Services](https://azure.microsoft.com/blog/announcing-general-availability-of-google-widevine-license-services/)
-* [Azure Media Services adds Google Widevine packaging for delivering a multi-DRM stream](https://azure.microsoft.com/blog/azure-media-services-adds-google-widevine-packaging-for-delivering-multi-drm-stream/)
-
-### Goals of the article
-
-The goals of this article are to:
-
-* Provide a reference design of a DRM subsystem that uses CENC with multi-DRM.
-* Provide a reference implementation on an Azure/Media Services platform.
-* Discuss some design and implementation topics.
-
-In the article, the term "multi-DRM" covers the following products:
-
-* Microsoft PlayReady
-* Google Widevine
-* Apple FairPlay
-
-The following table summarizes the native platform/native app and browsers supported by each DRM.
-
-| **Client platform** | **Native DRM support** | **Browser/app** | **Streaming formats** |
-| | | | |
-| **Smart TVs, operator STBs, OTT STBs** |PlayReady primarily, and/or Widevine, and/or other |Linux, Opera, WebKit, other |Various formats |
-| **Windows 10 devices (Windows PC, Windows tablets, Windows Phone, Xbox)** |PlayReady |Microsoft Edge/IE11/EME<br/><br/><br/>Universal Windows Platform |DASH (for HLS, PlayReady isn't supported)<br/><br/>DASH, Smooth Streaming (for HLS, PlayReady isn't supported) |
-| **Android devices (phone, tablet, TV)** |Widevine |Chrome/EME |DASH, HLS |
-| **iOS (iPhone, iPad), OS X clients and Apple TV** |FairPlay |Safari 8+/EME |HLS |
-
-Considering the current state of deployment for each DRM, a service typically wants to implement two or three DRMs to make sure you address all the types of endpoints in the best way.
-
-There is a tradeoff between the complexity of the service logic and the complexity on the client side to reach a certain level of user experience on the various clients.
-
-To make your selection, keep in mind:
-
-* PlayReady is natively implemented in every Windows device, on some Android devices, and available through software SDKs on virtually any platform.
-* Widevine is natively implemented in every Android device, in Chrome, and in some other devices.
-* FairPlay is available only on iOS and Mac OS clients or through iTunes.
-
-There are two options for a typical multi-DRM:
-
-* PlayReady and Widevine
-* PlayReady, Widevine, and FairPlay
-
-## A reference design
-This section presents a reference design that is agnostic to the technologies used to implement it.
-
-A DRM subsystem can contain the following components:
-
-* Key management
-* DRM packaging
-* DRM license delivery
-* Entitlement check
-* Authentication/authorization
-* Player
-* Origin/content delivery network (CDN)
-
-The following diagram illustrates the high-level interaction among the components in a DRM subsystem:
-
-![DRM subsystem with CENC](./media/media-services-cenc-with-multidrm-access-control/media-services-generic-drm-subsystem-with-cenc.png)
-
-The design has three basic layers:
-
-* A back-office layer (black) is not exposed externally.
-* A DMZ layer (dark blue) contains all the endpoints that face the public.
-* A public internet layer (light blue) contains the CDN and players with traffic across the public internet.
-
-There also should be a content management tool to control DRM protection, regardless of whether it's static or dynamic encryption. The inputs for DRM encryption include:
-
-* MBR video content
-* Content key
-* License acquisition URLs
-
-Here's the high-level flow during playback time:
-
-* The user is authenticated.
-* An authorization token is created for the user.
-* DRM protected content (manifest) is downloaded to the player.
-* The player submits a license acquisition request to license servers together with a key ID and an authorization token.
-
-The following section discusses the design of key management.
-
-| **ContentKey-to-asset** | **Scenario** |
-| | |
-| 1-to-1 |The simplest case. It provides the finest control. But this arrangement generally results in the highest license delivery cost. At minimum, one license request is required for each protected asset. |
-| 1-to-many |You could use the same content key for multiple assets. For example, for all the assets in a logical group, such as a genre or the subset of a genre (or movie gene), you can use a single content key. |
-| Many-to-1 |Multiple content keys are needed for each asset. <br/><br/>For example, if you need to apply dynamic CENC protection with multi-DRM for MPEG-DASH and dynamic AES-128 encryption for HLS, you need two separate content keys. Each content key needs its own ContentKeyType. (For the content key used for dynamic CENC protection, use ContentKeyType.CommonEncryption. For the content key used for dynamic AES-128 encryption, use ContentKeyType.EnvelopeEncryption.)<br/><br/>As another example, in CENC protection of DASH content, in theory, you can use one content key to protect the video stream and another content key to protect the audio stream. |
-| Many-to-many |Combination of the previous two scenarios. One set of content keys is used for each of the multiple assets in the same asset group. |
-
-Another important factor to consider is the use of persistent and nonpersistent licenses.
-
-Why are these considerations important?
-
-If you use a public cloud for license delivery, persistent and nonpersistent licenses have a direct impact on license delivery cost. The following two different design cases serve to illustrate:
-
-* Monthly subscription: Use a persistent license and 1-to-many content key-to-asset mapping. For example, for all the kids' movies, we use a single content key for encryption. In this case:
-
- Total number of licenses requested for all kids' movies/device = 1
-
-* Monthly subscription: Use a nonpersistent license and 1-to-1 mapping between content key and asset. In this case:
-
- Total number of licenses requested for all kids' movies/device = [number of movies watched] x [number of sessions]
-
-The two different designs result in very different license request patterns. The different patterns result in different license delivery cost if license delivery service is provided by a public cloud such as Media Services.
-
-## Map design to technology for implementation
-Next, the generic design is mapped to technologies on the Azure/Media Services platform by specifying which technology to use for each building block.
-
-The following table shows the mapping.
-
-| **Building block** | **Technology** |
-| | |
-| **Player** |[Azure Media Player](https://azure.microsoft.com/services/media-services/media-player/) |
-| **Identity provider (IDP)** |Azure Active Directory (Azure AD) |
-| **Security token service (STS)** |Azure AD |
-| **DRM protection workflow** |Media Services dynamic protection |
-| **DRM license delivery** |* Media Services license delivery (PlayReady, Widevine, FairPlay) <br/>* Axinom license server <br/>* Custom PlayReady license server |
-| **Origin** |Media Services streaming endpoint |
-| **Key management** |Not needed for reference implementation |
-| **Content management** |A C# console application |
-
-In other words, both IDP and STS are used with Azure AD. The [Azure Media Player API](https://amp.azure.net/libs/amp/latest/docs/) is used for the player. Both Media Services and Media Player support DASH and CENC with multi-DRM.
-
-The following diagram shows the overall structure and flow with the previous technology mapping:
-
-![CENC on Media Services](./media/media-services-cenc-with-multidrm-access-control/media-services-cenc-subsystem-on-AMS-platform.png)
-
-To set up dynamic CENC encryption, the content management tool uses the following inputs:
-
-* Open content
-* Content key from key generation/management
-* License acquisition URLs
-* A list of information from Azure AD
-
-Here's the output of the content management tool:
-
-* ContentKeyAuthorizationPolicy contains the specification on how license delivery verifies a JSON Web Token (JWT) and DRM license specifications.
-* AssetDeliveryPolicy contains specifications on streaming format, DRM protection, and license acquisition URLs.
-
-Here's the flow during runtime:
-
-* Upon user authentication, a JWT is generated.
-* One of the claims contained in the JWT is a groups claim that contains the group object ID EntitledUserGroup. This claim is used to pass the entitlement check.
-* The player downloads the client manifest of CENC-protected content and identifies the following:
- * Key ID.
- * The content is CENC protected.
- * License acquisition URLs.
-* The player makes a license acquisition request based on the browser/DRM supported. In the license acquisition request, the key ID and the JWT are also submitted. The license delivery service verifies the JWT and the claims contained before it issues the needed license.
-
-## Implementation
-### Implementation procedures
-Implementation includes the following steps:
-
-1. Prepare test assets. Encode/package a test video to multi-bitrate fragmented MP4 in Media Services. This asset is *not* DRM protected. DRM protection is done by dynamic protection later.
-
-2. Create a key ID and a content key (optionally from a key seed). In this instance, the key management system isn't needed because only a single key ID and content key are required for a couple of test assets.
-
-3. Use the Media Services API to configure multi-DRM license delivery services for the test asset. If you use custom license servers by your company or your company's vendors instead of license services in Media Services, you can skip this step. You can specify license acquisition URLs in the step when you configure license delivery. The Media Services API is needed to specify some detailed configurations, such as authorization policy restriction and license response templates for different DRM license services. At this time, the Azure portal doesn't provide the needed UI for this configuration. For API-level information and sample code, see [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md).
-
-4. Use the Media Services API to configure the asset delivery policy for the test asset. For API-level information and sample code, see [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md).
-
-5. Create and configure an Azure AD tenant in Azure.
-
-6. Create a few user accounts and groups in your Azure AD tenant. Create at least an "Entitled User" group, and add a user to this group. Users in this group pass the entitlement check in license acquisition. Users not in this group fail to pass the authentication check and can't acquire a license. Membership in this "Entitled User" group is a required groups claim in the JWT issued by Azure AD. You specify this claim requirement in the step when you configure multi-DRM license delivery services.
-
-7. Create an ASP.NET MVC app to host your video player. This ASP.NET app is protected with user authentication against the Azure AD tenant. Proper claims are included in the access tokens obtained after user authentication. We recommend OpenID Connect API for this step. Install the following NuGet packages:
-
- * Install-Package Microsoft.Azure.ActiveDirectory.GraphClient
- * Install-Package Microsoft.Owin.Security.OpenIdConnect
- * Install-Package Microsoft.Owin.Security.Cookies
- * Install-Package Microsoft.Owin.Host.SystemWeb
- * Install-Package Microsoft.IdentityModel.Clients.ActiveDirectory
-
-8. Create a player by using the [Azure Media Player API](https://amp.azure.net/libs/amp/latest/docs/). Use the [Azure Media Player ProtectionInfo API](https://amp.azure.net/libs/amp/latest/docs/) to specify which DRM technology to use on different DRM platforms.
-
-9. The following table shows the test matrix.
-
- | **DRM** | **Browser** | **Result for entitled user** | **Result for unentitled user** |
- | | | | |
- | **PlayReady** |Microsoft Edge or Internet Explorer 11 on Windows 10 |Succeed |Fail |
- | **Widevine** |Chrome, Firefox, Opera |Succeed |Fail |
- | **FairPlay** |Safari on macOS |Succeed |Fail |
- | **AES-128** |Most modern browsers |Succeed |Fail |
-
-For information on how to set up Azure AD for an ASP.NET MVC player app, see [Integrate an Azure Media Services OWIN MVC-based app with Azure Active Directory and restrict content key delivery based on JWT claims](http://gtrifonov.com/2015/01/24/mvc-owin-azure-media-services-ad-integration/).
-
-For more information, see [JWT token authentication in Azure Media Services and dynamic encryption](http://gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/).
-
-For information on Azure AD:
-
-* You can find developer information in the [Azure Active Directory developer's guide](../../active-directory/azuread-dev/v1-overview.md).
-* You can find administrator information in [Administer your Azure AD tenant directory](../../active-directory/fundamentals/active-directory-whatis.md).
-
-### Some issues in implementation
-Use the following troubleshooting information for help with implementation issues.
-
-* The issuer URL must end with "/". The audience must be the player application client ID. Also, add "/" at the end of the issuer URL.
-
- ```xml
- <add key="ida:audience" value="[Application Client ID GUID]" />
- <add key="ida:issuer" value="https://sts.windows.net/[AAD Tenant ID]/" />
- ```
-
- In the [JWT Decoder](http://jwt.calebb.net/), you see **aud** and **iss**, as shown in the JWT:
-
- ![JWT](./media/media-services-cenc-with-multidrm-access-control/media-services-1st-gotcha.png)
-
-* Add permissions to the application in Azure AD on the **Configure** tab of the application. Permissions are required for each application, both local and deployed versions.
-
- ![Permissions](./media/media-services-cenc-with-multidrm-access-control/media-services-perms-to-other-apps.png)
-
-* Use the correct issuer when you set up dynamic CENC protection.
-
- ```xml
- <add key="ida:issuer" value="https://sts.windows.net/[AAD Tenant ID]/"/>
- ```
-
- The following doesn't work:
-
- ```xml
- <add key="ida:issuer" value="https://willzhanad.onmicrosoft.com/" />
- ```
-
- The GUID is the Azure AD tenant ID. The GUID can be found in the **Endpoints** pop-up menu in the Azure portal.
-
-* Grant group membership claims privileges. Make sure the following is in the Azure AD application manifest file:
-
- "groupMembershipClaims": "All" (the default value is null)
-
-* Set the proper TokenType when you create restriction requirements.
-
- ```csharp
- objTokenRestrictionTemplate.TokenType = TokenType.JWT;
- ```
-
- Because you add support for JWT (Azure AD) in addition to SWT (ACS), the default TokenType is TokenType.JWT. If you use SWT/ACS, you must set the token to TokenType.SWT.
-
-## Additional topics for implementation
-This section discusses some additional topics in design and implementation.
-
-### HTTP or HTTPS?
-The ASP.NET MVC player application must support the following:
-
-* User authentication through Azure AD, which is under HTTPS.
-* JWT exchange between the client and Azure AD, which is under HTTPS.
-* DRM license acquisition by the client, which must be under HTTPS if license delivery is provided by Media Services. The PlayReady product suite doesn't mandate HTTPS for license delivery. If your PlayReady license server is outside Media Services, you can use either HTTP or HTTPS.
-
-The ASP.NET player application uses HTTPS as a best practice, so Media Player is on a page under HTTPS. However, HTTP is preferred for streaming, so you need to consider the issue of mixed content.
-
-* The browser doesn't allow mixed content. But plug-ins like Silverlight and the OSMF plug-in for smooth and DASH do allow it. Mixed content is a security concern because of the threat of the ability to inject malicious JavaScript, which can cause customer data to be at risk. Browsers block this capability by default. The only way to work around it is on the server (origin) side by allowing all domains (regardless of HTTPS or HTTP). This is probably not a good idea either.
-* Avoid mixed content. Both the player application and Media Player should use HTTP or HTTPS. When playing mixed content, the silverlightSS tech requires clearing a mixed-content warning. The flashSS tech handles mixed content without a mixed-content warning.
-* If your streaming endpoint was created before August 2014, it won't support HTTPS. In this case, create and use a new streaming endpoint for HTTPS.
-
-In the reference implementation for DRM-protected contents, both the application and streaming are under HTTPS. For open contents, the player doesn't need authentication or a license, so you can use either HTTP or HTTPS.
-
-### Azure Active Directory signing key rollover
-Signing key rollover is an important point to take into consideration in your implementation. If you ignore it, the finished system eventually stops working completely, within six weeks at the most.
-
-Azure AD uses industry standards to establish trust between itself and applications that use Azure AD. Specifically, Azure AD uses a signing key that consists of a public and private key pair. When Azure AD creates a security token that contains information about the user, it's signed by Azure AD with a private key before it's sent back to the application. To verify that the token is valid and originated from Azure AD, the application must validate the token's signature. The application uses the public key exposed by Azure AD that is contained in the tenant's federation metadata document. This public key, and the signing key from which it derives, is the same one used for all tenants in Azure AD.
-
-For more information on Azure AD key rollover, see [Important information about signing key rollover in Azure AD](../../active-directory/develop/active-directory-signing-key-rollover.md).
-
-Between the [public-private key pair](https://login.microsoftonline.com/common/discovery/keys/):
-
-* The private key is used by Azure AD to generate a JWT.
-* The public key is used by an application such as DRM license delivery services in Media Services to verify the JWT.
-
-For security purposes, Azure AD rotates the certificate periodically (every six weeks). In the case of security breaches, the key rollover can occur any time. Therefore, the license delivery services in Media Services need to update the public key used as Azure AD rotates the key pair. Otherwise, token authentication in Media Services fails and no license is issued.
-
-To set up this service, you set TokenRestrictionTemplate.OpenIdConnectDiscoveryDocument when you configure DRM license delivery services.
-
-Here's the JWT flow:
-
-* Azure AD issues the JWT with the current private key for an authenticated user.
-* When a player sees a CENC with multi-DRM protected content, it first locates the JWT issued by Azure AD.
-* The player sends a license acquisition request with the JWT to license delivery services in Media Services.
-* The license delivery services in Media Services use the current/valid public key from Azure AD to verify the JWT before issuing licenses.
-
-DRM license delivery services always check for the current/valid public key from Azure AD. The public key presented by Azure AD is the key used to verify a JWT issued by Azure AD.
-
-What if the key rollover happens after Azure AD generates a JWT but before the JWT is sent by players to DRM license delivery services in Media Services for verification?
-
-Because a key can be rolled over at any moment, more than one valid public key is always available in the federation metadata document. Media Services license delivery can use any of the keys specified in the document. Because one key might be rolled soon, another might be its replacement, and so forth.
-
-### Where is the access token?
-If you look at how a web app calls an API app under [Application identity with OAuth 2.0 client credentials grant](../../active-directory/azuread-dev/web-api.md), the authentication flow is as follows:
-
-* A user signs in to Azure AD in the web application. For more information, see [Web browser to web application](../../active-directory/azuread-dev/web-app.md).
-* The Azure AD authorization endpoint redirects the user agent back to the client application with an authorization code. The user agent returns the authorization code to the client application's redirect URI.
-* The web application needs to acquire an access token so that it can authenticate to the web API and retrieve the desired resource. It makes a request to the Azure AD token endpoint and provides the credential, client ID, and web API's application ID URI. It presents the authorization code to prove that the user consented.
-* Azure AD authenticates the application and returns a JWT access token that's used to call the web API.
-* Over HTTPS, the web application uses the returned JWT access token to add the JWT string with a "Bearer" designation in the "Authorization" header of the request to the web API. The web API then validates the JWT. If validation is successful, it returns the desired resource.
-
-In this application identity flow, the web API trusts that the web application authenticated the user. For this reason, this pattern is called a trusted subsystem. The [authorization flow diagram](../../active-directory/azuread-dev/v1-protocols-oauth-code.md) describes how authorization-code-grant flow works.
-
-License acquisition with token restriction follows the same trusted subsystem pattern. The license delivery service in Media Services is the web API resource, or the "back-end resource" that a web application needs to access. So where is the access token?
-
-The access token is obtained from Azure AD. After successful user authentication, an authorization code is returned. The authorization code is then used, together with the client ID and the app key, to exchange for the access token. The access token is used to access a "pointer" application that points to or represents the Media Services license delivery service.
-
-To register and configure the pointer app in Azure AD, take the following steps:
-
-1. In the Azure AD tenant:
-
- * Add an application (resource) with the sign-on URL https://[resource_name].azurewebsites.net/.
- * Add an app ID with the URL https://[aad_tenant_name].onmicrosoft.com/[resource_name].
-
-2. Add a new key for the resource app.
-
-3. Update the app manifest file so that the groupMembershipClaims property has the value "groupMembershipClaims": "All".
-
-4. In the Azure AD app that points to the player web app, in the section **Permissions to other applications**, add the resource app that was added in step 1. Under **Delegated permission**, select **Access [resource_name]**. This option gives the web app permission to create access tokens that access the resource app. Do this for both the local and deployed version of the web app if you develop with Visual Studio and the Azure web app.
-
-The JWT issued by Azure AD is the access token used to access the pointer resource.
-
-### What about live streaming?
-The previous discussion focused on on-demand assets. What about live streaming?
-
-You can use exactly the same design and implementation to protect live streaming in Media Services by treating the asset associated with a program as a VOD asset.
-
-Specifically, to do live streaming in Media Services, you need to create a channel and then create a program under the channel. To create the program, you need to create an asset that contains the live archive for the program. To provide CENC with multi-DRM protection of the live content, apply the same setup/processing to the asset as if it were a VOD asset before you start the program.
-
-### What about license servers outside Media Services?
-Often, customers invested in a license server farm either in their own data center or one hosted by DRM service providers. With Media Services Content Protection, you can operate in hybrid mode. Contents can be hosted and dynamically protected in Media Services, while DRM licenses are delivered by servers outside Media Services. In this case, consider the following changes:
-
-* STS needs to issue tokens that are acceptable and can be verified by the license server farm. For example, the Widevine license servers provided by Axinom require a specific JWT that contains an entitlement message. Therefore, you need to have an STS to issue such a JWT.
-* You no longer need to configure license delivery service (ContentKeyAuthorizationPolicy) in Media Services. You need to provide the license acquisition URLs (for PlayReady, Widevine, and FairPlay) when you configure AssetDeliveryPolicy to set up CENC with multi-DRM.
-
-### What if I want to use a custom STS?
-A customer might choose to use a custom STS to provide JWTs. Reasons include:
-
-* The IDP used by the customer doesn't support STS. In this case, a custom STS might be an option.
-* The customer might need more flexible or tighter control to integrate STS with the customer's subscriber billing system. For example, an MVPD operator might offer multiple OTT subscriber packages, such as premium, basic, and sports. The operator might want to match the claims in a token with a subscriber's package so that only the contents in a specific package are made available. In this case, a custom STS provides the needed flexibility and control.
-
-When you use a custom STS, two changes must be made:
-
-* When you configure license delivery service for an asset, you need to specify the security key used for verification by the custom STS instead of the current key from Azure AD. (More details follow.)
-* When a JTW token is generated, a security key is specified instead of the private key of the current X509 certificate in Azure AD.
-
-There are two types of security keys:
-
-* Symmetric key: The same key is used to generate and to verify a JWT.
-* Asymmetric key: A public-private key pair in an X509 certificate is used with a private key to encrypt/generate a JWT and with the public key to verify the token.
-
-> [!NOTE]
-> If you use .NET Framework/C# as your development platform, the X509 certificate used for an asymmetric security key must have a key length of at least 2048. This is a requirement of the class System.IdentityModel.Tokens.X509AsymmetricSecurityKey in .NET Framework. Otherwise, the following exception is thrown:
->
-> IDX10630: The 'System.IdentityModel.Tokens.X509AsymmetricSecurityKey' for signing cannot be smaller than '2048' bits.
-
-## The completed system and test
-This section walks you through the following scenarios in the completed end-to-end system so that you can have a basic picture of the behavior before you get a sign-in account:
-
-* If you need a non-integrated scenario:
-
- * For video assets hosted in Media Services that are either unprotected or DRM protected but without token authentication (issuing a license to whoever requested it), you can test it without signing in. Switch to HTTP if your video streaming is over HTTP.
-
-* If you need an end-to-end integrated scenario:
-
- * For video assets under dynamic DRM protection in Media Services, with the token authentication and JWT generated by Azure AD, you need to sign in.
-
-For the player web application and its sign-in, see [this website](https://openidconnectweb.azurewebsites.net/).
-
-### User sign-in
-To test the end-to-end integrated DRM system, you need to have an account created or added.
-
-What account?
-
-Although Azure originally allowed access only by Microsoft account users, access is now allowed by users from both systems. All Azure properties now trust Azure AD for authentication, and Azure AD authenticates organizational users. A federation relationship was created where Azure AD trusts the Microsoft account consumer identity system to authenticate consumer users. As a result, Azure AD can authenticate guest Microsoft accounts as well as native Azure AD accounts.
-
-Because Azure AD trusts the Microsoft account domain, you can add any accounts from any of the following domains to the custom Azure AD tenant and use the account to sign in:
-
-| **Domain name** | **Domain** |
-| | |
-| **Custom Azure AD tenant domain** |somename.onmicrosoft.com |
-| **Corporate domain** |microsoft.com |
-| **Microsoft account domain** |outlook.com, live.com, hotmail.com |
-
-You can contact any of the authors to have an account created or added for you.
-
-The following screenshots show different sign-in pages used by different domain accounts:
-
-**Custom Azure AD tenant domain account**: The customized sign-in page of the custom Azure AD tenant domain.
-
-![Screenshot that shows the customized sign-in page of the custom Azure A D tenant domain.](./media/media-services-cenc-with-multidrm-access-control/media-services-ad-tenant-domain1.png)
-
-**Microsoft domain account with smart card**: The sign-in page customized by Microsoft corporate IT with two-factor authentication.
-
-![Screenshot that shows the sign-in page customized by Microsoft corporate I T with two-factor authentication.](./media/media-services-cenc-with-multidrm-access-control/media-services-ad-tenant-domain2.png)
-
-**Microsoft account**: The sign-in page of the Microsoft account for consumers.
-
-![Custom Azure AD tenant domain account](./media/media-services-cenc-with-multidrm-access-control/media-services-ad-tenant-domain3.png)
-
-### Use Encrypted Media Extensions for PlayReady
-On a modern browser with Encrypted Media Extensions (EME) for PlayReady support, such as Internet Explorer 11 on Windows 8.1 or later and Microsoft Edge browser on Windows 10, PlayReady is the underlying DRM for EME.
-
-![Use EME for PlayReady](./media/media-services-cenc-with-multidrm-access-control/media-services-eme-for-playready1.png)
-
-The dark player area is because PlayReady protection prevents you from making a screen capture of protected video.
-
-The following screenshot shows the player plug-ins and Microsoft Security Essentials (MSE)/EME support:
-
-![Player plug-ins for PlayReady](./media/media-services-cenc-with-multidrm-access-control/media-services-eme-for-playready2.png)
-
-EME in Microsoft Edge and Internet Explorer 11 on Windows 10 allows [PlayReady SL3000](https://www.microsoft.com/playready/features/EnhancedContentProtection.aspx/) to be invoked on Windows 10 devices that support it. PlayReady SL3000 unlocks the flow of enhanced premium content (4K, HDR) and new content delivery models (for enhanced content).
-
-To focus on the Windows devices, PlayReady is the only DRM in the hardware available on Windows devices (PlayReady SL3000). A streaming service can use PlayReady through EME or through a Universal Windows Platform application and offer a higher video quality by using PlayReady SL3000 than another DRM. Typically, content up to 2K flows through Chrome or Firefox, and content up to 4K flows through Microsoft Edge/Internet Explorer 11 or a Universal Windows Platform application on the same device. The amount depends on service settings and implementation.
-
-#### Use EME for Widevine
-On a modern browser with EME/Widevine support, such as Chrome 41+ on Windows 10, Windows 8.1, Mac OSX Yosemite, and Chrome on Android 4.4.4, Google Widevine is the DRM behind EME.
-
-![Use EME for Widevine](./media/media-services-cenc-with-multidrm-access-control/media-services-eme-for-widevine1.png)
-
-Widevine doesn't prevent you from making a screen capture of protected video.
-
-![Player plug-ins for Widevine](./media/media-services-cenc-with-multidrm-access-control/media-services-eme-for-widevine2.png)
-
-### Unentitled users
-If a user isn't a member of the "Entitled Users" group, the user doesn't pass the entitlement check. The multi-DRM license service then refuses to issue the requested license as shown. The detailed description is "License acquire failed," which is as designed.
-
-![Unentitled users](./media/media-services-cenc-with-multidrm-access-control/media-services-unentitledusers.png)
-
-### Run a custom security token service
-If you run a custom STS, the JWT is issued by the custom STS by using either a symmetric or an asymmetric key.
-
-The following screenshot shows a scenario that uses a symmetric key (using Chrome):
-
-![Custom STS with a symmetric key](./media/media-services-cenc-with-multidrm-access-control/media-services-running-sts1.png)
-
-The following screenshot shows a scenario that uses an asymmetric key via an X509 certificate (using a Microsoft modern browser):
-
-![Custom STS with an asymmetric key](./media/media-services-cenc-with-multidrm-access-control/media-services-running-sts2.png)
-
-In both of the previous cases, user authentication stays the same. It takes place through Azure AD. The only difference is that JWTs are issued by the custom STS instead of Azure AD. When you configure dynamic CENC protection, the license delivery service restriction specifies the type of JWT, either a symmetric or an asymmetric key.
-
-## Summary
-
-This document discussed CENC with multi-native DRM and access control via token authentication, its design, and its implementation by using Azure, Media Services, and Media Player.
-
-* A reference design was presented that contains all the necessary components in a DRM/CENC subsystem.
-* A reference implementation was presented on Azure, Media Services, and Media Player.
-* Some topics directly involved in the design and implementation were also discussed.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Check Job Progress https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-check-job-progress.md
- Title: Monitor Job Progress using .NET
-description: Learn how to use event handler code to track job progress and send status updates. The code sample is written in C# and uses the Media Services SDK for .NET.
------ Previously updated : 03/10/2021----
-# Monitor Job Progress using .NET
---
-When you run jobs, you often require a way to track job progress. You can check the progress by defining a StateChanged event handler (as described in this topic) or using Azure Queue storage to monitor Media Services job notifications (as described in [this](media-services-dotnet-check-job-progress-with-queues.md) topic).
-
-## Define StateChanged event handler to monitor job progress
-
-The following code example defines the StateChanged event handler. This event handler tracks job progress and provides updated status, depending on the state. The code also defines the LogJobStop method. This helper method logs error details.
-
-```csharp
- private static void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("********************");
- Console.WriteLine("Job is finished.");
- Console.WriteLine("Please wait while local tasks or downloads complete...");
- Console.WriteLine("********************");
- Console.WriteLine();
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- LogJobStop(job.Id);
- break;
- default:
- break;
- }
- }
-
- private static void LogJobStop(string jobId)
- {
- StringBuilder builder = new StringBuilder();
- IJob job = GetJob(jobId);
-
- builder.AppendLine("\nThe job stopped due to cancellation or an error.");
- builder.AppendLine("***************************");
- builder.AppendLine("Job ID: " + job.Id);
- builder.AppendLine("Job Name: " + job.Name);
- builder.AppendLine("Job State: " + job.State.ToString());
- builder.AppendLine("Job started (server UTC time): " + job.StartTime.ToString());
- builder.AppendLine("Media Services account name: " + _accountName);
- builder.AppendLine("Media Services account location: " + _accountLocation);
- // Log job errors if they exist.
- if (job.State == JobState.Error)
- {
- builder.Append("Error Details: \n");
- foreach (ITask task in job.Tasks)
- {
- foreach (ErrorDetail detail in task.ErrorDetails)
- {
- builder.AppendLine(" Task Id: " + task.Id);
- builder.AppendLine(" Error Code: " + detail.Code);
- builder.AppendLine(" Error Message: " + detail.Message + "\n");
- }
- }
- }
- builder.AppendLine("***************************\n");
- // Write the output to a local file and to the console. The template
- // for an error output file is: JobStop-{JobId}.txt
- string outputFile = _outputFilesFolder + @"\JobStop-" + JobIdAsFileName(job.Id) + ".txt";
- WriteToFile(outputFile, builder.ToString());
- Console.Write(builder.ToString());
- }
-
- private static string JobIdAsFileName(string jobID)
- {
- return jobID.Replace(":", "_");
- }
-```
--
-## Next step
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Cli Create And Configure Aad App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-cli-create-and-configure-aad-app.md
- Title: Use Azure CLI to create an Azure AD app and configure it to access Azure Media Services API | Microsoft Docs
-description: This topic shows how to use the Azure CLI to create an Azure AD app and configure it to access Azure Media Services API.
------ Previously updated : 03/10/2021---
-# Use Azure CLI to create an Azure AD app and configure it to access Media Services API
---
-This topic shows you how to use the Azure CLI to create an Azure Active Directory (Azure AD) application and service principal to access Azure Media Services resources.
-
-## Prerequisites
--- An Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/). -- A Media Services account. For more information, see [Create an Azure Media Services account using the Azure portal](media-services-portal-create-account.md).-
-## Use the Azure Cloud Shell
-
-1. Sign in to the [Azure portal](https://portal.azure.com/).
-2. Launch the Cloud Shell from the upper navigation pane of the portal.
-
- ![Cloud Shell](./media/media-services-cli-create-and-configure-aad-app/media-services-cli-create-and-configure-aad-app01.png)
-
-For more information, see [Overview of Azure Cloud Shell](../../cloud-shell/overview.md).
-
-## Create an Azure AD app and configure access to the media account with Azure CLI
-
-```azurecli
-az login
-az ad sp create-for-rbac --name <appName> --role Contributor
-az role assignment create --assignee < user/app id> --role Contributor --scope <subscription/subscription id>
-```
-
-For example:
-
-```azurecli
-az role assignment create --assignee a3e068fa-f739-44e5-ba4d-ad57866e25a1 --role Contributor --scope /subscriptions/0b65e280-7917-4874-9fed-1307f2615ea2/resourceGroups/Default-AzureBatch-SouthCentralUS/providers/microsoft.media/mediaservices/sbbash
-```
-
-In this example, the **scope** is the full resource path for the media services account. However, the **scope** can be at any level.
-
-For example, it could be one of the following levels:
-
-* The **subscription** level.
-* The **resource group** level.
-* The **resource** level (for example, a Media account).
-
-For more information, see [Create an Azure service principal with the Azure CLI](/cli/azure/create-an-azure-service-principal-azure-cli)
-
-Also see [Add or remove Azure role assignments using Azure CLI](../../role-based-access-control/role-assignments-cli.md).
-
-## Next steps
-
-Get started with [uploading files to your account](media-services-portal-upload-files.md).
media-services Media Services Community https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-community.md
- Title: Azure Media Services Community Overview | Microsoft Docs
-description: 'This Azure Media Services (AMS) community page discusses different ways you can get updates about AMS, see new videos and podcasts, ask questions and give feedback. '
------ Previously updated : 03/10/2021--
-# Azure Media Services Community
--
-This Azure Media Services (AMS) community page discusses different ways you can get updates about AMS. You can also view new videos and podcasts, ask questions and give feedback.
-
-## Videos and Podcasts
--- [Protecting your Media Content with DRM](https://azure.microsoft.com/documentation/videos/azurefridayprotectingyourmediacontentdrm/) -- [Protecting your Media Content with AES Encryption](https://azure.microsoft.com/documentation/videos/azure-media-services-protecting-your-media-content-with-aes-encryption/) -- [Azure Media Services Developer Deep Dive](https://azure.microsoft.com/documentation/videos/build-2015-azure-media-services-developer-deep-dive/) -- [Azure Media Indexer automatically creates transcripts for your media](https://azure.microsoft.com/documentation/videos/azure-media-indexer-autoatically-creates-transcripts-for-your-media-with-adarsh-solanki/) --
-## Provide feedback and make suggestions
-
-## Discussion
-
-### Twitter
-
-Use the [@MSFTAzureMedia](https://twitter.com/MSFTAzureMedia) twitter handle to contact us or follow updates on Twitter.
-You can use the [@AzureSupport](https://twitter.com/azuresupport) twitter handle to request support on Twitter.
-
-### Online forums
-
-The following forums can be used for asking questions about current products and features.
-
-Currently, MSDN is Media Services team's primary community forum.
-
-[:::image type="icon" source="./media/media-services-community/msdn.png" border="false":::](/answers/topics/azure-media-services.html)
-
-The team also monitors questions tagged on Stack Overflow with 'azure-media-services'.
-
-[:::image type="icon" source="./media/media-services-community/stack-overflow.png" border="false":::](https://stackoverflow.com/questions/tagged/azure-media-services)
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Next steps
-
-[Overview](media-services-overview.md)
media-services Media Services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-concepts.md
- Title: Azure Media Services concepts | Microsoft Docs
-description: This article gives a brief overview of Microsoft Azure Media Services concepts and links to other articles for details.
------ Previously updated : 03/10/2021--
-# Azure Media Services concepts
---
-This topic gives an overview of the most important Media Services concepts.
-
-## <a name="assets"></a>Assets and Storage
-### Assets
-An [Asset](/rest/api/media/operations/asset) contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.
-
-An asset is mapped to a blob container in the Azure Storage account and the files in the asset are stored as block blobs in that container. Page blobs are not supported by Azure Media Services.
-
-When deciding what media content to upload and store in an asset, the following considerations apply:
-
-* An asset should contain only a single, unique instance of media content. For example, a single edit of a TV episode, movie, or advertisement.
-* An asset should not contain multiple renditions or edits of an audiovisual file. One example of an improper usage of an Asset would be attempting to store more than one TV episode, advertisement, or multiple camera angles from a single production inside an asset. Storing multiple renditions or edits of an audiovisual file in an asset can result in difficulties submitting encoding jobs, streaming and securing the delivery of the asset later in the workflow.
-
-### Asset file
-An [AssetFile](/rest/api/media/operations/assetfile) represents an actual video or audio file that is stored in a blob container. An asset file is always associated with an asset, and an asset may contain one or many files. The Media Services Encoder task fails if an asset file object is not associated with a digital file in a blob container.
-
-The **AssetFile** instance and the actual media file are two distinct objects. The AssetFile instance contains metadata about the media file, while the media file contains the actual media content.
-
-You should not attempt to change the contents of blob containers that were generated by Media Services without using Media Service APIs.
-
-### Asset encryption options
-Depending on the type of content you want to upload, store, and deliver, Media Services provides various encryption options that you can choose from.
-
->[!NOTE]
->No encryption is used. This is the default value. When using this option your content is not protected in transit or at rest in storage.
-
-If you plan to deliver an MP4 using progressive download, use this option to upload your content.
-
-**StorageEncrypted** ΓÇô Use this option to encrypt your clear content locally using AES 256 bit encryption and then upload it to Azure Storage where it is stored encrypted at rest. Assets protected with storage encryption are automatically unencrypted and placed in an encrypted file system prior to encoding, and optionally re-encrypted prior to uploading back as a new output asset. The primary use case for storage encryption is when you want to secure your high quality input media files with strong encryption at rest on disk.
-
-In order to deliver a storage encrypted asset, you must configure the assetΓÇÖs delivery policy so Media Services knows how you want to deliver your content. Before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy (for example, AES, PlayReady, or no encryption).
-
-**CommonEncryptionProtected** - Use this option if you want to encrypt (or upload already encrypted) content with Common Encryption or PlayReady DRM (for example, Smooth Streaming protected with PlayReady DRM).
-
-**EnvelopeEncryptionProtected** ΓÇô Use this option if you want to protect (or upload already protected) HTTP Live Streaming (HLS) encrypted with Advanced Encryption Standard (AES). If you are uploading HLS already encrypted with AES, it must have been encrypted by Transform Manager.
-
-### Access policy
-An [AccessPolicy](/rest/api/media/operations/accesspolicy) defines permissions (like read, write, and list) and duration of access to an asset. You would usually pass an AccessPolicy object to a locator that would then be used to access the files contained in an asset.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) topic.
-
-### Blob container
-A blob container provides a grouping of a set of blobs. Blob containers are used in Media Services as boundary point for access control, and Shared Access Signature (SAS) locators on assets. An Azure Storage account can contain an unlimited number of blob containers. A container can store an unlimited number of blobs.
-
->[!NOTE]
-> You should not attempt to change the contents of blob containers that were generated by Media Services without using Media Service APIs.
->
->
-
-### <a name="locators"></a>Locators
-[Locator](/rest/api/media/operations/locator)s provide an entry point to access the files contained in an asset. An access policy is used to define the permissions and duration that a client has access to a given asset. Locators can have a many to one relationship with an access policy, such that different locators can provide different start times and connection types to different clients while all using the same permission and duration settings; however, because of a shared access policy restriction set by Azure storage services, you cannot have more than five unique locators associated with a given asset at one time.
-
-Media Services supports two types of locators: OnDemandOrigin locators, used to stream media (for example, MPEG DASH, HLS, or Smooth Streaming) or progressively download media and SAS URL locators, used to upload or download media files to\from Azure storage.
-
->[!NOTE]
->The list permission (AccessPermissions.List) should not be used when creating an OnDemandOrigin locator.
-
-### Storage account
-All access to Azure Storage is done through a storage account. A Media Service account can associate with one or more storage accounts. An account can contain an unlimited number of containers, as long as their total size is under 500TB per storage account. Media Services provides SDK level tooling to allow you to manage multiple storage accounts and load balance the distribution of your assets during upload to these accounts based on metrics or random distribution. For more information, see Working with [Azure Storage](/previous-versions/azure/dn767951(v=azure.100)).
-
-## Jobs and tasks
-A [job](/rest/api/media/operations/job) is typically used to process (for example, index or encode) one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded.
-
-A job contains metadata about the processing to be performed. Each job contains one or more [task](/rest/api/media/operations/task)s that specify an atomic processing task, its input Assets, output Assets, a media processor and its associated settings. Tasks within a job can be chained together, where the output asset of one task is given as the input asset to the next task. In this way one job can contain all of the processing necessary for a media presentation.
-
-## <a id="encoding"></a>Encoding
-Azure Media Services provides multiple options for the encoding of media in the cloud.
-
-When starting out with Media Services, it is important to understand the difference between codecs and file formats.
-Codecs are the software that implements the compression/decompression algorithms whereas file formats are containers that hold the compressed video.
-
-Media Services provides dynamic packaging which allows you to deliver your adaptive bitrate MP4 or Smooth Streaming encoded content in streaming formats supported by Media Services (MPEG DASH, HLS, Smooth Streaming) without you having to re-package into these streaming formats.
-
-To take advantage of [dynamic packaging](media-services-dynamic-packaging-overview.md), you need to encode your mezzanine (source) file into a set of adaptive bitrate MP4 files or adaptive bitrate Smooth Streaming files and have at least one standard or premium streaming endpoint in started state.
-
-Media Services supports the following on-demand encoder that is described in this article:
-
-* [Media Encoder Standard](media-services-encode-asset.md#media-encoder-standard)
-
-For information about this supported encoder, see [Encoder](media-services-encode-asset.md).
-
-## Live Streaming
-In Azure Media Services, a Channel represents a pipeline for processing live streaming content. A Channel receives live input streams in one of two ways:
-
-* An on-premises live encoder sends multi-bitrate RTMP or Smooth Streaming (Fragmented MP4) to the Channel. You can use the following live encoders that output multi-bitrate Smooth Streaming: MediaExcel, Ateme, Imagine Communications, Envivio, Cisco and Elemental. The following live encoders output RTMP: Adobe Flash Live Encoder, [Telestream Wirecast](media-services-configure-wirecast-live-encoder.md), Teradek, Haivision encoders. The ingested streams pass through Channels without any further transcoding and encoding. When requested, Media Services delivers the stream to customers.
-* A single bitrate stream (in one of the following formats: RTMP, or Smooth Streaming (Fragmented MP4)) is sent to the Channel that is enabled to perform live encoding with Media Services. The Channel then performs live encoding of the incoming single bitrate stream to a multi-bitrate (adaptive) video stream. When requested, Media Services delivers the stream to customers.
-
-### Channel
-In Media Services, [Channel](/rest/api/media/operations/channel)s are responsible for processing live streaming content. A Channel provides an input endpoint (ingest URL) that you then provide to a live transcoder. The channel receives live input streams from the live transcoder and makes it available for streaming through one or more StreamingEndpoints. Channels also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-You can get the ingest URL and the preview URL when you create the channel. To get these URLs, the channel does not have to be in the started state. When you are ready to start pushing data from a live transcoder into the channel, the channel must be started. Once the live transcoder starts ingesting data, you can preview your stream.
-
-Each Media Services account can contain multiple Channels, multiple Programs, and multiple StreamingEndpoints. Depending on the bandwidth and security needs, StreamingEndpoint services can be dedicated to one or more channels. Any StreamingEndpoint can pull from any Channel.
-
-### Program (event)
-A [Program (event)](/rest/api/media/operations/program) enables you to control the publishing and storage of segments in a live stream. Channels manage Programs (events). The Channel and Program relationship is similar to traditional media where a channel has a constant stream of content and a program is scoped to some timed event on that channel.
-You can specify the number of hours you want to retain the recorded content for the program by setting the **ArchiveWindowLength** property. This value can be set from a minimum of 5 minutes to a maximum of 25 hours.
-
-ArchiveWindowLength also dictates the maximum amount of time clients can seek back in time from the current live position. Programs can run over the specified amount of time, but content that falls behind the window length is continuously discarded. This value of this property also determines how long the client manifests can grow.
-
-Each program (event) is associated with an Asset. To publish the program you must create a locator for the associated asset. Having this locator will enable you to build a streaming URL that you can provide to your clients.
-
-A channel supports up to three concurrently running programs so you can create multiple archives of the same incoming stream. This allows you to publish and archive different parts of an event as needed. For example, your business requirement is to archive 6 hours of a program, but to broadcast only last 10 minutes. To accomplish this, you need to create two concurrently running programs. One program is set to archive 6 hours of the event but the program is not published. The other program is set to archive for 10 minutes and this program is published.
-
-For more information, see:
-
-* [Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md)
-* [Working with Channels that Receive Multi-bitrate Live Stream from On-premises Encoders](media-services-live-streaming-with-onprem-encoders.md)
-* [Quotas and limitations](media-services-quotas-and-limitations.md).
-
-## Protecting content
-### Dynamic encryption
-Azure Media Services enables you to secure your media from the time it leaves your computer through storage, processing, and delivery. Media Services allows you to deliver your content encrypted dynamically with Advanced Encryption Standard (AES) (using 128-bit encryption keys) and common encryption (CENC) using PlayReady and/or Widevine DRM. Media Services also provides a service for delivering AES keys and PlayReady licenses to authorized clients.
-
-Currently, you can encrypt the following streaming formats: HLS, MPEG DASH, and Smooth Streaming. You cannot encrypt progressive downloads.
-
-If you want for Media Services to encrypt an asset, you need to associate an encryption key (CommonEncryption or EnvelopeEncryption) with your asset and also configure authorization policies for the key.
-
-If you want to stream a storage encrypted asset, you must configure the asset's delivery policy in order to specify how you want to deliver your asset.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content using an envelope encryption (with AES) or common encryption (with PlayReady or Widevine). To decrypt the stream, the player will request the key from the key delivery service. To decide whether or not the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-### Token restriction
-The content key authorization policy could have one or more authorization restrictions: open, token restriction, or IP restriction. The token restricted policy must be accompanied by a token issued by a Secure Token Service (STS). Media Services supports tokens in the Simple Web Tokens (SWT) format and JSON Web Token (JWT) format. Media Services does not provide Secure Token Services. You can create a custom STS. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration. The Media Services key delivery service will return the requested key (or license) to the client if the token is valid and the claims in the token match those configured for the key (or license).
-
-When configuring the token restricted policy, you must specify the primary verification key, issuer and audience parameters. The primary verification key contains the key that the token was signed with, issuer is the secure token service that issues the token. The audience (sometimes called scope) describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-For more information, see the following articles:
-- [Protect content overview](media-services-content-protection-overview.md)-- [Protect with AES-128](media-services-playready-license-template-overview.md)-- [Protect with PlayReady/Widevine](media-services-protect-with-playready-widevine.md)-
-## Delivering
-### <a name="dynamic_packaging"></a>Dynamic packaging
-When working with Media Services, it is recommended to encode your mezzanine files into an adaptive bitrate MP4 set and then convert the set to the desired format using the [Dynamic Packaging](media-services-dynamic-packaging-overview.md).
-
-### Streaming endpoint
-A StreamingEndpoint represents a streaming service that can deliver content directly to a client player application, or to a Content Delivery Network (CDN) for further distribution (Azure Media Services now provides the Azure CDN integration.) The outbound stream from a streaming endpoint service can be a live stream, or a video on-demand Asset in your Media Services account. Media Services customers choose either a **Standard** streaming endpoint or one or more **Premium** streaming endpoints, according to their needs. Standard streaming endpoint is suitable for most streaming workloads.
-
-Standard Streaming Endpoint is suitable for most streaming workloads. Standard Streaming Endpoints offer the flexibility to deliver your content to virtually every device through dynamic packaging into HLS, MPEG-DASH, and Smooth Streaming as well as dynamic encryption for Microsoft PlayReady, Google Widevine, Apple Fairplay, and AES128. They also scale from very small to very large audiences with thousands of concurrent viewers through Azure CDN integration. If you have an advanced workload or your streaming capacity requirements don't fit to standard streaming endpoint throughput targets or you want to control the capacity of the StreamingEndpoint service to handle growing bandwidth needs, it is recommended to allocate scale units(also known as premium streaming units).
-
-It is recommended to use dynamic packaging and/or dynamic encryption.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-For more information, see [this](media-services-portal-manage-streaming-endpoints.md) topic.
-
-By default you can have up to 2 streaming endpoints in your Media Services account. To request a higher limit, see [Quotas and limitations](media-services-quotas-and-limitations.md).
-
-You are only billed when your StreamingEndpoint is in running state.
-
-### Asset delivery policy
-One of the steps in the Media Services content delivery workflow is configuring [delivery policies for assets](/rest/api/media/operations/assetdeliverypolicy)that you want to be streamed. The asset delivery policy tells Media Services how you want for your asset to be delivered: into which streaming protocol should your asset be dynamically packaged (for example, MPEG DASH, HLS, Smooth Streaming, or all), whether or not you want to dynamically encrypt your asset and how (envelope or common encryption).
-
-If you have a storage encrypted asset, before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy. For example, to deliver your asset encrypted with Advanced Encryption Standard (AES) encryption key, set the policy type to DynamicEnvelopeEncryption. To remove storage encryption and stream the asset in the clear, set the policy type to NoDynamicEncryption.
-
-### Progressive download
-Progressive download allows you to start playing media before the entire file has been downloaded. You can only progressively download an MP4 file.
-
->[!NOTE]
->You must decrypt encrypted assets if you wish for them to be available for progressive download.
-
-To provide users with progressive download URLs, you first must create an OnDemandOrigin locator. Creating the locator, gives you the base Path to the asset. You then need to append the name of MP4 file. For example:
-
-`http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4`
-
-### Streaming URLs
-Streaming your content to clients. To provide users with streaming URLs, you first must create an OnDemandOrigin locator. Creating the locator, gives you the base Path to the asset that contains the content you want to stream. However, to be able to stream this content you need to modify this path further. To construct a full URL to the streaming manifest file, you must concatenate the locatorΓÇÖs Path value and the manifest (filename.ism) file name. Then, append /Manifest and an appropriate format (if needed) to the locator path.
-
-You can also stream your content over a TLS connection. To do this, make sure your streaming URLs start with HTTPS. Currently, AMS doesnΓÇÖt support TLS with custom domains.
-
->[!NOTE]
->You can only stream over TLS if the streaming endpoint from which you deliver your content was created after September 10th, 2014. If your streaming URLs are based on the streaming endpoints created after September 10th, the URL contains "streaming.mediaservices.windows.net" (the new format). Streaming URLs that contain "origin.mediaservices.windows.net" (the old format) do not support TLS. If your URL is in the old format and you want to be able to stream over TLS, create a new streaming endpoint. Use URLs created based on the new streaming endpoint to stream your content over TLS.
-
-The following list describes different streaming formats and gives examples:
-
-* Smooth Streaming
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest
-
-* MPEG DASH
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=mpd-time-csf)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf)
-
-* Apple HTTP Live Streaming (HLS) V4
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl)
-
-* Apple HTTP Live Streaming (HLS) V3
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl-v3)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl-v3)
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Configure Kb Live Encoder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-configure-kb-live-encoder.md
- Title: Configure the Haivision KB encoder to send a single bitrate live stream to Azure | Microsoft Docs
-description: 'This topic shows how to configure the Haivision KB live encoder to send a single bitrate stream to AMS channels that are enabled for live encoding.'
------ Previously updated : 03/10/2021--
-# Use the Haivision KB live encoder to send a single bitrate live stream
--
-> [!div class="op_single_selector"]
-> * [Haivision](media-services-configure-kb-live-encoder.md)
-> * [Wirecast](media-services-configure-wirecast-live-encoder.md)
-
-This topic shows how to configure the [Havision KB live encoder](https://www.haivision.com/products/kb-series/) encoder to send a single bitrate stream to AMS channels that are enabled for live encoding. For more information, see [Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md).
-
-This tutorial shows how to manage Azure Media Services (AMS) with Azure Media Services Explorer (AMSE) tool. This tool only runs on Windows PC. If you are on Mac or Linux, use the Azure portal to create [channels](media-services-portal-creating-live-encoder-enabled-channel.md#create-a-channel) and [programs](media-services-portal-creating-live-encoder-enabled-channel.md).
-
-## Prerequisites
-* Access to a Haivision KB encoder, running SW v5.01, or greater.
-* [Create an Azure Media Services account](media-services-portal-create-account.md)
-* Ensure there is a Streaming Endpoint running. For more information, see [Manage Streaming Endpoints in a Media Services Account](media-services-portal-manage-streaming-endpoints.md)
-* Install the latest version of the [AMSE](https://github.com/Azure/Azure-Media-Services-Explorer) tool.
-* Launch the tool and connect to your AMS account.
-
-## Tips
-* Whenever possible, use a hardwired internet connection.
-* A good rule of thumb when determining bandwidth requirements is to double the streaming bitrates. While this is not a mandatory requirement, it helps mitigate the impact of network congestion.
-* When using software-based encoders, close out any unnecessary programs.
-
-## Create a channel
-1. In the AMSE tool, navigate to the **Live** tab, and right-click within the channel area. Select **Create channel…** from the menu.
-[Haivision](./media/media-services-configure-kb-live-encoder/channel.png)
-2. Specify a channel name, the description field is optional. Under Channel Settings, select **Standard** for the Live Encoding option, with the Input Protocol set to **RTMP**. You can leave all other settings as is. Make sure the **Start the new channel now** is selected.
-3. Click **Create Channel**.
-[Haivision](./media/media-services-configure-kb-live-encoder/livechannel.png)
-
-> [!NOTE]
-> The channel can take as long as 20 minutes to start.
-
-## Configure the Haivision KB encoder
-In this tutorial, the following output settings are used. The rest of this section describes configuration steps in more detail.
-
-Video:
-- Codec: H.264-- Profile: High (Level 4.0)-- Bitrate: 5000 kbps-- Keyframe: 2 seconds (60 frames)-- Frame Rate: 30-
-Audio:
-- Codec: AAC (LC)-- Bitrate: 192 kbps-- Sample Rate: 44.1 kHz-
-## Configuration steps
-1. Log in to the Haivision KB user interface.
-2. Click on the **Menu Button** in the channel control center and select **Add Channel**
- ![Screenshot 2017-08-14 at 9.15.09 AM](./media/media-services-configure-kb-live-encoder/step2.png)
-3. Type the **Channel Name** in the Name field and click next.
- ![Screenshot 2017-08-14 at 9.19.07 AM](./media/media-services-configure-kb-live-encoder/step3.png)
-4. Select the **Channel Input Source** from the **Input Source** drop-down and click next.
- ![Screenshot 2017-08-14 at 9.20.44 AM](./media/media-services-configure-kb-live-encoder/step4.png)
-5. From the **Encoder Template** drop-down choose **H264-720-AAC-192** and click next.
- ![Screenshot 2017-08-14 at 9.23.15 AM](./media/media-services-configure-kb-live-encoder/step5.png)
-6. From the **Select New Output** drop-down choose **RTMP** and click next.
- ![Screenshot 2017-08-14 at 9.27.51 AM](./media/media-services-configure-kb-live-encoder/step6.png)
-7. From the **Channel Output** window, populate the Azure stream information. Paste the **RTMP** link from the initial channel setup in the **Server** area. In the **Output Name** area type in the name of the channel. In the Stream Name Template area, use the template RTMPStreamName_%video_bitrate% to name the stream.
- ![Screenshot 2017-08-14 at 9.33.17 AM](./media/media-services-configure-kb-live-encoder/step7.png)
-8. Click next and then click Done.
-9. Click the **Play Button** to start the encoder channel.
- ![Haivision KB.png](./media/media-services-configure-kb-live-encoder/step9.png)
-
-## Test playback
-Navigate to the AMSE tool, and right-click the channel to be tested. From the menu, hover over Playback the Preview and select with Azure Media Player.
-
-If the stream appears in the player, then the encoder has been properly configured to connect to AMS.
-
-If an error is received, the channel needs to be reset and encoder settings adjusted. See the troubleshooting article for guidance.
-
-## Create a program
-1. Once channel playback is confirmed, create a program. Under the Live tab in the AMSE tool, right-click within the program area and select Create New Program.
-[Haivision](./media/media-services-configure-kb-live-encoder/program.png)
-1. Name the program and, if needed, adjust the Archive Window Length (which defaults to four hours). You can also specify a storage location or leave as the default.
-2. Check the Start the Program now box.
-3. Click Create Program.
-4. Once the program is running, confirm playback by right-clicking the program and navigating to Play back the program(s) and then selecting with Azure Media Player.
-5. Once confirmed, right-click the program again and select Copy the Output URL to Clipboard (or retrieve this information from the Program information and settings option from the menu).
-
-The stream is now ready to be embedded in a player, or distributed to an audience for live viewing.
-
-> [!NOTE]
-> Program creation takes less time than channel creation.
media-services Media Services Configure Wirecast Live Encoder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-configure-wirecast-live-encoder.md
- Title: Configure the Telestream Wirecast encoder to send a single bitrate live stream | Microsoft Docs
-description: 'This topic shows how to configure the Wirecast live encoder to send a single bitrate stream to AMS channels that are enabled for live encoding. '
------ Previously updated : 03/10/2021---
-# Use the Wirecast encoder to send a single bitrate live stream
-
-
-> [!div class="op_single_selector"]
-> * [Wirecast](media-services-configure-wirecast-live-encoder.md)
->
-
-This article shows how to configure the [Telestream Wirecast](https://www.telestream.net/wirecast/overview.htm) live encoder to send a single bitrate stream to AMS channels that are enabled for live encoding. For more information, see [Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md).
-
-This tutorial shows how to manage Azure Media Services (AMS) with Azure Media Services Explorer (AMSE) tool. This tool only runs on Windows PC. If you are on Mac or Linux, use the Azure portal to create [channels](media-services-portal-creating-live-encoder-enabled-channel.md#create-a-channel) and [programs](media-services-portal-creating-live-encoder-enabled-channel.md).
-
-> [!NOTE]
-> Encoders must support TLS 1.2 when using RTMPS protocols. Use the Wirecast version 13.0.2 or higher due to the TLS 1.2 requirement.
-
-## Prerequisites
-* [Create an Azure Media Services account](media-services-portal-create-account.md)
-* Ensure there is a Streaming Endpoint running. For more information, see [Manage Streaming Endpoints in a Media Services Account](media-services-portal-manage-streaming-endpoints.md)
-* Install the latest version of the [AMSE](https://github.com/Azure/Azure-Media-Services-Explorer) tool.
-* Launch the tool and connect to your AMS account.
-
-## Tips
-* Whenever possible, use a hardwired internet connection.
-* A good rule of thumb when determining bandwidth requirements is to double the streaming bitrates. While this is not a mandatory requirement, it helps mitigate the impact of network congestion.
-* When using software-based encoders, close out any unnecessary programs.
-
-## Create a channel
-1. In the AMSE tool, navigate to the **Live** tab, and right-click within the channel area. Select **Create channel…** from the menu.
-
- ![Screenshot shows Create channel selected from a menu.](./media/media-services-wirecast-live-encoder/media-services-wirecast1.png)
-
-2. Specify a channel name, the description field is optional. Under Channel Settings, select **Standard** for the Live Encoding option, with the Input Protocol set to **RTMP**. You can leave all other settings as is.
-
- Make sure the **Start the new channel now** is selected.
-
-3. Click **Create Channel**.
-
- ![Screenshot shows the Create a live channel dialog box.](./media/media-services-wirecast-live-encoder/media-services-wirecast2.png)
-
-> [!NOTE]
-> The channel can take as long as 20 minutes to start.
->
->
-
-While the channel is starting, you can [configure the encoder](media-services-configure-wirecast-live-encoder.md#configure_wirecast_rtmp).
-
-> [!IMPORTANT]
-> Billing starts as soon as Channel goes into a ready state. For more information, see [Channel's states](media-services-manage-live-encoder-enabled-channels.md#states).
->
->
-
-## <a id="configure_wirecast_rtmp" />Configure the Telestream Wirecast encoder
-In this tutorial, the following output settings are used. The rest of this section describes configuration steps in more detail.
-
-**Video**:
-
-* Codec: H.264
-* Profile: High (Level 4.0)
-* Bitrate: 5000 kbps
-* Keyframe: 2 seconds (60 seconds)
-* Frame Rate: 30
-
-**Audio**:
-
-* Codec: AAC (LC)
-* Bitrate: 192 kbps
-* Sample Rate: 44.1 kHz
-
-### Configuration steps
-1. Open the Telestream Wirecast application on the machine being used, and set up for RTMP streaming.
-2. Configure the output by navigating to the **Output** tab and selecting **Output Settings…**.
-
- Make sure the **Output Destination** is set to **RTMP Server**.
-3. Click **OK**.
-4. On the settings page, set the **Destination** field to be **Azure Media Services**.
-
- The Encoding profile is pre-selected to **Azure H.264 720p 16:9 (1280x720)**. To customize these settings, select the gear icon to the right of the drop-down, and then choose **New Preset**.
-
- ![Screenshot shows the Choose a template dialog box with BlobTrigger selected.](./media/media-services-wirecast-live-encoder/media-services-wirecast3.png)
-5. Configure encoder presets.
-
- Name the preset, and check for the following recommended settings:
-
- **Video**
-
- * Encoder: MainConcept H.264
- * Frames per Second: 30
- * Average bit rate: 5000 kbits/sec (Can be adjusted based on network limitations)
- * Profile: Main
- * Key frame every: 60 frames
-
- **Audio**
-
- * Target bit rate: 192 kbits/sec
- * Sample Rate: 44.100 kHz
-
- ![Screenshot shows the Encoder Preset for AzureTest1.](./media/media-services-wirecast-live-encoder/media-services-wirecast4.png)
-6. Press **Save**.
-
- The Encoding field now has the newly created profile available for selection.
-
- Make sure the new profile is selected.
-7. Get the channel's input URL in order to assign it to the Wirecast **RTMP Endpoint**.
-
- Navigate back to the AMSE tool, and check on the channel completion status. Once the State has changed from **Starting** to **Running**, you can get the input URL.
-
- When the channel is running, right-click the channel name, navigate down to hover over **Copy Input URL to clipboard** and then select **Primary Input
- URL**.
-
- ![Screenshot shows the Copy Input U R L to clipboard option for Primary Input U R L.](./media/media-services-wirecast-live-encoder/media-services-wirecast6.png)
-8. In the Wirecast **Output Settings** window, paste this information in the **Address** field of the output section, and assign a stream name.
-
- ![Screenshot shows Output Settings.](./media/media-services-wirecast-live-encoder/media-services-wirecast5.png)
-
-1. Select **OK**.
-2. On the main **Wirecast** screen, confirm input sources for video and audio are ready and then hit **Stream** in the top left-hand corner.
-
- ![Screenshot shows the Wirecast Stream button.](./media/media-services-wirecast-live-encoder/media-services-wirecast7.png)
-
-> [!IMPORTANT]
-> Before you click **Stream**, you **must** ensure that the Channel is ready.
-> Also, make sure not to leave the Channel in a ready state without an input contribution feed for longer than > 15 minutes.
->
->
-
-## Test playback
-
-Navigate to the AMSE tool, and right-click the channel to be tested. From the menu, hover over **Playback the Preview** and select **with Azure Media Player**.
-
-![Screenshot shows Playback the Preview with Azure Media Player option selected.](./media/media-services-wirecast-live-encoder/media-services-wirecast8.png)
-
-If the stream appears in the player, then the encoder has been properly configured to connect to AMS.
-
-If an error is received, the channel needs to be reset and encoder settings adjusted. See the [troubleshooting](media-services-troubleshooting-live-streaming.md) article for guidance.
-
-## Create a program
-1. Once channel playback is confirmed, create a program. Under the **Live** tab in the AMSE tool, right-click within the program area and select **Create New Program**.
-
- ![Screenshot shows the Create program option selected.](./media/media-services-wirecast-live-encoder/media-services-wirecast9.png)
-2. Name the program and, if needed, adjust the **Archive Window Length** (which defaults to four hours). You can also specify a storage location or leave as the default.
-3. Check the **Start the Program now** box.
-4. Click **Create Program**.
-
- >[!NOTE]
- >Program creation takes less time than channel creation.
-
-5. Once the program is running, confirm playback by right-clicking the program and navigating to **Playback the program(s)** and then selecting **with Azure Media Player**.
-6. Once confirmed, right-click the program again and select **Copy the Output URL to Clipboard** (or retrieve this information from the **Program information and settings** option from the menu).
-
-The stream is now ready to be embedded in a player, or distributed to an audience for live viewing.
-
-## Troubleshooting
-See the [troubleshooting](media-services-troubleshooting-live-streaming.md) article for guidance.
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Content Protection Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-content-protection-overview.md
- Title: Protect your content with Azure Media Services | Microsoft Docs
-description: This article gives an overview of content protection with Azure Media Services v2.
------ Previously updated : 03/10/2021--
-# Content protection overview
---
-You can use Azure Media Services to secure your media from the time it leaves your computer through storage, processing, and delivery. With Media Services, you can deliver your live and on-demand content encrypted dynamically with Advanced Encryption Standard (AES-128) or any of the three major digital rights management (DRM) systems: Microsoft PlayReady, Google Widevine, and Apple FairPlay. Media Services also provides a service for delivering AES keys and DRM (PlayReady, Widevine, and FairPlay) licenses to authorized clients.
-
-The following image illustrates the Media Services content protection workflow:
-
-![Protect with PlayReady](./media/media-services-content-protection-overview/media-services-content-protection-with-multi-drm.png)
-
-This article explains concepts and terminology relevant to understanding content protection with Media Services. The article also provides links to articles that discuss how to protect content.
-
-## Dynamic encryption
-
-You can use Media Services to deliver your content encrypted dynamically with AES clear key or DRM encryption by using PlayReady, Widevine, or FairPlay. If content is encrypted with an AES clear key and is sent over HTTPS, it is not in clear until it reaches the client.
-
-Each encryption method supports the following streaming protocols:
-
-- AES: MPEG-DASH, Smooth Streaming, and HLS-- PlayReady: MPEG-DASH, Smooth Streaming, and HLS-- Widevine: MPEG-DASH-- FairPlay: HLS-
-Encryption on progressive downloads is not supported.
-
-To encrypt an asset, you need to associate an encryption content key with your asset and also configure an authorization policy for the key. Content keys can be specified or automatically generated by Media Services.
-
-You also need to configure the asset's delivery policy. If you want to stream a storage-encrypted asset, make sure to specify how you want to deliver it by configuring the asset delivery policy.
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES clear key or DRM encryption. To decrypt the stream, the player requests the key from Media Services key delivery service. To decide whether or not the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-## AES-128 clear key vs. DRM
-Customers often wonder whether they should use AES encryption or a DRM system. The primary difference between the two systems is that with AES encryption the content key is transmitted to the client in an unencrypted format ("in the clear"). As a result, the key used to encrypt the content can be viewed in a network trace on the client in plain text. AES-128 clear key encryption is suitable for use cases where the viewer is a trusted party (for example, encrypting corporate videos distributed within a company to be viewed by employees).
-
-PlayReady, Widevine, and FairPlay all provide a higher level of encryption compared to AES-128 clear key encryption. The content key is transmitted in an encrypted format. Additionally, decryption is handled in a secure environment at the operating system level, where it's more difficult for a malicious user to attack. DRM is recommended for use cases where the viewer might not be a trusted party and you require the highest level of security.
-
-## Storage encryption
-You can use storage encryption to encrypt your clear content locally by using AES 256-bit encryption. You then can upload it to Azure Storage, where it's stored encrypted at rest. Assets protected with storage encryption are automatically unencrypted and placed in an encrypted file system prior to encoding. The assets are optionally re-encrypted prior to uploading back as a new output asset. The primary use case for storage encryption is when you want to secure your high-quality input media files with strong encryption at rest on disk.
-
-To deliver a storage-encrypted asset, you must configure the asset's delivery policy so that Media Services knows how you want to deliver your content. Before your asset can be streamed, the streaming server decrypts and streams your content by using the specified delivery policy (for example, AES, common encryption, or no encryption).
-
-## Types of encryption
-PlayReady and Widevine utilize common encryption (AES CTR mode). FairPlay utilizes AES CBC-mode encryption. AES-128 clear key encryption utilizes envelope encryption.
-
-## Licenses and keys delivery service
-Media Services provides a key delivery service for delivering DRM (PlayReady, Widevine, FairPlay) licenses and AES keys to authorized clients. You can use [the Azure portal](media-services-portal-protect-content.md), the REST API, or the Media Services SDK for .NET to configure authorization and authentication policies for your licenses and keys.
-
-## Control content access
-You can control who has access to your content by configuring the content key authorization policy. The content key authorization policy supports either open or token restriction.
-
-### Open authorization
-With an open authorization policy, the content key is sent to any client (no restriction).
-
-### Token authorization
-With a token-restricted authorization policy, the content key is sent only to a client that presents a valid JSON Web Token (JWT) or simple web token (SWT) in the key/license request. This token must be issued by a security token service (STS). You can use Azure Active Directory as an STS or deploy a custom STS. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration. The Media Services key delivery service returns the requested key/license to the client if the token is valid and the claims in the token match those configured for the key/license.
-
-When you configure the token restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the secure token service that issues the token. The audience, sometimes called scope, describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-### Token replay prevention
-
-The *Token Replay Prevention* feature allows Media Services customers to set a limit on how many times the same token can be used to request a key or a license. The customer can add a claim of type `urn:microsoft:azure:media
-
-#### Considerations
-
-* Customers must have control over token generation. The claim needs to be placed in the token itself.
-* When using this feature, requests with tokens whose expiry time is more than one hour away from the time the request is received are rejected with an unauthorized response.
-* Tokens are uniquely identified by their signature. Any change to the payload (for example, update to the expiry time or the claim) changes the signature of the token and it will count as a new token that Key Delivery hasn't come across before.
-* Playback fails if the token has exceeded the `maxuses` value set by the customer.
-* This feature can be used for all existing protected content (only the token issued needs to be changed).
-* This feature works with both JWT and SWT.
-
-## Streaming URLs
-If your asset was encrypted with more than one DRM, use an encryption tag in the streaming URL: (format='m3u8-aapl', encryption='xxx').
-
-The following considerations apply:
-
-* No more than one encryption type can be specified.
-* Encryption type doesn't have to be specified in the URL if only one encryption was applied to the asset.
-* Encryption type is case insensitive.
-* The following encryption types can be specified:
-
- * **cenc**: For PlayReady or Widevine (common encryption)
- * **cbcs-aapl**: For FairPlay (AES CBC encryption)
- * **cbc**: For AES envelope encryption
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-The following articles describe next steps to help you get started with content protection:
-
-* [Protect with storage encryption](media-services-rest-storage-encryption.md)
-* [Protect with AES encryption](media-services-playready-license-template-overview.md)
-* [Protect with PlayReady and/or Widevine](media-services-protect-with-playready-widevine.md)
-* [Protect with FairPlay](media-services-protect-hls-with-FairPlay.md)
-
-## Related links
-
-* [JWT token authentication](http://www.gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/)
-* [Integrate Azure Media Services OWIN MVC-based app with Azure Active Directory and restrict content key delivery based on JWT claims](http://www.gtrifonov.com/2015/01/24/mvc-owin-azure-media-services-ad-integration/)
-
-[content-protection]: ./media/media-services-content-protection-overview/media-services-content-protection.png
media-services Media Services Copying Existing Blob https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-copying-existing-blob.md
- Title: Copying blobs from a storage account into an Azure Media Services asset | Microsoft Docs
-description: This topic shows how to copy an existing blob into a Media Services Asset. The example uses Azure Media Services .NET SDK Extensions.
------ Previously updated : 03/10/2021---
-# Copying existing blobs into a Media Services Asset
---
-This article shows how to copy blobs from a storage account into a new Azure Media Services (AMS) asset using [Azure Media Services .NET SDK Extensions](https://github.com/Azure/azure-sdk-for-media-services-extensions/).
-
-You should not attempt to change the contents of blob containers that were generated by Media Services without using Media Service APIs.
-
-The extension methods work with:
--- Regular assets.-- Live archive assets (FragBlob format).-- Source and destination assets belonging to different Media Services accounts (even across different data centers). However, there may be charges incurred by doing so. For more information about pricing, see [Data Transfers](https://azure.microsoft.com/pricing/#header-11).-
-The article shows two code samples:
-
-1. Copy blobs from an asset in one AMS account into a new asset in another AMS account.
-2. Copy blobs from some storage account into a new asset in an AMS account.
-
-## Copy blobs between two AMS accounts
-
-### Prerequisites
-
-Two Media Services accounts. See the article [How to Create a Media Services Account](media-services-portal-create-account.md).
-
-### Download sample
-You can follow the steps in this article or you can download a sample that contains the code described in this article from [here](https://azure.microsoft.com/documentation/samples/media-services-dotnet-copy-blob-into-asset/).
-
-### Set up your project
-
-1. Set up your development environment as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-2. Add the appSettings section to the .config file and update the values based on your Media Services accounts, the destination storage account, and the source asset ID.
-
-```xml
-<appSettings>
- <add key="AMSSourceAADTenantDomain" value="tenant"/>
- <add key="AMSSourceRESTAPIEndpoint" value="endpoint"/>
-
- <add key="SourceAMSClientId" value="clientID"/>
- <add key="SourceAMSClientSecret" value="clientSecret"/>
-
- <add key="SourceAssetID" value="nb:cid:UUID:<id>"/>
-
- <add key="AMSDestAADTenantDomain" value="tenant"/>
- <add key="AMSDestRESTAPIEndpoint" value="endpoint"/>
-
- <add key="DestAMSClientId" value="clientID"/>
- <add key="DestAMSClientSecret" value="clientSecret"/>
-
- <add key="DestStorageAccountName" value="name"/>
- <add key="DestStorageAccountKey" value="key"/>
-
-</appSettings>
-```
-
-### Copy blobs from an asset in one AMS account into an asset in another AMS account
-
-The following code uses extension **IAsset.Copy** method to copy all files in the source asset into the destination asset using a single extension.
-
-```csharp
-using System;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Linq;
-using System.Configuration;
-using Microsoft.WindowsAzure.Storage.Auth;
-
-namespace CopyExistingBlobsIntoAsset
-{
- class Program
- {
- static string _sourceAADTenantDomain =
- ConfigurationManager.AppSettings["AMSSourceAADTenantDomain"];
- static string _sourceRESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSSourceRESTAPIEndpoint"];
- static string _sourceClientId =
- ConfigurationManager.AppSettings["SourceAMSClientId"];
- static string _sourceClientSecret =
- ConfigurationManager.AppSettings["SourceAMSClientSecret"];
-
- static string _destAADTenantDomain =
- ConfigurationManager.AppSettings["AMSDestAADTenantDomain"];
- static string _destRESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSDestRESTAPIEndpoint"];
- static string _destClientId =
- ConfigurationManager.AppSettings["DestAMSClientId"];
- static string _destClientSecret =
- ConfigurationManager.AppSettings["DestAMSClientSecret"];
-
- static string _destStorageAccountName =
- ConfigurationManager.AppSettings["DestStorageAccountName"];
- static string _destStorageAccountKey =
- ConfigurationManager.AppSettings["DestStorageAccountKey"];
- static string _sourceAssetID =
- ConfigurationManager.AppSettings["SourceAssetID"];
-
- private static CloudMediaContext _sourceContext = null;
- private static CloudMediaContext _destContext = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials1 = new AzureAdTokenCredentials(_sourceAADTenantDomain,
- new AzureAdClientSymmetricKey(_sourceClientId, _sourceClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- AzureAdTokenCredentials tokenCredentials2 = new AzureAdTokenCredentials(_destAADTenantDomain,
- new AzureAdClientSymmetricKey(_destClientId, _destClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider1 = new AzureAdTokenProvider(tokenCredentials1);
- var tokenProvider2 = new AzureAdTokenProvider(tokenCredentials2);
-
- // Create the context for your source Media Services account.
- _sourceContext = new CloudMediaContext(new Uri(_sourceRESTAPIEndpoint), tokenProvider1);
-
- // Create the context for your destination Media Services account.
- _destContext = new CloudMediaContext(new Uri(_destRESTAPIEndpoint), tokenProvider2);
-
- // Get the credentials of the default Storage account bound to your destination Media Services account.
- StorageCredentials destinationStorageCredentials =
- new StorageCredentials(_destStorageAccountName, _destStorageAccountKey);
-
- // Get a reference to the source asset in the source context.
- IAsset sourceAsset = _sourceContext.Assets.Where(asset => asset.Id == _sourceAssetID).First();
-
- // Create an empty destination asset in the destination context.
- IAsset destinationAsset = _destContext.Assets.Create(sourceAsset.Name, AssetCreationOptions.None);
-
- // Copy the files in the source asset instance into the destination asset instance.
- // There is an additional overload with async support.
- sourceAsset.Copy(destinationAsset, destinationStorageCredentials);
-
- Console.WriteLine("Done");
- }
- }
-}
-```
-
-## Copy blobs from a storage account into an AMS account
-
-### Prerequisites
--- One Storage account from which you want to copy blobs.-- One AMS account into which you want to copy blobs.-
-### Set up your project
-
-1. Set up your development environment as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-2. Add the appSettings section to the .config file and update the values based on your source storage and destination AMS accounts.
-
-```xml
-<appSettings>
- <add key="SourceStorageAccountName" value="name" />
- <add key="SourceStorageAccountKey" value="key" />
- <add key="NameOfBlobContainerYouWantToCopy" value="BlobContainerName"/>
-
- <add key="AMSAADTenantDomain" value="tenant"/>
- <add key="AMSRESTAPIEndpoint" value="endpoint"/>
- <add key="AMSClientId" value="clientID"/>
- <add key="AMSClientSecret" value="clientSecret"/>
- <add key="AMSStorageAccountName" value="name"/>
- <add key="AMSStorageAccountKey" value="key"/>
-</appSettings>
-```
-
-### Copy blobs from some storage account into a new asset in an AMS account
-
-The following code copies blobs from a storage account into a Media Services asset.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-```csharp
-using System;
-using System.Configuration;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.Storage.Auth;
-using Microsoft.WindowsAzure.Storage;
-using Microsoft.WindowsAzure.Storage.Blob;
-
-namespace CopyExistingBlobsIntoAsset
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _sourceStorageAccountName =
- ConfigurationManager.AppSettings["SourceStorageAccountName"];
- private static readonly string _sourceStorageAccountKey =
- ConfigurationManager.AppSettings["SourceStorageAccountKey"];
- private static readonly string _NameOfBlobContainerYouWantToCopy =
- ConfigurationManager.AppSettings["NameOfBlobContainerYouWantToCopy"];
-
- private static readonly string _AMSAADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _AMSRESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
- private static readonly string _AMSStorageAccountName =
- ConfigurationManager.AppSettings["AMSStorageAccountName"];
- private static readonly string _AMSStorageAccountKey =
- ConfigurationManager.AppSettings["AMSStorageAccountKey"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
- private static CloudStorageAccount _sourceStorageAccount = null;
- private static CloudStorageAccount _destinationStorageAccount = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AMSAADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- // Create the context for your source Media Services account.
- _context = new CloudMediaContext(new Uri(_AMSRESTAPIEndpoint), tokenProvider);
-
- _sourceStorageAccount =
- new CloudStorageAccount(new StorageCredentials(_sourceStorageAccountName,
- _sourceStorageAccountKey), true);
-
- _destinationStorageAccount =
- new CloudStorageAccount(new StorageCredentials(_AMSStorageAccountName,
- _AMSStorageAccountKey), true);
-
- CloudBlobClient sourceCloudBlobClient =
- _sourceStorageAccount.CreateCloudBlobClient();
- CloudBlobContainer sourceContainer =
- sourceCloudBlobClient.GetContainerReference(_NameOfBlobContainerYouWantToCopy);
-
- CreateAssetFromExistingBlobs(sourceContainer);
-
- Console.WriteLine("Done");
- }
-
- static private IAsset CreateAssetFromExistingBlobs(CloudBlobContainer sourceBlobContainer)
- {
- CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
-
- // Create a new asset.
- IAsset asset = _context.Assets.Create("NewAsset_" + Guid.NewGuid(), AssetCreationOptions.None);
-
- IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
- TimeSpan.FromHours(24), AccessPermissions.Write);
-
- ILocator destinationLocator =
- _context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
-
- // Get the asset container URI and Blob copy from mediaContainer to assetContainer.
- CloudBlobContainer destAssetContainer =
- destBlobStorage.GetContainerReference((new Uri(destinationLocator.Path)).Segments[1]);
-
- if (destAssetContainer.CreateIfNotExists())
- {
- destAssetContainer.SetPermissions(new BlobContainerPermissions
- {
- PublicAccess = BlobContainerPublicAccessType.Blob
- });
- }
-
- var blobList = sourceBlobContainer.ListBlobs();
-
- foreach (CloudBlockBlob sourceBlob in blobList)
- {
- var assetFile = asset.AssetFiles.Create((sourceBlob as ICloudBlob).Name);
-
- ICloudBlob destinationBlob = destAssetContainer.GetBlockBlobReference(assetFile.Name);
-
- CopyBlob(sourceBlob, destAssetContainer);
-
- sourceBlob.FetchAttributes();
- assetFile.ContentFileSize = (sourceBlob as ICloudBlob).Properties.Length;
- assetFile.Update();
- Console.WriteLine("File {0} is of {1} size", assetFile.Name, assetFile.ContentFileSize);
- }
-
- asset.Update();
-
- destinationLocator.Delete();
- writePolicy.Delete();
-
- // Set the primary asset file.
- // If, for example, we copied a set of Smooth Streaming files,
- // set the .ism file to be the primary file.
- // If we, for example, copied an .mp4, then the mp4 would be the primary file.
- var ismAssetFile = asset.AssetFiles.ToList().
- Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).FirstOrDefault();
-
- // The following code assigns the first .ism file as the primary file in the asset.
- // An asset should have one .ism file.
- if (ismAssetFile != null)
- {
- ismAssetFile.IsPrimary = true;
- ismAssetFile.Update();
- }
-
- return asset;
- }
-
- /// <summary>
- /// Copies the specified blob into the specified container.
- /// </summary>
- /// <param name="sourceBlob">The source container.</param>
- /// <param name="destinationContainer">The destination container.</param>
- static private void CopyBlob(ICloudBlob sourceBlob, CloudBlobContainer destinationContainer)
- {
- var signature = sourceBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy
- {
- Permissions = SharedAccessBlobPermissions.Read,
- SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24)
- });
-
- var destinationBlob = destinationContainer.GetBlockBlobReference(sourceBlob.Name);
-
- if (destinationBlob.Exists())
- {
- Console.WriteLine(string.Format("Destination blob '{0}' already exists. Skipping.", destinationBlob.Uri));
- }
- else
- {
-
- // Display the size of the source blob.
- Console.WriteLine(sourceBlob.Properties.Length);
-
- Console.WriteLine(string.Format("Copy blob '{0}' to '{1}'", sourceBlob.Uri, destinationBlob.Uri));
- destinationBlob.StartCopy(new Uri(sourceBlob.Uri.AbsoluteUri + signature));
-
- while (true)
- {
- // The StartCopyFromBlob is an async operation,
- // so we want to check if the copy operation is completed before proceeding.
- // To do that, we call FetchAttributes on the blob and check the CopyStatus.
- destinationBlob.FetchAttributes();
- if (destinationBlob.CopyState.Status != CopyStatus.Pending)
- {
- break;
- }
- //It's still not completed. So wait for some time.
- System.Threading.Thread.Sleep(1000);
- }
-
- // Display the size of the destination blob.
- Console.WriteLine(destinationBlob.Properties.Length);
-
- }
- }
- }
-}
-```
-
-## Next steps
-
-You can now encode your uploaded assets. For more information, see [Encode assets](media-services-portal-encode.md).
-
-You can also use Azure Functions to trigger an encoding job based on a file arriving in the configured container. For more information, see [this sample](https://azure.microsoft.com/resources/samples/media-services-dotnet-functions-integration/ ).
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Crop Video https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-crop-video.md
- Title: How to crop videos with Media Encoder Standard - Azure | Microsoft Docs
-description: Cropping is the process of selecting a rectangular window within the video frame, and encoding just the pixels within that window. This article demonstrates how to crop videos with Media Encoder Standard.
------ Previously updated : 03/10/2021---
-# Crop videos with Media Encoder Standard
--
-You can use Media Encoder Standard (MES) to crop your input video. Cropping is the process of selecting a rectangular window within the video frame, and encoding just the pixels within that window. The following diagram helps illustrate the process.
-
-![Crop a video](./media/media-services-crop-video/media-services-crop-video01.png)
-
-Suppose you have as input a video that has a resolution of 1920x1080 pixels (16:9 aspect ratio), but has black bars (pillar boxes) at the left and right, so that only a 4:3 window or 1440x1080 pixels contains active video. You can use MES to crop or edit out the black bars, and encode the 1440x1080 region.
-
-Cropping in MES is a pre-processing stage, so the cropping parameters in the encoding preset apply to the original input video. Encoding is a subsequent stage, and the width/height settings apply to the *pre-processed* video, and not to the original video. When designing your preset you need to do the following: (a) select the crop parameters based on the original input video, and (b) select your encode settings based on the cropped video. If you do not match your encode settings to the cropped video, the output will not be as you expect.
-
-The [following](media-services-custom-mes-presets-with-dotnet.md#encoding_with_dotnet) topic shows how to create an encoding job with MES and how to specify a custom preset for the encoding task.
-
-## Creating a custom preset
-In the example shown in the diagram:
-
-1. Original input is 1920x1080
-2. It needs to be cropped to an output of 1440x1080, which is centered in the input frame
-3. This means an X offset of (1920 ΓÇô 1440)/2 = 240, and a Y offset of zero
-4. The Width and Height of the Crop rectangle are 1440 and 1080, respectively
-5. In the encode stage, the ask is to produce three layers, are resolutions 1440x1080, 960x720 and 480x360, respectively
-
-### JSON preset
-
-```json
-{
- "Version": 1.0,
- "Sources": [
- {
- "Streams": [],
- "Filters": {
- "Crop": {
- "X": 240,
- "Y": 0,
- "Width": 1440,
- "Height": 1080
- }
- },
- "Pad": true
- }
- ],
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1440,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1250,
- "MaxBitrate": 1250,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
-
-## Restrictions on cropping
-The cropping feature is meant to be manual. You would need to load your input video into a suitable editing tool that lets you select frames of interest, position the cursor to determine offsets for the cropping rectangle, to determine the encoding preset that is tuned for that particular video, etc. This feature is not meant to enable things like: automatic detection and removal of black letterbox/pillarbox borders in your input video.
-
-Following constraints apply to the cropping feature. If these are not met, the encode Task can fail, or produce an unexpected output.
-
-1. The co-ordinates and size of the Crop rectangle have to fit within the input video
-2. As mentioned above, the Width & Height in the encode settings have to correspond to the cropped video
-3. Cropping applies to videos captured in landscape mode (i.e. not applicable to videos recorded with a smartphone held vertically or in portrait mode)
-4. Works best with progressive video captured with square pixels
-
-## Provide feedback
-
-## Next step
-See Azure Media Services learning paths to help you learn about great features offered by AMS.
-
media-services Media Services Custom Mes Presets With Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-custom-mes-presets-with-dotnet.md
- Title: Customizing Media Encoder Standard presets | Microsoft Docs
-description: This topic shows how to perform advanced encoding by customizing Media Encoder Standard task presets. The topic shows how to use Media Services .NET SDK to create an encoding task and job. It also shows how to supply custom presets to the encoding job.
------ Previously updated : 03/10/2021----
-# Customizing Media Encoder Standard presets
--
-## Overview
-
-This article shows how to perform advanced encoding with Media Encoder Standard (MES) using a custom preset. The article uses .NET to create an encoding task and a job that executes this task.
-
-This article shows you how to customize a preset by taking the [H264 Multiple Bitrate 720p](media-services-mes-preset-H264-Multiple-Bitrate-720p.md) preset and reducing the number of layers. The [Customizing Media Encoder Standard presets](media-services-advanced-encoding-with-mes.md) article demonstrates custom presets that can be used to perform advanced encoding tasks.
-
-> [!NOTE]
-> The custom presets described in this article cannot be used in [Media Services V3](../latest/index.yml) transforms or the CLI commands. See the [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md) for more details.
-
-## <a id="customizing_presets"></a> Customizing a MES preset
-
-### Original preset
-
-Save the JSON defined in the [H264 Multiple Bitrate 720p](media-services-mes-preset-H264-Multiple-Bitrate-720p.md) article in some file with .json extension. For example, **CustomPreset_JSON.json**.
-
-### Customized preset
-
-Open the **CustomPreset_JSON.json** file and remove first three layers from **H264Layers** so your file looks like this.
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
- }
-```
-
-## <a id="encoding_with_dotnet"></a>Encoding with Media Services .NET SDK
-
-The following code example uses Media Services .NET SDK to perform the following tasks:
--- Create an encoding job.-- Get a reference to the Media Encoder Standard encoder.-- Load the custom JSON preset that you created in the previous section. -
- ```csharp
- // Load the JSON from the local file.
- string configuration = File.ReadAllText(fileName);
- ```
--- Add an encoding task to the job. -- Specify the input asset to be encoded.-- Create an output asset that contains the encoded asset.-- Add an event handler to check the job progress.-- Submit the job.
-
-#### Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-#### Example
-
-```csharp
-using System;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Threading;
-
-namespace CustomizeMESPresests
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Get an uploaded asset.
- var asset = _context.Assets.FirstOrDefault();
-
- // Encode and generate the output using custom presets.
- EncodeToAdaptiveBitrateMP4Set(asset);
-
- Console.ReadLine();
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Load the XML (or JSON) from the local file.
- string configuration = File.ReadAllText("CustomPreset_JSON.json");
-
- // Create a task
- ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
-
- private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
-
- // Cast sender as a job.
- IJob job = (IJob)sender;
-
- // Display or log error details as needed.
- break;
- default:
- break;
- }
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
-
- }
-}
-```
-
-## See also
--- [How to encode with a custom transform by using CLI](../latest/transform-custom-preset-cli-how-to.md)-- [Encoding with Media Services v3](../latest/encode-concept.md)-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Deliver Asset Download https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-deliver-asset-download.md
- Title: Download Media Services assets to your computer - Azure | Microsoft Docs
-description: Learn about to download assets to your computer. Code samples are written in C# and use the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# How to: Deliver an asset by download
--
-This article discusses options for delivering media assets uploaded to Media Services. You can deliver Media Services content in numerous application scenarios. After encoding, download the generated media assets, or access them by using a streaming locator. For improved performance and scalability, you can also deliver content by using a Content Delivery Network (CDN).
-
-This example shows how to download media assets from Media Services to your local computer. The code queries the jobs associated with the Media Services account by job ID and accesses its **OutputMediaAssets** collection (which is the set of one or more output media assets that results from running a job). This example shows how to download output media assets from a job, but you can apply the same approach to download other assets.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). Use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-```csharp
- // Download the output asset of the specified job to a local folder.
- static IAsset DownloadAssetToLocal( string jobId, string outputFolder)
- {
- // This method illustrates how to download a single asset.
- // However, you can iterate through the OutputAssets
- // collection, and download all assets if there are many.
-
- // Get a reference to the job.
- IJob job = GetJob(jobId);
-
- // Get a reference to the first output asset. If there were multiple
- // output media assets you could iterate and handle each one.
- IAsset outputAsset = job.OutputMediaAssets[0];
-
- // Create a SAS locator to download the asset
- IAccessPolicy accessPolicy = _context.AccessPolicies.Create("File Download Policy", TimeSpan.FromDays(30), AccessPermissions.Read);
- ILocator locator = _context.Locators.CreateLocator(LocatorType.Sas, outputAsset, accessPolicy);
-
- BlobTransferClient blobTransfer = new BlobTransferClient
- {
- NumberOfConcurrentTransfers = 20,
- ParallelTransferThreadCount = 20
- };
-
- var downloadTasks = new List<Task>();
- foreach (IAssetFile outputFile in outputAsset.AssetFiles)
- {
- // Use the following event handler to check download progress.
- outputFile.DownloadProgressChanged += DownloadProgress;
-
- string localDownloadPath = Path.Combine(outputFolder, outputFile.Name);
-
- Console.WriteLine("File download path: " + localDownloadPath);
-
- downloadTasks.Add(outputFile.DownloadAsync(Path.GetFullPath(localDownloadPath), blobTransfer, locator, CancellationToken.None));
-
- outputFile.DownloadProgressChanged -= DownloadProgress;
- }
-
- Task.WaitAll(downloadTasks.ToArray());
-
- return outputAsset;
- }
-
- static void DownloadProgress(object sender, DownloadProgressChangedEventArgs e)
- {
- Console.WriteLine(string.Format("{0} % download progress. ", e.Progress));
- }
-```
--
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Deliver streaming content](media-services-deliver-streaming-content.md)
-
media-services Media Services Deliver Content Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-deliver-content-overview.md
- Title: Delivering content to customers
-description: This topic gives an overview of what is involved in delivering your content with Azure Media Services.
------ Previously updated : 03/10/2021---
-# Deliver content to customers
--
-When you're delivering your streaming or video-on-demand content to customers, your goal is to deliver high-quality video to various devices under different network conditions.
-
-To achieve this goal, you can:
-
-* Encode your stream to a multi-bitrate (adaptive bitrate) video stream. This will take care of quality and network conditions.
-* Use Microsoft Azure Media Services [dynamic packaging](media-services-dynamic-packaging-overview.md) to dynamically re-package your stream into different protocols. This will take care of streaming on different devices. Media Services supports delivery of the following adaptive bitrate streaming technologies: <br/>
- * **HTTP Live Streaming** (HLS) - add "(format=m3u8-aapl)" path to the "/Manifest" portion of the URL to tell the streaming origin server to return back HLS content for consumption on **Apple iOS** native devices (for details, see [locators](#locators) and [URLs](#URLs)),
- * **MPEG-DASH** - add "(format=mpd-time-csf)" path to the "/Manifest" portion of the URL to tell the streaming origin server to return back MPEG-DASH (for details, see [locators](#locators) and [URLs](#URLs)),
- * **Smooth Streaming**.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-This article gives an overview of important content delivery concepts.
-
-To check known issues, see [Known issues](media-services-deliver-content-overview.md#known-issues).
-
-## Dynamic packaging
-With the dynamic packaging that Media Services provides, you can deliver your adaptive bitrate MP4 or Smooth Streaming encoded content in streaming formats supported by Media Services (MPEG-DASH, HLS, Smooth Streaming,) without having to re-package into these streaming formats. We recommend delivering your content with dynamic packaging.
-
-To take advantage of dynamic packaging, you need to encode your mezzanine (source) file into a set of adaptive-bitrate MP4 files or adaptive bitrate Smooth Streaming files.
-
-With dynamic packaging, you store and pay for the files in single storage format. Media Services will build and serve the appropriate response based on your requests.
-
-Dynamic packaging is available for standard and premium streaming endpoints.
-
-For more information, see [Dynamic packaging](media-services-dynamic-packaging-overview.md).
-
-## Filters and dynamic manifests
-You can define filters for your assets with Media Services. These filters are server-side rules that help your customers do things like play a specific section of a video or specify a subset of audio and video renditions that your customer's device can handle (instead of all the renditions that are associated with the asset). This filtering is achieved through *dynamic manifests* that are created when your customer requests to stream a video based on one or more specified filters.
-
-For more information, see [Filters and dynamic manifests](media-services-dynamic-manifest-overview.md).
-
-## <a name="locators"></a>Locators
-To provide your user with a URL that can be used to stream or download your content, you first need to publish your asset by creating a locator. A locator provides an entry point to access the files contained in an asset. Media Services supports two types of locators:
-
-* OnDemandOrigin locators. These are used to stream media (for example, MPEG-DASH, HLS, or Smooth Streaming) or progressively download files.
-* Shared access signature (SAS) URL locators. These are used to download media files to your local computer.
-
-An *access policy* is used to define the permissions (such as read, write, and list) and duration for which a client has access for a particular asset. Note that the list permission (AccessPermissions.List) should not be used in creating an OnDemandOrigin locator.
-
-Locators have expiration dates. The Azure portal sets an expiration date 100 years in the future for locators.
-
-> [!NOTE]
-> If you used the Azure portal to create locators before March 2015, those locators were set to expire after two years.
->
->
-
-To update an expiration date on a locator, use [REST](/rest/api/media/operations/locator#update_a_locator) or [.NET](/dotnet/api/microsoft.windowsazure.mediaservices.client.ilocator) APIs. Note that when you update the expiration date of a SAS locator, the URL changes.
-
-Locators are not designed to manage per-user access control. You can give different access rights to individual users by using Digital Rights Management (DRM) solutions. For more information, see [Securing Media](/previous-versions/azure/dn282272(v=azure.100)).
-
-When you create a locator, there may be a 30-second delay due to required storage and propagation processes in Azure Storage.
-
-## Adaptive streaming
-Adaptive bitrate technologies allow video player applications to determine network conditions and select from several bitrates. When network communication degrades, the client can select a lower bitrate so playback can continue with lower video quality. As network conditions improve, the client can switch to a higher bitrate with improved video quality. Azure Media Services supports the following adaptive bitrate technologies: HTTP Live Streaming (HLS), Smooth Streaming, and MPEG-DASH.
-
-To provide users with streaming URLs, you first must create an OnDemandOrigin locator. Creating the locator gives you the base path to the asset that contains the content you want to stream. However, to be able to stream this content, you need to modify this path further. To construct a full URL to the streaming manifest file, you must concatenate the locatorΓÇÖs path value and the manifest (filename.ism) file name. Then append **/Manifest** and an appropriate format (if needed) to the locator path.
-
-> [!NOTE]
-> You can also stream your content over a TLS connection. To do this, make sure your streaming URLs start with HTTPS. Note that, currently, AMS doesnΓÇÖt support TLS with custom domains.
->
-
-You can only stream over TLS if the streaming endpoint from which you deliver your content was created after September 10th, 2014. If your streaming URLs are based on the streaming endpoints created after September 10th, 2014, the URL contains ΓÇ£streaming.mediaservices.windows.net.ΓÇ¥ Streaming URLs that contain ΓÇ£origin.mediaservices.windows.netΓÇ¥ (the old format) do not support TLS. If your URL is in the old format and you want to be able to stream over TLS, create a new streaming endpoint. Use URLs based on the new streaming endpoint to stream your content over TLS.
-
-## <a name="URLs"></a>Streaming URL formats
-
-### MPEG-DASH format
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=mpd-time-csf)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf)
-
-### Apple HTTP Live Streaming (HLS) V4 format
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl)
-
-### Apple HTTP Live Streaming (HLS) V3 format
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl-v3)
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl-v3)
-
-### Apple HTTP Live Streaming (HLS) format with audio-only filter
-By default, audio-only tracks are included in the HLS manifest. This is required for Apple Store certification for cellular networks. In this case, if a client doesnΓÇÖt have sufficient bandwidth or is connected over a 2G connection, playback switches to audio-only. This helps to keep content streaming without buffering, but there is no video. In some scenarios, player buffering might be preferred over audio-only. If you want to remove the audio-only track, add **audio-only=false** to the URL.
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl-v3,audio-only=false)
-
-For more information, see [Dynamic Manifest Composition support and HLS output additional features](https://azure.microsoft.com/blog/azure-media-services-release-dynamic-manifest-composition-remove-hls-audio-only-track-and-hls-i-frame-track-support/).
-
-### Smooth Streaming format
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
-
-Example:
-
-http:\//testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest
-
-### <a id="fmp4_v20"></a>Smooth Streaming 2.0 manifest (legacy manifest)
-By default, Smooth Streaming manifest format contains the repeat tag (r-tag). However, some players do not support the r-tag. Clients with these players can use a format that disables the r-tag:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=fmp4-v20)
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=fmp4-v20)`
-
-## Progressive download
-With progressive download, you can start playing media before the entire file has been downloaded. You cannot progressively download .ism* (ismv, isma, ismt, or ismc) files.
-
-To progressively download content, use the OnDemandOrigin type of locator. The following example shows the URL that is based on the OnDemandOrigin type of locator:
-
-`http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4`
-
-You must decrypt any storage-encrypted assets that you want to stream from the origin service for progressive download.
-
-## Download
-To download your content to a client device, you must create a SAS Locator. The SAS locator gives you access to the Azure Storage container where your file is located. To build the download URL, you have to embed the file name between the host and SAS signature.
-
-The following example shows the URL that is based on the SAS locator:
-
-`https://test001.blob.core.windows.net/asset-ca7a4c3f-9eb5-4fd8-a898-459cb17761bd/BigBuckBunny.mp4?sv=2012-02-12&se=2014-05-03T01%3A23%3A50Z&sr=c&si=7c093e7c-7dab-45b4-beb4-2bfdff764bb5&sig=msEHP90c6JHXEOtTyIWqD7xio91GtVg0UIzjdpFscHk%3D`
-
-The following considerations apply:
-
-* You must decrypt any storage-encrypted assets that you want to stream from the origin service for progressive download.
-* A download that has not finished within 12 hours will fail.
-
-## Streaming endpoints
-
-A streaming endpoint represents a streaming service that can deliver content directly to a client player application or to a content delivery network (CDN) for further distribution. The outbound stream from a streaming endpoint service can be a live stream or a video-on-demand asset in your Media Services account. There are two types of streaming endpoints, **standard** and **premium**. For more information, see [Streaming endpoints overview](media-services-streaming-endpoints-overview.md).
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-## Known issues
-### Changes to Smooth Streaming manifest version
-Before the July 2016 service release--when assets produced by Media Encoder Standard, Media Encoder Premium Workflow, or the earlier Azure Media Encoder were streamed by using dynamic packaging--the Smooth Streaming manifest returned would conform to version 2.0. In version 2.0, the fragment durations do not use the so-called repeat (ΓÇÿrΓÇÖ) tags. For example:
-
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<SmoothStreamingMedia MajorVersion="2" MinorVersion="0" Duration="8000" TimeScale="1000">
- <StreamIndex Chunks="4" Type="video" Url="QualityLevels({bitrate})/Fragments(video={start time})" QualityLevels="3" Subtype="" Name="video" TimeScale="1000">
- <QualityLevel Index="0" Bitrate="1000000" FourCC="AVC1" MaxWidth="640" MaxHeight="360" CodecPrivateData="00000001674D4029965201405FF2E02A100000030010000003032E0A000F42400040167F18E3050007A12000200B3F8C70ED0B16890000000168EB7352" />
- <c t="0" d="2000" n="0" />
- <c d="2000" />
- <c d="2000" />
- <c d="2000" />
- </StreamIndex>
-</SmoothStreamingMedia>
-```
-
-In the July 2016 service release, the generated Smooth Streaming manifest conforms to version 2.2, with fragment durations using repeat tags. For example:
-
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<SmoothStreamingMedia MajorVersion="2" MinorVersion="2" Duration="8000" TimeScale="1000">
- <StreamIndex Chunks="4" Type="video" Url="QualityLevels({bitrate})/Fragments(video={start time})" QualityLevels="3" Subtype="" Name="video" TimeScale="1000">
- <QualityLevel Index="0" Bitrate="1000000" FourCC="AVC1" MaxWidth="640" MaxHeight="360" CodecPrivateData="00000001674D4029965201405FF2E02A100000030010000003032E0A000F42400040167F18E3050007A12000200B3F8C70ED0B16890000000168EB7352" />
- <c t="0" d="2000" r="4" />
- </StreamIndex>
-</SmoothStreamingMedia>
-```
-
-Some of the legacy Smooth Streaming clients may not support the repeat tags and will fail to load the manifest. To mitigate this issue, you can use the legacy manifest format parameter **(format=fmp4-v20)** or update your client to the latest version, which supports repeat tags. For more information, see [Smooth Streaming 2.0](media-services-deliver-content-overview.md#fmp4_v20).
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Related topics
-[Update Media Services locators after rolling storage keys](media-services-roll-storage-access-keys.md)
media-services Media Services Deliver Keys And Licenses https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-deliver-keys-and-licenses.md
- Title: Use Azure Media Services to deliver DRM licenses or AES keys | Microsoft Docs
-description: This article describes how you can use Azure Media Services to deliver PlayReady and/or Widevine licenses and AES keys but do the rest (encode, encrypt, stream) by using your on-premises servers.
------ Previously updated : 03/10/2021----
-# Use Media Services to deliver DRM licenses or AES keys
---
-Azure Media Services enables you to ingest, encode, add content protection, and stream your content. For more information, see [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md). Some customers want to use Media Services only to deliver licenses and/or keys and encode, encrypt, and stream by using their on-premises servers. This article describes how you can use Media Services to deliver PlayReady and/or Widevine licenses but do the rest with your on-premises servers.
-
-To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-
-## Overview
-Media Services provides a service for delivering PlayReady and Widevine digital rights management (DRM) licenses and AES-128 keys. Media Services also provides APIs that let you configure the rights and restrictions that you want for the DRM runtime to enforce when a user plays back the DRM-protected content. When a user requests the protected content, the player application requests a license from the Media Services license service. If the license is authorized, the Media Services license service issues the license to the player. The PlayReady and Widevine licenses contain the decryption key that can be used by the client player to decrypt and stream the content.
-
-Media Services supports multiple ways of authorizing users who make license or key requests. You configure the content key's authorization policy. The policy can have one or more restrictions. The options are open or token restriction. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the simple web token (SWT) format and the JSON Web Token (JWT) format.
-
-The following diagram shows the main steps you need to take to use Media Services to deliver PlayReady and/or Widevine licenses but do the rest with your on-premises servers:
-
-![Protect with PlayReady](./media/media-services-deliver-keys-and-licenses/media-services-diagram1.png)
-
-## Create and configure a Visual Studio project
-
-1. Set up your development environment, and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-2. Add the following elements to **appSettings** defined in your app.config file:
-
- ```xml
- <add key="Issuer" value="http://testissuer.com"/>
- <add key="Audience" value="urn:test"/>
- ```
-
-## .NET code example
-The following code example shows how to create a common content key and get PlayReady or Widevine license acquisition URLs. To configure your on-premises server, you need a content key, the key ID, and the license acquisition URL. After you configure your on-premises server, you can stream from your own streaming server. Because the encrypted stream points to a Media Services license server, your player requests a license from Media Services. If you choose token authentication, the Media Services license server validates the token you sent through HTTPS. If the token is valid, the license server delivers the license back to your player. The following code example only shows how to create a common content key and get PlayReady or Widevine license acquisition URLs. If you want to deliver AES-128 keys, you need to create an envelope content key and get a key acquisition URL. For more information, see [Use AES-128 dynamic encryption and key delivery service](media-services-playready-license-template-overview.md).
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
-using Microsoft.WindowsAzure.MediaServices.Client.Widevine;
-using Newtonsoft.Json;
-
-namespace DeliverDRMLicenses
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly Uri _sampleIssuer =
- new Uri(ConfigurationManager.AppSettings["Issuer"]);
- private static readonly Uri _sampleAudience =
- new Uri(ConfigurationManager.AppSettings["Audience"]);
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- bool tokenRestriction = true;
- string tokenTemplateString = null;
--
- IContentKey key = CreateCommonTypeContentKey();
-
- // Print out the key ID and Key in base64 string format
- Console.WriteLine("Created key {0} with key value {1} ",
- key.Id, System.Convert.ToBase64String(key.GetClearKeyValue()));
-
- Console.WriteLine("PlayReady License Key delivery URL: {0}",
- key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense));
-
- Console.WriteLine("Widevine License Key delivery URL: {0}",
- key.GetKeyDeliveryUrl(ContentKeyDeliveryType.Widevine));
-
- if (tokenRestriction)
- tokenTemplateString = AddTokenRestrictedAuthorizationPolicy(key);
- else
- AddOpenAuthorizationPolicy(key);
-
- Console.WriteLine("Added authorization policy: {0}",
- key.AuthorizationPolicyId);
- Console.WriteLine();
- Console.ReadLine();
- }
-
- static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
- {
-
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions =
- new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
-
- // Configure PlayReady and Widevine license templates.
- string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, PlayReadyLicenseTemplate);
-
- IContentKeyAuthorizationPolicyOption WidevinePolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.Widevine,
- restrictions, WidevineLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
--
- contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
- contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-
- public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
- {
- string tokenTemplateString = GenerateTokenRequirements();
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions =
- new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Token Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
- Requirements = tokenTemplateString,
- }
- };
-
- // Configure PlayReady and Widevine license templates.
- string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, PlayReadyLicenseTemplate);
-
- IContentKeyAuthorizationPolicyOption WidevinePolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.Widevine,
- restrictions, WidevineLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with token restrictions").
- Result;
-
- contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
- contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
-
- // Associate the content key authorization policy with the content key
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
-
- return tokenTemplateString;
- }
-
- static private string GenerateTokenRequirements()
- {
- TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-
- template.PrimaryVerificationKey = new SymmetricVerificationKey();
- template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
- template.Audience = _sampleAudience.ToString();
- template.Issuer = _sampleIssuer.ToString();
- template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
-
- return TokenRestrictionTemplateSerializer.Serialize(template);
- }
-
- static private string ConfigurePlayReadyLicenseTemplate()
- {
- // The following code configures PlayReady License Template using .NET classes
- // and returns the XML string.
-
- //The PlayReadyLicenseResponseTemplate class represents the template
- //for the response sent back to the end user.
- //It contains a field for a custom data string between the license server
- //and the application (may be useful for custom app logic)
- //as well as a list of one or more license templates.
-
- PlayReadyLicenseResponseTemplate responseTemplate =
- new PlayReadyLicenseResponseTemplate();
-
- // The PlayReadyLicenseTemplate class represents a license template
- // for creating PlayReady licenses
- // to be returned to the end users.
- // It contains the data on the content key in the license
- // and any rights or restrictions to be
- // enforced by the PlayReady DRM runtime when using the content key.
- PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
-
- // Configure whether the license is persistent
- // (saved in persistent storage on the client)
- // or non-persistent (only held in memory while the player is using the license).
- licenseTemplate.LicenseType = PlayReadyLicenseType.Nonpersistent;
-
- // AllowTestDevices controls whether test devices can use the license or not.
- // If true, the MinimumSecurityLevel property of the license
- // is set to 150. If false (the default),
- // the MinimumSecurityLevel property of the license is set to 2000.
- licenseTemplate.AllowTestDevices = true;
-
- // You can also configure the Play Right in the PlayReady license by using the PlayReadyPlayRight class.
- // It grants the user the ability to play back the content subject to the zero or more restrictions
- // configured in the license and on the PlayRight itself (for playback specific policy).
- // Much of the policy on the PlayRight has to do with output restrictions
- // which control the types of outputs that the content can be played over and
- // any restrictions that must be put in place when using a given output.
- // For example, if the DigitalVideoOnlyContentRestriction is enabled,
- //then the DRM runtime will only allow the video to be displayed over digital outputs
- //(analog video outputs wonΓÇÖt be allowed to pass the content).
-
- // IMPORTANT: These types of restrictions can be very powerful
- // but can also affect the consumer experience.
- // If the output protections are configured too restrictive,
- // the content might be unplayable on some clients.
- // For more information, see the PlayReady Compliance Rules document.
-
- // For example:
- //licenseTemplate.PlayRight.AgcAndColorStripeRestriction = new AgcAndColorStripeRestriction(1);
-
- responseTemplate.LicenseTemplates.Add(licenseTemplate);
-
- return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
- }
--
- private static string ConfigureWidevineLicenseTemplate()
- {
- var template = new WidevineMessage
- {
- allowed_track_types = AllowedTrackTypes.SD_HD,
- content_key_specs = new[]
- {
- new ContentKeySpecs
- {
- required_output_protection =
- new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
- security_level = 1,
- track_type = "SD"
- }
- },
- policy_overrides = new
- {
- can_play = true,
- can_persist = true,
- can_renew = false
- }
- };
-
- string configuration = JsonConvert.SerializeObject(template);
- return configuration;
- }
--
- static public IContentKey CreateCommonTypeContentKey()
- {
- // Create envelope encryption content key
- Guid keyId = Guid.NewGuid();
- byte[] contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryption);
-
- return key;
- }
-
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
- }
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See also
-* [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md)
-* [Use AES-128 dynamic encryption and the key delivery service](media-services-playready-license-template-overview.md)
media-services Media Services Deliver Streaming Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-deliver-streaming-content.md
- Title: Publish Azure Media Services content using .NET | Microsoft Docs
-description: Learn how to create a locator that is used to build a streaming URL. Code samples are written in C# and use the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# Publish Media Services content using .NET
--
-> [!div class="op_single_selector"]
-> * [REST](media-services-rest-deliver-streaming-content.md)
-> * [.NET](media-services-deliver-streaming-content.md)
-> * [Portal](media-services-portal-publish.md)
->
->
-
-## Overview
-You can stream an adaptive bitrate MP4 set by creating an OnDemand streaming locator and building a streaming URL. The [encoding an asset](media-services-encode-asset.md) topic shows how to encode into an adaptive bitrate MP4 set.
-
-> [!NOTE]
-> If your content is encrypted, configure asset delivery policy (as described in [this](media-services-dotnet-configure-asset-delivery-policy.md) topic) before creating a locator.
->
->
-
-You can also use an OnDemand streaming locator to build URLs that point to MP4 files that can be progressively downloaded.
-
-This topic shows how to create an OnDemand streaming locator to publish your asset and build a Smooth, MPEG DASH, and HLS streaming URLs. It also shows hot to build progressive download URLs.
-
-## Create an OnDemand streaming locator
-To create the OnDemand streaming locator and get URLs, you need to do the following things:
-
-1. If the content is encrypted, define an access policy.
-2. Create an OnDemand streaming locator.
-3. If you plan to stream, get the streaming manifest file (.ism) in the asset.
-
- If you plan to progressively download, get the names of MP4 files in the asset.
-4. Build URLs to the manifest file or MP4 files.
--
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). Use the same policy ID if you are always using the same days / access permissions. For example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) topic.
-
-### Use Media Services .NET SDK
-Build Streaming URLs
-
-```csharp
- private static void BuildStreamingURLs(IAsset asset)
- {
-
- // Create a 30-day readonly access policy.
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
- IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create a locator to the streaming content on an origin.
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Display some useful values based on the locator.
- Console.WriteLine("Streaming asset base path on origin: ");
- Console.WriteLine(originLocator.Path);
- Console.WriteLine();
-
- // Get a reference to the streaming manifest file from the
- // collection of files in the asset.
- var manifestFile = asset.AssetFiles.ToList().Where(f => f.Name.ToLower().
- EndsWith(".ism")).
- FirstOrDefault();
-
- // Create a full URL to the manifest file. Use this for playback
- // in streaming media clients.
- string urlForClientStreaming = originLocator.Path + manifestFile.Name + "/manifest";
- Console.WriteLine("URL to manifest for client streaming using Smooth Streaming protocol: ");
- Console.WriteLine(urlForClientStreaming);
- Console.WriteLine("URL to manifest for client streaming using HLS protocol: ");
- Console.WriteLine(urlForClientStreaming + "(format=m3u8-aapl)");
- Console.WriteLine("URL to manifest for client streaming using MPEG DASH protocol: ");
- Console.WriteLine(urlForClientStreaming + "(format=mpd-time-csf)");
- Console.WriteLine();
- }
-```
-
-The outputs:
--- URL to manifest for client streaming using Smooth Streaming protocol:\
- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest`
-- URL to manifest for client streaming using HLS protocol:\
- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest(format=m3u8-aapl)`
-- URL to manifest for client streaming using MPEG DASH protocol:\
- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest(format=mpd-time-csf)`
-
-> [!NOTE]
-> You can also stream your content over a TLS connection. To do this approach, make sure your streaming URLs start with HTTPS. Currently, AMS doesnΓÇÖt support TLS with custom domains.
->
->
-
-Build progressive download URLs
-
-```csharp
- private static void BuildProgressiveDownloadURLs(IAsset asset)
- {
- // Create a 30-day readonly access policy.
- IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create an OnDemandOrigin locator to the asset.
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Display some useful values based on the locator.
- Console.WriteLine("Streaming asset base path on origin: ");
- Console.WriteLine(originLocator.Path);
- Console.WriteLine();
-
- // Get MP4 files.
- IEnumerable<IAssetFile> mp4AssetFiles = asset
- .AssetFiles
- .ToList()
- .Where(af => af.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase));
-
- // Create a full URL to the MP4 files. Use this to progressively download files.
- foreach (var pd in mp4AssetFiles)
- Console.WriteLine(originLocator.Path + pd.Name);
- }
-```
-The outputs:
--- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4`-- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_400kbps_AAC_und_ch2_96kbps.mp4`-- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_3400kbps_AAC_und_ch2_96kbps.mp4`-- `http://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_2250kbps_AAC_und_ch2_96kbps.mp4`-
- . . .
-
-### Use Media Services .NET SDK Extensions
-The following code calls .NET SDK extensions methods that create a locator and generate the Smooth Streaming, HLS, and MPEG-DASH URLs for adaptive streaming.
-```csharp
- // Create a loctor.
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- inputAsset,
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- // Get the streaming URLs.
- Uri smoothStreamingUri = inputAsset.GetSmoothStreamingUri();
- Uri hlsUri = inputAsset.GetHlsUri();
- Uri mpegDashUri = inputAsset.GetMpegDashUri();
-
- Console.WriteLine(smoothStreamingUri);
- Console.WriteLine(hlsUri);
- Console.WriteLine(mpegDashUri);
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-* [Download assets](media-services-deliver-asset-download.md)
-* [Configure asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md)
-
media-services Media Services Dotnet Check Job Progress With Queues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-check-job-progress-with-queues.md
- Title: Use Azure Queue storage to monitor Media Services job notifications with .NET | Microsoft Docs
-description: Learn how to use Azure Queue storage to monitor Media Services job notifications. The code sample is written in C# and uses the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# Use Azure Queue storage to monitor Media Services job notifications with .NET
---
-When you run encoding jobs, you often require a way to track job progress. You can configure Media Services to deliver notifications to [Azure Queue storage](../../storage/queues/storage-dotnet-how-to-use-queues.md). You can monitor job progress by getting notifications from the Queue storage.
-
-Messages delivered to Queue storage can be accessed from anywhere in the world. The Queue storage messaging architecture is reliable and highly scalable. Polling Queue storage for messages is recommended over using other methods.
-
-One common scenario for listening to Media Services notifications is if you are developing a content management system that needs to perform some additional task after an encoding job completes (for example, to trigger the next step in a workflow, or to publish content).
-
-This article shows how to get notification messages from Queue storage.
-
-## Considerations
-Consider the following when developing Media Services applications that use Queue storage:
-
-* Queue storage does not provide a guarantee of first-in-first-out (FIFO) ordered delivery. For more information, see [Azure Queues and Azure Service Bus Queues Compared and Contrasted](/previous-versions/azure/hh767287(v=azure.100)).
-* Queue storage is not a push service. You have to poll the queue.
-* You can have any number of queues. For more information, see [Queue Service REST API](/rest/api/storageservices/queue-service-rest-api).
-* Queue storage has some limitations and specifics to be aware of. These are described in [Azure Queues and Azure Service Bus Queues Compared and Contrasted](../../service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted.md).
-
-## .NET code example
-
-The code example in this section does the following:
-
-1. Defines the **EncodingJobMessage** class that maps to the notification message format. The code deserializes messages received from the queue into objects of the **EncodingJobMessage** type.
-2. Loads the Media Services and Storage account information from the app.config file. The code example uses this information to create the **CloudMediaContext** and **CloudQueue** objects.
-3. Creates the queue that receives notification messages about the encoding job.
-4. Creates the notification end point that is mapped to the queue.
-5. Attaches the notification end point to the job and submits the encoding job. You can have multiple notification end points attached to a job.
-6. Passes **NotificationJobState.FinalStatesOnly** to the **AddNew** method. (In this example, we are only interested in final states of the job processing.)
-
- ```csharp
- job.JobNotificationSubscriptions.AddNew(NotificationJobState.FinalStatesOnly, _notificationEndPoint);
- ```
-
-7. If you pass **NotificationJobState.All**, you get all of the following state change notifications: queued, scheduled, processing, and finished. However, as noted earlier, Queue storage does not guarantee ordered delivery. To order messages, use the **Timestamp** property (defined on the **EncodingJobMessage** type in the example below). Duplicate messages are possible. To check for duplicates, use the **ETag property** (defined on the **EncodingJobMessage** type). It is also possible that some state change notifications get skipped.
-8. Waits for the job to get to the finished state by checking the queue every 10 seconds. Deletes messages after they have been processed.
-9. Deletes the queue and the notification end point.
-
-> [!NOTE]
-> The recommended way to monitor a jobΓÇÖs state is by listening to notification messages, as shown in the following example:
->
-> Alternatively, you could check on a jobΓÇÖs state by using the **IJob.State** property. A notification message about a jobΓÇÖs completion may arrive before the state on **IJob** is set to **Finished**. The **IJob.State** property reflects the accurate state with a slight delay.
->
->
-
-### Create and configure a Visual Studio project
-
-1. Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-2. Create a new folder (folder can be anywhere on your local drive) and copy a .mp4 file that you want to encode and stream or progressively download. In this example, the "C:\Media" path is used.
-3. Add a reference to the **System.Runtime.Serialization** library.
-
-### Code
-
-```csharp
-using System;
-using System.Linq;
-using System.Configuration;
-using System.IO;
-using System.Threading;
-using System.Collections.Generic;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Azure.Storage.Queues;
-using Azure.Storage.Queues.Models;
-using System.Runtime.Serialization.Json;
-
-namespace JobNotification
-{
- public class EncodingJobMessage
- {
- // MessageVersion is used for version control.
- public String MessageVersion { get; set; }
-
- // Type of the event. Valid values are
- // JobStateChange and NotificationEndpointRegistration.
- public String EventType { get; set; }
-
- // ETag is used to help the customer detect if
- // the message is a duplicate of another message previously sent.
- public String ETag { get; set; }
-
- // Time of occurrence of the event.
- public String TimeStamp { get; set; }
-
- // Collection of values specific to the event.
-
- // For the JobStateChange event the values are:
- // JobId - Id of the Job that triggered the notification.
- // NewState- The new state of the Job. Valid values are:
- // Scheduled, Processing, Canceling, Cancelled, Error, Finished
- // OldState- The old state of the Job. Valid values are:
- // Scheduled, Processing, Canceling, Cancelled, Error, Finished
-
- // For the NotificationEndpointRegistration event the values are:
- // NotificationEndpointId- Id of the NotificationEndpoint
- // that triggered the notification.
- // State- The state of the Endpoint.
- // Valid values are: Registered and Unregistered.
-
- public IDictionary<string, object> Properties { get; set; }
- }
-
- class Program
- {
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly string _StorageConnectionString =
- ConfigurationManager.AppSettings["StorageConnectionString"];
-
- private static CloudMediaContext _context = null;
- private static QueueClient _queue = null;
- private static INotificationEndPoint _notificationEndPoint = null;
-
- private static readonly string _singleInputMp4Path =
- Path.GetFullPath(@"C:\Media\BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
- string endPointAddress = Guid.NewGuid().ToString();
- string queueName = "queueName";
-
- // Create the context.
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Create the queue that will be receiving the notification messages.
- _queue = CreateQueue(_StorageConnectionString, queueName);
-
- // Create the notification point that is mapped to the queue.
- _notificationEndPoint =
- _context.NotificationEndPoints.Create(
- Guid.NewGuid().ToString(), NotificationEndPointType.AzureQueue, endPointAddress);
--
- if (_notificationEndPoint != null)
- {
- IJob job = SubmitEncodingJobWithNotificationEndPoint(_singleInputMp4Path);
- WaitForJobToReachedFinishedState(job.Id);
- }
-
- // Clean up.
- _queue.Delete();
- _notificationEndPoint.Delete();
- }
--
- static public QueueClient CreateQueue(string storageAccountConnectionString, string queueName)
- {
- // Create the queue client
- QueueClient queue = new QueueClient(storageAccountConnectionString, queueName);
-
- // Create the queue if it doesn't already exist
- queue.CreateIfNotExistsAsync();
-
- return queue;
- }
--
- public static IJob SubmitEncodingJobWithNotificationEndPoint(string inputMediaFilePath)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("My MP4 to Smooth Streaming encoding job");
-
- //Create an encrypted asset and upload the mp4.
- IAsset asset = CreateAssetAndUploadSingleFile(AssetCreationOptions.StorageEncrypted,
- inputMediaFilePath);
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task with the conversion details, using a configuration file.
- ITask task = job.Tasks.AddNew("My encoding Task",
- processor,
- "Adaptive Streaming",
- Microsoft.WindowsAzure.MediaServices.Client.TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- // Add a notification point to the job. You can add multiple notification points.
- job.JobNotificationSubscriptions.AddNew(NotificationJobState.FinalStatesOnly,
- _notificationEndPoint);
-
- job.Submit();
-
- return job;
- }
-
- public static void WaitForJobToReachedFinishedState(string jobId)
- {
- int expectedState = (int)JobState.Finished;
- int timeOutInSeconds = 600;
-
- bool jobReachedExpectedState = false;
- DateTime startTime = DateTime.Now;
- int jobState = -1;
-
- while (!jobReachedExpectedState)
- {
- // Specify how often you want to get messages from the queue.
- Thread.Sleep(TimeSpan.FromSeconds(10));
-
- foreach (QueueMessage message in _queue.ReceiveMessages(maxMessages: 10).Value)
- {
- DataContractJsonSerializerSettings settings = new DataContractJsonSerializerSettings();
- settings.UseSimpleDictionaryFormat = true;
- DataContractJsonSerializer ser = new DataContractJsonSerializer(typeof(EncodingJobMessage), settings);
- EncodingJobMessage encodingJobMsg = (EncodingJobMessage)ser.ReadObject(message.Body.ToStream);
-
- Console.WriteLine();
-
- // Display the message information.
- Console.WriteLine("EventType: {0}", encodingJobMsg.EventType);
- Console.WriteLine("MessageVersion: {0}", encodingJobMsg.MessageVersion);
- Console.WriteLine("ETag: {0}", encodingJobMsg.ETag);
- Console.WriteLine("TimeStamp: {0}", encodingJobMsg.TimeStamp);
- foreach (var property in encodingJobMsg.Properties)
- {
- Console.WriteLine(" {0}: {1}", property.Key, property.Value);
- }
-
- // We are only interested in messages
- // where EventType is "JobStateChange".
- if (encodingJobMsg.EventType == "JobStateChange")
- {
- string JobId = (String)encodingJobMsg.Properties.Where(j => j.Key == "JobId").FirstOrDefault().Value;
- if (JobId == jobId)
- {
- string oldJobStateStr = (String)encodingJobMsg.Properties.
- Where(j => j.Key == "OldState").FirstOrDefault().Value;
- string newJobStateStr = (String)encodingJobMsg.Properties.
- Where(j => j.Key == "NewState").FirstOrDefault().Value;
-
- JobState oldJobState = (JobState)Enum.Parse(typeof(JobState), oldJobStateStr);
- JobState newJobState = (JobState)Enum.Parse(typeof(JobState), newJobStateStr);
-
- if (newJobState == (JobState)expectedState)
- {
- Console.WriteLine("job with Id: {0} reached expected state: {1}",
- jobId, newJobState);
- jobReachedExpectedState = true;
- break;
- }
- }
- }
- // Delete the message after we've read it.
- _queue.DeleteMessage(message.MessageId, message.PopReceipt);
- }
-
- // Wait until timeout
- TimeSpan timeDiff = DateTime.Now - startTime;
- bool timedOut = (timeDiff.TotalSeconds > timeOutInSeconds);
- if (timedOut)
- {
- Console.WriteLine(@"Timeout for checking job notification messages,
- latest found state ='{0}', wait time = {1} secs",
- jobState,
- timeDiff.TotalSeconds);
-
- throw new TimeoutException();
- }
- }
- }
-
- static private IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string singleFilePath)
- {
- var asset = _context.Assets.Create("UploadSingleFile_" + DateTime.UtcNow.ToString(),
- assetCreationOptions);
-
- var fileName = Path.GetFileName(singleFilePath);
-
- var assetFile = asset.AssetFiles.Create(fileName);
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading of {0}", assetFile.Name);
-
- return asset;
- }
-
- static private IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-The preceding example produced the following output: Your values will vary.
-
-```output
-Created assetFile BigBuckBunny.mp4
-Upload BigBuckBunny.mp4
-Done uploading of BigBuckBunny.mp4
-
-EventType: NotificationEndPointRegistration
-MessageVersion: 1.0
-ETag: e0238957a9b25bdf3351a88e57978d6a81a84527fad03bc23861dbe28ab293f6
-TimeStamp: 2013-05-14T20:22:37
- NotificationEndPointId: nb:nepid:UUID:d6af9412-2488-45b2-ba1f-6e0ade6dbc27
- State: Registered
- Name: dde957b2-006e-41f2-9869-a978870ac620
- Created: 2013-05-14T20:22:35
-
-EventType: JobStateChange
-MessageVersion: 1.0
-ETag: 4e381f37c2d844bde06ace650310284d6928b1e50101d82d1b56220cfcb6076c
-TimeStamp: 2013-05-14T20:24:40
- JobId: nb:jid:UUID:526291de-f166-be47-b62a-11ffe6d4be54
- JobName: My MP4 to Smooth Streaming encoding job
- NewState: Finished
- OldState: Processing
- AccountName: westeuropewamsaccount
-job with Id: nb:jid:UUID:526291de-f166-be47-b62a-11ffe6d4be54 reached expected
-State: Finished
-```
-
-## Next step
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Dotnet Check Job Progress With Webhooks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-check-job-progress-with-webhooks.md
- Title: Use Azure Webhooks to monitor Media Services job notifications with .NET | Microsoft Docs
-description: Learn how to use Azure Webhooks to monitor Media Services job notifications. The code sample is written in C# and uses the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# Use Azure Webhooks to monitor Media Services job notifications with .NET
---
-When you run jobs, you often require a way to track job progress. You can monitor Media Services job notifications by using Azure Webhooks or [Azure Queue storage](media-services-dotnet-check-job-progress-with-queues.md). This article shows how to work with webhooks.
-
-This article shows how to
-
-* Define an Azure Function that is customized to respond to webhooks.
-
- In this case, the webhook is triggered by Media Services when your encoding job changes status. The function listens for the webhook call back from Media Services notifications and publishes the output asset once the job finishes.
-
- >[!TIP]
- >Before continuing, make sure you understand how [Azure Functions HTTP and webhook bindings](../../azure-functions/functions-bindings-http-webhook.md) work.
- >
-
-* Add a webhook to your encoding task and specify the webhook URL and secret key that this webhook responds to. You will find an example that adds a webhook to your encoding task at the end of the article.
-
-You can find definitions of various Media Services .NET Azure Functions (including the one shown in this article) [here](https://github.com/Azure-Samples/media-services-dotnet-functions-integration).
-
-## Prerequisites
-
-The following are required to complete the tutorial:
-
-* An Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* Understanding of [how to use Azure Functions](../../azure-functions/functions-overview.md). Also, review [Azure Functions HTTP and webhook bindings](../../azure-functions/functions-bindings-http-webhook.md).
-
-## Create a function app
-
-1. Go to the [Azure portal](https://portal.azure.com) and sign-in with your Azure account.
-2. Create a function app as described [here](../../azure-functions/functions-create-function-app-portal.md).
-
-## Configure function app settings
-
-When developing Media Services functions, it is handy to add environment variables that will be used throughout your functions. To configure app settings, click the Configure App Settings link.
-
-The [application settings](media-services-dotnet-how-to-use-azure-functions.md#configure-function-app-settings) section defines parameters that are used in the webhook defined in this article. Also add the following parameters to the app settings.
-
-|Name|Definition|Example|
-||||
-|SigningKey |A signing key.| j0txf1f8msjytzvpe40nxbpxdcxtqcgxy0nt|
-|WebHookEndpoint | A webhook endpoint address. Once your webhook function is created, you can copy the URL from the **Get function URL** link. | https:\//juliakofuncapp.azurewebsites.net/api/Notification_Webhook_Function?code=iN2phdrTnCxmvaKExFWOTulfnm4C71mMLIy8tzLr7Zvf6Z22HHIK5g==.|
-
-## Create a function
-
-Once your function app is deployed, you can find it among **App Services** Azure Functions.
-
-1. Select your function app and click **New Function**.
-2. Select **C#** code and **API & Webhooks** scenario.
-3. Select **Generic Webhook - C#**.
-4. Name your webhook and press **Create**.
-
-### Files
-
-Your Azure Function is associated with code files and other files that are described in this section. By default, a function is associated with **function.json** and **run.csx** (C#) files. You need to add a **project.json** file. The rest of this section shows the definitions for these files.
-
-![files](./media/media-services-azure-functions/media-services-azure-functions003.png)
-
-#### function.json
-
-The function.json file defines the function bindings and other configuration settings. The runtime uses this file to determine the events to monitor and how to pass data into and return data from function execution.
-
-```json
-{
- "bindings": [
- {
- "type": "httpTrigger",
- "direction": "in",
- "webHookType": "genericJson",
- "name": "req"
- },
- {
- "type": "http",
- "direction": "out",
- "name": "res"
- }
- ],
- "disabled": false
-}
-```
-
-#### project.json
-
-The project.json file contains dependencies.
-
-```json
-{
- "frameworks": {
- "net46":{
- "dependencies": {
- "windowsazure.mediaservices": "4.0.0.4",
- "windowsazure.mediaservices.extensions": "4.0.0.4",
- "Microsoft.IdentityModel.Clients.ActiveDirectory": "3.13.1",
- "Microsoft.IdentityModel.Protocol.Extensions": "1.0.2.206221351"
- }
- }
- }
-}
-```
-
-#### run.csx
-
-The code in this section shows an implementation of an Azure Function that is a webhook. In this sample, the function listens for the webhook call back from Media Services notifications and publishes the output asset once the job finishes.
-
-The webhook expects a signing key (credential) to match the one you pass when you configure the notification endpoint. The signing key is the 64-byte Base64 encoded value that is used to protect and secure your WebHooks callbacks from Azure Media Services.
-
-In the webhook definition code that follows, the **VerifyWebHookRequestSignature** method does the verification of the notification message. The purpose of this validation is to ensure that the message was sent by Azure Media Services and hasn't been tampered with. The signature is optional for Azure Functions as it has the **Code** value as a query parameter over Transport Layer Security (TLS).
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) topic.
-
-```csharp
-///////////////////////////////////////////////////
-#r "Newtonsoft.Json"
-
-using System;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Collections.Generic;
-using System.Linq;
-using System.Text;
-using System.Threading;
-using System.Threading.Tasks;
-using System.IO;
-using System.Globalization;
-using Newtonsoft.Json;
-using Microsoft.Azure;
-using System.Net;
-using System.Security.Cryptography;
-using Microsoft.Azure.WebJobs;
-using Microsoft.IdentityModel.Clients.ActiveDirectory;
-
-internal const string SignatureHeaderKey = "sha256";
-internal const string SignatureHeaderValueTemplate = SignatureHeaderKey + "={0}";
-static string _webHookEndpoint = Environment.GetEnvironmentVariable("WebHookEndpoint");
-static string _signingKey = Environment.GetEnvironmentVariable("SigningKey");
-
-static readonly string _AADTenantDomain = Environment.GetEnvironmentVariable("AMSAADTenantDomain");
-static readonly string _RESTAPIEndpoint = Environment.GetEnvironmentVariable("AMSRESTAPIEndpoint");
-
-static readonly string _AMSClientId = Environment.GetEnvironmentVariable("AMSClientId");
-static readonly string _AMSClientSecret = Environment.GetEnvironmentVariable("AMSClientSecret");
-
-static CloudMediaContext _context = null;
-
-public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
-{
- log.Info($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");
-
- Task<byte[]> taskForRequestBody = req.Content.ReadAsByteArrayAsync();
- byte[] requestBody = await taskForRequestBody;
-
- string jsonContent = await req.Content.ReadAsStringAsync();
- log.Info($"Request Body = {jsonContent}");
-
- IEnumerable<string> values = null;
- if (req.Headers.TryGetValues("ms-signature", out values))
- {
- byte[] signingKey = Convert.FromBase64String(_signingKey);
- string signatureFromHeader = values.FirstOrDefault();
-
- if (VerifyWebHookRequestSignature(requestBody, signatureFromHeader, signingKey))
- {
- string requestMessageContents = Encoding.UTF8.GetString(requestBody);
-
- NotificationMessage msg = JsonConvert.DeserializeObject<NotificationMessage>(requestMessageContents);
-
- if (VerifyHeaders(req, msg, log))
- {
- string newJobStateStr = (string)msg.Properties.Where(j => j.Key == "NewState").FirstOrDefault().Value;
- if (newJobStateStr == "Finished")
- {
- AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- AzureAdTokenProvider tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- if(_context!=null)
- {
- string urlForClientStreaming = PublishAndBuildStreamingURLs(msg.Properties["JobId"]);
- log.Info($"URL to the manifest for client streaming using HLS protocol: {urlForClientStreaming}");
- }
- }
-
- return req.CreateResponse(HttpStatusCode.OK, string.Empty);
- }
- else
- {
- log.Info($"VerifyHeaders failed.");
- return req.CreateResponse(HttpStatusCode.BadRequest, "VerifyHeaders failed.");
- }
- }
- else
- {
- log.Info($"VerifyWebHookRequestSignature failed.");
- return req.CreateResponse(HttpStatusCode.BadRequest, "VerifyWebHookRequestSignature failed.");
- }
- }
-
- return req.CreateResponse(HttpStatusCode.BadRequest, "Generic Error.");
-}
-
-private static string PublishAndBuildStreamingURLs(String jobID)
-{
- IJob job = _context.Jobs.Where(j => j.Id == jobID).FirstOrDefault();
- IAsset asset = job.OutputMediaAssets.FirstOrDefault();
-
- // Create a 30-day readonly access policy.
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
- IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create a locator to the streaming content on an origin.
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Get a reference to the streaming manifest file from the
- // collection of files in the asset.
- var manifestFile = asset.AssetFiles.ToList().Where(f => f.Name.ToLower().
- EndsWith(".ism")).
- FirstOrDefault();
-
- // Create a full URL to the manifest file. Use this for playback
- // in streaming media clients.
- string urlForClientStreaming = originLocator.Path + manifestFile.Name + "/manifest" + "(format=m3u8-aapl)";
- return urlForClientStreaming;
-
-}
-
-private static bool VerifyWebHookRequestSignature(byte[] data, string actualValue, byte[] verificationKey)
-{
- using (var hasher = new HMACSHA256(verificationKey))
- {
- byte[] sha256 = hasher.ComputeHash(data);
- string expectedValue = string.Format(CultureInfo.InvariantCulture, SignatureHeaderValueTemplate, ToHex(sha256));
-
- return (0 == String.Compare(actualValue, expectedValue, System.StringComparison.Ordinal));
- }
-}
-
-private static bool VerifyHeaders(HttpRequestMessage req, NotificationMessage msg, TraceWriter log)
-{
- bool headersVerified = false;
-
- try
- {
- IEnumerable<string> values = null;
- if (req.Headers.TryGetValues("ms-mediaservices-accountid", out values))
- {
- string accountIdHeader = values.FirstOrDefault();
- string accountIdFromMessage = msg.Properties["AccountId"];
-
- if (0 == string.Compare(accountIdHeader, accountIdFromMessage, StringComparison.OrdinalIgnoreCase))
- {
- headersVerified = true;
- }
- else
- {
- log.Info($"accountIdHeader={accountIdHeader} does not match accountIdFromMessage={accountIdFromMessage}");
- }
- }
- else
- {
- log.Info($"Header ms-mediaservices-accountid not found.");
- }
- }
- catch (Exception e)
- {
- log.Info($"VerifyHeaders hit exception {e}");
- headersVerified = false;
- }
-
- return headersVerified;
-}
-
-private static readonly char[] HexLookup = new char[] { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F' };
-
-/// <summary>
-/// Converts a <see cref="T:byte[]"/> to a hex-encoded string.
-/// </summary>
-private static string ToHex(byte[] data)
-{
- if (data == null)
- {
- return string.Empty;
- }
-
- char[] content = new char[data.Length * 2];
- int output = 0;
- byte d;
-
- for (int input = 0; input < data.Length; input++)
- {
- d = data[input];
- content[output++] = HexLookup[d / 0x10];
- content[output++] = HexLookup[d % 0x10];
- }
-
- return new string(content);
-}
-
-internal enum NotificationEventType
-{
- None = 0,
- JobStateChange = 1,
- NotificationEndPointRegistration = 2,
- NotificationEndPointUnregistration = 3,
- TaskStateChange = 4,
- TaskProgress = 5
-}
-
-internal sealed class NotificationMessage
-{
- public string MessageVersion { get; set; }
- public string ETag { get; set; }
- public NotificationEventType EventType { get; set; }
- public DateTime TimeStamp { get; set; }
- public IDictionary<string, string> Properties { get; set; }
-}
-```
-
-Save and run your function.
-
-### Function output
-
-Once the webhook is triggered, the example above produces the following output, your values will vary.
-
-```output
-C# HTTP trigger function processed a request. RequestUri=https://juliako001-functions.azurewebsites.net/api/otification_Webhook_Function?code=9376d69kygoy49oft81nel8frty5cme8hb9xsjslxjhalwhfrqd79awz8ic4ieku74dvkdfgvi
-Request Body =
-{
- "MessageVersion": "1.1",
- "ETag": "b8977308f48858a8f224708bc963e1a09ff917ce730316b4e7ae9137f78f3b20",
- "EventType": 4,
- "TimeStamp": "2017-02-16T03:59:53.3041122Z",
- "Properties": {
- "JobId": "nb:jid:UUID:badd996c-8d7c-4ae0-9bc1-bd7f1902dbdd",
- "TaskId": "nb:tid:UUID:80e26fb9-ee04-4739-abd8-2555dc24639f",
- "NewState": "Finished",
- "OldState": "Processing",
- "AccountName": "mediapkeewmg5c3peq",
- "AccountId": "301912b0-659e-47e0-9bc4-6973f2be3424",
- "NotificationEndPointId": "nb:nepid:UUID:cb5d707b-4db8-45fe-a558-19f8d3306093"
- }
-}
-
-URL to the manifest for client streaming using HLS protocol: http://mediapkeewmg5c3peq.streaming.mediaservices.windows.net/0ac98077-2b58-4db7-a8da-789a13ac6167/BigBuckBunny.ism/manifest(format=m3u8-aapl)
-```
-
-## Add a webhook to your encoding task
-
-In this section, the code that adds a webhook notification to a Task is shown. You can also add a job level notification, which would be more useful for a job with chained tasks.
-
-1. Create a new C# Console Application in Visual Studio. Enter the Name, Location, and Solution name, and then click OK.
-2. Use [NuGet](https://www.nuget.org/packages/windowsazure.mediaservices) to install Azure Media Services.
-3. Update App.config file with appropriate values:
-
- * Azure Media Services connection information,
- * webhook URL that expects to get the notifications,
- * the signing key that matches the key that your webhook expects. The signing key is the 64-byte Base64 encoded value that is used to protect and secure your webhooks callbacks from Azure Media Services.
-
- ```xml
- <appSettings>
- <add key="AMSAADTenantDomain" value="domain" />
- <add key="AMSRESTAPIEndpoint" value="endpoint" />
-
- <add key="AMSClientId" value="clinet id" />
- <add key="AMSClientSecret" value="client secret" />
-
- <add key="WebhookURL" value="https://yourapp.azurewebsites.net/api/functionname?code=ApiKey" />
- <add key="WebhookSigningKey" value="j0txf1f8msjytzvpe40nxbpxdcxtqcgxy0nt" />
- </appSettings>
- ```
-
-4. Update your Program.cs file with the following code:
-
- ```csharp
- using System;
- using System.Configuration;
- using System.Linq;
- using Microsoft.WindowsAzure.MediaServices.Client;
-
- namespace NotificationWebHook
- {
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AMSAADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _AMSRESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
-
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly string _webHookEndpoint =
- ConfigurationManager.AppSettings["WebhookURL"];
- private static readonly string _signingKey =
- ConfigurationManager.AppSettings["WebhookSigningKey"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AMSAADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- AzureAdTokenProvider tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_AMSRESTAPIEndpoint), tokenProvider);
-
- byte[] keyBytes = Convert.FromBase64String(_signingKey);
-
- IAsset newAsset = _context.Assets.FirstOrDefault();
-
- // Check for existing Notification Endpoint with the name "FunctionWebHook"
-
- var existingEndpoint = _context.NotificationEndPoints.Where(e => e.Name == "FunctionWebHook").FirstOrDefault();
- INotificationEndPoint endpoint = null;
-
- if (existingEndpoint != null)
- {
- Console.WriteLine("webhook endpoint already exists");
- endpoint = (INotificationEndPoint)existingEndpoint;
- }
- else
- {
- endpoint = _context.NotificationEndPoints.Create("FunctionWebHook",
- NotificationEndPointType.WebHook, _webHookEndpoint, keyBytes);
- Console.WriteLine("Notification Endpoint Created with Key : {0}", keyBytes.ToString());
- }
-
- // Declare a new encoding job with the Standard encoder
- IJob job = _context.Jobs.Create("MES Job");
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(newAsset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew(newAsset.Name, AssetCreationOptions.None);
-
- // Add the WebHook notification to this Task and request all notification state changes.
- // Note that you can also add a job level notification
- // which would be more useful for a job with chained tasks.
- if (endpoint != null)
- {
- task.TaskNotificationSubscriptions.AddNew(NotificationJobState.All, endpoint, true);
- Console.WriteLine("Created Notification Subscription for endpoint: {0}", _webHookEndpoint);
- }
- else
- {
- Console.WriteLine("No Notification Endpoint is being used");
- }
-
- job.Submit();
-
- Console.WriteLine("Expect WebHook to be triggered for the Job ID: {0}", job.Id);
- Console.WriteLine("Expect WebHook to be triggered for the Task ID: {0}", task.Id);
-
- Console.WriteLine("Job Submitted");
-
- }
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
- }
- ```
-
-## Next steps
--
-## Provide feedback
media-services Media Services Dotnet Configure Asset Delivery Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-configure-asset-delivery-policy.md
- Title: Configure asset delivery policies with .NET SDK | Microsoft Docs
-description: This topic shows how to configure different asset delivery policies with Azure Media Services .NET SDK.
------ Previously updated : 03/10/2021---
-# Configure asset delivery policies with .NET SDK
---
-## Overview
-If you plan to delivery encrypted assets, one of the steps in the Media Services content delivery workflow is configuring delivery policies for assets. The asset delivery policy tells Media Services how you want for your asset to be delivered: into which streaming protocol should your asset be dynamically packaged (for example, MPEG DASH, HLS, Smooth Streaming, or all), whether or not you want to dynamically encrypt your asset and how (envelope or common encryption).
-
-This article discusses why and how to create and configure asset delivery policies.
-
->[!NOTE]
->When your AMS account is created, a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
->
->Also, to be able to use dynamic packaging and dynamic encryption your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files.
-
-You could apply different policies to the same asset. For example, you could apply PlayReady encryption to Smooth Streaming and AES Envelope encryption to MPEG DASH and HLS. Any protocols that are not defined in a delivery policy (for example, you add a single policy that only specifies HLS as the protocol) will be blocked from streaming. The exception is if you have no asset delivery policy defined at all. Then, all protocols will be allowed in the clear.
-
-If you want to deliver a storage encrypted asset, you must configure the assetΓÇÖs delivery policy. Before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy. For example, to deliver your asset encrypted with Advanced Encryption Standard (AES) envelope encryption key, set the policy type to **DynamicEnvelopeEncryption**. To remove storage encryption and stream the asset in the clear, set the policy type to **NoDynamicEncryption**. Examples that show how to configure these policy types follow.
-
-Depending on how you configure the asset delivery policy, you can dynamically package, encrypt, and stream the following streaming protocols: Smooth Streaming, HLS, and MPEG DASH.
-
-The following list shows the formats that you use to stream Smooth, HLS, and DASH.
-
-Smooth Streaming:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
-
-HLS:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl)
-
-MPEG DASH
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=mpd-time-csf)
-
-## Considerations
-* Before deleting the AssetDeliveryPolicy, you should delete all of the streaming locators associated with the asset. You can later create new streaming locators, if desired, with a new AssetDeliveryPolicy.
-* A streaming locator cannot be created on a storage encrypted asset when no asset delivery policy is set. If the Asset isnΓÇÖt storage encrypted, the system will let you create a locator and stream the asset in the clear without an asset delivery policy.
-* You can have multiple asset delivery policies associated with a single asset but you can only specify one way to handle a given AssetDeliveryProtocol. Meaning if you try to link two delivery policies that specify the AssetDeliveryProtocol.SmoothStreaming protocol that will result in an error because the system does not know which one you want it to apply when a client makes a Smooth Streaming request.
-* If you have an asset with an existing streaming locator, you cannot link a new policy to the asset (you can either unlink an existing policy from the asset, or update a delivery policy associated with the asset). You first have to remove the streaming locator, adjust the policies, and then re-create the streaming locator. You can use the same locatorId when you recreate the streaming locator but you should ensure that wonΓÇÖt cause issues for clients since content can be cached by the origin or a downstream CDN.
-
-## Clear asset delivery policy
-
-The following **ConfigureClearAssetDeliveryPolicy** method specifies to not apply dynamic encryption and to deliver the stream in any of the following protocols: MPEG DASH, HLS, and Smooth Streaming protocols. You might want to apply this policy to your storage encrypted assets.
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-```csharp
- static public void ConfigureClearAssetDeliveryPolicy(IAsset asset)
- {
- IAssetDeliveryPolicy policy =
- _context.AssetDeliveryPolicies.Create("Clear Policy",
- AssetDeliveryPolicyType.NoDynamicEncryption,
- AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.Dash, null);
-
- asset.DeliveryPolicies.Add(policy);
- }
-```
-## DynamicCommonEncryption asset delivery policy
-
-The following **CreateAssetDeliveryPolicy** method creates the **AssetDeliveryPolicy** that is configured to apply dynamic common encryption (**DynamicCommonEncryption**) to a smooth streaming protocol (other protocols will be blocked from streaming). The method takes two parameters: **Asset** (the asset to which you want to apply the delivery policy) and **IContentKey** (the content key of the **CommonEncryption** type, for more information, see: [Creating a content key](media-services-dotnet-create-contentkey.md#common_contentkey)).
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-```csharp
- static public void CreateAssetDeliveryPolicy(IAsset asset, IContentKey key)
- {
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense);
-
- Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
- {
- {AssetDeliveryPolicyConfigurationKey.PlayReadyLicenseAcquisitionUrl, acquisitionUrl.ToString()},
- };
-
- var assetDeliveryPolicy = _context.AssetDeliveryPolicies.Create(
- "AssetDeliveryPolicy",
- AssetDeliveryPolicyType.DynamicCommonEncryption,
- AssetDeliveryProtocol.SmoothStreaming,
- assetDeliveryPolicyConfiguration);
-
- // Add AssetDelivery Policy to the asset
- asset.DeliveryPolicies.Add(assetDeliveryPolicy);
-
- Console.WriteLine();
- Console.WriteLine("Adding Asset Delivery Policy: " +
- assetDeliveryPolicy.AssetDeliveryPolicyType);
- }
-```
-
-Azure Media Services also enables you to add Widevine encryption. The following example demonstrates both PlayReady and Widevine being added to the asset delivery policy.
-
-```csharp
- static public void CreateAssetDeliveryPolicy(IAsset asset, IContentKey key)
- {
- // Get the PlayReady license service URL.
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense);
--
- // GetKeyDeliveryUrl for Widevine attaches the KID to the URL.
- // For example: https://amsaccount1.keydelivery.mediaservices.windows.net/Widevine/?KID=268a6dcb-18c8-4648-8c95-f46429e4927c.
- // The WidevineBaseLicenseAcquisitionUrl (used below) also tells Dynamic Encryption
- // to append /? KID =< keyId > to the end of the url when creating the manifest.
- // As a result Widevine license acquisition URL will have KID appended twice,
- // so we need to remove the KID that in the URL when we call GetKeyDeliveryUrl.
-
- Uri widevineUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.Widevine);
- UriBuilder uriBuilder = new UriBuilder(widevineUrl);
- uriBuilder.Query = String.Empty;
- widevineUrl = uriBuilder.Uri;
-
- Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
- {
- {AssetDeliveryPolicyConfigurationKey.PlayReadyLicenseAcquisitionUrl, acquisitionUrl.ToString()},
- {AssetDeliveryPolicyConfigurationKey.WidevineLicenseAcquisitionUrl, widevineUrl.ToString()}
-
- };
-
- var assetDeliveryPolicy = _context.AssetDeliveryPolicies.Create(
- "AssetDeliveryPolicy",
- AssetDeliveryPolicyType.DynamicCommonEncryption,
- AssetDeliveryProtocol.Dash,
- assetDeliveryPolicyConfiguration);
--
- // Add AssetDelivery Policy to the asset
- asset.DeliveryPolicies.Add(assetDeliveryPolicy);
-
- }
-```
-> [!NOTE]
-> When encrypting with Widevine, you would only be able to deliver using DASH. Make sure to specify DASH in the asset delivery protocol.
->
->
-
-## DynamicEnvelopeEncryption asset delivery policy
-The following **CreateAssetDeliveryPolicy** method creates the **AssetDeliveryPolicy** that is configured to apply dynamic envelope encryption (**DynamicEnvelopeEncryption**) to Smooth Streaming, HLS, and DASH protocols (if you decide to not specify some protocols, they will be blocked from streaming). The method takes two parameters: **Asset** (the asset to which you want to apply the delivery policy) and **IContentKey** (the content key of the **EnvelopeEncryption** type, for more information, see: [Creating a content key](media-services-dotnet-create-contentkey.md#envelope_contentkey)).
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-```csharp
- private static void CreateAssetDeliveryPolicy(IAsset asset, IContentKey key)
- {
-
- // Get the Key Delivery Base Url by removing the Query parameter. The Dynamic Encryption service will
- // automatically add the correct key identifier to the url when it generates the Envelope encrypted content
- // manifest. Omitting the IV will also cause the Dynamic Encryption service to generate a deterministic
- // IV for the content automatically. By using the EnvelopeBaseKeyAcquisitionUrl and omitting the IV, this
- // allows the AssetDelivery policy to be reused by more than one asset.
- //
- Uri keyAcquisitionUri = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.BaselineHttp);
- UriBuilder uriBuilder = new UriBuilder(keyAcquisitionUri);
- uriBuilder.Query = String.Empty;
- keyAcquisitionUri = uriBuilder.Uri;
-
- // The following policy configuration specifies:
- // key url that will have KID=<Guid> appended to the envelope and
- // the Initialization Vector (IV) to use for the envelope encryption.
- Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
- {
- {AssetDeliveryPolicyConfigurationKey.EnvelopeBaseKeyAcquisitionUrl, keyAcquisitionUri.ToString()},
- };
-
- IAssetDeliveryPolicy assetDeliveryPolicy =
- _context.AssetDeliveryPolicies.Create(
- "AssetDeliveryPolicy",
- AssetDeliveryPolicyType.DynamicEnvelopeEncryption,
- AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.Dash,
- assetDeliveryPolicyConfiguration);
-
- // Add AssetDelivery Policy to the asset
- asset.DeliveryPolicies.Add(assetDeliveryPolicy);
-
- Console.WriteLine();
- Console.WriteLine("Adding Asset Delivery Policy: " + assetDeliveryPolicy.AssetDeliveryPolicyType);
- }
-```
-
-## <a id="types"></a>Types used when defining AssetDeliveryPolicy
-
-### <a id="AssetDeliveryProtocol"></a>AssetDeliveryProtocol
-
-The following enum describes values you can set for the asset delivery protocol.
-
-```csharp
- [Flags]
- public enum AssetDeliveryProtocol
- {
- /// <summary>
- /// No protocols.
- /// </summary>
- None = 0x0,
-
- /// <summary>
- /// Smooth streaming protocol.
- /// </summary>
- SmoothStreaming = 0x1,
-
- /// <summary>
- /// MPEG Dynamic Adaptive Streaming over HTTP (DASH)
- /// </summary>
- Dash = 0x2,
-
- /// <summary>
- /// Apple HTTP Live Streaming protocol.
- /// </summary>
- HLS = 0x4,
-
- ProgressiveDownload = 0x10,
-
- /// <summary>
- /// Include all protocols.
- /// </summary>
- All = 0xFFFF
- }
-```
-### <a id="AssetDeliveryPolicyType"></a>AssetDeliveryPolicyType
-
-The following enum describes values you can set for the asset delivery policy type.
-```csharp
- public enum AssetDeliveryPolicyType
- {
- /// <summary>
- /// Delivery Policy Type not set. An invalid value.
- /// </summary>
- None,
-
- /// <summary>
- /// The Asset should not be delivered via this AssetDeliveryProtocol.
- /// </summary>
- Blocked,
-
- /// <summary>
- /// Do not apply dynamic encryption to the asset.
- /// </summary>
- ///
- NoDynamicEncryption,
-
- /// <summary>
- /// Apply Dynamic Envelope encryption.
- /// </summary>
- DynamicEnvelopeEncryption,
-
- /// <summary>
- /// Apply Dynamic Common encryption.
- /// </summary>
- DynamicCommonEncryption
- }
-```
-### <a id="ContentKeyDeliveryType"></a>ContentKeyDeliveryType
-
-The following enum describes values you can use to configure the delivery method of the content key to the client.
- ```csharp
- public enum ContentKeyDeliveryType
- {
- /// <summary>
- /// None.
- ///
- </summary>
- None = 0,
-
- /// <summary>
- /// Use PlayReady License acquisition protocol
- ///
- </summary>
- PlayReadyLicense = 1,
-
- /// <summary>
- /// Use MPEG Baseline HTTP key protocol.
- ///
- </summary>
- BaselineHttp = 2,
-
- /// <summary>
- /// Use Widevine License acquisition protocol
- ///
- </summary>
- Widevine = 3
-
- }
-```
-### <a id="AssetDeliveryPolicyConfigurationKey"></a>AssetDeliveryPolicyConfigurationKey
-
-The following enum describes values you can set to configure keys used to get specific configuration for an asset delivery policy.
-```csharp
- public enum AssetDeliveryPolicyConfigurationKey
- {
- /// <summary>
- /// No policies.
- /// </summary>
- None,
-
- /// <summary>
- /// Exact Envelope key URL.
- /// </summary>
- EnvelopeKeyAcquisitionUrl,
-
- /// <summary>
- /// Base key url that will have KID=<Guid> appended for Envelope.
- /// </summary>
- EnvelopeBaseKeyAcquisitionUrl,
-
- /// <summary>
- /// The initialization vector to use for envelope encryption in Base64 format.
- /// </summary>
- EnvelopeEncryptionIVAsBase64,
-
- /// <summary>
- /// The PlayReady License Acquisition Url to use for common encryption.
- /// </summary>
- PlayReadyLicenseAcquisitionUrl,
-
- /// <summary>
- /// The PlayReady Custom Attributes to add to the PlayReady Content Header
- /// </summary>
- PlayReadyCustomAttributes,
-
- /// <summary>
- /// The initialization vector to use for envelope encryption.
- /// </summary>
- EnvelopeEncryptionIV,
-
- /// <summary>
- /// Widevine DRM acquisition url
- /// </summary>
- WidevineLicenseAcquisitionUrl
- }
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Dotnet Configure Content Key Auth Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-configure-content-key-auth-policy.md
- Title: Configure a content key authorization policy by using the Media Services .NET SDK | Microsoft Docs
-description: Learn how to configure an authorization policy for a content key by using the Media Services .NET SDK.
------ Previously updated : 03/10/2021---
-# Configure a content key authorization policy by using the Media Services .NET SDK
---
-## Overview
- You can use Azure Media Services to deliver MPEG-DASH, Smooth Streaming, and HTTP Live Streaming (HLS) streams protected with the Advanced Encryption Standard (AES) by using 128-bit encryption keys or [PlayReady digital rights management (DRM)](https://www.microsoft.com/playready/overview/). With Media Services, you also can deliver DASH streams encrypted with Widevine DRM. Both PlayReady and Widevine are encrypted per the common encryption (ISO/IEC 23001-7 CENC) specification.
-
-Media Services also provides a key/license delivery service from which clients can obtain AES keys or PlayReady/Widevine licenses to play the encrypted content.
-
-If you want Media Services to encrypt an asset, you need to associate an encryption key (CommonEncryption or EnvelopeEncryption) with the asset. For more information, see [Create ContentKeys with .NET](media-services-dotnet-create-contentkey.md). You also need to configure authorization policies for the key (as described in this article).
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES or DRM encryption. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-Media Services supports multiple ways of authenticating users who make key requests. The content key authorization policy can have one or more authorization restrictions. The options are open or token restriction. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the simple web token ([SWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_2)) format and the JSON Web Token ([JWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_3)) format.
-
-Media Services doesn't provide STS. You can create a custom STS or use Azure Access Control Service to issue tokens. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration (as described in this article). If the token is valid and the claims in the token match those configured for the content key, the Media Services key delivery service returns the encryption key to the client.
-
-For more information, see the following articles:
--- [JWT token authentication](http://www.gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/)-- [Integrate Azure Media Services OWIN MVC-based app with Azure Active Directory and restrict content key delivery based on JWT claims](http://www.gtrifonov.com/2015/01/24/mvc-owin-azure-media-services-ad-integration/)-
-### Some considerations apply
-* When your Media Services account is created, a default streaming endpoint is added to your account in the "Stopped" state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, your streaming endpoint must be in the "Running" state.
-* Your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files. For more information, see [Encode an asset](media-services-encode-asset.md).
-* Upload and encode your assets by using the AssetCreationOptions.StorageEncrypted option.
-* If you plan to have multiple content keys that require the same policy configuration, we recommend that you create a single authorization policy and reuse it with multiple content keys.
-* The key delivery service caches ContentKeyAuthorizationPolicy and its related objects (policy options and restrictions) for 15 minutes. You can create ContentKeyAuthorizationPolicy and specify to use a token restriction, test it, and then update the policy to the open restriction. This process takes roughly 15 minutes before the policy switches to the open version of the policy.
-* If you add or update your asset's delivery policy, you must delete any existing locator and create a new locator.
-* Currently, you can't encrypt progressive downloads.
-* A Media Services streaming endpoint sets the value of the CORS 'Access-Control-Allow-Origin' header in preflight response as the wildcard '\*'. This value works well with most players, including Azure Media Player, Roku and JWPlayer, and others. However, some players that use dashjs don't work because, with the credentials mode set to "include", XMLHttpRequest in their dashjs doesn't allow the wildcard "\*" as the value of 'Access-Control-Allow-Origin'. As a workaround to this limitation in dashjs, if you host your client from a single domain, Media Services can specify that domain in the preflight response header. For assistance, open a support ticket through the Azure portal.
-
-## AES-128 dynamic encryption
-### Open restriction
-Open restriction means the system delivers the key to anyone who makes a key request. This restriction might be useful for testing purposes.
-
-The following example creates an open authorization policy and adds it to the content key:
-```csharp
- static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
- {
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
- IContentKeyAuthorizationPolicy policy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Open Authorization Policy").Result;
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions =
- new List<ContentKeyAuthorizationPolicyRestriction>();
-
- ContentKeyAuthorizationPolicyRestriction restriction =
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "HLS Open Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null // no requirements needed for HLS
- };
-
- restrictions.Add(restriction);
-
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create(
- "policy",
- ContentKeyDeliveryType.BaselineHttp,
- restrictions,
- "");
-
- policy.Options.Add(policyOption);
-
- // Add ContentKeyAuthorizationPolicy to ContentKey
- contentKey.AuthorizationPolicyId = policy.Id;
- IContentKey updatedKey = contentKey.UpdateAsync().Result;
- Console.WriteLine("Adding Key to Asset: Key ID is " + updatedKey.Id);
- }
-```
-
-### Token restriction
-This section describes how to create a content key authorization policy and associate it with the content key. The authorization policy describes what authorization requirements must be met to determine if the user is authorized to receive the key. For example, does the verification key list contain the key that the token was signed with?
-
-To configure the token restriction option, you need to use an XML to describe the token's authorization requirements. The token restriction configuration XML must conform to the following XML schema:
-```csharp
-#### Token restriction schema
- <?xml version="1.0" encoding="utf-8"?>
- <xs:schema xmlns:tns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1" elementFormDefault="qualified" targetNamespace="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1" xmlns:xs="https://www.w3.org/2001/XMLSchema">
- <xs:complexType name="TokenClaim">
- <xs:sequence>
- <xs:element name="ClaimType" nillable="true" type="xs:string" />
- <xs:element minOccurs="0" name="ClaimValue" nillable="true" type="xs:string" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="TokenClaim" nillable="true" type="tns:TokenClaim" />
- <xs:complexType name="TokenRestrictionTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" name="AlternateVerificationKeys" nillable="true" type="tns:ArrayOfTokenVerificationKey" />
- <xs:element name="Audience" nillable="true" type="xs:anyURI" />
- <xs:element name="Issuer" nillable="true" type="xs:anyURI" />
- <xs:element name="PrimaryVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- <xs:element minOccurs="0" name="RequiredClaims" nillable="true" type="tns:ArrayOfTokenClaim" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="TokenRestrictionTemplate" nillable="true" type="tns:TokenRestrictionTemplate" />
- <xs:complexType name="ArrayOfTokenVerificationKey">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="TokenVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfTokenVerificationKey" nillable="true" type="tns:ArrayOfTokenVerificationKey" />
- <xs:complexType name="TokenVerificationKey">
- <xs:sequence />
- </xs:complexType>
- <xs:element name="TokenVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- <xs:complexType name="ArrayOfTokenClaim">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="TokenClaim" nillable="true" type="tns:TokenClaim" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfTokenClaim" nillable="true" type="tns:ArrayOfTokenClaim" />
- <xs:complexType name="SymmetricVerificationKey">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:TokenVerificationKey">
- <xs:sequence>
- <xs:element name="KeyValue" nillable="true" type="xs:base64Binary" />
- </xs:sequence>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="SymmetricVerificationKey" nillable="true" type="tns:SymmetricVerificationKey" />
- </xs:schema>
-```
-When you configure the token-restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the STS that issues the token. The audience (sometimes called scope) describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-When you use the Media Services SDK for .NET, you can use the TokenRestrictionTemplate class to generate the restriction token.
-The following example creates an authorization policy with a token restriction. In this example, the client must present a token that contains a signing key (VerificationKey), a token issuer, and required claims.
-```csharp
- public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
- {
- string tokenTemplateString = GenerateTokenRequirements();
-
- IContentKeyAuthorizationPolicy policy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("HLS token restricted authorization policy").Result;
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions =
- new List<ContentKeyAuthorizationPolicyRestriction>();
-
- ContentKeyAuthorizationPolicyRestriction restriction =
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Token Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
- Requirements = tokenTemplateString
- };
-
- restrictions.Add(restriction);
-
- //You could have multiple options
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create(
- "Token option for HLS",
- ContentKeyDeliveryType.BaselineHttp,
- restrictions,
- null // no key delivery data is needed for HLS
- );
-
- policy.Options.Add(policyOption);
-
- // Add ContentKeyAuthorizationPolicy to ContentKey
- contentKey.AuthorizationPolicyId = policy.Id;
- IContentKey updatedKey = contentKey.UpdateAsync().Result;
- Console.WriteLine("Adding Key to Asset: Key ID is " + updatedKey.Id);
-
- return tokenTemplateString;
- }
-
- static private string GenerateTokenRequirements()
- {
- TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-
- template.PrimaryVerificationKey = new SymmetricVerificationKey();
- template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
- template.Audience = _sampleAudience.ToString();
- template.Issuer = _sampleIssuer.ToString();
-
- template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
-
- return TokenRestrictionTemplateSerializer.Serialize(template);
- }
-```
-#### Test token
-To get a test token based on the token restriction that was used for the key authorization policy, do the following:
-```csharp
- // Deserializes a string containing an Xml representation of a TokenRestrictionTemplate
- // back into a TokenRestrictionTemplate class instance.
- TokenRestrictionTemplate tokenTemplate =
- TokenRestrictionTemplateSerializer.Deserialize(tokenTemplateString);
-
- // Generate a test token based on the data in the given TokenRestrictionTemplate.
- // Note, you need to pass the key id Guid because we specified
- // TokenClaim.ContentKeyIdentifierClaim in during the creation of TokenRestrictionTemplate.
- Guid rawkey = EncryptionUtils.GetKeyIdAsGuid(key.Id);
-
- //The GenerateTestToken method returns the token without the word ΓÇ£BearerΓÇ¥ in front
- //so you have to add it in front of the token string.
- string testToken = TokenRestrictionTemplateSerializer.GenerateTestToken(tokenTemplate, null, rawkey);
- Console.WriteLine("The authorization token is:\nBearer {0}", testToken);
- Console.WriteLine();
-```
-
-## PlayReady dynamic encryption
-You can use Media Services to configure the rights and restrictions that you want the PlayReady DRM runtime to enforce when a user tries to play back protected content.
-
-When you protect your content with PlayReady, one of the things you need to specify in your authorization policy is an XML string that defines the [PlayReady license template](media-services-playready-license-template-overview.md). In the Media Services SDK for .NET, the PlayReadyLicenseResponseTemplate and PlayReadyLicenseTemplate classes help you define the PlayReady license template.
-
-To learn how to encrypt your content with PlayReady and Widevine, see [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md).
-
-### Open restriction
-Open restriction means the system delivers the key to anyone who makes a key request. This restriction might be useful for testing purposes.
-
-The following example creates an open authorization policy and adds it to the content key:
-
-```csharp
- static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
- {
-
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
-
- // Configure PlayReady license template.
- string newLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, newLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
--
- contentKeyAuthorizationPolicy.Options.Add(policyOption);
-
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-```
-
-### Token restriction
-To configure the token restriction option, you need to use an XML to describe the token's authorization requirements. The token restriction configuration XML must conform to the XML schema shown in the "Token restriction schema" section.
-
-```csharp
- public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
- {
- string tokenTemplateString = GenerateTokenRequirements();
-
- IContentKeyAuthorizationPolicy policy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("HLS token restricted authorization policy").Result;
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Token Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
- Requirements = tokenTemplateString,
- }
- };
-
- // Configure PlayReady license template.
- string newLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, newLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
-
- policy.Options.Add(policyOption);
-
- // Add ContentKeyAuthorizationPolicy to ContentKey
- contentKeyAuthorizationPolicy.Options.Add(policyOption);
-
- // Associate the content key authorization policy with the content key
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
-
- return tokenTemplateString;
- }
-
- static private string GenerateTokenRequirements()
- {
-
- TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-
- template.PrimaryVerificationKey = new SymmetricVerificationKey();
- template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
- template.Audience = _sampleAudience.ToString();
- template.Issuer = _sampleIssuer.ToString();
--
- template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
-
- return TokenRestrictionTemplateSerializer.Serialize(template);
- }
-
- static private string ConfigurePlayReadyLicenseTemplate()
- {
- // The following code configures PlayReady License Template using .NET classes
- // and returns the XML string.
-
- //The PlayReadyLicenseResponseTemplate class represents the template for the response sent back to the end user.
- //It contains a field for a custom data string between the license server and the application
- //(may be useful for custom app logic) as well as a list of one or more license templates.
- PlayReadyLicenseResponseTemplate responseTemplate = new PlayReadyLicenseResponseTemplate();
-
- // The PlayReadyLicenseTemplate class represents a license template for creating PlayReady licenses
- // to be returned to the end users.
- //It contains the data on the content key in the license and any rights or restrictions to be
- //enforced by the PlayReady DRM runtime when using the content key.
- PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
- //Configure whether the license is persistent (saved in persistent storage on the client)
- //or non-persistent (only held in memory while the player is using the license).
- licenseTemplate.LicenseType = PlayReadyLicenseType.Nonpersistent;
-
- // AllowTestDevices controls whether test devices can use the license or not.
- // If true, the MinimumSecurityLevel property of the license
- // is set to 150. If false (the default), the MinimumSecurityLevel property of the license is set to 2000.
- licenseTemplate.AllowTestDevices = true;
--
- // You can also configure the Play Right in the PlayReady license by using the PlayReadyPlayRight class.
- // It grants the user the ability to play back the content subject to the zero or more restrictions
- // configured in the license and on the PlayRight itself (for playback specific policy).
- // Much of the policy on the PlayRight has to do with output restrictions
- // which control the types of outputs that the content can be played over and
- // any restrictions that must be put in place when using a given output.
- // For example, if the DigitalVideoOnlyContentRestriction is enabled,
- //then the DRM runtime will only allow the video to be displayed over digital outputs
- //(analog video outputs wonΓÇÖt be allowed to pass the content).
-
- //IMPORTANT: These types of restrictions can be very powerful but can also affect the consumer experience.
- // If the output protections are configured too restrictive,
- // the content might be unplayable on some clients. For more information, see the PlayReady Compliance Rules document.
-
- // For example:
- //licenseTemplate.PlayRight.AgcAndColorStripeRestriction = new AgcAndColorStripeRestriction(1);
-
- responseTemplate.LicenseTemplates.Add(licenseTemplate);
-
- return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
- }
-```
-
-To get a test token based on the token restriction that was used for the key authorization policy, see the "[Test token](#test-token)" section.
-
-## <a id="types"></a>Types used when you define ContentKeyAuthorizationPolicy
-### <a id="ContentKeyRestrictionType"></a>ContentKeyRestrictionType
-
-```csharp
- public enum ContentKeyRestrictionType
- {
- Open = 0,
- TokenRestricted = 1,
- IPRestricted = 2,
- }
-```
-
-### <a id="ContentKeyDeliveryType"></a>ContentKeyDeliveryType
-
-```csharp
- public enum ContentKeyDeliveryType
- {
- None = 0,
- PlayReadyLicense = 1,
- BaselineHttp = 2,
- Widevine = 3
- }
-```
-
-### <a id="TokenType"></a>TokenType
-
-```csharp
- public enum TokenType
- {
- Undefined = 0,
- SWT = 1,
- JWT = 2,
- }
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-Now that you have configured the content key's authorization policy, see [Configure an asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md).
media-services Media Services Dotnet Create Contentkey https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-create-contentkey.md
- Title: Create ContentKeys with .NET
-description: This article demonstrates how to create content keys by using .NET. These keys provide secure access to assets.
------ Previously updated : 03/10/2021---
-# Create ContentKeys with .NET
-
-
-> [!div class="op_single_selector"]
-> * [REST](media-services-rest-create-contentkey.md)
-> * [.NET](media-services-dotnet-create-contentkey.md)
->
->
-
-Media Services enables you to create and deliver encrypted assets. A **ContentKey** provides secure access to your **Asset**s.
-
-When you create a new asset (for example, before you [upload files](media-services-dotnet-upload-files.md)), you can specify the following encryption options: **StorageEncrypted**, **CommonEncryptionProtected**, or **EnvelopeEncryptionProtected**.
-
-When you deliver assets to your clients, you can [configure for assets to be dynamically encrypted](media-services-dotnet-configure-asset-delivery-policy.md) with one of the following two encryptions: **DynamicEnvelopeEncryption** or **DynamicCommonEncryption**.
-
-Encrypted assets have to be associated with **ContentKey**s. This article describes how to create a content key.
-
-> [!NOTE]
-> When creating a new **StorageEncrypted** asset using the Media Services .NET SDK , the **ContentKey** is automatically created and linked with the asset.
->
->
-
-## ContentKeyType
-One of the values that you must set when create a content key is the content key type. Choose from one of the following values.
-
-```csharp
- public enum ContentKeyType
- {
- /// <summary>
- /// Specifies a content key for common encryption.
- /// </summary>
- /// <remarks>This is the default value.</remarks>
- CommonEncryption = 0,
-
- /// <summary>
- /// Specifies a content key for storage encryption.
- /// </summary>
- StorageEncryption = 1,
-
- /// <summary>
- /// Specifies a content key for configuration encryption.
- /// </summary>
- ConfigurationEncryption = 2,
-
- /// <summary>
- /// Specifies a content key for Envelope encryption. Only used internally.
- /// </summary>
- EnvelopeEncryption = 4
- }
-```
-
-## <a id="envelope_contentkey"></a>Create envelope type ContentKey
-The following code snippet creates a content key of the envelope encryption type. It then associates the key with the specified asset.
-
-```csharp
- static public IContentKey CreateEnvelopeTypeContentKey(IAsset asset)
- {
- // Create envelope encryption content key
- Guid keyId = Guid.NewGuid();
- byte[] contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.EnvelopeEncryption);
-
- asset.ContentKeys.Add(key);
-
- return key;
- }
-
- static private byte[] GetRandomBuffer(int size)
- {
- byte[] randomBytes = new byte[size];
- using (RNGCryptoServiceProvider rng = new RNGCryptoServiceProvider())
- {
- rng.GetBytes(randomBytes);
- }
-
- return randomBytes;
- }
-
-call
-
- IContentKey key = CreateEnvelopeTypeContentKey(encryptedsset);
-```
--
-## <a id="common_contentkey"></a>Create common type ContentKey
-The following code snippet creates a content key of the common encryption type. It then associates the key with the specified asset.
-
-```csharp
- static public IContentKey CreateCommonTypeContentKey(IAsset asset)
- {
- // Create common encryption content key
- Guid keyId = Guid.NewGuid();
- byte[] contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryption);
-
- // Associate the key with the asset.
- asset.ContentKeys.Add(key);
-
- return key;
- }
-
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
-call
-
- IContentKey key = CreateCommonTypeContentKey(encryptedsset);
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Dotnet Creating Live Encoder Enabled Channel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-creating-live-encoder-enabled-channel.md
- Title: How to perform live streaming using Azure Media Services to create multi-bitrate streams with .NET | Microsoft Docs
-description: This tutorial walks you through the steps of creating a Channel that receives a single-bitrate live stream and encodes it to multi-bitrate stream using .NET SDK.
------ Previously updated : 03/10/2021----
-# How to perform live streaming using Azure Media Services to create multi-bitrate streams with .NET
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-creating-live-encoder-enabled-channel.md)
-> * [.NET](media-services-dotnet-creating-live-encoder-enabled-channel.md)
-> * [REST API](/rest/api/media/operations/channel)
->
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/?WT.mc_id=A261C142F).
->
->
-
-## Overview
-This tutorial walks you through the steps of creating a **Channel** that receives a single-bitrate live stream and encodes it to multi-bitrate stream.
-
-For more conceptual information related to Channels that are enabled for live encoding, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
-
-## Common Live Streaming Scenario
-The following steps describe tasks involved in creating common live streaming applications.
-
-> [!NOTE]
-> Currently, the max recommended duration of a live event is 8 hours. Please contact amshelp@microsoft.com if you need to run a Channel for longer periods of time.
-
-1. Connect a video camera to a computer. Launch and configure an on-premises live encoder that can output a single bitrate stream in one of the following protocols: RTMP or Smooth Streaming. For more information, see [Azure Media Services RTMP Support and Live Encoders](https://go.microsoft.com/fwlink/?LinkId=532824).
-
- This step could also be performed after you create your Channel.
-
-2. Create and start a Channel.
-3. Retrieve the Channel ingest URL.
-
- The ingest URL is used by the live encoder to send the stream to the Channel.
-
-4. Retrieve the Channel preview URL.
-
- Use this URL to verify that your channel is properly receiving the live stream.
-
-5. Create an asset.
-6. If you want for the asset to be dynamically encrypted during playback, do the following:
-7. Create a content key.
-8. Configure the content key's authorization policy.
-9. Configure asset delivery policy (used by dynamic packaging and dynamic encryption).
-10. Create a program and specify to use the asset that you created.
-11. Publish the asset associated with the program by creating an OnDemand locator.
-
- >[!NOTE]
- >When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. The streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-12. Start the program when you are ready to start streaming and archiving.
-13. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
-14. Stop the program whenever you want to stop streaming and archiving the event.
-15. Delete the Program (and optionally delete the asset).
-
-## What you'll learn
-This article shows you how to execute different operations on channels and programs using Media Services .NET SDK. Because many operations are long-running .NET APIs that manage long running operations are used.
-
-The article shows how to do the following:
-
-1. Create and start a channel. Long-running APIs are used.
-2. Get the channels ingest (input) endpoint. This endpoint should be provided to the encoder that can send a single bitrate live stream.
-3. Get the preview endpoint. This endpoint is used to preview your stream.
-4. Create an asset that is used to store your content. The asset delivery policies should be configured as well, as shown in this example.
-5. Create a program and specify to use the asset that was created earlier. Start the program. Long-running APIs are used.
-6. Create a locator for the asset, so the content gets published and can be streamed to your clients.
-7. Show and hide slates. Start and stop advertisements. Long-running APIs are used.
-8. Clean up your channel and all the associated resources.
-
-## Prerequisites
-The following are required to complete the tutorial.
-
-* An Azure account. If you don't have an account, you can create a free trial account in just a couple of minutes. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/?WT.mc_id=A261C142F). You get credits that can be used to try out paid Azure services. Even after the credits are used up, you can keep the account and use free Azure services and features, such as the Web Apps feature in Azure App Service.
-* A Media Services account. To create a Media Services account, see [Create Account](media-services-portal-create-account.md).
-* Visual Studio 2010 SP1 (Professional, Premium, Ultimate, or Express) or later versions.
-* You must use Media Services .NET SDK version 3.2.0.0 or newer.
-* A webcam and an encoder that can send a single bitrate live stream.
-
-## Considerations
-* Currently, the max recommended duration of a live event is 8 hours. Please contact amshelp@microsoft.com if you need to run a Channel for longer periods of time.
-* There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-## Download sample
-
-You can download the sample that is described in this article from [here](https://azure.microsoft.com/documentation/samples/media-services-dotnet-encode-live-stream-with-ams-clear/).
-
-## Set up for development with Media Services SDK for .NET
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Code example
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using System.Net;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
-
-namespace EncodeLiveStreamWithAmsClear
-{
- class Program
- {
- private const string ChannelName = "channel001";
- private const string AssetName = "asset001";
- private const string ProgramName = "program001";
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- IChannel channel = CreateAndStartChannel();
-
- // The channel's input endpoint:
- string ingestUrl = channel.Input.Endpoints.FirstOrDefault().Url.ToString();
-
- Console.WriteLine("Intest URL: {0}", ingestUrl);
--
- // Use the previewEndpoint to preview and verify
- // that the input from the encoder is actually reaching the Channel.
- string previewEndpoint = channel.Preview.Endpoints.FirstOrDefault().Url.ToString();
-
- Console.WriteLine("Preview URL: {0}", previewEndpoint);
-
- // When Live Encoding is enabled, you can now get a preview of the live feed as it reaches the Channel.
- // This can be a valuable tool to check whether your live feed is actually reaching the Channel.
- // The thumbnail is exposed via the same end-point as the Channel Preview URL.
- string thumbnailUri = new UriBuilder
- {
- Scheme = Uri.UriSchemeHttps,
- Host = channel.Preview.Endpoints.FirstOrDefault().Url.Host,
- Path = "thumbnails/input.jpg"
- }.Uri.ToString();
-
- Console.WriteLine("Thumbain URL: {0}", thumbnailUri);
-
- // Once you previewed your stream and verified that it is flowing into your Channel,
- // you can create an event by creating an Asset, Program, and Streaming Locator.
- IAsset asset = CreateAndConfigureAsset();
-
- IProgram program = CreateAndStartProgram(channel, asset);
-
- ILocator locator = CreateLocatorForAsset(program.Asset, program.ArchiveWindowLength);
-
- // You can use slates and ads only if the channel type is Standard.
- StartStopAdsSlates(channel);
-
- // Once you are done streaming, clean up your resources.
- Cleanup(channel);
- }
-
- public static IChannel CreateAndStartChannel()
- {
- var channelInput = CreateChannelInput();
- var channelPreview = CreateChannelPreview();
- var channelEncoding = CreateChannelEncoding();
-
- ChannelCreationOptions options = new ChannelCreationOptions
- {
- EncodingType = ChannelEncodingType.Standard,
- Name = ChannelName,
- Input = channelInput,
- Preview = channelPreview,
- Encoding = channelEncoding
- };
-
- Log("Creating channel");
- IOperation channelCreateOperation = _context.Channels.SendCreateOperation(options);
- string channelId = TrackOperation(channelCreateOperation, "Channel create");
-
- IChannel channel = _context.Channels.FirstOrDefault(c => c.Id == channelId);
-
- Log("Starting channel");
- var channelStartOperation = channel.SendStartOperation();
- TrackOperation(channelStartOperation, "Channel start");
-
- return channel;
- }
-
- /// <summary>
- /// Create channel input, used in channel creation options.
- /// </summary>
- /// <returns></returns>
- private static ChannelInput CreateChannelInput()
- {
- // When creating a Channel, you can specify allowed IP addresses in one of the following formats:
- // IpV4 address with 4 numbers
- // CIDR address range
-
- return new ChannelInput
- {
- StreamingProtocol = StreamingProtocol.FragmentedMP4,
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelInput001",
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- /// <summary>
- /// Create channel preview, used in channel creation options.
- /// </summary>
- /// <returns></returns>
- private static ChannelPreview CreateChannelPreview()
- {
- // When creating a Channel, you can specify allowed IP addresses in one of the following formats:
- // IpV4 address with 4 numbers
- // CIDR address range
-
- return new ChannelPreview
- {
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelPreview001",
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- /// <summary>
- /// Create channel encoding, used in channel creation options.
- /// </summary>
- /// <returns></returns>
- private static ChannelEncoding CreateChannelEncoding()
- {
- return new ChannelEncoding
- {
- SystemPreset = "Default720p",
- IgnoreCea708ClosedCaptions = false,
- AdMarkerSource = AdMarkerSource.Api
- };
- }
-
- /// <summary>
- /// Create an asset and configure asset delivery policies.
- /// </summary>
- /// <returns></returns>
- public static IAsset CreateAndConfigureAsset()
- {
- IAsset asset = _context.Assets.Create(AssetName, AssetCreationOptions.None);
-
- IAssetDeliveryPolicy policy =
- _context.AssetDeliveryPolicies.Create("Clear Policy",
- AssetDeliveryPolicyType.NoDynamicEncryption,
- AssetDeliveryProtocol.HLS | AssetDeliveryProtocol.SmoothStreaming | AssetDeliveryProtocol.Dash, null);
-
- asset.DeliveryPolicies.Add(policy);
-
- return asset;
- }
-
- /// <summary>
- /// Create a Program on the Channel. You can have multiple Programs that overlap or are sequential;
- /// however each Program must have a unique name within your Media Services account.
- /// </summary>
- /// <param name="channel"></param>
- /// <param name="asset"></param>
- /// <returns></returns>
- public static IProgram CreateAndStartProgram(IChannel channel, IAsset asset)
- {
- IProgram program = channel.Programs.Create(ProgramName, TimeSpan.FromHours(3), asset.Id);
- Log("Program created", program.Id);
-
- Log("Starting program");
- var programStartOperation = program.SendStartOperation();
- TrackOperation(programStartOperation, "Program start");
-
- return program;
- }
-
- /// <summary>
- /// Create locators in order to be able to publish and stream the video.
- /// </summary>
- /// <param name="asset"></param>
- /// <param name="ArchiveWindowLength"></param>
- /// <returns></returns>
- public static ILocator CreateLocatorForAsset(IAsset asset, TimeSpan ArchiveWindowLength)
- {
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
- var locator = _context.Locators.CreateLocator
- (
- LocatorType.OnDemandOrigin,
- asset,
- _context.AccessPolicies.Create
- (
- "Live Stream Policy",
- ArchiveWindowLength,
- AccessPermissions.Read
- )
- );
-
- return locator;
- }
-
- /// <summary>
- /// Perform operations on slates.
- /// </summary>
- /// <param name="channel"></param>
- public static void StartStopAdsSlates(IChannel channel)
- {
- int cueId = new Random().Next(int.MaxValue);
- var path = Path.GetFullPath(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, @"..\\..\\SlateJPG\\DefaultAzurePortalSlate.jpg"));
-
- Log("Creating asset");
- var slateAsset = _context.Assets.Create("Slate test asset " + DateTime.Now.ToString("yyyy-MM-dd HH-mm"), AssetCreationOptions.None);
- Log("Slate asset created", slateAsset.Id);
-
- Log("Uploading file");
- var assetFile = slateAsset.AssetFiles.Create("DefaultAzurePortalSlate.jpg");
- assetFile.Upload(path);
- assetFile.IsPrimary = true;
- assetFile.Update();
-
- Log("Showing slate");
- var showSlateOperation = channel.SendShowSlateOperation(TimeSpan.FromMinutes(1), slateAsset.Id);
- TrackOperation(showSlateOperation, "Show slate");
-
- Log("Hiding slate");
- var hideSlateOperation = channel.SendHideSlateOperation();
- TrackOperation(hideSlateOperation, "Hide slate");
-
- Log("Starting ad");
- var startAdOperation = channel.SendStartAdvertisementOperation(TimeSpan.FromMinutes(1), cueId, false);
- TrackOperation(startAdOperation, "Start ad");
-
- Log("Ending ad");
- var endAdOperation = channel.SendEndAdvertisementOperation(cueId);
- TrackOperation(endAdOperation, "End ad");
-
- Log("Deleting slate asset");
- slateAsset.Delete();
- }
-
- /// <summary>
- /// Clean up resources associated with the channel.
- /// </summary>
- /// <param name="channel"></param>
- public static void Cleanup(IChannel channel)
- {
- IAsset asset;
- if (channel != null)
- {
- foreach (var program in channel.Programs)
- {
- asset = _context.Assets.FirstOrDefault(se => se.Id == program.AssetId);
-
- Log("Stopping program");
- var programStopOperation = program.SendStopOperation();
- TrackOperation(programStopOperation, "Program stop");
-
- program.Delete();
-
- if (asset != null)
- {
- Log("Deleting locators");
- foreach (var l in asset.Locators)
- l.Delete();
-
- Log("Deleting asset");
- asset.Delete();
- }
- }
-
- Log("Stopping channel");
- var channelStopOperation = channel.SendStopOperation();
- TrackOperation(channelStopOperation, "Channel stop");
-
- Log("Deleting channel");
- var channelDeleteOperation = channel.SendDeleteOperation();
- TrackOperation(channelDeleteOperation, "Channel delete");
- }
- }
-
- /// <summary>
- /// Track long running operations.
- /// </summary>
- /// <param name="operation"></param>
- /// <param name="description"></param>
- /// <returns></returns>
- public static string TrackOperation(IOperation operation, string description)
- {
- string entityId = null;
- bool isCompleted = false;
-
- Log("starting to track ", null, operation.Id);
- while (isCompleted == false)
- {
- operation = _context.Operations.GetOperation(operation.Id);
- isCompleted = IsCompleted(operation, out entityId);
- System.Threading.Thread.Sleep(TimeSpan.FromSeconds(30));
- }
- // If we got here, the operation succeeded.
- Log(description + " in completed", operation.TargetEntityId, operation.Id);
-
- return entityId;
- }
-
- /// <summary>
- /// Checks if the operation has been completed.
- /// If the operation succeeded, the created entity Id is returned in the out parameter.
- /// </summary>
- /// <param name="operationId">The operation Id.</param>
- /// <param name="channel">
- /// If the operation succeeded,
- /// the entity Id associated with the successful operation is returned in the out parameter.</param>
- /// <returns>Returns false if the operation is still in progress; otherwise, true.</returns>
- private static bool IsCompleted(IOperation operation, out string entityId)
- {
- bool completed = false;
-
- entityId = null;
-
- switch (operation.State)
- {
- case OperationState.Failed:
- // Handle the failure.
- // For example, throw an exception.
- // Use the following information in the exception: operationId, operation.ErrorMessage.
- Log("operation failed", operation.TargetEntityId, operation.Id);
- break;
- case OperationState.Succeeded:
- completed = true;
- entityId = operation.TargetEntityId;
- break;
- case OperationState.InProgress:
- completed = false;
- Log("operation in progress", operation.TargetEntityId, operation.Id);
- break;
- }
- return completed;
- }
-
- private static void Log(string action, string entityId = null, string operationId = null)
- {
- Console.WriteLine(
- "{0,-21}{1,-51}{2,-51}{3,-51}",
- DateTime.Now.ToString("yyyy'-'MM'-'dd HH':'mm':'ss"),
- action,
- entityId ?? string.Empty,
- operationId ?? string.Empty);
- }
- }
-}
-```
-
-## Next step
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Dotnet Dynamic Manifest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-dynamic-manifest.md
- Title: Creating Filters with Azure Media Services .NET SDK
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services .NET SDK creates dynamic manifests to achieve this selective streaming.
------ Previously updated : 03/10/2021----
-# Creating Filters with Media Services .NET SDK
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-dynamic-manifest.md)
-> * [REST](media-services-rest-dynamic-manifest.md)
->
->
-
-Starting with 2.17 release, Media Services enables you to define filters for your assets. These filters are server-side rules that allow your customers to choose to do things like: playback only a section of a video (instead of playing the whole video), or specify only a subset of audio and video renditions that your customer's device can handle (instead of all the renditions that are associated with the asset). This filtering of your assets is achieved through **Dynamic Manifest**s that are created upon your customer's request to stream a video based on specified filter(s).
-
-For more detailed information related to filters and Dynamic Manifest, see [Dynamic manifests overview](media-services-dynamic-manifest-overview.md).
-
-This article shows how to use Media Services .NET SDK to create, update, and delete filters.
-
-Note if you update a filter, it can take up to two minutes for streaming endpoint to refresh the rules. If the content was served using this filter (and cached in proxies and CDN caches), updating this filter can result in player failures. Always clear the cache after updating the filter. If this option is not possible, consider using a different filter.
-
-## Types used to create filters
-The following types are used when creating filters:
-
-* **IStreamingFilter**. This type is based on the following REST API [Filter](/rest/api/media/operations/filter)
-* **IStreamingAssetFilter**. This type is based on the following REST API [AssetFilter](/rest/api/media/operations/assetfilter)
-* **PresentationTimeRange**. This type is based on the following REST API [PresentationTimeRange](/rest/api/media/operations/presentationtimerange)
-* **FilterTrackSelectStatement** and **IFilterTrackPropertyCondition**. These types are based on the following REST APIs [FilterTrackSelect and FilterTrackPropertyCondition](/rest/api/media/operations/filtertrackselect)
-
-## Create/Update/Read/Delete global filters
-The following code shows how to use .NET to create, update, read, and delete asset filters.
-
-```csharp
- string filterName = "GlobalFilter_" + Guid.NewGuid().ToString();
-
- List<FilterTrackSelectStatement> filterTrackSelectStatements = new List<FilterTrackSelectStatement>();
-
- FilterTrackSelectStatement filterTrackSelectStatement = new FilterTrackSelectStatement();
- filterTrackSelectStatement.PropertyConditions = new List<IFilterTrackPropertyCondition>();
- filterTrackSelectStatement.PropertyConditions.Add(new FilterTrackNameCondition("Track Name", FilterTrackCompareOperator.NotEqual));
- filterTrackSelectStatement.PropertyConditions.Add(new FilterTrackBitrateRangeCondition(new FilterTrackBitrateRange(0, 1), FilterTrackCompareOperator.NotEqual));
- filterTrackSelectStatement.PropertyConditions.Add(new FilterTrackTypeCondition(FilterTrackType.Audio, FilterTrackCompareOperator.NotEqual));
- filterTrackSelectStatements.Add(filterTrackSelectStatement);
-
- // Create
- IStreamingFilter filter = _context.Filters.Create(filterName, new PresentationTimeRange(), filterTrackSelectStatements);
-
- // Update
- filter.PresentationTimeRange = new PresentationTimeRange(timescale: 500);
- filter.Update();
-
- // Read
- var filterUpdated = _context.Filters.FirstOrDefault();
- Console.WriteLine(filterUpdated.Name);
-
- // Delete
- filter.Delete();
-```
-
-## Create/Update/Read/Delete asset filters
-The following code shows how to use .NET to create, update, read, and delete asset filters.
-
-```csharp
- string assetName = "AssetFilter_" + Guid.NewGuid().ToString();
- var asset = _context.Assets.Create(assetName, AssetCreationOptions.None);
-
- string filterName = "AssetFilter_" + Guid.NewGuid().ToString();
--
- // Create
- IStreamingAssetFilter filter = asset.AssetFilters.Create(filterName,
- new PresentationTimeRange(),
- new List<FilterTrackSelectStatement>());
-
- // Update
- filter.PresentationTimeRange =
- new PresentationTimeRange(start: 6000000000, end: 72000000000);
-
- filter.Update();
-
- // Read
- asset = _context.Assets.Where(c => c.Id == asset.Id).FirstOrDefault();
- var filterUpdated = asset.AssetFilters.FirstOrDefault();
- Console.WriteLine(filterUpdated.Name);
-
- // Delete
- filterUpdated.Delete();
-
-```
--
-## Build streaming URLs that use filters
-For information on how to publish and deliver your assets, see [Delivering Content to Customers Overview](media-services-deliver-content-overview.md).
-
-The following examples show how to add filters to your streaming URLs.
-
-**MPEG DASH**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf, filter=MyFilter)`
-
-**Apple HTTP Live Streaming (HLS) V4**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl, filter=MyFilter)`
-
-**Apple HTTP Live Streaming (HLS) V3**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl-v3, filter=MyFilter)`
-
-**Smooth Streaming**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(filter=MyFilter)`
--
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Dynamic manifests overview](media-services-dynamic-manifest-overview.md)
media-services Media Services Dotnet Encode With Media Encoder Standard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-encode-with-media-encoder-standard.md
- Title: Encode an asset with Media Encoder Standard using .NET | Microsoft Docs
-description: This article shows how to use .NET to encode an asset using Media Encoder Standard.
------ Previously updated : 03/10/2021-----
-# Encode an asset with Media Encoder Standard using .NET
--
-Encoding jobs are one of the most common processing operations in Media Services. You create encoding jobs to convert media files from one encoding to another. When you encode, you can use the Media Services built-in Media Encoder. You can also use an encoder provided by a Media Services partner; third-party encoders are available through the Azure Marketplace.
-
-This article shows how to use .NET to encode your assets with Media Encoder Standard (MES). Media Encoder Standard is configured using one of the encoders presets described [here](./media-services-mes-presets-overview.md).
-
-It is recommended to always encode your source files into an adaptive bitrate MP4 set and then convert the set to the desired format using the [Dynamic Packaging](media-services-dynamic-packaging-overview.md).
-
-If your output asset is storage encrypted, you must configure asset delivery policy. For more information, see [Configuring asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md).
-
-> [!NOTE]
-> MES produces an output file with a name that contains the first 32 characters of the input file name. The name is based on what is specified in the preset file. For example, "FileName": "{Basename}_{Index}{Extension}". {Basename} is replaced by the first 32 characters of the input file name.
->
->
-
-### MES Formats
-[Formats and codecs](media-services-media-encoder-standard-formats.md)
-
-### MES Presets
-Media Encoder Standard is configured using one of the encoders presets described [here](./media-services-mes-presets-overview.md).
-
-### Input and output metadata
-When you encode an input asset (or assets) using MES, you get an output asset at the successful completion of that encode task. The output asset contains video, audio, thumbnails, manifest, etc. based on the encoding preset you use.
-
-The output asset also contains a file with metadata about the input asset. The name of the metadata XML file has the following format: <asset_id>_metadata.xml (for example, 41114ad3-eb5e-4c57-8d92-5354e2b7d4a4_metadata.xml), where <asset_id> is the AssetId value of the input asset. The schema of this input metadata XML is described [here](media-services-input-metadata-schema.md).
-
-The output asset also contains a file with metadata about the output asset. The name of the metadata XML file has the following format: <source_file_name>_manifest.xml (for example, BigBuckBunny_manifest.xml). The schema of this output metadata XML is described [here](media-services-output-metadata-schema.md).
-
-If you want to examine either of the two metadata files, you can create a SAS locator and download the file to your local computer. You can find an example on how to create a SAS locator and download a file Using the Media Services .NET SDK Extensions.
-
-## Download sample
-You can get and run a sample that shows how to encode with MES from [here](https://azure.microsoft.com/documentation/samples/media-services-dotnet-on-demand-encoding-with-media-encoder-standard/).
-
-## .NET sample code
-
-The following code example uses Media Services .NET SDK to perform the following tasks:
-
-* Create an encoding job.
-* Get a reference to the Media Encoder Standard encoder.
-* Specify to use the [Adaptive Streaming](media-services-autogen-bitrate-ladder-with-mes.md) preset.
-* Add a single encoding task to the job.
-* Specify the input asset to be encoded.
-* Create an output asset that contains the encoded asset.
-* Add an event handler to check the job progress.
-* Submit the job.
-
-#### Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-#### Example
-
-```csharp
-using System;
-using System.Linq;
-using System.Configuration;
-using System.IO;
-using System.Threading;
-using Microsoft.WindowsAzure.MediaServices.Client;
-
-namespace MediaEncoderStandardSample
-{
- class Program
- {
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static readonly string _supportFiles =
- Path.GetFullPath(@"../..\Media");
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
--
- // Get an uploaded asset.
- var asset = _context.Assets.FirstOrDefault();
-
- // Encode and generate the output using the "Adaptive Streaming" preset.
- EncodeToAdaptiveBitrateMP4Set(asset);
-
- Console.ReadLine();
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task with the encoding details, using a string preset.
- // In this case "Adaptive Streaming" preset is used.
- ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
-
- private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
-
- // Cast sender as a job.
- IJob job = (IJob)sender;
-
- // Display or log error details as needed.
- break;
- default:
- break;
- }
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-## Advanced Encoding Features to explore
-* [How to generate thumbnails](media-services-dotnet-generate-thumbnail-with-mes.md)
-* [Generating thumbnails during encoding](media-services-dotnet-generate-thumbnail-with-mes.md#example-of-generating-a-thumbnail-while-encoding)
-* [Crop videos during encoding](media-services-crop-video.md)
-* [Customizing encoding presets](media-services-custom-mes-presets-with-dotnet.md)
-* [Overlay or watermark a video with an image](media-services-advanced-encoding-with-mes.md#overlay)
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-[How to generate thumbnail using Media Encoder Standard with .NET](media-services-dotnet-generate-thumbnail-with-mes.md)
-[Media Services Encoding Overview](media-services-encode-asset.md)
media-services Media Services Dotnet Encoding Units https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-encoding-units.md
- Title: Scale media processing by adding encoding units - Azure | Microsoft Docs
-description: This article demonstrates how to add encoding units with Azure Media Services .NET.
------ Previously updated : 08/24/2021----
-# How to scale encoding with .NET SDK
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-scale-media-processing.md)
-> * [.NET](media-services-dotnet-encoding-units.md)
-> * [REST](/rest/api/media/operations/encodingreservedunittype)
-> * [Java](https://github.com/rnrneverdies/azure-sdk-for-media-services-java-samples)
-> * [PHP](https://github.com/Azure/azure-sdk-for-php/tree/master/examples/MediaServices)
-
-## Overview
-> [!IMPORTANT]
-> By default, Media Reserve Units are no longer needed to be used and are not supported by Azure Media Services. Make sure to review the [Overview](media-services-scale-media-processing-overview.md) to get more information about scaling media processing.
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Dotnet Generate Thumbnail With Mes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-generate-thumbnail-with-mes.md
- Title: How to generate thumbnails using Media Encoder Standard with .NET
-description: This topic shows how to use .NET to encode an asset and generate thumbnails at the same time using Media Encoder Standard.
------ Previously updated : 03/10/2021---
-# How to generate thumbnails using Media Encoder Standard with .NET
--
-You can use Media Encoder Standard to generate one or more thumbnails from your input video in [JPEG](https://en.wikipedia.org/wiki/JPEG), [PNG](https://en.wikipedia.org/wiki/Portable_Network_Graphics), or [BMP](https://en.wikipedia.org/wiki/BMP_file_format) image file formats. You can submit Tasks that produce only images, or you can combine thumbnail generation with encoding. This article provides a few sample XML and JSON thumbnail presets for such scenarios. At the end of the article, there is a [sample code](#code_sample) that shows how to use the Media Services .NET SDK to accomplish the encoding task.
-
-For more details on the elements that are used in sample presets, you should review [Media Encoder Standard schema](media-services-mes-schema.md).
-
-Make sure to review the [Considerations](media-services-dotnet-generate-thumbnail-with-mes.md#considerations) section.
-
-## Example of a "single PNG file" preset
-
-The following JSON and XML preset can be used to produce a single output PNG file from the first few seconds of the input video, where the encoder makes a best-effort attempt at finding an ΓÇ£interestingΓÇ¥ frame. Note that the output image dimensions have been set to 100%, meaning these match the dimensions of the input video. Note also how the ΓÇ£FormatΓÇ¥ setting in "Outputs" is required to match the use of "PngLayers" in the ΓÇ£CodecsΓÇ¥ section.
-
-### JSON preset
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "PngLayers": [
- {
- "Type": "PngLayer",
- "Width": "100%",
- "Height": "100%"
- }
- ],
- "Start": "{Best}",
- "Type": "PngImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "PngFormat"
- }
- }
- ]
- }
-```
-
-### XML preset
-
-```xml
- <?xml version="1.0" encoding="utf-16"?>
- <Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <PngImage Start="{Best}">
- <PngLayers>
- <PngLayer>
- <Width>100%</Width>
- <Height>100%</Height>
- </PngLayer>
- </PngLayers>
- </PngImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Index}{Extension}">
- <PngFormat />
- </Output>
- </Outputs>
- </Preset>
-```
-
-## Example of a "series of JPEG images" preset
-
-The following JSON and XML preset can be used to produce a set of 10 images at timestamps of 5%, 15%, …, 95% of the input timeline, where the image size is specified to be one quarter that of the input video.
-
-### JSON preset
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "25%",
- "Height": "25%"
- }
- ],
- "Start": "5%",
- "Step": "10%",
- "Range": "96%",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
- }
-```
-
-### XML preset
-
-```xml
- <?xml version="1.0" encoding="utf-16"?>
- <Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <JpgImage Start="5%" Step="10%" Range="96%">
- <JpgLayers>
- <JpgLayer>
- <Width>25%</Width>
- <Height>25%</Height>
- <Quality>90</Quality>
- </JpgLayer>
- </JpgLayers>
- </JpgImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Index}{Extension}">
- <JpgFormat />
- </Output>
- </Outputs>
- </Preset>
-```
-
-## Example of a "one image at a specific timestamp" preset
-
-The following JSON and XML preset can be used to produce a single JPEG image at the 30-second mark of the input video. This preset expects the input video to be more than 30 seconds in duration (else the job fails).
-
-### JSON preset
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "25%",
- "Height": "25%"
- }
- ],
- "Start": "00:00:30",
- "Step": "1",
- "Range": "1",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
- }
-```
-
-### XML preset
-```xml
- <?xml version="1.0" encoding="utf-16"?>
- <Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <JpgImage Start="00:00:30" Step="00:00:01" Range="00:00:01">
- <JpgLayers>
- <JpgLayer>
- <Width>25%</Width>
- <Height>25%</Height>
- <Quality>90</Quality>
- </JpgLayer>
- </JpgLayers>
- </JpgImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Index}{Extension}">
- <JpgFormat />
- </Output>
- </Outputs>
- </Preset>
-```
-
-## Example of a "thumbnails at different resolutions" preset
-
-The following preset can be used to generate thumbnails at different resolutions in one task. In the example, at positions 5%, 15%, …, 95% of the input timeline, the encoder generates two images – one at 100% of the input video resolution and the other at 50%.
-
-Note the use of {Resolution} macro in the FileName; it indicates to the encoder to use the width and height that you specified in the Encoding section of the preset while generating the file name of the output images. This also helps you distinguish between the different images easily
-
-### JSON preset
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "100%",
- "Height": "100%"
- },
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "50%",
- "Height": "50%"
- }
-
- ],
- "Start": "5%",
- "Step": "10%",
- "Range": "96%",
- "Type": "JpgImage"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Resolution}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- }
- ]
- }
-```
-
-### XML preset
-```xml
- <?xml version="1.0" encoding="utf-8"?>
- <Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <JpgImage Start="5%" Step="10%" Range="96%"><JpgImage Start="00:00:01" Step="00:00:15">
- <JpgLayers>
- <JpgLayer>
- <Width>100%</Width>
- <Height>100%</Height>
- <Quality>90</Quality>
- </JpgLayer>
- <JpgLayer>
- <Width>50%</Width>
- <Height>50%</Height>
- <Quality>90</Quality>
- </JpgLayer>
- </JpgLayers>
- </JpgImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Resolution}_{Index}{Extension}">
- <JpgFormat/>
- </Output>
- </Outputs>
- </Preset>
-```
-
-## Example of generating a thumbnail while encoding
-
-While all of the above examples have discussed how you can submit an encoding task that only produces images, you can also combine video/audio encoding with thumbnail generation. The following JSON and XML preset tell **Media Encoder Standard** to generate a thumbnail during encoding.
-
-### <a id="json"></a>JSON preset
-For information about schema, see [this](./media-services-mes-schema.md) article.
-
-```json
- {
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": "true",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4500,
- "MaxBitrate": 4500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "ReferenceFrames": 3,
- "EntropyMode": "Cabac",
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
-
- }
- ],
- "Type": "H264Video"
- },
- {
- "JpgLayers": [
- {
- "Quality": 90,
- "Type": "JpgLayer",
- "Width": "100%",
- "Height": "100%"
- }
- ],
- "Start": "{Best}",
- "Type": "JpgImage"
- },
- {
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Index}{Extension}",
- "Format": {
- "Type": "JpgFormat"
- }
- },
- {
- "FileName": "{Basename}_{Resolution}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
- }
-```
-
-### <a id="xml"></a>XML preset
-For information about schema, see [this](./media-services-mes-schema.md) article.
-
-```xml
- <?xml version="1.0" encoding="utf-16"?>
- <Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>4500</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4500</MaxBitrate>
- </H264Layer>
- </H264Layers>
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- <JpgImage Start="{Best}">
- <JpgLayers>
- <JpgLayer>
- <Width>100%</Width>
- <Height>100%/Height>
- <Quality>90</Quality>
- </JpgLayer>
- </JpgLayers>
- </JpgImage>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Resolution}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- <Output FileName="{Basename}_{Index}{Extension}">
- <JpgFormat />
- </Output>
- </Outputs>
- </Preset>
-```
-
-## <a id="code_sample"></a>Encode video and generate thumbnail with .NET
-
-The following code example uses Media Services .NET SDK to perform the following tasks:
-
-* Create an encoding job.
-* Get a reference to the Media Encoder Standard encoder.
-* Load the preset [XML](media-services-dotnet-generate-thumbnail-with-mes.md#xml) or [JSON](media-services-dotnet-generate-thumbnail-with-mes.md#json) that contain the encoding preset as well as information needed to generate thumbnails. You can save this [XML](media-services-dotnet-generate-thumbnail-with-mes.md#xml) or [JSON](media-services-dotnet-generate-thumbnail-with-mes.md#json) in a file and use the following code to load the file.
-
- ```csharp
- // Load the XML (or JSON) from the local file.
- string configuration = File.ReadAllText(fileName);
- ```
-
-* Add a single encoding task to the job.
-* Specify the input asset to be encoded.
-* Create an output asset that contains the encoded asset.
-* Add an event handler to check the job progress.
-* Submit the job.
-
-See the [Media Services development with .NET](media-services-dotnet-how-to-use.md) article for directions on how to set up your dev environment.
-
-```csharp
-using System;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Threading;
-
-namespace EncodeAndGenerateThumbnails
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Get an uploaded asset.
- var asset = _context.Assets.FirstOrDefault();
-
- // Encode and generate the thumbnails.
- EncodeToAdaptiveBitrateMP4Set(asset);
-
- Console.ReadLine();
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Thumbnail Job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Load the XML (or JSON) from the local file.
- string configuration = File.ReadAllText("ThumbnailPreset_JSON.json");
-
- // Create a task
- ITask task = job.Tasks.AddNew("Media Encoder Standard Thumbnail task",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
-
- private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
-
- // Cast sender as a job.
- IJob job = (IJob)sender;
-
- // Display or log error details as needed.
- break;
- default:
- break;
- }
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-## Considerations
-The following considerations apply:
-
-* The use of explicit timestamps for Start/Step/Range assumes that the input source is at least 1 minute long.
-* Jpg/Png/BmpImage elements have Start, Step, and Range string attributes ΓÇô these can be interpreted as:
-
- * Frame Number if they are non-negative integers, for example "Start": "120",
- * Relative to source duration if expressed as %-suffixed, for example "Start": "15%", OR
- * Timestamp if expressed as HH:MM:SS… format. For example "Start" : "00:01:00"
-
- You can mix and match notations as you please.
-
- Additionally, Start also supports a special Macro:{Best}, which attempts to determine the first ΓÇ£interestingΓÇ¥ frame of the content
- NOTE: (Step and Range are ignored when Start is set to {Best})
- * Defaults: Start:{Best}
-* Output format needs to be explicitly provided for each Image format: Jpg/Png/BmpFormat. When present, MES matches JpgVideo to JpgFormat and so on. OutputFormat introduces a new image-codec specific Macro: {Index}, which needs to be present (once and only once) for image output formats.
-
-## Next steps
-
-You can check the [job progress](media-services-check-job-progress.md) while the encoding job is pending.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Media Services Encoding Overview](media-services-encode-asset.md)
media-services Media Services Dotnet Get Started With Aad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-get-started-with-aad.md
- Title: Use Azure AD authentication to access Azure Media Services API with .NET | Microsoft Docs
-description: This topic shows how to use Azure Active Directory (Azure AD) authentication to access Azure Media Services (AMS) API with .NET.
------ Previously updated : 03/10/2021---
-# Use Azure AD authentication to access Azure Media Services API with .NET
---
-Starting with windowsazure.mediaservices 4.0.0.4, Azure Media Services supports authentication based on Azure Active Directory (Azure AD). This topic shows you how to use Azure AD authentication to access Azure Media Services API with Microsoft .NET.
-
-## Prerequisites
--- An Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).-- A Media Services account. For more information, see [Create an Azure Media Services account using the Azure portal](media-services-portal-create-account.md).-- The latest [NuGet](https://www.nuget.org/packages/windowsazure.mediaservices) package.-- Familiarity with the topic [Accessing Azure Media Services API with Azure AD authentication overview](media-services-use-aad-auth-to-access-ams-api.md).-
-When you're using Azure AD authentication with Azure Media Services, you can authenticate in one of two ways:
--- **User authentication** authenticates a person who is using the app to interact with Azure Media Services resources. The interactive application should first prompt the user for credentials. An example is a management console app that's used by authorized users to monitor encoding jobs or live streaming.-- **Service principal authentication** authenticates a service. Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs, such as web apps, function apps, logic apps, APIs, or microservices.-
->[!IMPORTANT]
->Azure Media Service currently supports an Azure Access Control Service authentication model. However, Access Control authorization is going to be deprecated on June 22, 2018. We recommend that you migrate to an Azure Active Directory authentication model as soon as possible.
-
-## Get an Azure AD access token
-
-To connect to the Azure Media Services API with Azure AD authentication, the client app needs to request an Azure AD access token. When you use the Media Services .NET client SDK, many of the details about how to acquire an Azure AD access token are wrapped and simplified for you in the [AzureAdTokenProvider](https://github.com/Azure/azure-sdk-for-media-services/blob/dev/src/net/Client/Common/Common.Authentication/AzureAdTokenProvider.cs) and [AzureAdTokenCredentials](https://github.com/Azure/azure-sdk-for-media-services/blob/dev/src/net/Client/Common/Common.Authentication/AzureAdTokenCredentials.cs) classes.
-
-For example, you don't need to provide the Azure AD authority, Media Services resource URI, or native Azure AD application details. These are well-known values that are already configured by the Azure AD access token provider class.
-
-If you are not using Azure Media Service .NET SDK, we recommend that you use the [Azure AD Authentication Library](../../active-directory/azuread-dev/active-directory-authentication-libraries.md). To get values for the parameters that you need to use with Azure AD Authentication Library, see [Use the Azure portal to access Azure AD authentication settings](media-services-portal-get-started-with-aad.md).
-
-You also have the option of replacing the default implementation of the **AzureAdTokenProvider** with your own implementation.
-
-## Install and configure Azure Media Services .NET SDK
-
->[!NOTE]
->To use Azure AD authentication with the Media Services .NET SDK, you need to have the latest [NuGet](https://www.nuget.org/packages/windowsazure.mediaservices) package. Also, add a reference to the **Microsoft.IdentityModel.Clients.ActiveDirectory** assembly. If you are using an existing app, include the **Microsoft.WindowsAzure.MediaServices.Client.Common.Authentication.dll** assembly.
-
-1. Create a new C# console application in Visual Studio.
-2. Use the [windowsazure.mediaservices](https://www.nuget.org/packages/windowsazure.mediaservices) NuGet package to install **Azure Media Services .NET SDK**.
-
- To add references by using NuGet, take the following steps: in **Solution Explorer**, right-click the project name, and then select **Manage NuGet packages**. Then, search for **windowsazure.mediaservices** and select **Install**.
-
- -or-
-
- Run the following command in **Package Manager Console** in Visual Studio.
-
- ```console
- Install-Package windowsazure.mediaservices -Version 4.0.0.4
- ```
-
-3. Add **using** to your source code.
-
- ```csharp
- using Microsoft.WindowsAzure.MediaServices.Client;
- ```
-
-## Use user authentication
-
-To connect to the Azure Media Service API with the user authentication option, the client app needs to request an Azure AD token by using the following parameters:
--- Azure AD tenant endpoint. The tenant information can be retrieved from the Azure portal. Hover over the signed-in user in the upper-right corner.-- Media Services resource URI.-- Media Services (native) application client ID.-- Media Services (native) application redirect URI.-
-The values for these parameters can be found in **AzureEnvironments.AzureCloudEnvironment**. The **AzureEnvironments.AzureCloudEnvironment** constant is a helper in the .NET SDK to get the right environment variable settings for a public Azure Data Center.
-
-It contains pre-defined environment settings for accessing Media Services in the public data centers only. For sovereign or government cloud regions, you can use **AzureChinaCloudEnvironment**, **AzureUsGovernmentEnvironment**, or **AzureGermanCloudEnvironment** respectively.
-
-The following code example creates a token:
-
-```csharp
-var tokenCredentials = new AzureAdTokenCredentials("microsoft.onmicrosoft.com", AzureEnvironments.AzureCloudEnvironment);
-var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-```
-
-To start programming against Media Services, you need to create a **CloudMediaContext** instance that represents the server context. The **CloudMediaContext** includes references to important collections including jobs, assets, files, access policies, and locators.
-
-You also need to pass the **resource URI for Media REST Services** to the **CloudMediaContext** constructor. To get the resource URI for Media REST Services, sign in to the Azure portal, select your Azure Media Services account, select **API access**, and then select **Connect to Azure Media Services with user authentication**.
-
-The following code example creates a **CloudMediaContext** instance:
-
-```csharp
-CloudMediaContext context = new CloudMediaContext(new Uri("YOUR REST API ENDPOINT HERE"), tokenProvider);
-```
-
-The following example shows how to create the Azure AD token and the context:
-
-```csharp
-namespace AzureADAuthSample
-{
- class Program
- {
- static void Main(string[] args)
- {
- // Specify your Azure AD tenant domain, for example "microsoft.onmicrosoft.com".
- var tokenCredentials = new AzureAdTokenCredentials("{YOUR Azure AD TENANT DOMAIN HERE}", AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- // Specify your REST API endpoint, for example "https://accountname.restv2.westcentralus.media.azure.net/API".
- CloudMediaContext context = new CloudMediaContext(new Uri("YOUR REST API ENDPOINT HERE"), tokenProvider);
-
- var assets = context.Assets;
- foreach (var a in assets)
- {
- Console.WriteLine(a.Name);
- }
- }
- }
-}
-```
-
->[!NOTE]
->If you get an exception that says "The remote server returned an error: (401) Unauthorized," see the [Access control](media-services-use-aad-auth-to-access-ams-api.md#access-control) section of Accessing Azure Media Services API with Azure AD authentication overview.
-
-## Use service principal authentication
-
-To connect to the Azure Media Services API with the service principal option, your middle-tier app (web API or web application) needs to requests an Azure AD token with the following parameters:
--- Azure AD tenant endpoint. The tenant information can be retrieved from the Azure portal. Hover over the signed-in user in the upper-right corner.-- Media Services resource URI.-- Azure AD application values: the **Client ID** and **Client secret**.-
-The values for the **Client ID** and **Client secret** parameters can be found in the Azure portal. For more information, see [Getting started with Azure AD authentication using the Azure portal](media-services-portal-get-started-with-aad.md).
-
-The following code example creates a token by using the **AzureAdTokenCredentials** constructor that takes **AzureAdClientSymmetricKey** as a parameter:
-
-```csharp
-var tokenCredentials = new AzureAdTokenCredentials("{YOUR Azure AD TENANT DOMAIN HERE}",
- new AzureAdClientSymmetricKey("{YOUR CLIENT ID HERE}", "{YOUR CLIENT SECRET}"),
- AzureEnvironments.AzureCloudEnvironment);
-
-var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-```
-
-You can also specify the **AzureAdTokenCredentials** constructor that takes **AzureAdClientCertificate** as a parameter.
-
-For instructions about how to create and configure a certificate in a form that can be used by Azure AD, see [Authenticating to Azure AD in daemon apps with certificates - manual configuration steps](https://github.com/azure-samples/active-directory-dotnetcore-daemon-v2).
-
-```csharp
-var tokenCredentials = new AzureAdTokenCredentials("{YOUR Azure AD TENANT DOMAIN HERE}",
- new AzureAdClientCertificate("{YOUR CLIENT ID HERE}", "{YOUR CLIENT CERTIFICATE THUMBPRINT}"),
- AzureEnvironments.AzureCloudEnvironment);
-```
-
-To start programming against Media Services, you need to create a **CloudMediaContext** instance that represents the server context. You also need to pass the **resource URI for Media REST Services** to the **CloudMediaContext** constructor. You can get the **resource URI for Media REST Services** value from the Azure portal as well.
-
-The following code example creates a **CloudMediaContext** instance:
-
-```csharp
-CloudMediaContext context = new CloudMediaContext(new Uri("YOUR REST API ENDPOINT HERE"), tokenProvider);
-```
-
-The following example shows how to create the Azure AD token and the context:
-
-```csharp
-namespace AzureADAuthSample
-{
- class Program
- {
- static void Main(string[] args)
- {
- var tokenCredentials = new AzureAdTokenCredentials("{YOUR Azure AD TENANT DOMAIN HERE}",
- new AzureAdClientSymmetricKey("{YOUR CLIENT ID HERE}", "{YOUR CLIENT SECRET}"),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- // Specify your REST API endpoint, for example "https://accountname.restv2.westcentralus.media.azure.net/API".
- CloudMediaContext context = new CloudMediaContext(new Uri("YOUR REST API ENDPOINT HERE"), tokenProvider);
-
- var assets = context.Assets;
- foreach (var a in assets)
- {
- Console.WriteLine(a.Name);
- }
-
- Console.ReadLine();
- }
- }
-}
-```
-
-## Next steps
-
-Get started with [uploading files to your account](media-services-dotnet-upload-files.md).
media-services Media Services Dotnet Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-get-started.md
- Title: Get started with delivering content on demand using .NET | Microsoft Docs
-description: This tutorial walks you through the steps of implementing an on demand content delivery application with Azure Media Services using .NET.
------ Previously updated : 03/10/2021----
-# Get started with delivering content on demand using .NET SDK
---
-This tutorial walks you through the steps of implementing a basic Video-on-Demand (VoD) content delivery service with Azure Media Services (AMS) application using the Azure Media Services .NET SDK.
-
-## Prerequisites
-
-The following are required to complete the tutorial:
-
-* An Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* .NET Framework 4.0 or later.
-* Visual Studio.
-
-This tutorial includes the following tasks:
-
-1. Start streaming endpoint (using the Azure portal).
-2. Create and configure a Visual Studio project.
-3. Connect to the Media Services account.
-2. Upload a video file.
-3. Encode the source file into a set of adaptive bitrate MP4 files.
-4. Publish the asset and get streaming and progressive download URLs.
-5. Play your content.
-
-## Overview
-This tutorial walks you through the steps of implementing a Video-on-Demand (VoD) content delivery application using Azure Media Services (AMS) SDK for .NET.
-
-The tutorial introduces the basic Media Services workflow and the most common programming objects and tasks required for Media Services development. At the completion of the tutorial, you will be able to stream or progressively download a sample media file that you uploaded, encoded, and downloaded.
-
-### AMS model
-
-The following image shows some of the most commonly used objects when developing VoD applications against the Media Services OData model.
-
-Click the image to view it full size.
-
-[![Diagram showing some of the most commonly used objects in the Azure Media Services object data model for developing Video on Demand applications.](./media/media-services-dotnet-get-started/media-services-overview-object-model-small.png)](./media/media-services-dotnet-get-started/media-services-overview-object-model.png#lightbox)
-
-You can view the whole model [here](https://m.eet.com/media/1170326/ms-part1.pdf).
-
-## Start streaming endpoints using the Azure portal
-
-When working with Azure Media Services one of the most common scenarios is delivering video via adaptive bitrate streaming. Media Services provides dynamic packaging, which allows you to deliver your adaptive bitrate MP4 encoded content in streaming formats supported by Media Services (MPEG DASH, HLS, Smooth Streaming) just-in-time, without you having to store pre-packaged versions of each of these streaming formats.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-To start the streaming endpoint, do the following:
-
-1. Log in at the [Azure portal](https://portal.azure.com/).
-2. In the Settings window, click Streaming endpoints.
-3. Click the default streaming endpoint.
-
- The DEFAULT STREAMING ENDPOINT DETAILS window appears.
-
-4. Click the Start icon.
-5. Click the Save button to save your changes.
-
-## Create and configure a Visual Studio project
-
-1. Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-2. Create a new folder (folder can be anywhere on your local drive) and copy an .mp4 file that you want to encode and stream or progressively download. In this example, the "C:\VideoFiles" path is used.
-
-## Connect to the Media Services account
-
-When using Media Services with .NET, you must use the **CloudMediaContext** class for most Media Services programming tasks: connecting to Media Services account; creating, updating, accessing, and deleting the following objects: assets, asset files, jobs, access policies, locators, etc.
-
-Overwrite the default Program class with the following code: The code demonstrates how to read the connection values from the App.config file and how to create the **CloudMediaContext** object in order to connect to Media Services. For more information, see [connecting to the Media Services API](media-services-use-aad-auth-to-access-ams-api.md).
-
-Make sure to update the file name and path to where you have your media file.
-
-The **Main** function calls methods that will be defined further in this section.
-
-> [!NOTE]
-> You will be getting compilation errors until you add definitions for all the functions that are defined later in this article.
-
-```csharp
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- try
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Add calls to methods defined in this section.
- // Make sure to update the file name and path to where you have your media file.
- IAsset inputAsset =
- UploadFile(@"C:\VideoFiles\BigBuckBunny.mp4", AssetCreationOptions.None);
-
- IAsset encodedAsset =
- EncodeToAdaptiveBitrateMP4s(inputAsset, AssetCreationOptions.None);
-
- PublishAssetGetURLs(encodedAsset);
- }
- catch (Exception exception)
- {
- // Parse the XML error message in the Media Services response and create a new
- // exception with its content.
- exception = MediaServicesExceptionParser.Parse(exception);
-
- Console.Error.WriteLine(exception.Message);
- }
- finally
- {
- Console.ReadLine();
- }
- }
-```
-
-## Create a new asset and upload a video file
-
-In Media Services, you upload (or ingest) your digital files into an asset. The **Asset** entity can contain video, audio, images, thumbnail collections, text tracks, and closed caption files (and the metadata about these files.) Once the files are uploaded, your content is stored securely in the cloud for further processing and streaming. The files in the asset are called **Asset Files**.
-
-The **UploadFile** method defined below calls **CreateFromFile** (defined in .NET SDK Extensions). **CreateFromFile** creates a new asset into which the specified source file is uploaded.
-
-The **CreateFromFile** method takes **AssetCreationOptions**, which lets you specify one of the following asset creation options:
-
-* **None** - No encryption is used. This is the default value. Note that when using this option, your content is not protected in transit or at rest in storage.
- If you plan to deliver an MP4 using progressive download, use this option.
-* **StorageEncrypted** - Use this option to encrypt your clear content locally using Advanced Encryption Standard (AES)-256 bit encryption, which then uploads it to Azure Storage where it is stored encrypted at rest. Assets protected with Storage Encryption are automatically unencrypted and placed in an encrypted file system prior to encoding, and optionally re-encrypted prior to uploading back as a new output asset. The primary use case for Storage Encryption is when you want to secure your high-quality input media files with strong encryption at rest on disk.
-* **CommonEncryptionProtected** - Use this option if you are uploading content that has already been encrypted and protected with Common Encryption or PlayReady DRM (for example, Smooth Streaming protected with PlayReady DRM).
-* **EnvelopeEncryptionProtected** ΓÇô Use this option if you are uploading HLS encrypted with AES. Note that the files must have been encoded and encrypted by Transform Manager.
-
-The **CreateFromFile** method also lets you specify a callback in order to report the upload progress of the file.
-
-In the following example, we specify **None** for the asset options.
-
-Add the following method to the Program class.
-
-```csharp
- static public IAsset UploadFile(string fileName, AssetCreationOptions options)
- {
- IAsset inputAsset = _context.Assets.CreateFromFile(
- fileName,
- options,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- Console.WriteLine("Asset {0} created.", inputAsset.Id);
-
- return inputAsset;
- }
-```
-
-## Encode the source file into a set of adaptive bitrate MP4 files
-After ingesting assets into Media Services, media can be encoded, transmuxed, watermarked, and so on, before it is delivered to clients. These activities are scheduled and run against multiple background role instances to ensure high performance and availability. These activities are called Jobs, and each Job is composed of atomic Tasks that do the actual work on the Asset file.
-
-As was mentioned earlier, when working with Azure Media Services, one of the most common scenarios is delivering adaptive bitrate streaming to your clients. Media Services can dynamically package a set of adaptive bitrate MP4 files into one of the following formats: HTTP Live Streaming (HLS), Smooth Streaming, and MPEG DASH.
-
-To take advantage of dynamic packaging, you need to encode or transcode your mezzanine (source) file into a set of adaptive bitrate MP4 files or adaptive bitrate Smooth Streaming files.
-
-The following code shows how to submit an encoding job. The job contains one task that specifies to transcode the mezzanine file into a set of adaptive bitrate MP4s using **Media Encoder Standard**. The code submits the job and waits until it is completed.
-
-Once the job is completed, you would be able to stream your asset or progressively download MP4 files that were created as a result of transcoding.
-
-Add the following method to the Program class.
-
-```csharp
- static public IAsset EncodeToAdaptiveBitrateMP4s(IAsset asset, AssetCreationOptions options)
- {
-
- // Prepare a job with a single task to transcode the specified asset
- // into a multi-bitrate asset.
-
- IJob job = _context.Jobs.CreateWithSingleTask(
- "Media Encoder Standard",
- "Adaptive Streaming",
- asset,
- "Adaptive Bitrate MP4",
- options);
-
- Console.WriteLine("Submitting transcoding job...");
--
- // Submit the job and wait until it is completed.
- job.Submit();
-
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- Console.WriteLine("Transcoding job finished.");
-
- IAsset outputAsset = job.OutputMediaAssets[0];
-
- return outputAsset;
- }
-```
-
-## Publish the asset and get URLs for streaming and progressive download
-
-To stream or download an asset, you first need to "publish" it by creating a locator. Locators provide access to files contained in the asset. Media Services supports two types of locators: OnDemandOrigin locators, used to stream media (for example, MPEG DASH, HLS, or Smooth Streaming) and Access Signature (SAS) locators, used to download media files.
-
-### Some details about URL formats
-
-After you create the locators, you can build the URLs that would be used to stream or download your files. The sample in this tutorial outputs URLs that you can paste in appropriate browsers. This section just gives short examples of what different formats look like.
-
-#### A streaming URL for MPEG DASH has the following format:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest**(format=mpd-time-csf)**
-
-#### A streaming URL for HLS has the following format:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest**(format=m3u8-aapl)**
-
-#### A streaming URL for Smooth Streaming has the following format:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
--
-#### A SAS URL used to download files has the following format:
-
-{blob container name}/{asset name}/{file name}/{SAS signature}
-
-Media Services .NET SDK extensions provide convenient helper methods that return formatted URLs for the published asset.
-
-The following code uses .NET SDK Extensions to create locators and to get streaming and progressive download URLs. The code also shows how to download files to a local folder.
-
-Add the following method to the Program class.
-
-```csharp
- static public void PublishAssetGetURLs(IAsset asset)
- {
- // Publish the output asset by creating an Origin locator for adaptive streaming,
- // and a SAS locator for progressive download.
-
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- asset,
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- _context.Locators.Create(
- LocatorType.Sas,
- asset,
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
--
- IEnumerable<IAssetFile> mp4AssetFiles = asset
- .AssetFiles
- .ToList()
- .Where(af => af.Name.EndsWith(".mp4", StringComparison.OrdinalIgnoreCase));
-
- // Get the Smooth Streaming, HLS and MPEG-DASH URLs for adaptive streaming,
- // and the Progressive Download URL.
- Uri smoothStreamingUri = asset.GetSmoothStreamingUri();
- Uri hlsUri = asset.GetHlsUri();
- Uri mpegDashUri = asset.GetMpegDashUri();
-
- // Get the URls for progressive download for each MP4 file that was generated as a result
- // of encoding.
- List<Uri> mp4ProgressiveDownloadUris = mp4AssetFiles.Select(af => af.GetSasUri()).ToList();
--
- // Display the streaming URLs.
- Console.WriteLine("Use the following URLs for adaptive streaming: ");
- Console.WriteLine(smoothStreamingUri);
- Console.WriteLine(hlsUri);
- Console.WriteLine(mpegDashUri);
- Console.WriteLine();
-
- // Display the URLs for progressive download.
- Console.WriteLine("Use the following URLs for progressive download.");
- mp4ProgressiveDownloadUris.ForEach(uri => Console.WriteLine(uri + "\n"));
- Console.WriteLine();
-
- // Download the output asset to a local folder.
- string outputFolder = "job-output";
- if (!Directory.Exists(outputFolder))
- {
- Directory.CreateDirectory(outputFolder);
- }
-
- Console.WriteLine();
- Console.WriteLine("Downloading output asset files to a local folder...");
- asset.DownloadToFolder(
- outputFolder,
- (af, p) =>
- {
- Console.WriteLine("Downloading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- Console.WriteLine("Output asset files available at '{0}'.", Path.GetFullPath(outputFolder));
- }
-```
-
-## Test by playing your content
-
-Once you run the program defined in the previous section, the URLs similar to the following will be displayed in the console window.
-
-Adaptive streaming URLs:
-
-Smooth Streaming
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest`
-
-HLS
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest(format=m3u8-aapl)`
-
-MPEG DASH
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest(format=mpd-time-csf)`
-
-Progressive download URLs (audio and video).
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_400kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_3400kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_2250kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_1500kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_1000kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-`https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_AAC_und_ch2_56kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
--
-To stream your video, paste your URL in the URL textbox in the [Azure Media Services Player](https://aka.ms/azuremediaplayer).
-
-To test progressive download, paste a URL into a browser (for example, Internet Explorer, Chrome, or Safari).
-
-For more information, see the following topics:
--- [Playing your content with existing players](media-services-playback-content-with-existing-players.md)-- [Embedding an MPEG-DASH Adaptive Streaming Video in an HTML5 Application with DASH.js](media-services-embed-mpeg-dash-in-html5.md)-
-## Download sample
-The following code sample contains the code that you created in this tutorial: [sample](https://azure.microsoft.com/documentation/samples/media-services-dotnet-on-demand-encoding-with-media-encoder-standard/).
-
-## Next Steps
--
-## Provide feedback
---
-<!-- Anchors. -->
--
-<!-- URLs. -->
-[Web Platform Installer]: https://go.microsoft.com/fwlink/?linkid=255386
-[Portal]: https://portal.azure.com/
media-services Media Services Dotnet How To Use Azure Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-how-to-use-azure-functions.md
- Title: Develop Azure Functions with Media Services
-description: This topic shows how to start developing Azure Functions with Media Services using the Azure portal.
------ Previously updated : 03/10/2021---
-# Develop Azure Functions with Media Services
--
-This article shows you how to get started with creating Azure Functions that use Media Services. The Azure Function defined in this article monitors a storage account container named **input** for new MP4 files. Once a file is dropped into the storage container, the blob trigger executes the function. To review Azure Functions, see [Overview](../../azure-functions/functions-overview.md) and other topics in the **Azure Functions** section.
-
-If you want to explore and deploy existing Azure Functions that use Azure Media Services, check out [Media Services Azure Functions](https://github.com/Azure-Samples/media-services-dotnet-functions-integration). This repository contains examples that use Media Services to show workflows related to ingesting content directly from blob storage, encoding, and writing content back to blob storage. It also includes examples of how to monitor job notifications via WebHooks and Azure Queues. You can also develop your Functions based on the examples in the [Media Services Azure Functions](https://github.com/Azure-Samples/media-services-dotnet-functions-integration) repository. To deploy the functions, press the **Deploy to Azure** button.
-
-## Prerequisites
--- Before you can create your first function, you need to have an active Azure account. If you don't already have an Azure account, [free accounts are available](https://azure.microsoft.com/free/).-- If you are going to create Azure Functions that perform actions on your Azure Media Services (AMS) account or listen to events sent by Media Services, you should create an AMS account, as described [here](media-services-portal-create-account.md).
-
-## Create a function app
-
-1. Go to the [Azure portal](https://portal.azure.com) and sign-in with your Azure account.
-2. Create a function app as described [here](../../azure-functions/functions-create-function-app-portal.md).
-
->[!NOTE]
-> A storage account that you specify in the **StorageConnection** environment variable (see the next step) should be in the same region as your app.
-
-## Configure function app settings
-
-When developing Media Services functions, it is handy to add environment variables that will be used throughout your functions. To configure app settings, click the Configure App Settings link. For more information, see [How to configure Azure Function app settings](../../azure-functions/functions-how-to-use-azure-function-app-settings.md).
-
-The function, defined in this article, assumes you have the following environment variables in your app settings:
-
-**AMSAADTenantDomain**: Azure AD tenant endpoint. For more information about connecting to the AMS API, see [this](media-services-use-aad-auth-to-access-ams-api.md) article.
-
-**AMSRESTAPIEndpoint**: URI that represents the REST API endpoint.
-
-**AMSClientId**: Azure AD application client ID.
-
-**AMSClientSecret**: Azure AD application client secret.
-
-**StorageConnection**: storage connection of the account associated with the Media Services account. This value is used in the **function.json** file and **run.csx** file (described below).
-
-## Create a function
-
-Once your function app is deployed, you can find it among **App Services** Azure Functions.
-
-1. Select your function app and click **New Function**.
-2. Choose the **C#** language and **Data Processing** scenario.
-3. Choose **BlobTrigger** template. This function is triggered whenever a blob is uploaded into the **input** container. The **input** name is specified in the **Path**, in the next step.
-
- ![Screenshot shows the Choose a template dialog box with BlobTrigger selected.](./media/media-services-azure-functions/media-services-azure-functions004.png)
-
-4. Once you select **BlobTrigger**, some more controls appear on the page.
-
- ![Screenshot shows the Name your function dialog box.](./media/media-services-azure-functions/media-services-azure-functions005.png)
-
-4. Click **Create**.
-
-## Files
-
-Your Azure Function is associated with code files and other files that are described in this section. When you use the Azure portal to create a function, **function.json** and **run.csx** are created for you. You need to add or upload a **project.json** file. The rest of this section gives a brief explanation of each file and shows their definitions.
-
-![Screenshot shows the json files in your project.](./media/media-services-azure-functions/media-services-azure-functions003.png)
-
-### function.json
-
-The function.json file defines the function bindings and other configuration settings. The runtime uses this file to determine the events to monitor and how to pass data into and return data from function execution. For more information, see [Azure Functions HTTP and webhook bindings](../../azure-functions/functions-reference.md#function-code).
-
->[!NOTE]
->Set the **disabled** property to **true** to prevent the function from being executed.
-
-Replace the contents of the existing function.json file with the following code:
-
-```json
-{
- "bindings": [
- {
- "name": "myBlob",
- "type": "blobTrigger",
- "direction": "in",
- "path": "input/{filename}.mp4",
- "connection": "ConnectionString"
- }
- ],
- "disabled": false
-}
-```
-
-### project.json
-
-The project.json file contains dependencies. Here is an example of **project.json** file that includes the required .NET Azure Media Services packages from Nuget. Note that the version numbers change with latest updates to the packages, so you should confirm the most recent versions.
-
-Add the following definition to project.json.
-
-```json
-{
- "frameworks": {
- "net46":{
- "dependencies": {
- "windowsazure.mediaservices": "4.0.0.4",
- "windowsazure.mediaservices.extensions": "4.0.0.4",
- "Microsoft.IdentityModel.Clients.ActiveDirectory": "3.13.1",
- "Microsoft.IdentityModel.Protocol.Extensions": "1.0.2.206221351"
- }
- }
- }
-}
-
-```
-
-### run.csx
-
-This is the C# code for your function. The function defined below monitors a storage account container named **input** (that is what was specified in the path) for new MP4 files. Once a file is dropped into the storage container, the blob trigger executes the function.
-
-The example defined in this section demonstrates
-
-1. how to ingest an asset into a Media Services account (by coping a blob into an AMS asset) and
-2. how to submit an encoding job that uses Media Encoder Standard's "Adaptive Streaming" preset.
-
-In the real life scenario, you most likely want to track job progress and then publish your encoded asset. For more information, see [Use Azure WebHooks to monitor Media Services job notifications](media-services-dotnet-check-job-progress-with-webhooks.md). For more examples, see [Media Services Azure Functions](https://github.com/Azure-Samples/media-services-dotnet-functions-integration).
-
-Replace the contents of the existing run.csx file with the following code: Once you are done defining your function click **Save and Run**.
-
-```csharp
-#r "Microsoft.WindowsAzure.Storage"
-#r "Newtonsoft.Json"
-#r "System.Web"
-
-using System;
-using System.Net;
-using System.Net.Http;
-using Newtonsoft.Json;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Collections.Generic;
-using System.Linq;
-using System.Text;
-using System.Threading;
-using System.Threading.Tasks;
-using System.IO;
-using System.Web;
-using Microsoft.Azure;
-using Microsoft.WindowsAzure.Storage;
-using Microsoft.WindowsAzure.Storage.Blob;
-using Microsoft.WindowsAzure.Storage.Auth;
-using Microsoft.Azure.WebJobs;
-using Microsoft.IdentityModel.Clients.ActiveDirectory;
-
-// Read values from the App.config file.
-
-static readonly string _AADTenantDomain = Environment.GetEnvironmentVariable("AMSAADTenantDomain");
-static readonly string _RESTAPIEndpoint = Environment.GetEnvironmentVariable("AMSRESTAPIEndpoint");
-
-static readonly string _mediaservicesClientId = Environment.GetEnvironmentVariable("AMSClientId");
-static readonly string _mediaservicesClientSecret = Environment.GetEnvironmentVariable("AMSClientSecret");
-
-static readonly string _connectionString = Environment.GetEnvironmentVariable("ConnectionString");
-
-private static CloudMediaContext _context = null;
-private static CloudStorageAccount _destinationStorageAccount = null;
-
-public static void Run(CloudBlockBlob myBlob, string fileName, TraceWriter log)
-{
- // NOTE that the variables {fileName} here come from the path setting in function.json
- // and are passed into the Run method signature above. We can use this to make decisions on what type of file
- // was dropped into the input container for the function.
-
- // No need to do any Retry strategy in this function, By default, the SDK calls a function up to 5 times for a
- // given blob. If the fifth try fails, the SDK adds a message to a queue named webjobs-blobtrigger-poison.
-
- log.Info($"C# Blob trigger function processed: {fileName}.mp4");
- log.Info($"Media Services REST endpoint : {_RESTAPIEndpoint}");
-
- try
- {
- AzureAdTokenCredentials tokenCredentials = new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_mediaservicesClientId, _mediaservicesClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- AzureAdTokenProvider tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- IAsset newAsset = CreateAssetFromBlob(myBlob, fileName, log).GetAwaiter().GetResult();
-
- // Step 2: Create an Encoding Job
-
- // Declare a new encoding job with the Standard encoder
- IJob job = _context.Jobs.Create("Azure Function - MES Job");
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task with the encoding details, using a custom preset
- ITask task = job.Tasks.AddNew("Encode with Adaptive Streaming",
- processor,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(newAsset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew(fileName, AssetCreationOptions.None);
-
- job.Submit();
- log.Info("Job Submitted");
-
- }
- catch (Exception ex)
- {
- log.Error("ERROR: failed.");
- log.Info($"StackTrace : {ex.StackTrace}");
- throw ex;
- }
-}
-
-private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
-{
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
-}
-
-public static async Task<IAsset> CreateAssetFromBlob(CloudBlockBlob blob, string assetName, TraceWriter log){
- IAsset newAsset = null;
-
- try{
- Task<IAsset> copyAssetTask = CreateAssetFromBlobAsync(blob, assetName, log);
- newAsset = await copyAssetTask;
- log.Info($"Asset Copied : {newAsset.Id}");
- }
- catch(Exception ex){
- log.Info("Copy Failed");
- log.Info($"ERROR : {ex.Message}");
- throw ex;
- }
-
- return newAsset;
-}
-
-/// <summary>
-/// Creates a new asset and copies blobs from the specifed storage account.
-/// </summary>
-/// <param name="blob">The specified blob.</param>
-/// <returns>The new asset.</returns>
-public static async Task<IAsset> CreateAssetFromBlobAsync(CloudBlockBlob blob, string assetName, TraceWriter log)
-{
- //Get a reference to the storage account that is associated with the Media Services account.
- _destinationStorageAccount = CloudStorageAccount.Parse(_connectionString);
-
- // Create a new asset.
- var asset = _context.Assets.Create(blob.Name, AssetCreationOptions.None);
- log.Info($"Created new asset {asset.Name}");
-
- IAccessPolicy writePolicy = _context.AccessPolicies.Create("writePolicy",
- TimeSpan.FromHours(4), AccessPermissions.Write);
- ILocator destinationLocator = _context.Locators.CreateLocator(LocatorType.Sas, asset, writePolicy);
- CloudBlobClient destBlobStorage = _destinationStorageAccount.CreateCloudBlobClient();
-
- // Get the destination asset container reference
- string destinationContainerName = (new Uri(destinationLocator.Path)).Segments[1];
- CloudBlobContainer assetContainer = destBlobStorage.GetContainerReference(destinationContainerName);
-
- try{
- assetContainer.CreateIfNotExists();
- }
- catch (Exception ex)
- {
- log.Error ("ERROR:" + ex.Message);
- }
-
- log.Info("Created asset.");
-
- // Get hold of the destination blob
- CloudBlockBlob destinationBlob = assetContainer.GetBlockBlobReference(blob.Name);
-
- // Copy Blob
- try
- {
- using (var stream = await blob.OpenReadAsync())
- {
- await destinationBlob.UploadFromStreamAsync(stream);
- }
-
- log.Info("Copy Complete.");
-
- var assetFile = asset.AssetFiles.Create(blob.Name);
- assetFile.ContentFileSize = blob.Properties.Length;
- assetFile.IsPrimary = true;
- assetFile.Update();
- asset.Update();
- }
- catch (Exception ex)
- {
- log.Error(ex.Message);
- log.Info (ex.StackTrace);
- log.Info ("Copy Failed.");
- throw;
- }
-
- destinationLocator.Delete();
- writePolicy.Delete();
-
- return asset;
-}
-```
-
-## Test your function
-
-To test your function, you need to upload an MP4 file into the **input** container of the storage account that you specified in the connection string.
-
-1. Select the storage account that you specified in the **StorageConnection** environment variable.
-2. Click **Blobs**.
-3. Click **+ Container**. Name the container **input**.
-4. Press **Upload** and browse to a .mp4 file that you want to upload.
-
->[!NOTE]
-> When you're using a blob trigger on a Consumption plan, there can be up to a 10-minute delay in processing new blobs after a function app has gone idle. After the function app is running, blobs are processed immediately. For more information, see [Blob storage triggers and bindings](../../azure-functions/functions-bindings-storage-blob.md).
-
-## Next steps
-
-At this point, you are ready to start developing a Media Services application.
-
-For more details and complete samples/solutions of using Azure Functions and Logic Apps with Azure Media Services to create custom content creation workflows, see the [Media Services .NET Functions Integration Sample on GitHub](https://github.com/Azure-Samples/media-services-dotnet-functions-integration)
-
-Also, see [Use Azure WebHooks to monitor Media Services job notifications with .NET](media-services-dotnet-check-job-progress-with-webhooks.md).
-
-## Provide feedback
media-services Media Services Dotnet How To Use https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-how-to-use.md
- Title: How to Set Up Computer for Media Services Development with .NET
-description: Learn about the prerequisites for Media Services using the Media Services SDK for .NET. Also learn how to create a Visual Studio app.
------ Previously updated : 03/10/2021---
-# Media Services development with .NET
---
-This article discusses how to start developing Media Services applications using .NET.
-
-The **Azure Media Services .NET SDK** library enables you to program against Media Services using .NET. To make it even easier to develop with .NET, the **Azure Media Services .NET SDK Extensions** library is provided. This library contains a set of extension methods and helper functions that simplify your .NET code. Both libraries are available through **NuGet** and **GitHub**.
-
-## Prerequisites
-* A Media Services account in a new or existing Azure subscription. See the article [How to Create a Media Services Account](media-services-portal-create-account.md).
-* Operating Systems: Windows 10, Windows 7, Windows 2008 R2, or Windows 8.
-* .NET Framework 4.5 or later.
-* Visual Studio.
-
-## Create and configure a Visual Studio project
-This section shows you how to create a project in Visual Studio and set it up for Media Services development. In this case, the project is a C# Windows console application, but the same setup steps shown here apply to other types of projects you can create for Media Services applications (for example, a Windows Forms application or an ASP.NET Web application).
-
-This section shows how to use **NuGet** to add Media Services .NET SDK extensions and other dependent libraries.
-
-Alternatively, you can get the latest Media Services .NET SDK bits from GitHub ([github.com/Azure/azure-sdk-for-media-services](https://github.com/Azure/azure-sdk-for-media-services) or [github.com/Azure/azure-sdk-for-media-services-extensions](https://github.com/Azure/azure-sdk-for-media-services-extensions)), build the solution, and add the references to the client project. All the necessary dependencies get downloaded and extracted automatically.
-
-1. Create a new C# Console Application in Visual Studio. Enter the **Name**, **Location**, and **Solution name**, and then click OK.
-2. Build the solution.
-3. Use **NuGet** to install and add **Azure Media Services .NET SDK Extensions** (**windowsazure.mediaservices.extensions**). Installing this package, also installs **Media Services .NET SDK** and adds all other required dependencies.
-
- Ensure that you have the newest version of NuGet installed. For more information and installation instructions, see [NuGet](https://www.nuget.org/packages/CodePlex.LinqToXsd/).
-
- 1. In Solution Explorer, right-click the name of the project and choose **Manage NuGet Packages**.
-
- 2. The Manage NuGet Packages dialog box appears.
-
- 3. In the Online gallery, search for Azure MediaServices Extensions, choose **Azure Media Services .NET SDK Extensions** (**windowsazure.mediaservices.extensions**), and then click the **Install** button.
-
- 4. The project is modified and references to the Media Services .NET SDK Extensions, Media Services .NET SDK, and other dependent assemblies are added.
-4. To promote a cleaner development environment, consider enabling NuGet Package Restore. For more information, see [NuGet Package Restore"](https://docs.nuget.org/consume/package-restore).
-5. Add a reference to **System.Configuration** assembly. This assembly contains the System.Configuration.**ConfigurationManager** class that is used to access configuration files (for example, App.config).
-
- 1. To add references using the Manage References dialog, right-click the project name in the Solution Explorer. Then, click **Add**, then click **Reference...**.
-
- 2. The Manage References dialog appears.
- 3. Under .NET framework assemblies, find and select the System.Configuration assembly and press **OK**.
-6. Open the App.config file and add an **appSettings** section to the file. Set the values that are needed to connect to the Media Services API. For more information, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
- Set the values that are needed to connect using the **Service principal** authentication method.
-
- ```xml
- <configuration>
- ...
- <appSettings>
- <add key="AMSAADTenantDomain" value="tenant"/>
- <add key="AMSRESTAPIEndpoint" value="endpoint"/>
- <add key="AMSClientId" value="id"/>
- <add key="AMSClientSecret" value="secret"/>
- </appSettings>
- </configuration>
- ```
-
-7. Add the **System.Configuration** reference to your project.
-8. Overwrite the existing **using** statements at the beginning of the Program.cs file with the following code:
-
- ```csharp
- using System;
- using System.Configuration;
- using System.IO;
- using Microsoft.WindowsAzure.MediaServices.Client;
- using System.Threading;
- using System.Collections.Generic;
- using System.Linq;
- ```
-
- At this point, you are ready to start developing a Media Services application.
-
-## Example
-
-Here is a small example that connects to the AMS API and lists all available Media Processors.
-
-```csharp
-class Program
-{
- // Read values from the App.config file.
-
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // List all available Media Processors
- foreach (var mp in _context.MediaProcessors)
- {
- Console.WriteLine(mp.Name);
- }
-
- }
- ```
-
-## Next steps
-
-Now [you can connect to the AMS API](media-services-use-aad-auth-to-access-ams-api.md) and start [developing](media-services-dotnet-get-started.md).
--
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Dotnet Live Encode With Onpremises Encoders https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-live-encode-with-onpremises-encoders.md
- Title: How to perform live streaming with on-premises encoders using .NET | Microsoft Docs
-description: This topic shows how to use .NET to perform live encoding with on-premises encoders.
------ Previously updated : 03/10/2021---
-# How to perform live streaming with on-premises encoders using .NET
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-live-passthrough-get-started.md)
-> * [.NET](media-services-dotnet-live-encode-with-onpremises-encoders.md)
-> * [REST](/rest/api/media/operations/channel)
->
->
--
-This tutorial walks you through the steps of using the Azure Media Services .NET SDK to create a **Channel** that is configured for a pass-through delivery.
-
-## Prerequisites
-The following are required to complete the tutorial:
-
-* An Azure account.
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* Make sure the streaming endpoint from which you want to stream content is in the **Running** state.
-* Set up your dev environment. For more information, see [Set up your environment](media-services-set-up-computer.md).
-* A webcam. For example, [Telestream Wirecast encoder](media-services-configure-wirecast-live-encoder.md).
-
-Recommended to review the following articles:
-
-* [Azure Media Services RTMP Support and Live Encoders](https://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/)
-* [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md)
-
-## Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Example
-
-The following code example demonstrates how to achieve the following tasks:
-
-* Connect to Media Services
-* Create a channel
-* Update the channel
-* Retrieve the channelΓÇÖs input endpoint. The input endpoint should be provided to the on-premises live encoder. The live encoder converts signals from the camera to streams that are sent to the channelΓÇÖs input (ingest) endpoint.
-* Retrieve the channelΓÇÖs preview endpoint
-* Create and start a program
-* Create a locator needed to access the program
-* Create and start a StreamingEndpoint
-* Update the streaming endpoint
-* Shut down resources
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-For information on how to configure a live encoder, see [Azure Media Services RTMP Support and Live Encoders](https://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/).
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.Linq;
-using System.Net;
-using System.Security.Cryptography;
-using Microsoft.WindowsAzure.MediaServices.Client;
-
-namespace AMSLiveTest
-{
- class Program
- {
- private const string StreamingEndpointName = "streamingendpoint001";
- private const string ChannelName = "channel001";
- private const string AssetName = "asset001";
- private const string ProgramName = "program001";
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- IChannel channel = CreateAndStartChannel();
-
- // Set the Live Encoder to point to the channel's input endpoint:
- string ingestUrl = channel.Input.Endpoints.FirstOrDefault().Url.ToString();
-
- // Use the previewEndpoint to preview and verify
- // that the input from the encoder is actually reaching the Channel.
- string previewEndpoint = channel.Preview.Endpoints.FirstOrDefault().Url.ToString();
-
- IProgram program = CreateAndStartProgram(channel);
- ILocator locator = CreateLocatorForAsset(program.Asset, program.ArchiveWindowLength);
- IStreamingEndpoint streamingEndpoint = CreateAndStartStreamingEndpoint(false);
-
- // Once you are done streaming, clean up your resources.
- Cleanup(streamingEndpoint, channel);
- }
-
- public static IChannel CreateAndStartChannel()
- {
- //If you want to change the Smooth fragments to HLS segment ratio, you would set the ChannelCreationOptionsΓÇÖs Output property.
-
- IChannel channel = _context.Channels.Create(
- new ChannelCreationOptions
- {
- Name = ChannelName,
- Input = CreateChannelInput(),
- Preview = CreateChannelPreview()
- });
-
- //Starting and stopping Channels can take some time to execute. To determine the state of operations after calling Start or Stop, query the IChannel.State .
-
- channel.Start();
-
- return channel;
- }
-
- private static ChannelInput CreateChannelInput()
- {
- // When creating a Channel, you can specify allowed IP addresses in one of the following formats:
- // IpV4 address with 4 numbers
- // CIDR address range
-
- return new ChannelInput
- {
- StreamingProtocol = StreamingProtocol.RTMP,
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelInput001",
- // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
- // will allow access to IP addresses.
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- private static ChannelPreview CreateChannelPreview()
- {
- // When creating a Channel, you can specify allowed IP addresses in one of the following formats:
- // IpV4 address with 4 numbers
- // CIDR address range
-
- return new ChannelPreview
- {
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelPreview001",
- // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
- // will allow access to IP addresses.
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- public static void UpdateCrossSiteAccessPoliciesForChannel(IChannel channel)
- {
- var clientPolicy =
- @"<?xml version=""1.0"" encoding=""utf-8""?>
- <access-policy>
- <cross-domain-access>
- <policy>
- <allow-from http-request-headers=""*"" http-methods=""*"">
- <domain uri=""*""/>
- </allow-from>
- <grant-to>
- <resource path=""/"" include-subpaths=""true""/>
- </grant-to>
- </policy>
- </cross-domain-access>
- </access-policy>";
-
- var xdomainPolicy =
- @"<?xml version=""1.0"" ?>
- <cross-domain-policy>
- <allow-access-from domain=""*"" />
- </cross-domain-policy>";
-
- channel.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
- channel.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;
-
- channel.Update();
- }
-
- public static IProgram CreateAndStartProgram(IChannel channel)
- {
- IAsset asset = _context.Assets.Create(AssetName, AssetCreationOptions.None);
-
- // Create a Program on the Channel. You can have multiple Programs that overlap or are sequential;
- // however each Program must have a unique name within your Media Services account.
- IProgram program = channel.Programs.Create(ProgramName, TimeSpan.FromHours(3), asset.Id);
- program.Start();
-
- return program;
- }
-
- public static ILocator CreateLocatorForAsset(IAsset asset, TimeSpan ArchiveWindowLength)
- {
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
-
- var locator = _context.Locators.CreateLocator
- (
- LocatorType.OnDemandOrigin,
- asset,
- _context.AccessPolicies.Create
- (
- "Live Stream Policy",
- ArchiveWindowLength,
- AccessPermissions.Read
- )
- );
-
- return locator;
- }
-
- public static IStreamingEndpoint CreateAndStartStreamingEndpoint(bool createNew)
- {
- IStreamingEndpoint streamingEndpoint = null;
- if (createNew)
- {
- var options = new StreamingEndpointCreationOptions
- {
- Name = StreamingEndpointName,
- ScaleUnits = 1,
- AccessControl = GetAccessControl(),
- CacheControl = GetCacheControl()
- };
-
- streamingEndpoint = _context.StreamingEndpoints.Create(options);
- }
- else
- {
- streamingEndpoint = _context.StreamingEndpoints.FirstOrDefault();
- }
--
- if (streamingEndpoint.State == StreamingEndpointState.Stopped)
- streamingEndpoint.Start();
-
- return streamingEndpoint;
- }
-
- private static StreamingEndpointAccessControl GetAccessControl()
- {
- return new StreamingEndpointAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "Allow all",
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- },
-
- AkamaiSignatureHeaderAuthenticationKeyList = new List<AkamaiSignatureHeaderAuthenticationKey>
- {
- new AkamaiSignatureHeaderAuthenticationKey
- {
- Identifier = "My key",
- Expiration = DateTime.UtcNow + TimeSpan.FromDays(365),
- Base64Key = Convert.ToBase64String(GenerateRandomBytes(16))
- }
- }
- };
- }
-
- private static byte[] GenerateRandomBytes(int length)
- {
- var bytes = new byte[length];
- using (var rng = new RNGCryptoServiceProvider())
- {
- rng.GetBytes(bytes);
- }
-
- return bytes;
- }
-
- private static StreamingEndpointCacheControl GetCacheControl()
- {
- return new StreamingEndpointCacheControl
- {
- MaxAge = TimeSpan.FromSeconds(1000)
- };
- }
-
- public static void UpdateCrossSiteAccessPoliciesForStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
- {
- var clientPolicy =
- @"<?xml version=""1.0"" encoding=""utf-8""?>
- <access-policy>
- <cross-domain-access>
- <policy>
- <allow-from http-request-headers=""*"" http-methods=""*"">
- <domain uri=""*""/>
- </allow-from>
- <grant-to>
- <resource path=""/"" include-subpaths=""true""/>
- </grant-to>
- </policy>
- </cross-domain-access>
- </access-policy>";
-
- var xdomainPolicy =
- @"<?xml version=""1.0"" ?>
- <cross-domain-policy>
- <allow-access-from domain=""*"" />
- </cross-domain-policy>";
-
- streamingEndpoint.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
- streamingEndpoint.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;
-
- streamingEndpoint.Update();
- }
-
- public static void Cleanup(IStreamingEndpoint streamingEndpoint,
- IChannel channel)
- {
- if (streamingEndpoint != null)
- {
- streamingEndpoint.Stop();
- if (streamingEndpoint.Name != "default")
- streamingEndpoint.Delete();
- }
-
- IAsset asset;
- if (channel != null)
- {
-
- foreach (var program in channel.Programs)
- {
- asset = _context.Assets.Where(se => se.Id == program.AssetId)
- .FirstOrDefault();
-
- program.Stop();
- program.Delete();
-
- if (asset != null)
- {
- foreach (var l in asset.Locators)
- l.Delete();
-
- asset.Delete();
- }
- }
-
- channel.Stop();
- channel.Delete();
- }
- }
- }
-}
-```
-
-## Next Step
-Review Media Services learning paths
--
-## Provide feedback
media-services Media Services Dotnet Long Operations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-long-operations.md
- Title: Polling Long-Running Operations | Microsoft Docs
-description: Azure Media Services offers APIs that send requests to Media Services to start operations (for example, create, start, stop, or delete a channel), these operations are long-running. This topic shows how to poll long-running operations.
--
-writer: juliako
---- Previously updated : 03/10/2021----
-# Delivering Live Streaming with Azure Media Services
--
-## Overview
-
-Microsoft Azure Media Services offers APIs that send requests to Media Services to start operations (for example: create, start, stop, or delete a channel). These operations are long-running.
-
-The Media Services .NET SDK provides APIs that send the request and wait for the operation to complete (internally, the APIs are polling for operation progress at some intervals). For example, when you call channel.Start(), the method returns after the channel is started. You can also use the asynchronous version: await channel.StartAsync() (for information about Task-based Asynchronous Pattern, see [TAP](./media-services-mes-schema.md)). APIs that send an operation request and then poll for the status until the operation is complete are called ΓÇ£polling methodsΓÇ¥. These methods (especially the Async version) are recommended for rich client applications and/or stateful services.
-
-There are scenarios where an application cannot wait for a long running http request and wants to poll for the operation progress manually. A typical example would be a browser interacting with a stateless web service: when the browser requests to create a channel, the web service initiates a long running operation and returns the operation ID to the browser. The browser could then ask the web service to get the operation status based on the ID. The Media Services .NET SDK provides APIs that are useful for this scenario. These APIs are called ΓÇ£non-polling methodsΓÇ¥.
-The ΓÇ£non-polling methodsΓÇ¥ have the following naming pattern: Send*OperationName*Operation (for example, SendCreateOperation). Send*OperationName*Operation methods return the **IOperation** object; the returned object contains information that can be used to track the operation. The Send*OperationName*OperationAsync methods return **Task\<IOperation>**.
-
-Currently, the following classes support non-polling methods: **Channel**, **StreamingEndpoint**, and **Program**.
-
-To poll for the operation status, use the **GetOperation** method on the **OperationBaseCollection** class. Use the following intervals to check the operation status: for **Channel** and **StreamingEndpoint** operations, use 30 seconds; for **Program** operations, use 10 seconds.
-
-## Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Example
-
-The following example defines a class called **ChannelOperations**. This class definition could be a starting point for your web service class definition. For simplicity, the following examples use the non-async versions of methods.
-
-The example also shows how the client might use this class.
-
-### ChannelOperations class definition
-
-```csharp
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.Net;
-
-/// <summary>
-/// The ChannelOperations class only implements
-/// the ChannelΓÇÖs creation operation.
-/// </summary>
-public class ChannelOperations
-{
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- public ChannelOperations()
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
- }
-
- /// <summary>
- /// Initiates the creation of a new channel.
- /// </summary>
- /// <param name="channelName">Name to be given to the new channel</param>
- /// <returns>
- /// Operation Id for the long running operation being executed by Media Services.
- /// Use this operation Id to poll for the channel creation status.
- /// </returns>
- public string StartChannelCreation(string channelName)
- {
- var operation = _context.Channels.SendCreateOperation(
- new ChannelCreationOptions
- {
- Name = channelName,
- Input = CreateChannelInput(),
- Preview = CreateChannelPreview(),
- Output = CreateChannelOutput()
- });
-
- return operation.Id;
- }
-
- /// <summary>
- /// Checks if the operation has been completed.
- /// If the operation succeeded, the created channel Id is returned in the out parameter.
- /// </summary>
- /// <param name="operationId">The operation Id.</param>
- /// <param name="channel">
- /// If the operation succeeded,
- /// the created channel Id is returned in the out parameter.</param>
- /// <returns>Returns false if the operation is still in progress; otherwise, true.</returns>
- public bool IsCompleted(string operationId, out string channelId)
- {
- IOperation operation = _context.Operations.GetOperation(operationId);
- bool completed = false;
-
- channelId = null;
-
- switch (operation.State)
- {
- case OperationState.Failed:
- // Handle the failure.
- // For example, throw an exception.
- // Use the following information in the exception: operationId, operation.ErrorMessage.
- break;
- case OperationState.Succeeded:
- completed = true;
- channelId = operation.TargetEntityId;
- break;
- case OperationState.InProgress:
- completed = false;
- break;
- }
- return completed;
- }
-
- private static ChannelInput CreateChannelInput()
- {
- return new ChannelInput
- {
- StreamingProtocol = StreamingProtocol.RTMP,
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelInput001",
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- private static ChannelPreview CreateChannelPreview()
- {
- return new ChannelPreview
- {
- AccessControl = new ChannelAccessControl
- {
- IPAllowList = new List<IPRange>
- {
- new IPRange
- {
- Name = "TestChannelPreview001",
- Address = IPAddress.Parse("0.0.0.0"),
- SubnetPrefixLength = 0
- }
- }
- }
- };
- }
-
- private static ChannelOutput CreateChannelOutput()
- {
- return new ChannelOutput
- {
- Hls = new ChannelOutputHls { FragmentsPerSegment = 1 }
- };
- }
-}
-```
-
-### The client code
-
-```csharp
-ChannelOperations channelOperations = new ChannelOperations();
-string opId = channelOperations.StartChannelCreation("MyChannel001");
-
-string channelId = null;
-bool isCompleted = false;
-
-while (isCompleted == false)
-{
- System.Threading.Thread.Sleep(TimeSpan.FromSeconds(30));
- isCompleted = channelOperations.IsCompleted(opId, out channelId);
-}
-
-// If we got here, we should have the newly created channel id.
-Console.WriteLine(channelId);
-```
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Dotnet Manage Entities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-manage-entities.md
- Title: Managing Assets and Related Entities with Media Services .NET SDK
-description: Learn how to manage assets and related entities with the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# Managing Assets and Related Entities with Media Services .NET SDK
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-manage-entities.md)
-> * [REST](media-services-rest-manage-entities.md)
->
->
--
-This topic shows how to manage Azure Media Services entities with .NET.
-
-Starting April 1, 2017, any Job record in your account older than 90 days will be automatically deleted, along with its associated Task records, even if the total number of records is below the maximum quota. For example, on April 1, 2017, any Job record in your account older than December 31, 2016, will be automatically deleted. If you need to archive the job/task information, you can use the code described in this topic.
-
-## Prerequisites
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Get an Asset Reference
-A frequent task is to get a reference to an existing asset in Media Services. The following code example shows how you can get an asset reference from the Assets collection on the server context object, based on an asset Id.
-The following code example uses a Linq query to get a reference to an existing IAsset object.
-
-```csharp
- static IAsset GetAsset(string assetId)
- {
- // Use a LINQ Select query to get an asset.
- var assetInstance =
- from a in _context.Assets
- where a.Id == assetId
- select a;
- // Reference the asset as an IAsset.
- IAsset asset = assetInstance.FirstOrDefault();
-
- return asset;
- }
-```
-
-## List All Assets
-As the number of assets you have in storage grows, it is helpful to list your assets. The following code example shows how to iterate through the Assets collection on the server context object. With each asset, the code example also writes some of its property values to the console. For example, each asset can contain many media files. The code example writes out all files associated with each asset.
-
-```csharp
- static void ListAssets()
- {
- string waitMessage = "Building the list. This may take a few "
- + "seconds to a few minutes depending on how many assets "
- + "you have."
- + Environment.NewLine + Environment.NewLine
- + "Please wait..."
- + Environment.NewLine;
- Console.Write(waitMessage);
-
- // Create a Stringbuilder to store the list that we build.
- StringBuilder builder = new StringBuilder();
-
- foreach (IAsset asset in _context.Assets)
- {
- // Display the collection of assets.
- builder.AppendLine("");
- builder.AppendLine("******ASSET******");
- builder.AppendLine("Asset ID: " + asset.Id);
- builder.AppendLine("Name: " + asset.Name);
- builder.AppendLine("==============");
- builder.AppendLine("******ASSET FILES******");
-
- // Display the files associated with each asset.
- foreach (IAssetFile fileItem in asset.AssetFiles)
- {
- builder.AppendLine("Name: " + fileItem.Name);
- builder.AppendLine("Size: " + fileItem.ContentFileSize);
- builder.AppendLine("==============");
- }
- }
-
- // Display output in console.
- Console.Write(builder.ToString());
- }
-```
-
-## Get a Job Reference
-
-When you work with processing tasks in Media Services code, you often need to get a reference to an existing job based on an Id. The following code example shows how to get a reference to an IJob object from the Jobs collection.
-
-You may need to get a job reference when starting a long-running encoding job, and need to check the job status on a thread. In cases like this, when the method returns from a thread, you need to retrieve a refreshed reference to a job.
-
-```csharp
- static IJob GetJob(string jobId)
- {
- // Use a Linq select query to get an updated
- // reference by Id.
- var jobInstance =
- from j in _context.Jobs
- where j.Id == jobId
- select j;
- // Return the job reference as an Ijob.
- IJob job = jobInstance.FirstOrDefault();
-
- return job;
- }
-```
-
-## List Jobs and Assets
-An important related task is to list assets with their associated job in Media Services. The following code example shows you how to list each IJob object, and then for each job, it displays properties about the job, all related tasks, all input assets, and all output assets. The code in this example can be useful for numerous other tasks. For example, if you want to list the output assets from one or more encoding jobs that you ran previously, this code shows how to access the output assets. When you have a reference to an output asset, you can then deliver the content to other users or applications by downloading it, or providing URLs.
-
-For more information on options for delivering assets, see [Deliver Assets with the Media Services SDK for .NET](media-services-deliver-streaming-content.md).
-
-```csharp
- // List all jobs on the server, and for each job, also list
- // all tasks, all input assets, all output assets.
-
- static void ListJobsAndAssets()
- {
- string waitMessage = "Building the list. This may take a few "
- + "seconds to a few minutes depending on how many assets "
- + "you have."
- + Environment.NewLine + Environment.NewLine
- + "Please wait..."
- + Environment.NewLine;
- Console.Write(waitMessage);
-
- // Create a Stringbuilder to store the list that we build.
- StringBuilder builder = new StringBuilder();
-
- foreach (IJob job in _context.Jobs)
- {
- // Display the collection of jobs on the server.
- builder.AppendLine("");
- builder.AppendLine("******JOB*******");
- builder.AppendLine("Job ID: " + job.Id);
- builder.AppendLine("Name: " + job.Name);
- builder.AppendLine("State: " + job.State);
- builder.AppendLine("Order: " + job.Priority);
- builder.AppendLine("==============");
--
- // For each job, display the associated tasks (a job
- // has one or more tasks).
- builder.AppendLine("******TASKS*******");
- foreach (ITask task in job.Tasks)
- {
- builder.AppendLine("Task Id: " + task.Id);
- builder.AppendLine("Name: " + task.Name);
- builder.AppendLine("Progress: " + task.Progress);
- builder.AppendLine("Configuration: " + task.Configuration);
- if (task.ErrorDetails != null)
- {
- builder.AppendLine("Error: " + task.ErrorDetails);
- }
- builder.AppendLine("==============");
- }
-
- // For each job, display the list of input media assets.
- builder.AppendLine("******JOB INPUT MEDIA ASSETS*******");
- foreach (IAsset inputAsset in job.InputMediaAssets)
- {
-
- if (inputAsset != null)
- {
- builder.AppendLine("Input Asset Id: " + inputAsset.Id);
- builder.AppendLine("Name: " + inputAsset.Name);
- builder.AppendLine("==============");
- }
- }
-
- // For each job, display the list of output media assets.
- builder.AppendLine("******JOB OUTPUT MEDIA ASSETS*******");
- foreach (IAsset theAsset in job.OutputMediaAssets)
- {
- if (theAsset != null)
- {
- builder.AppendLine("Output Asset Id: " + theAsset.Id);
- builder.AppendLine("Name: " + theAsset.Name);
- builder.AppendLine("==============");
- }
- }
-
- }
-
- // Display output in console.
- Console.Write(builder.ToString());
- }
-```
-
-## List all Access Policies
-In Media Services, you can define an access policy on an asset or its files. An access policy defines the permissions for a file or an asset (what type of access, and the duration). In your Media Services code, you typically define an access policy by creating an IAccessPolicy object and then associating it with an existing asset. Then you create an ILocator object, which lets you provide direct access to assets in Media Services. The Visual Studio project that accompanies this documentation series contains several code examples that show how to create and assign access policies and locators to assets.
-
-The following code example shows how to list all access policies on the server, and shows the type of permissions associated with each. Another useful way to view access policies is to list all ILocator objects on the server, and then for each locator, you can list its associated access policy by using its AccessPolicy property.
-
-```csharp
- static void ListAllPolicies()
- {
- foreach (IAccessPolicy policy in _context.AccessPolicies)
- {
- Console.WriteLine("");
- Console.WriteLine("Name: " + policy.Name);
- Console.WriteLine("ID: " + policy.Id);
- Console.WriteLine("Permissions: " + policy.Permissions);
- Console.WriteLine("==============");
-
- }
- }
-```
-
-## Limit Access Policies
-
->[!NOTE]
-> There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies).
-
-For example, you can create a generic set of policies with the following code that would only run one time in your application. You can log IDs to a log file for later use:
-
-```csharp
- double year = 365.25;
- double week = 7;
- IAccessPolicy policyYear = _context.AccessPolicies.Create("One Year", TimeSpan.FromDays(year), AccessPermissions.Read);
- IAccessPolicy policy100Year = _context.AccessPolicies.Create("Hundred Years", TimeSpan.FromDays(year * 100), AccessPermissions.Read);
- IAccessPolicy policyWeek = _context.AccessPolicies.Create("One Week", TimeSpan.FromDays(week), AccessPermissions.Read);
-
- Console.WriteLine("One year policy ID is: " + policyYear.Id);
- Console.WriteLine("100 year policy ID is: " + policy100Year.Id);
- Console.WriteLine("One week policy ID is: " + policyWeek.Id);
-```
-
-Then, you can use the existing IDs in your code like this:
-
-```csharp
- const string policy1YearId = "nb:pid:UUID:2a4f0104-51a9-4078-ae26-c730f88d35cf";
--
- // Get the standard policy for 1 year read only
- var tempPolicyId = from b in _context.AccessPolicies
- where b.Id == policy1YearId
- select b;
- IAccessPolicy policy1Year = tempPolicyId.FirstOrDefault();
-
- // Get the existing asset
- var tempAsset = from a in _context.Assets
- where a.Id == assetID
- select a;
- IAsset asset = tempAsset.SingleOrDefault();
-
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy1Year,
- DateTime.UtcNow.AddMinutes(-5));
- Console.WriteLine("The locator base path is " + originLocator.BaseUri.ToString());
-```
-
-## List All Locators
-A locator is a URL that provides a direct path to access an asset, along with permissions to the asset as defined by the locator's associated access policy. Each asset can have a collection of ILocator objects associated with it on its Locators property. The server context also has a Locators collection that contains all locators.
-
-The following code example lists all locators on the server. For each locator, it shows the Id for the related asset and access policy. It also displays the type of permissions, the expiration date, and the full path to the asset.
-
-Note that a locator path to an asset is only a base URL to the asset. To create a direct path to individual files that a user or application could browse to, your code must add the specific file path to the locator path. For more information on how to do this, see the topic [Deliver Assets with the Media Services SDK for .NET](media-services-deliver-streaming-content.md).
-
-```csharp
- static void ListAllLocators()
- {
- foreach (ILocator locator in _context.Locators)
- {
- Console.WriteLine("***********");
- Console.WriteLine("Locator Id: " + locator.Id);
- Console.WriteLine("Locator asset Id: " + locator.AssetId);
- Console.WriteLine("Locator access policy Id: " + locator.AccessPolicyId);
- Console.WriteLine("Access policy permissions: " + locator.AccessPolicy.Permissions);
- Console.WriteLine("Locator expiration: " + locator.ExpirationDateTime);
- // The locator path is the base or parent path (with included permissions) to access
- // the media content of an asset. To create a full URL to a specific media file, take
- // the locator path and then append a file name and info as needed.
- Console.WriteLine("Locator base path: " + locator.Path);
- Console.WriteLine("");
- }
- }
-```
-
-## Enumerating through large collections of entities
-When querying entities, there is a limit of 1000 entities returned at one time because public REST v2 limits query results to 1000 results. You need to use Skip and Take when enumerating through large collections of entities.
-
-The following function loops through all the jobs in the provided Media Services Account. Media Services returns 1000 jobs in Jobs Collection. The function makes use of Skip and Take to make sure that all jobs are enumerated (in case you have more than 1000 jobs in your account).
-
-```csharp
- static void ProcessJobs()
- {
- try
- {
-
- int skipSize = 0;
- int batchSize = 1000;
- int currentBatch = 0;
-
- while (true)
- {
- // Loop through all Jobs (1000 at a time) in the Media Services account
- IQueryable _jobsCollectionQuery = _context.Jobs.Skip(skipSize).Take(batchSize);
- foreach (IJob job in _jobsCollectionQuery)
- {
- currentBatch++;
- Console.WriteLine("Processing Job Id:" + job.Id);
- }
-
- if (currentBatch == batchSize)
- {
- skipSize += batchSize;
- currentBatch = 0;
- }
- else
- {
- break;
- }
- }
- }
- catch (Exception ex)
- {
- Console.WriteLine(ex.Message);
- }
- }
-```
-
-## Delete an Asset
-The following example deletes an asset.
-
-```csharp
- static void DeleteAsset( IAsset asset)
- {
- // delete the asset
- asset.Delete();
-
- // Verify asset deletion
- if (GetAsset(asset.Id) == null)
- Console.WriteLine("Deleted the Asset");
-
- }
-```
-
-## Delete a Job
-To delete a job, you must check the state of the job as indicated in the State property. Jobs that are finished or canceled can be deleted, while jobs that are in certain other states, such as queued, scheduled, or processing, must be canceled first, and then they can be deleted.
-
-The following code example shows a method for deleting a job by checking job states and then deleting when the state is finished or canceled. This code depends on the previous section in this topic for getting a reference to a job: Get a job reference.
-
-```csharp
- static void DeleteJob(string jobId)
- {
- bool jobDeleted = false;
-
- while (!jobDeleted)
- {
- // Get an updated job reference.
- IJob job = GetJob(jobId);
-
- // Check and handle various possible job states. You can
- // only delete a job whose state is Finished, Error, or Canceled.
- // You can cancel jobs that are Queued, Scheduled, or Processing,
- // and then delete after they are canceled.
- switch (job.State)
- {
- case JobState.Finished:
- case JobState.Canceled:
- case JobState.Error:
- // Job errors should already be logged by polling or event
- // handling methods such as CheckJobProgress or StateChanged.
- // You can also call job.DeleteAsync to do async deletes.
- job.Delete();
- Console.WriteLine("Job has been deleted.");
- jobDeleted = true;
- break;
- case JobState.Canceling:
- Console.WriteLine("Job is cancelling and will be deleted "
- + "when finished.");
- Console.WriteLine("Wait while job finishes canceling...");
- Thread.Sleep(5000);
- break;
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- job.Cancel();
- Console.WriteLine("Job is scheduled or processing and will "
- + "be deleted.");
- break;
- default:
- break;
- }
-
- }
- }
-```
--
-## Delete an Access Policy
-The following code example shows how to get a reference to an access policy based on a policy Id, and then to delete the policy.
-
-```csharp
- static void DeleteAccessPolicy(string existingPolicyId)
- {
- // To delete a specific access policy, get a reference to the policy.
- // based on the policy Id passed to the method.
- var policyInstance =
- from p in _context.AccessPolicies
- where p.Id == existingPolicyId
- select p;
- IAccessPolicy policy = policyInstance.FirstOrDefault();
-
- policy.Delete();
-
- }
-```
--
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Dotnet Manage Streaming Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-manage-streaming-endpoints.md
- Title: Manage streaming endpoints with .NET SDK. | Microsoft Docs
-description: This article demonstrates how to manage streaming endpoints with .NET SDK.
--
-writer: juliako
---- Previously updated : 03/10/2021----
-# Manage streaming endpoints with .NET SDK
--
->[!NOTE]
->Make sure to review the [overview](media-services-streaming-endpoints-overview.md) article. Also, review [StreamingEndpoint](/rest/api/media/operations/streamingendpoint).
-
-The code in this article shows how to do the following tasks using the Azure Media Services .NET SDK:
--- Examine the default streaming endpoint.-- Create/add new streaming endpoint.-
- You might want to have multiple streaming endpoints if you plan to have different CDNs or a CDN and direct access.
-
- > [!NOTE]
- > You are only billed when your Streaming Endpoint is in running state.
-
-- Update the streaming endpoint.
-
- Make sure to call the Update() function.
--- Delete the streaming endpoint.-
- >[!NOTE]
- >The default streaming endpoint cannot be deleted.
-
-For information about how to scale the streaming endpoint, see [this](media-services-portal-scale-streaming-endpoints.md) article.
-
-## Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Add code that manages streaming endpoints
-
-Replace the code in the Program.cs with the following code:
-
-```csharp
-using System;
-using System.Configuration;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.MediaServices.Client.Live;
-
-namespace AMSStreamingEndpoint
-{
- class Program
- {
- // Read values from the App.config file.
-
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- var defaultStreamingEndpoint = _context.StreamingEndpoints.Where(s => s.Name.Contains("default")).FirstOrDefault();
- ExamineStreamingEndpoint(defaultStreamingEndpoint);
-
- IStreamingEndpoint newStreamingEndpoint = AddStreamingEndpoint();
- UpdateStreamingEndpoint(newStreamingEndpoint);
- DeleteStreamingEndpoint(newStreamingEndpoint);
- }
-
- static public void ExamineStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
- {
- Console.WriteLine(streamingEndpoint.Name);
- Console.WriteLine(streamingEndpoint.StreamingEndpointVersion);
- Console.WriteLine(streamingEndpoint.FreeTrialEndTime);
- Console.WriteLine(streamingEndpoint.ScaleUnits);
- Console.WriteLine(streamingEndpoint.CdnProvider);
- Console.WriteLine(streamingEndpoint.CdnProfile);
- Console.WriteLine(streamingEndpoint.CdnEnabled);
- }
-
- static public IStreamingEndpoint AddStreamingEndpoint()
- {
- var name = "StreamingEndpoint" + DateTime.UtcNow.ToString("hhmmss");
- var option = new StreamingEndpointCreationOptions(name, 1)
- {
- StreamingEndpointVersion = new Version("2.0"),
- CdnEnabled = true,
- CdnProfile = "CdnProfile",
- CdnProvider = CdnProviderType.PremiumVerizon
- };
-
- var streamingEndpoint = _context.StreamingEndpoints.Create(option);
-
- return streamingEndpoint;
- }
-
- static public void UpdateStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
- {
- if (streamingEndpoint.StreamingEndpointVersion == "1.0")
- streamingEndpoint.StreamingEndpointVersion = "2.0";
-
- streamingEndpoint.CdnEnabled = false;
- streamingEndpoint.Update();
- }
-
- static public void DeleteStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
- {
- streamingEndpoint.Delete();
- }
- }
-}
-```
-
-## Next steps
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Dotnet Telemetry https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-telemetry.md
- Title: Configuring Azure Media Services telemetry with .NET| Microsoft Docs
-description: This article shows you how to use the Azure Media Services telemetry using .NET SDK.
------ Previously updated : 03/10/2021----
-# Configuring Azure Media Services telemetry with .NET
--
-This article describes general steps that you might take when configuring the Azure Media Services (AMS) telemetry using .NET SDK.
-
->[!NOTE]
->For the detailed explanation of what is AMS telemetry and how to consume it, see the [overview](media-services-telemetry-overview.md) article.
-
-You can consume telemetry data in one of the following ways:
--- Read data directly from Azure Table Storage (for example, using the Storage SDK). For the description of telemetry storage tables, see the **Consuming telemetry information** in [this](/previous-versions/azure/mt742089(v=azure.100)) article.-
-Or
--- Use the support in the Media Services .NET SDK for reading storage data. This article shows how to enable telemetry for the specified AMS account and how to query the metrics using the Azure Media Services .NET SDK. -
-## Configuring telemetry for a Media Services account
-
-The following steps are needed to enable telemetry:
--- Get the credentials of the storage account attached to the Media Services account. -- Create a Notification Endpoint with **EndPointType** set to **AzureTable** and endPointAddress pointing to the storage table.-
-```csharp
- INotificationEndPoint notificationEndPoint =
- _context.NotificationEndPoints.Create("monitoring",
- NotificationEndPointType.AzureTable,
- "https://" + _mediaServicesStorageAccountName + ".table.core.windows.net/");
-```
--- Create a monitoring configuration setting for the services you want to monitor. No more than one monitoring configuration setting is allowed. -
-```csharp
- IMonitoringConfiguration monitoringConfiguration = _context.MonitoringConfigurations.Create(notificationEndPoint.Id,
- new List<ComponentMonitoringSetting>()
- {
- new ComponentMonitoringSetting(MonitoringComponent.Channel, MonitoringLevel.Normal),
- new ComponentMonitoringSetting(MonitoringComponent.StreamingEndpoint, MonitoringLevel.Normal)
- });
-```
-
-## Consuming telemetry information
-
-For information about consuming telemetry information, see [this](media-services-telemetry-overview.md) article.
-
-## Create and configure a Visual Studio project
-
-1. Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-2. Add the following element to **appSettings** defined in your app.config file:
-
- ```xml
- <add key="StorageAccountName" value="storage_name" />
- ```
-
-## Example
-
-The following example shows how to enable telemetry for the specified AMS account and how to query the metrics using the Azure Media Services .NET SDK.
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-
-namespace AMSMetrics
-{
- class Program
- {
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly string _mediaServicesStorageAccountName =
- ConfigurationManager.AppSettings["StorageAccountName"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static IStreamingEndpoint _streamingEndpoint = null;
- private static IChannel _channel = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- _streamingEndpoint = _context.StreamingEndpoints.FirstOrDefault();
- _channel = _context.Channels.FirstOrDefault();
-
- var monitoringConfigurations = _context.MonitoringConfigurations;
- IMonitoringConfiguration monitoringConfiguration = null;
-
- // No more than one monitoring configuration settings is allowed.
- if (monitoringConfigurations.ToArray().Length != 0)
- {
- monitoringConfiguration = _context.MonitoringConfigurations.FirstOrDefault();
- }
- else
- {
- INotificationEndPoint notificationEndPoint =
- _context.NotificationEndPoints.Create("monitoring",
- NotificationEndPointType.AzureTable, GetTableEndPoint());
-
- monitoringConfiguration = _context.MonitoringConfigurations.Create(notificationEndPoint.Id,
- new List<ComponentMonitoringSetting>()
- {
- new ComponentMonitoringSetting(MonitoringComponent.Channel, MonitoringLevel.Normal),
- new ComponentMonitoringSetting(MonitoringComponent.StreamingEndpoint, MonitoringLevel.Normal)
-
- });
- }
-
- //Print metrics for a Streaming Endpoint.
- PrintStreamingEndpointMetrics();
-
- Console.ReadLine();
- }
-
- private static string GetTableEndPoint()
- {
- return "https://" + _mediaServicesStorageAccountName + ".table.core.windows.net/";
- }
-
- private static void PrintStreamingEndpointMetrics()
- {
- Console.WriteLine(string.Format("Telemetry for streaming endpoint '{0}'", _streamingEndpoint.Name));
-
- DateTime timerangeEnd = DateTime.UtcNow;
- DateTime timerangeStart = DateTime.UtcNow.AddHours(-5);
-
- // Get some streaming endpoint metrics.
- var telemetry = _streamingEndpoint.GetTelemetry();
-
- var res = telemetry.GetStreamingEndpointRequestLogs(timerangeStart, timerangeEnd);
-
- Console.Title = "Streaming endpoint metrics:";
-
- foreach (var log in res)
- {
- Console.WriteLine("AccountId: {0}", log.AccountId);
- Console.WriteLine("BytesSent: {0}", log.BytesSent);
- Console.WriteLine("EndToEndLatency: {0}", log.EndToEndLatency);
- Console.WriteLine("HostName: {0}", log.HostName);
- Console.WriteLine("ObservedTime: {0}", log.ObservedTime);
- Console.WriteLine("PartitionKey: {0}", log.PartitionKey);
- Console.WriteLine("RequestCount: {0}", log.RequestCount);
- Console.WriteLine("ResultCode: {0}", log.ResultCode);
- Console.WriteLine("RowKey: {0}", log.RowKey);
- Console.WriteLine("ServerLatency: {0}", log.ServerLatency);
- Console.WriteLine("StatusCode: {0}", log.StatusCode);
- Console.WriteLine("StreamingEndpointId: {0}", log.StreamingEndpointId);
- Console.WriteLine();
- }
-
- Console.WriteLine();
- }
-
- private static void PrintChannelMetrics()
- {
- if (_channel == null)
- {
- Console.WriteLine("There are no channels in this AMS account");
- return;
- }
-
- Console.WriteLine(string.Format("Telemetry for channel '{0}'", _channel.Name));
-
- DateTime timerangeEnd = DateTime.UtcNow;
- DateTime timerangeStart = DateTime.UtcNow.AddHours(-5);
-
- // Get some channel metrics.
- var telemetry = _channel.GetTelemetry();
-
- var channelMetrics = telemetry.GetChannelHeartbeats(timerangeStart, timerangeEnd);
-
- // Print the channel metrics.
- Console.WriteLine("Channel metrics:");
-
- foreach (var channelHeartbeat in channelMetrics.OrderBy(x => x.ObservedTime))
- {
- Console.WriteLine(
- " Observed time: {0}, Last timestamp: {1}, Incoming bitrate: {2}",
- channelHeartbeat.ObservedTime,
- channelHeartbeat.LastTimestamp,
- channelHeartbeat.IncomingBitrate);
- }
-
- Console.WriteLine();
- }
- }
-}
-```
-
-## Next steps
--
-## Provide feedback
-
media-services Media Services Dotnet Upload Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dotnet-upload-files.md
- Title: Upload files into a Media Services account using .NET | Microsoft Docs
-description: Learn how to get media content into Media Services by creating and uploading assets using .NET.
------ Previously updated : 03/10/2021---
-# Upload files into a Media Services account using .NET
---
-In Media Services, you upload (or ingest) your digital files into an asset. The **Asset** entity can contain video, audio, images, thumbnail collections, text tracks and closed caption files (and the metadata about these files.) Once the files are uploaded, your content is stored securely in the cloud for further processing and streaming.
-
-The files in the asset are called **Asset Files**. The **AssetFile** instance and the actual media file are two distinct objects. The AssetFile instance contains metadata about the media file, while the media file contains the actual media content.
-
-## Considerations
-
-The following considerations apply:
-
- * Media Services uses the value of the IAssetFile.Name property when building URLs for the streaming content (for example, http://{AMSAccount}.origin.mediaservices.windows.net/{GUID}/{IAssetFile.Name}/streamingParameters.) For this reason, percent-encoding is not allowed. The value of the **Name** property cannot have any of the following [percent-encoding-reserved characters](https://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters): !*'();:@&=+$,/?%#[]". Also, there can only be one '.' for the file name extension.
-* The length of the name should not be greater than 260 characters.
-* There is a limit to the maximum file size supported for processing in Media Services. See [this](media-services-quotas-and-limitations.md) article for details about the file size limitation.
-* There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-When you create assets, you can specify the following encryption options:
-
-* **None** - No encryption is used. This is the default value. When using this option your content is not protected in transit or at rest in storage.
- If you plan to deliver an MP4 using progressive download, use this option:
-* **CommonEncryption** - Use this option if you are uploading content that has already been encrypted and protected with Common Encryption or PlayReady DRM (for example, Smooth Streaming protected with PlayReady DRM).
-* **EnvelopeEncrypted** ΓÇô Use this option if you are uploading HLS encrypted with AES. Note that the files must have been encoded and encrypted by Transform Manager.
-* **StorageEncrypted** - Encrypts your clear content locally using AES-256 bit encryption and then uploads it to Azure Storage where it is stored encrypted at rest. Assets protected with Storage Encryption are automatically unencrypted and placed in an encrypted file system prior to encoding, and optionally re-encrypted prior to uploading back as a new output asset. The primary use case for Storage Encryption is when you want to secure your high-quality input media files with strong encryption at rest on disk.
-
- Media Services provides on-disk storage encryption for your assets, not over-the-wire like Digital Rights Manager (DRM).
-
- If your asset is storage encrypted, you must configure asset delivery policy. For more information, see [Configuring asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md).
-
-If you specify for your asset to be encrypted with a **CommonEncrypted** option, or an **EnvelopeEncrypted** option, you need to associate your asset with a **ContentKey**. For more information, see [How to create a ContentKey](media-services-dotnet-create-contentkey.md).
-
-If you specify for your asset to be encrypted with a **StorageEncrypted** option, the Media Services SDK for .NET creates a **StorageEncrypted** **ContentKey** for your asset.
-
-This article shows how to use Media Services .NET SDK as well as Media Services .NET SDK extensions to upload files into a Media Services asset.
-
-## Upload a single file with Media Services .NET SDK
-
-The following code uses .NET to upload a single file. The AccessPolicy and Locator are created and destroyed by the Upload function.
-
-```csharp
- static public IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string singleFilePath)
- {
- if (!File.Exists(singleFilePath))
- {
- Console.WriteLine("File does not exist.");
- return null;
- }
-
- var assetName = Path.GetFileNameWithoutExtension(singleFilePath);
- IAsset inputAsset = _context.Assets.Create(assetName, assetCreationOptions);
-
- var assetFile = inputAsset.AssetFiles.Create(Path.GetFileName(singleFilePath));
-
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return inputAsset;
- }
-```
--
-## Upload multiple files with Media Services .NET SDK
-The following code shows how to create an asset and upload multiple files.
-
-The code does the following:
-
-* Creates an empty asset using the CreateEmptyAsset method defined in the previous step.
-* Creates an **AccessPolicy** instance that defines the permissions and duration of access to the asset.
-* Creates a **Locator** instance that provides access to the asset.
-* Creates a **BlobTransferClient** instance. This type represents a client that operates on the Azure blobs. In this example, the client monitors the upload progress.
-* Enumerates through files in the specified directory and creates an **AssetFile** instance for each file.
-* Uploads the files into Media Services using the **UploadAsync** method.
-
-> [!NOTE]
-> Use the UploadAsync method to ensure that the calls are not blocking and the files are uploaded in parallel.
->
->
-
-```csharp
- static public IAsset CreateAssetAndUploadMultipleFiles(AssetCreationOptions assetCreationOptions, string folderPath)
- {
- var assetName = "UploadMultipleFiles_" + DateTime.UtcNow.ToString();
-
- IAsset asset = _context.Assets.Create(assetName, assetCreationOptions);
-
- var accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromDays(30),
- AccessPermissions.Write | AccessPermissions.List);
-
- var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);
-
- var blobTransferClient = new BlobTransferClient();
- blobTransferClient.NumberOfConcurrentTransfers = 20;
- blobTransferClient.ParallelTransferThreadCount = 20;
-
- blobTransferClient.TransferProgressChanged += blobTransferClient_TransferProgressChanged;
-
- var filePaths = Directory.EnumerateFiles(folderPath);
-
- Console.WriteLine("There are {0} files in {1}", filePaths.Count(), folderPath);
-
- if (!filePaths.Any())
- {
- throw new FileNotFoundException(String.Format("No files in directory, check folderPath: {0}", folderPath));
- }
-
- var uploadTasks = new List<Task>();
- foreach (var filePath in filePaths)
- {
- var assetFile = asset.AssetFiles.Create(Path.GetFileName(filePath));
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- // It is recommended to validate AssetFiles before upload.
- Console.WriteLine("Start uploading of {0}", assetFile.Name);
- uploadTasks.Add(assetFile.UploadAsync(filePath, blobTransferClient, locator, CancellationToken.None));
- }
-
- Task.WaitAll(uploadTasks.ToArray());
- Console.WriteLine("Done uploading the files");
-
- blobTransferClient.TransferProgressChanged -= blobTransferClient_TransferProgressChanged;
-
- locator.Delete();
- accessPolicy.Delete();
-
- return asset;
- }
-
- static void blobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
- {
- if (e.ProgressPercentage > 4) // Avoid startup jitter, as the upload tasks are added.
- {
- Console.WriteLine("{0}% upload competed for {1}.", e.ProgressPercentage, e.LocalFile);
- }
- }
-```
--
-When uploading a large number of assets, consider the following:
-
-* Create a new **CloudMediaContext** object per thread. The **CloudMediaContext** class is not thread-safe.
-* Increase NumberOfConcurrentTransfers from the default value of 2 to a higher value like 5. Setting this property affects all instances of **CloudMediaContext**.
-* Keep ParallelTransferThreadCount at the default value of 10.
-
-## <a id="ingest_in_bulk"></a>Ingesting Assets in Bulk using Media Services .NET SDK
-Uploading large asset files can be a bottleneck during asset creation. Ingesting Assets in Bulk or ΓÇ£Bulk IngestingΓÇ¥, involves decoupling asset creation from the upload process. To use a bulk ingesting approach, create a manifest (IngestManifest) that describes the asset and its associated files. Then use the upload method of your choice to upload the associated files to the manifestΓÇÖs blob container. Microsoft Azure Media Services watches the blob container associated with the manifest. Once a file is uploaded to the blob container, Microsoft Azure Media Services completes the asset creation based on the configuration of the asset in the manifest (IngestManifestAsset).
-
-To create a new IngestManifest, call the Create method exposed by the IngestManifests collection on the CloudMediaContext. This method creates a new IngestManifest with the manifest name you provide.
-
-```csharp
- IIngestManifest manifest = context.IngestManifests.Create(name);
-```
-
-Create the assets that are associated with the bulk IngestManifest. Configure the desired encryption options on the asset for bulk ingesting.
-
-```csharp
- // Create the assets that will be associated with this bulk ingest manifest
- IAsset destAsset1 = _context.Assets.Create(name + "_asset_1", AssetCreationOptions.None);
- IAsset destAsset2 = _context.Assets.Create(name + "_asset_2", AssetCreationOptions.None);
-```
-
-An IngestManifestAsset associates an Asset with a bulk IngestManifest for bulk ingesting. It also associates the AssetFiles that makes up each Asset.
-To create an IngestManifestAsset, use the Create method on the server context.
-
-The following example demonstrates adding two new IngestManifestAssets that associate the two assets previously created to the bulk ingest manifest. Each IngestManifestAsset also associates a set of files that are uploaded for each asset during bulk ingesting.
-
-```csharp
- string filename1 = _singleInputMp4Path;
- string filename2 = _primaryFilePath;
- string filename3 = _singleInputFilePath;
-
- IIngestManifestAsset bulkAsset1 = manifest.IngestManifestAssets.Create(destAsset1, new[] { filename1 });
- IIngestManifestAsset bulkAsset2 = manifest.IngestManifestAssets.Create(destAsset2, new[] { filename2, filename3 });
-```
-
-You can use any high-speed client application capable of uploading the asset files to the blob storage container URI provided by the **IIngestManifest.BlobStorageUriForUpload** property of the IngestManifest.
-
-The following code shows how to use .NET SDK to upload the assets files.
-
-```csharp
- static void UploadBlobFile(string containerName, string filename)
- {
- Task copytask = new Task(() =>
- {
- var storageaccount = new CloudStorageAccount(new StorageCredentials(_storageAccountName, _storageAccountKey), true);
- CloudBlobClient blobClient = storageaccount.CreateCloudBlobClient();
- CloudBlobContainer blobContainer = blobClient.GetContainerReference(containerName);
-
- string[] splitfilename = filename.Split('\\');
- var blob = blobContainer.GetBlockBlobReference(splitfilename[splitfilename.Length - 1]);
-
- using (var stream = System.IO.File.OpenRead(filename))
- blob.UploadFromStream(stream);
-
- lock (consoleWriteLock)
- {
- Console.WriteLine("Upload for {0} completed.", filename);
- }
- });
-
- copytask.Start();
- }
-```
-
-The code for uploading the asset files for the sample used in this article is shown in the following code example:
-
-```csharp
- UploadBlobFile(manifest.BlobStorageUriForUpload, filename1);
- UploadBlobFile(manifest.BlobStorageUriForUpload, filename2);
- UploadBlobFile(manifest.BlobStorageUriForUpload, filename3);
-```
-
-You can determine the progress of the bulk ingesting for all assets associated with an **IngestManifest** by polling the Statistics property of the **IngestManifest**. In order to update progress information, you must use a new **CloudMediaContext** each time you poll the Statistics property.
-
-The following example demonstrates polling an IngestManifest by its **Id**.
-
-```csharp
- static void MonitorBulkManifest(string manifestID)
- {
- bool bContinue = true;
- while (bContinue)
- {
- CloudMediaContext context = GetContext();
- IIngestManifest manifest = context.IngestManifests.Where(m => m.Id == manifestID).FirstOrDefault();
-
- if (manifest != null)
- {
- lock(consoleWriteLock)
- {
- Console.WriteLine("\nWaiting on all file uploads.");
- Console.WriteLine("PendingFilesCount : {0}", manifest.Statistics.PendingFilesCount);
- Console.WriteLine("FinishedFilesCount : {0}", manifest.Statistics.FinishedFilesCount);
- Console.WriteLine("{0}% complete.\n", (float)manifest.Statistics.FinishedFilesCount / (float)(manifest.Statistics.FinishedFilesCount + manifest.Statistics.PendingFilesCount) * 100);
-
- if (manifest.Statistics.PendingFilesCount == 0)
- {
- Console.WriteLine("Completed\n");
- bContinue = false;
- }
- }
-
- if (manifest.Statistics.FinishedFilesCount < manifest.Statistics.PendingFilesCount)
- Thread.Sleep(60000);
- }
- else // Manifest is null
- bContinue = false;
- }
- }
-```
--
-## Upload files using .NET SDK Extensions
-The following example shows how to upload a single file using .NET SDK Extensions. In this case the **CreateFromFile** method is used, but the asynchronous version is also available (**CreateFromFileAsync**). The **CreateFromFile** method lets you specify the file name, encryption option, and a callback in order to report the upload progress of the file.
-
-```csharp
- static public IAsset UploadFile(string fileName, AssetCreationOptions options)
- {
- IAsset inputAsset = _context.Assets.CreateFromFile(
- fileName,
- options,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- Console.WriteLine("Asset {0} created.", inputAsset.Id);
-
- return inputAsset;
- }
-```
-
-The following example calls UploadFile function and specifies storage encryption as the asset creation option.
-
-```csharp
- var asset = UploadFile(@"C:\VideoFiles\BigBuckBunny.mp4", AssetCreationOptions.StorageEncrypted);
-```
-
-## Next steps
-
-You can now encode your uploaded assets. For more information, see [Encode assets](media-services-portal-encode.md).
-
-You can also use Azure Functions to trigger an encoding job based on a file arriving in the configured container. For more information, see [this sample](https://azure.microsoft.com/resources/samples/media-services-dotnet-functions-integration/ ).
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next step
-Now that you have uploaded an asset to Media Services, go to the [How to Get a Media Processor][How to Get a Media Processor] article.
-
-[How to Get a Media Processor]: media-services-get-media-processor.md
media-services Media Services Dynamic Manifest Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dynamic-manifest-overview.md
- Title: Filters and dynamic manifests | Microsoft Docs
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services creates dynamic manifests to archive this selective streaming.
------ Previously updated : 03/10/2021--
-# Filters and dynamic manifests
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 2](media-services-dynamic-manifest-overview.md)
-> * [Version 3](../latest/filters-dynamic-manifest-concept.md)
-
-Starting with 2.17 release, Media Services enables you to define filters for your assets. These filters are server-side rules that will allow your customers to choose to do things like: play back only a section of a video (instead of playing the whole video), or specify only a subset of audio and video renditions that your customer's device can handle (instead of all the renditions that are associated with the asset). This filtering of your assets is achieved through **Dynamic Manifest**s that are created upon your customer's request to stream a video based on specified filter(s).
-
-This topic discusses common scenarios in which using filters would be beneficial to your customers and links to topics that demonstrate how to create filters programmatically.
-
-## Overview
-When delivering your content to customers (streaming live events or video-on-demand) your goal is to deliver a high-quality video to various devices under different network conditions. To achieve this goal do the following:
-
-* encode your stream to multi-bitrate ([adaptive bitrate](https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming)) video stream (this will take care of quality and network conditions) and
-* use Media Services [Dynamic Packaging](media-services-dynamic-packaging-overview.md) to dynamically re-package your stream into different protocols (this will take care of streaming on different devices). Media Services supports delivery of the following adaptive bitrate streaming technologies: HTTP Live Streaming (HLS), Smooth Streaming, and MPEG DASH.
-
-### Manifest files
-When you encode an asset for adaptive bitrate streaming, a **manifest** (playlist) file is created (the file is text-based or XML-based). The **manifest** file includes streaming metadata such as: track type (audio, video, or text), track name, start and end time, bitrate (qualities), track languages, presentation window (sliding window of fixed duration), video codec (FourCC). It also instructs the player to retrieve the next fragment by providing information about the next playable video fragments available and their location. Fragments (or segments) are the actual ΓÇ£chunksΓÇ¥ of a video content.
-
-Here is an example of a manifest file:
-
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<SmoothStreamingMedia MajorVersion="2" MinorVersion="0" Duration="330187755" TimeScale="10000000">
-
-<StreamIndex Chunks="17" Type="video" Url="QualityLevels({bitrate})/Fragments(video={start time})" QualityLevels="8">
-<QualityLevel Index="0" Bitrate="5860941" FourCC="H264" MaxWidth="1920" MaxHeight="1080" CodecPrivateData="0000000167640028AC2CA501E0089F97015202020280000003008000001931300016E360000E4E1FF8C7076850A4580000000168E9093525" />
-<QualityLevel Index="1" Bitrate="4602724" FourCC="H264" MaxWidth="1920" MaxHeight="1080" CodecPrivateData="0000000167640028AC2CA501E0089F97015202020280000003008000001931100011EDC00002CD29FF8C7076850A45800000000168E9093525" />
-<QualityLevel Index="2" Bitrate="3319311" FourCC="H264" MaxWidth="1280" MaxHeight="720" CodecPrivateData="000000016764001FAC2CA5014016EC054808080A00000300020000030064C0800067C28000103667F8C7076850A4580000000168E9093525" />
-<QualityLevel Index="3" Bitrate="2195119" FourCC="H264" MaxWidth="960" MaxHeight="540" CodecPrivateData="000000016764001FAC2CA503C045FBC054808080A000000300200000064C1000044AA0000ABA9FE31C1DA14291600000000168E9093525" />
-<QualityLevel Index="4" Bitrate="1469881" FourCC="H264" MaxWidth="960" MaxHeight="540" CodecPrivateData="000000016764001FAC2CA503C045FBC054808080A000000300200000064C04000B71A0000E4E1FF8C7076850A4580000000168E9093525" />
-<QualityLevel Index="5" Bitrate="978815" FourCC="H264" MaxWidth="640" MaxHeight="360" CodecPrivateData="000000016764001EAC2CA50280BFE5C0548303032000000300200000064C08001E8480004C4B7F8C7076850A45800000000168E9093525" />
-<QualityLevel Index="6" Bitrate="638374" FourCC="H264" MaxWidth="640" MaxHeight="360" CodecPrivateData="000000016764001EAC2CA50280BFE5C0548303032000000300200000064C080013D60000C65DFE31C1DA1429160000000168E9093525" />
-<QualityLevel Index="7" Bitrate="388851" FourCC="H264" MaxWidth="320" MaxHeight="180" CodecPrivateData="000000016764000DAC2CA505067E7C054830303200000300020000030064C040030D40003D093F8C7076850A45800000000168E9093525" />
-
-<c t="0" d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="20000000" /><c d="9600000"/>
-</StreamIndex>
--
-<StreamIndex Chunks="17" Type="audio" Url="QualityLevels({bitrate})/Fragments(AAC_und_ch2_128kbps={start time})" QualityLevels="1" Name="AAC_und_ch2_128kbps">
-<QualityLevel AudioTag="255" Index="0" BitsPerSample="16" Bitrate="125658" FourCC="AACL" CodecPrivateData="1210" Channels="2" PacketSize="4" SamplingRate="44100" />
-
-<c t="0" d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="6965987" /></StreamIndex>
--
-<StreamIndex Chunks="17" Type="audio" Url="QualityLevels({bitrate})/Fragments(AAC_und_ch2_56kbps={start time})" QualityLevels="1" Name="AAC_und_ch2_56kbps">
-<QualityLevel AudioTag="255" Index="0" BitsPerSample="16" Bitrate="53655" FourCC="AACL" CodecPrivateData="1210" Channels="2" PacketSize="4" SamplingRate="44100" />
-
-<c t="0" d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201361" /><c d="20201360" /><c d="20201361" /><c d="20201360" /><c d="6965987" /></StreamIndex>
-
-</SmoothStreamingMedia>
-```
-
-### Dynamic manifests
-There are [scenarios](media-services-dynamic-manifest-overview.md#scenarios) when your client needs more flexibility than what's described in the default asset's manifest file. For example:
-
-* Device specific: deliver only the specified renditions and/or specified language tracks that are supported by the device that is used to play back the content ("rendition filtering").
-* Reduce the manifest to show a sub-clip of a live event ("sub-clip filtering").
-* Trim the start of a video ("trimming a video").
-* Adjust Presentation Window (DVR) in order to provide a limited length of the DVR window in the player ("adjusting presentation window").
-
-To achieve this flexibility, Media Services offers **Dynamic Manifests** based on pre-defined [filters](media-services-dynamic-manifest-overview.md#filters). Once you define the filters, your clients could use them to stream a specific rendition or sub-clips of your video. They would specify filter(s) in the streaming URL. Filters could be applied to adaptive bitrate streaming protocols supported by [Dynamic Packaging](media-services-dynamic-packaging-overview.md): HLS, MPEG-DASH, and Smooth Streaming. For example:
-
-MPEG DASH URL with filter
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf,filter=MyLocalFilter)`
-
-Smooth Streaming URL with filter
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(filter=MyLocalFilter)`
--
-For more information about how to deliver your content and build streaming URLs, see [Delivering content overview](media-services-deliver-content-overview.md).
-
-> [!NOTE]
-> Note that Dynamic Manifests do not change the asset and the default manifest for that asset. Your client can choose to request a stream with or without filters.
->
->
-
-### <a id="filters"></a>Filters
-There are two types of asset filters:
-
-* Global filters (can be applied to any asset in the Azure Media Services account, have a lifetime of the account) and
-* Local filters (can only be applied to an asset with which the filter was associated upon creation, have a lifetime of the asset).
-
-Global and local filter types have exactly the same properties. The main difference between the two is for which scenarios what type of a filer is more suitable. Global filters are generally suitable for device profiles (rendition filtering) where local filters could be used to trim a specific asset.
-
-## <a id="scenarios"></a>Common scenarios
-As was mentioned before, when delivering your content to customers (streaming live events or video-on-demand) your goal is to deliver a high-quality video to various devices under different network conditions. In addition, you might have other requirements that involve filtering your assets and using of **Dynamic Manifest**s. The following sections give a short overview of different filtering scenarios.
-
-* Specify only a subset of audio and video renditions that certain devices can handle (instead of all the renditions that are associated with the asset).
-* Playing back only a section of a video (instead of playing the whole video).
-* Adjust DVR presentation window.
-
-## Rendition filtering
-You may choose to encode your asset to multiple encoding profiles (H.264 Baseline, H.264 High, AACL, AACH, Dolby Digital Plus) and multiple quality bitrates. However, not all client devices will support all your asset's profiles and bitrates. For example, older Android devices only support H.264 Baseline+AACL. Sending higher bitrates to a device which cannot get the benefits, wastes bandwidth and device computation. Such device must decode all the given information, only to scale it down for display.
-
-With Dynamic Manifest, you can create device profiles such as mobile, console, HD/SD, etc. and include the tracks and qualities which you want to be a part of each profile.
-
-![Rendition filtering example][renditions2]
-
-In the following example, an encoder was used to encode a mezzanine asset into seven ISO MP4s video renditions (from 180p to 1080p). The encoded asset can be dynamically packaged into any of the following streaming protocols: HLS, Smooth, and MPEG DASH. At the top of the diagram, the HLS manifest for the asset with no filters is shown (it contains all seven renditions). In the bottom left, the HLS manifest to which a filter named "ott" was applied is shown. The "ott" filter specifies to remove all bitrates below 1Mbps, which resulted in the bottom two quality levels being stripped off in the response. In the bottom right, the HLS manifest to which a filter named "mobile" was applied is shown. The "mobile" filter specifies to remove renditions where the resolution is larger than 720p, which resulted in the two 1080p renditions being stripped off.
-
-![Rendition filtering][renditions1]
-
-## Removing language tracks
-Your assets might include multiple audio languages such as English, Spanish, French, etc. Usually, the Player SDK managers default audio track selection and available audio tracks per user selection. It is challenging to develop such Player SDKs, it requires different implementations across device-specific player-frameworks. Also, on some platforms, Player APIs are limited and do not include audio selection feature where users cannot select or change the default audio track. With asset filters, you can control the behavior by creating filters that only include desired audio languages.
-
-![Language tracks filtering][language_filter]
-
-## Trimming start of an asset
-In most live streaming events, operators run some tests before the actual event. For example, they might include a slate like this before the start of the event: "Program will begin momentarily". If the program is archiving, the test and slate data are also archived and included in the presentation. However, this information should not be shown to the clients. With Dynamic Manifest, you can create a start time filter and remove the unwanted data from the manifest.
-
-![Trimming start][trim_filter]
-
-## Creating subclips (views) from a live archive
-Many live events are long running and live archive might include multiple events. After the live event ends, broadcasters may want to break up the live archive into logical program start and stop sequences. Next, publish these virtual programs separately without post processing the live archive and not creating separate assets (which does not get the benefit of the existing cached fragments in the CDNs). Examples of such virtual programs are the quarters of a football or basketball game, innings in baseball, or individual events of any sports program.
-
-With Dynamic Manifest, you can create filters using start/end times and create virtual views over the top of your live archive.
-
-![Subclip filter][subclip_filter]
-
-Filtered Asset:
-
-![Skiing][skiing]
-
-## Adjusting Presentation Window (DVR)
-Currently, Azure Media Services offers circular archive where the duration can be configured between 5 minutes - 25 hours. Manifest filtering can be used to create a rolling DVR window over the top of the archive, without deleting media. There are many scenarios where broadcasters want to provide a limited DVR window to move with the live edge and at the same time keep a bigger archiving window. A broadcaster may want to use the data that is out of the DVR window to highlight clips, or they may want to provide different DVR windows for different devices. For example, most of the mobile devices donΓÇÖt handle large DVR windows (you can have a 2-minute DVR window for mobile devices and one hour for desktop clients).
-
-![DVR window][dvr_filter]
-
-## Adjusting LiveBackoff (live position)
-Manifest filtering can be used to remove several seconds from the live edge of a live program. Filtering allows broadcasters to watch the presentation on the preview publication point and create advertisement insertion points before the viewers receive the stream (backed-off by 30 seconds). Broadcasters can then push these advertisements to their client frameworks in time for them to received and process the information before the advertisement opportunity.
-
-In addition to the advertisement support, the LiveBackoff setting can be used to adjusting the viewers position so that when clients drift and hit the live edge they can still get fragments from server instead of getting an HTTP 404 or 412 error.
-
-![livebackoff_filter][livebackoff_filter]
-
-## Combining multiple rules in a single filter
-You can combine multiple filtering rules in a single filter. As an example you can define a "range rule" to remove slates from a live archive and also filter out available bitrates. When applying multiple filtering rules, the end result is the intersection of all rules.
-
-![multiple-rules][multiple-rules]
-
-## Create filters programmatically
-The following article discusses Media Services entities that are related to filters. The article also shows how to programmatically create filters.
-
-[Create filters with REST APIs](media-services-rest-dynamic-manifest.md).
-
-## Combining multiple filters (filter composition)
-You can also combine multiple filters in a single URL.
-
-The following scenario demonstrates why you might want to combine filters:
-
-1. You need to filter your video qualities for mobile devices such as Android or iPAD (in order to limit video qualities). To remove the unwanted qualities, you would create a global filter suitable for the device profiles. As mentioned earlier in this article, global filters can be used for all your assets under the same media services account without any further association.
-2. You also want to trim the start and end time of an asset. To achieve this, you would create a local filter and set the start/end time.
-3. You want to combine both of these filters (without combination, you need to add quality filtering to the trimming filter which makes filter usage more difficult).
-
-To combine filters, you need to set the filter names to the manifest/playlist URL with semicolon delimited. LetΓÇÖs assume you have a filter named *MyMobileDevice* that filters qualities and you have another named *MyStartTime* to set a specific start time. You can combine them like this:
-
-`http://teststreaming.streaming.mediaservices.windows.net/3d56a4d-b71d-489b-854f-1d67c0596966/64ff1f89-b430-43f8-87dd-56c87b7bd9e2.ism/Manifest(filter=MyMobileDevice;MyStartTime)`
-
-You can combine up to three filters.
-
-For more information, see [this](https://azure.microsoft.com/blog/azure-media-services-release-dynamic-manifest-composition-remove-hls-audio-only-track-and-hls-i-frame-track-support/) blog.
-
-## Know issues and limitations
-* Dynamic manifest operates in GOP boundaries (Key Frames) hence trimming has GOP accuracy.
-* You can use same filter name for local and global filters. Local filters have higher precedence and will override global filters.
-* If you update a filter, it can take up to 2 minutes for streaming endpoint to refresh the rules. If the content was served using some filters (and cached in proxies and CDN caches), updating these filters can result in player failures. It is recommended to clear the cache after updating the filter. If this option is not possible, consider using a different filter.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Delivering Content to Customers Overview](media-services-deliver-content-overview.md)
-
-[renditions1]: ./media/media-services-dynamic-manifest-overview/media-services-rendition-filter.png
-[renditions2]: ./media/media-services-dynamic-manifest-overview/media-services-rendition-filter2.png
-
-[rendered_subclip]: ./media/media-services-dynamic-manifests/media-services-rendered-subclip.png
-[timeline_trim_event]: ./media/media-services-dynamic-manifests/media-services-timeline-trim-event.png
-[timeline_trim_subclip]: ./media/media-services-dynamic-manifests/media-services-timeline-trim-subclip.png
-
-[multiple-rules]:./media/media-services-dynamic-manifest-overview/media-services-multiple-rules-filters.png
-
-[subclip_filter]: ./media/media-services-dynamic-manifest-overview/media-services-subclips-filter.png
-[trim_event]: ./media/media-services-dynamic-manifests/media-services-timeline-trim-event.png
-[trim_subclip]: ./media/media-services-dynamic-manifests/media-services-timeline-trim-subclip.png
-[trim_filter]: ./media/media-services-dynamic-manifest-overview/media-services-trim-filter.png
-[redered_subclip]: ./media/media-services-dynamic-manifests/media-services-rendered-subclip.png
-[livebackoff_filter]: ./media/media-services-dynamic-manifest-overview/media-services-livebackoff-filter.png
-[language_filter]: ./media/media-services-dynamic-manifest-overview/media-services-language-filter.png
-[dvr_filter]: ./media/media-services-dynamic-manifest-overview/media-services-dvr-filter.png
-[skiing]: ./media/media-services-dynamic-manifest-overview/media-services-skiing.png
media-services Media Services Dynamic Packaging Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-dynamic-packaging-overview.md
- Title: Azure Media Services dynamic packaging overview | Microsoft Docs
-description: This articles gives an overview of Microsoft Azure Media Services dynamic packaging.
------ Previously updated : 03/10/2021--
-# Dynamic packaging
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 3](../latest/encode-dynamic-packaging-concept.md)
-> * [Version 2](media-services-dynamic-packaging-overview.md)
--
-Microsoft Azure Media Services can be used to deliver many media source file formats, media streaming formats, and content protection formats to a variety of client technologies (for example, iOS, XBOX, Silverlight, Windows 8). These clients understand different protocols, for example iOS requires an HTTP Live Streaming (HLS) V4 format and Silverlight and Xbox require Smooth Streaming. If you have a set of adaptive bitrate (multi-bitrate) MP4 (ISO Base Media 14496-12) files or a set of adaptive bitrate Smooth Streaming files that you want to serve to clients that understand MPEG DASH, HLS or Smooth Streaming, you should take advantage of Media Services dynamic packaging.
-
-With dynamic packaging all, you need is to create an asset that contains a set of adaptive bitrate MP4 files or adaptive bitrate Smooth Streaming files. Then, based on the specified format in the manifest or fragment request, the On-Demand Streaming server will ensure that you receive the stream in the protocol you have chosen. As a result, you only need to store and pay for the files in single storage format and Media Services service will build and serve the appropriate response based on requests from a client.
-
-The following diagram shows the traditional encoding and static packaging workflow.
-
-![Static Encoding](./media/media-services-dynamic-packaging-overview/media-services-static-packaging.png)
-
-The following diagram shows the dynamic packaging workflow.
-
-![Dynamic Encoding](./media/media-services-dynamic-packaging-overview/media-services-dynamic-packaging.png)
-
-## Common scenario
-
-1. Upload an input file (called a mezzanine file). For example, H.264, MP4, or WMV (for the list of supported formats see [Formats Supported by the Media Encoder Standard](media-services-media-encoder-standard-formats.md).
-2. Encode your mezzanine file to H.264 MP4 adaptive bitrate sets.
-3. Publish the asset that contains the adaptive bitrate MP4 set by creating the On-Demand Locator.
-4. Build the streaming URLs to access and stream your content.
-
-## Preparing assets for dynamic streaming
-
-To prepare your asset for dynamic streaming, you have the following options:
--- [Upload a master file](media-services-dotnet-upload-files.md).-- [Use the Media Encoder Standard encoder to produce H.264 MP4 adaptive bitrate sets](media-services-dotnet-encode-with-media-encoder-standard.md).-- [Stream your content](media-services-deliver-content-overview.md).-
-## Audio codecs supported by dynamic packaging
-
-Dynamic Packaging supports MP4 files, which contain audio encoded with [AAC](https://en.wikipedia.org/wiki/Advanced_Audio_Coding) (AAC-LC, HE-AAC v1, HE-AAC v2), [Dolby Digital Plus](https://en.wikipedia.org/wiki/Dolby_Digital_Plus)(Enhanced AC-3 or E-AC3), Dolby Atmos, or [DTS](https://en.wikipedia.org/wiki/DTS_%28sound_system%29) (DTS Express, DTS LBR, DTS HD, DTS HD Lossless). Streaming of Dolby Atmos content is supported for standards like MPEG-DASH protocol with either Common Streaming Format (CSF) or Common Media Application Format (CMAF) fragmented MP4, and via HTTP Live Streaming (HLS) with CMAF.
-
-> [!NOTE]
-> Dynamic Packaging does not support files that contain [Dolby Digital](https://en.wikipedia.org/wiki/Dolby_Digital) (AC3) audio (it is a legacy codec).
-
-## Media Services learning paths
--
-## Provide feedback
-
media-services Media Services Embed Mpeg Dash In Html5 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-embed-mpeg-dash-in-html5.md
- Title: Embedding a MPEG-DASH Adaptive Streaming Video in an HTML5 Application with DASH.js | Microsoft Docs
-description: This topic demonstrates how to embed an MPEG-DASH Adaptive Streaming Video in an HTML5 Application with DASH.js.
------ Previously updated : 03/10/2021--
-# Embedding an MPEG-DASH Adaptive Streaming Video in an HTML5 Application with DASH.js
--
-## Overview
-MPEG-DASH is an ISO standard for the adaptive streaming of video content, which offers significant benefits for developers wanting to deliver high-quality, adaptive video streaming output. With MPEG-DASH, the video stream adjusts automatically to a lower definition when the network becomes congested. This reduces the likelihood of the viewer seeing a "paused" video while the player downloads the next few seconds to play (that is, it reduces the likelihood of buffering). As network congestion reduces, the video player will in turn return to a higher-quality stream. This ability to adapt the bandwidth required also results in a faster start time for video. That means that the first few seconds can be played in a fast-to-download lower quality segment and then step up to a higher quality once sufficient content has been buffered.
-
-Dash.js is an open-source MPEG-DASH video player written in JavaScript. Its goal is to provide a robust, cross-platform player that can be freely reused in applications that require video playback. It provides MPEG-DASH playback in any browser that supports the W3C Media Source Extensions (MSE), today that is Chrome, Microsoft Edge, and IE11 (other browsers have indicated their intent to support MSE). For more information about DASH.js, js see the GitHub dash.js repository.
-
-## Creating a browser-based streaming video player
-To create a simple web page that displays a video player with the expected controls such a play, pause, rewind etc., you need to:
-
-1. Create an HTML page
-2. Add the video tag
-3. Add the dash.js player
-4. Initialize the player
-5. Add some CSS style
-6. View the results in a browser that implements MSE
-
-Initializing the player can be completed in just a handful of lines of JavaScript code. Using dash.js, it really is that simple to embed MPEG-DASH video in your browser-based applications.
-
-## Creating the HTML Page
-The first step is to create a standard HTML page containing the **video** element, save this file as basicPlayer.html, as the following example illustrates:
-
-```html
- <!DOCTYPE html>
- <html>
- <head><title>Adaptive Streaming in HTML5</title></head>
- <body>
- <h1>Adaptive Streaming with HTML5</h1>
- <video id="videoplayer" controls></video>
- </body>
- </html>
-```
-
-## Adding the DASH.js Player
-To add the dash.js reference implementation to the application, you need to grab the dash.all.js file from the latest version of dash.js project. This should be saved in the JavaScript folder of your application. This file is a convenience file that pulls together all the necessary dash.js code into a single file. If you have a look around the dash.js repository, you find the individual files, test code and much more, but if all you want to do is use dash.js, then the dash.all.js file is what you need.
-
-To add the dash.js player to your applications, add a script tag to the head section of basicPlayer.html:
-
-```html
- <!-- DASH-AVC/265 reference implementation -->
- < script src="js/dash.all.js"></script>
-```
-
-Next, create a function to initialize the player when the page loads. Add the following script after the line in which you load dash.all.js:
-
-```html
- <script>
- // setup the video element and attach it to the Dash player
- function setupVideo() {
- var url = "http://wams.edgesuite.net/media/MPTExpressionData02/BigBuckBunny_1080p24_IYUV_2ch.ism/manifest(format=mpd-time-csf)";
- var context = new Dash.di.DashContext();
- var player = new MediaPlayer(context);
- player.startup();
- player.attachView(document.querySelector("#videoplayer"));
- player.attachSource(url);
- }
- </script>
-```
-
-This function first creates a DashContext. This is used to configure the application for a specific runtime environment. From a technical point of view, it defines the classes that the dependency injection framework should use when constructing the application. In most cases, you use Dash.di.DashContext.
-
-Next, instantiate the primary class of the dash.js framework, MediaPlayer. This class contains the core methods needed such as play and pause, manages the relationship with the video element and also manages the interpretation of the Media Presentation Description (MPD) file, which describes the video to be played.
-
-The startup() function of the MediaPlayer class is called to ensure that the player is ready to play video. Among other things, the function ensures that all the necessary classes (as defined by the context) have been loaded. Once the player is ready, you can attach the video element to it using the attachView() function. The startup function enables the MediaPlayer to inject the video stream into the element and also control playback as necessary.
-
-Pass the URL of the MPD file to the MediaPlayer so that it knows about the video it is expected to play. The setupVideo() function just created will need to be executed once the page has fully loaded. Do this by using the onload event of the body element. Change your `<body>` element to:
-
-```html
- <body onload="setupVideo()">
-```
-
-Finally, set the size of the video element using CSS. In an adaptive streaming environment, this is especially important because the size of the video being played may change as playback adapts to changing network conditions. In this simple demo simply force the video element to be 80% of the available browser window by adding the following CSS to the head section of the page:
-
-```html
- <style>
- video {
- width: 80%;
- height: 80%;
- }
- </style>
-```
-
-## Playing a Video
-To play a video, point your browser at the basicPlayback.html file and click play on the video player displayed.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-
-[GitHub dash.js repository](https://github.com/Dash-Industry-Forum/dash.js)
-
media-services Media Services Encode Asset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-encode-asset.md
- Title: Overview of Azure on-demand media encoders | Microsoft Docs
-description: Azure Media Services provides multiple options for the encoding of media in the cloud. This article gives an overview of Azure on-demand media encoders.
------ Previously updated : 03/10/2021--
-# Overview of Azure on-demand media encoders
---
-Azure Media Services provides multiple options for the encoding of media in the cloud.
-
-When starting out with Media Services, it is important to understand the difference between codecs and file formats.
-Codecs are the software that implements the compression/decompression algorithms whereas file formats are containers that hold the compressed video.
-
-Media Services provides dynamic packaging which allows you to deliver your adaptive bitrate MP4 or Smooth Streaming encoded content in streaming formats supported by Media Services (MPEG DASH, HLS, Smooth Streaming) without you having to re-package into these streaming formats.
-
-When your Media Services account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state. Billing for streaming endpoints occurs whenever the endpoint is in a **Running** state.
-
-Media Services supports the following on demand encoder:
-
-* [Media Encoder Standard](media-services-encode-asset.md#media-encoder-standard)
-
-This article gives a brief overview of on demand media encoders and links to articles with more detailed information.
-
-By default each Media Services account can have one active encoding task at a time. You can reserve encoding units that allow you to have multiple encoding tasks running concurrently, one for each encoding reserved unit you purchase. For information, see [Scaling encoding units](media-services-scale-media-processing-overview.md).
-
-## Media Encoder Standard
-
-### How to use
-[How to encode with Media Encoder Standard](media-services-dotnet-encode-with-media-encoder-standard.md)
-
-### Formats
-[Formats and codecs](media-services-media-encoder-standard-formats.md)
-
-### Presets
-Media Encoder Standard is configured using one of the encoder presets described [here](./media-services-mes-presets-overview.md).
-
-### Input and output metadata
-The encoders input metadata is described [here](media-services-input-metadata-schema.md).
-
-The encoders output metadata is described [here](media-services-output-metadata-schema.md).
-
-### Generate thumbnails
-For information, see [How to generate thumbnails using Media Encoder Standard](media-services-advanced-encoding-with-mes.md).
-
-### Trim videos (clipping)
-For information, see [How to trim videos using Media Encoder Standard](media-services-advanced-encoding-with-mes.md#trim_video).
-
-### Create overlays
-For information, see [How to create overlays using Media Encoder Standard](media-services-advanced-encoding-with-mes.md#overlay).
-
-### See also
-[The Media Services blog](https://azure.microsoft.com/blog/2015/07/16/announcing-the-general-availability-of-media-encoder-standard/)
-
-### Known issues
-If your input video does not contain closed captioning, the output Asset will still contain an empty TTML file.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Related articles
-* [Perform advanced encoding tasks by customizing Media Encoder Standard presets](media-services-custom-mes-presets-with-dotnet.md)
-* [Quotas and Limitations](media-services-quotas-and-limitations.md)
-
-<!--Reference links in article-->
-[1]: https://azure.microsoft.com/pricing/details/media-services/
media-services Media Services Encode With Premium Workflow https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-encode-with-premium-workflow.md
- Title: Advanced encoding with Media Encoder Premium Workflow | Microsoft Docs
-description: Learn how to encode with Media Encoder Premium Workflow. Code samples are written in C# and use the Media Services SDK for .NET.
------ Previously updated : 03/10/2021---
-# Advanced encoding with Media Encoder Premium Workflow
--
-> [!NOTE]
-> Media Encoder Premium Workflow media processor discussed in this article is not available in China.
->
->
-
-## Overview
-Microsoft Azure Media Services is introducing the **Media Encoder Premium Workflow** media processor. This processor offers advance encoding capabilities for your premium on-demand workflows.
-
-The following topics outline details related to **Media Encoder Premium Workflow**:
-
-* [Formats Supported by the Media Encoder Premium Workflow](./media-services-encode-asset.md) ΓÇô Discusses the file formats and codecs supported by **Media Encoder Premium Workflow**.
-* [Overview and comparison of Azure on-demand media encoders](media-services-encode-asset.md) compares the encoding capabilities of **Media Encoder Premium Workflow** and **Media Encoder Standard**.
-
-This article demonstrates how to encode with **Media Encoder Premium Workflow** using .NET.
-
-Encoding tasks for the **Media Encoder Premium Workflow** require a separate configuration file, called a Workflow file. These files have a .workflow extension and are created using the [Workflow Designer](media-services-workflow-designer.md) tool.
-
-You can also get the default workflow files [here](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/media-services/previous/media-services-encode-with-premium-workflow.md
-). The folder also contains the description of these files.
-
-The workflow files need to be uploaded to your Media Services account as an Asset, and this Asset should be passed in to the encoding Task.
-
-## Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-## Encoding example
-
-The following example demonstrates how to encode with **Media Encoder Premium Workflow**.
-
-The following steps are performed:
-
-1. Create an asset and upload a workflow file.
-2. Create an asset and upload a source media file.
-3. Get the "Media Encoder Premium Workflow" media processor.
-4. Create a job and a task.
-
- In most cases, the configuration string for the task is empty (like in the following example). There are some advanced scenarios (that require you to set runtime properties dynamically) in which case you would provide an XML string to the encoding task. Examples of such scenarios are: creating an overlay, parallel or sequential media stitching, subtitling.
-5. Add two input assets to the task.
-
- 1. 1st ΓÇô the workflow asset.
- 2. 2nd ΓÇô the video asset.
-
- >[!NOTE]
- >The workflow asset must be added to the task before the media asset.
- The configuration string for this task should be empty.
-
-6. Submit the encoding job.
-
-```csharp
-using System;
-using System.Linq;
-using System.Configuration;
-using System.IO;
-using System.Threading;
-using System.Threading.Tasks;
-using Microsoft.WindowsAzure.MediaServices.Client;
-
-namespace MediaEncoderPremiumWorkflowSample
-{
- class Program
- {
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static readonly string _supportFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _workflowFilePath =
- Path.GetFullPath(_supportFiles + @"\H264 Progressive Download MP4.workflow");
-
- private static readonly string _singleMP4InputFilePath =
- Path.GetFullPath(_supportFiles + @"\BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
-
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- var workflowAsset = CreateAssetAndUploadSingleFile(_workflowFilePath);
- var videoAsset = CreateAssetAndUploadSingleFile(_singleMP4InputFilePath);
- IAsset outputAsset = CreateEncodingJob(workflowAsset, videoAsset);
-
- }
-
- static public IAsset CreateAssetAndUploadSingleFile(string singleFilePath)
- {
- var assetName = "UploadSingleFile_" + DateTime.UtcNow.ToString();
- var asset = _context.Assets.Create(assetName, AssetCreationOptions.None);
-
- var fileName = Path.GetFileName(singleFilePath);
-
- var assetFile = asset.AssetFiles.Create(fileName);
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return asset;
- }
-
- static public IAsset CreateEncodingJob(IAsset workflow, IAsset video)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Premium Workflow encoding job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Premium Workflow");
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("Premium Workflow encoding task",
- processor,
- "",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(workflow);
- task.InputAssets.Add(video); // we add one asset
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset",
- AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new
- EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // If job state is Error the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- throw new Exception("\nExiting method due to job error.");
- }
-
- return job.OutputMediaAssets[0];
- }
-
- static private void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished.");
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- // LogJobStop(job.Id);
- break;
- default:
- break;
- }
- }
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Encoding Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-encoding-error-codes.md
- Title: Azure Media Services encoding error codes | Microsoft Docs
-description: This topic lists error codes that could be returned in case an error was encountered during the encoding task execution..
------ Previously updated : 03/10/2021---
-# Encoding error codes
--
-The following table lists error codes that could be returned in case an error was encountered during the encoding task execution. To get error details in your .NET code, use the [ErrorDetails](/previous-versions/azure/jj126075(v=azure.100)) class. To get error details in your REST code, use the [ErrorDetail](/rest/api/media/operations/errordetail) REST API.
-
-| ErrorDetail.Code | Possible causes for error |
-| | |
-| Unknown |Unknown error while executing the task |
-| ErrorDownloadingInputAssetMalformedContent |Category of errors that covers errors in downloading input asset such as bad file names, zero length files, incorrect formats and so on. |
-| ErrorDownloadingInputAssetServiceFailure |Category of errors that covers problems on the service side - for example network or storage errors while downloading. |
-| ErrorParsingConfiguration |Category of errors where task \<see cref="MediaTask.PrivateData"/> (configuration) is not valid, for example the configuration is not a valid system preset or it contains invalid XML. |
-| ErrorExecutingTaskMalformedContent |Category of errors during the execution of the task where issues inside the input media files cause failure. |
-| ErrorExecutingTaskUnsupportedFormat |Category of errors where the media processor cannot process the files provided - media format not supported, or does not match the Configuration. For example, trying to produce an audio-only output from an asset that has only video |
-| ErrorProcessingTask |Category of other errors that the media processor encounters during the processing of the task that are unrelated to content. |
-| ErrorUploadingOutputAsset |Category of errors when uploading the output asset |
-| ErrorCancelingTask |Category of errors to cover failures when attempting to cancel the Task |
-| TransientError |Category of errors to cover transient issues (eg. temporary networking issues with Azure Storage) |
-
-To get help from the **Media Services** team, open a [support ticket](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade).
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Related articles
-* [Perform advanced encoding tasks by customizing Media Encoder Standard presets](media-services-custom-mes-presets-with-dotnet.md)
-* [Quotas and Limitations](media-services-quotas-and-limitations.md)
-
-<!--Reference links in article-->
-[1]: https://azure.microsoft.com/pricing/details/media-services/
media-services Media Services Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-error-codes.md
- Title: Azure Media Services error codes | Microsoft Docs
-description: You may receive HTTP error codes from the service depending on issues such as authentication tokens expiring to actions that are not supported in Media Services. This article gives an overview of Azure Media Services v2 API error codes.
------ Previously updated : 03/10/2021--
-# Azure Media Services error codes
--
-When using Microsoft Azure Media Services, you may receive HTTP error codes from the service depending on issues such as authentication tokens expiring to actions that are not supported in Media Services. The following is a list of **HTTP error codes** that may be returned by Media Services and the possible causes for them.
-
-## 400 Bad Request
-The request contains invalid information and is rejected due to one of the following reasons:
-
-* An unsupported API version is specified. For the most current version, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-* The API version of Media Services is not specified. For information on how to specify the API version, see [Media Services Operations REST API Reference](/rest/api/media/operations/azure-media-services-rest-api-reference).
-
- > [!NOTE]
- > If you are using the .NET or Java SDKs to connect to Media Services, the API version is specified for you whenever you try and perform some action against Media Services.
- >
- >
-* An undefined property has been specified. The property name is in the error message. Only those properties that are members of a given entity can be specified. See [Azure Media Services REST API Reference](/rest/api/media/operations/azure-media-services-rest-api-reference) for a list of entities and their properties.
-* An invalid property value has been specified. The property name is in the error message. See the previous link for valid property types and their values.
-* A property value is missing and is required.
-* Part of the URL specified contains a bad value.
-* An attempt was made to update a WriteOnce property.
-* An attempt was made to create a Job that has an input Asset with a primary AssetFile that was not specified or could not be determined.
-* An attempt was made to update a SAS Locator. SAS locators can only be created or deleted. Streaming locators can be updated. For more information, see [Locators](/rest/api/media/operations/locator).
-* An unsupported operation or query was submitted.
-
-## 401 Unauthorized
-The request could not be authenticated (before it can be authorized) due to one of the following reasons:
-
-* Missing authentication header.
-* Bad authentication header value.
- * The token has expired.
- * The token contains an invalid signature.
-
-## 403 Forbidden
-The request is not allowed due to one of the following reasons:
-
-* The Media Services account cannot be found or has been deleted.
-* The Media Services account is disabled and the request type is not HTTP GET. Service operations will return a 403 response as well.
-* The authentication token does not contain the userΓÇÖs credential information: AccountName and/or SubscriptionId. You can find this information in the Media Services UI extension for your Media Services account in the Azure Management Portal.
-* The resource cannot be accessed.
-
- * An attempt was made to use a MediaProcessor that is not available for your Media Services account.
- * An attempt was made to update a JobTemplate defined by Media Services.
- * An attempt was made to overwrite some other Media Services account's Locator.
- * An attempt was made to overwrite some other Media Services account's ContentKey.
-* The resource could not be created due to a service quota that was reached for the Media Services account. For more information on the service quotas, see [Quotas and Limitations](media-services-quotas-and-limitations.md).
-
-## 404 Not Found
-The request is not allowed on a resource due to one of the following reasons:
-
-* An attempt was made to update an entity that does not exist.
-* An attempt was made to delete an entity that does not exist.
-* An attempt was made to create an entity that links to an entity that does not exist.
-* An attempt was made to GET an entity that does not exist.
-* An attempt was made to specify a storage account that is not associated with the Media Services account.
-
-## 409 Conflict
-The request is not allowed due to one of the following reasons:
-
-* More than one AssetFile has the specified name within the Asset.
-* An attempt was made to create a second primary AssetFile within the Asset.
-* An attempt was made to create a ContentKey with the specified Id already used.
-* An attempt was made to create a Locator with the specified Id already used.
-* More than one IngestManifestFile has the specified name within the IngestManifest.
-* An attempt was made to link a second storage encryption ContentKey to the storage-encrypted Asset.
-* An attempt was made to link the same ContentKey to the Asset.
-* An attempt was made to create a locator to an Asset whose storage container is missing or is no longer associated with the Asset.
-* An attempt was made to create a locator to an Asset which already has 5 locators in use. (Azure Storage enforces the limit of five shared access policies on one storage container.)
-* Linking storage account of an Asset to an IngestManifestAsset is not the same as the storage account used by the parent IngestManifest.
-
-## 500 Internal Server Error
-During the processing of the request, Media Services encounters some error that prevents the processing from continuing. This could be due to one of the following reasons:
-
-* Creating an Asset or Job fails because the Media Services account's service quota information is temporarily unavailable.
-* Creating an Asset or IngestManifest blob storage container fails because the account's storage account information is temporarily unavailable.
-* Other unexpected error.
-
-## 503 Service Unavailable
-The server is currently unable to receive requests. This error may be caused by excessive requests to the service. Media Services throttling mechanism restricts the resource usage for applications that make excessive request to the service.
-
-> [!NOTE]
-> Check the error message and error code string to get more detailed information about the reason you got the 503 error. This error does not always mean throttling.
->
->
-
-Possible status descriptions are:
-
-* "Server is busy. Previous runs of this type of request took more than {0} seconds."
-* "Server is busy. More than {0} requests per second can be throttled."
-* "Server is busy. More than {0} requests within {1} seconds can be throttled."
-
-To handle this error, we recommend using exponential back-off retry logic. That means using progressively longer waits between retries for consecutive error responses. For more information, see [Transient Fault Handling Application Block](/previous-versions/msp-n-p/hh680905(v=pandp.50)).
-
-> [!NOTE]
-> If you are using [Azure Media Services SDK for .Net](https://github.com/Azure/azure-sdk-for-media-services/tree/master), the retry logic for the 503 error has been implemented by the SDK.
->
->
-
-## See Also
-[Media Services Management Error Codes](/rest/api/media/)
-
-## Next steps
-
-## Provide feedback
media-services Media Services Face Redaction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-face-redaction.md
- Title: Redact faces with Azure Media Analytics | Microsoft Docs
-description: Azure Media Redactor is an Azure Media Analytics media processor that offers scalable face redaction in the cloud. This article demonstrates how to redact faces with Azure media analytics.
------ Previously updated : 03/10/2021---
-# Redact faces with Azure Media Analytics
--
-## Overview
-
-**Azure Media Redactor** is an [Azure Media Analytics](./legacy-components.md) media processor (MP) that offers scalable face redaction in the cloud. Face redaction enables you to modify your video in order to blur faces of selected individuals. You may want to use the face redaction service in public safety and news media scenarios. A few minutes of footage that contains multiple faces can take hours to redact manually, but with this service the face redaction process will require just a few simple steps.
-
-This article gives details about **Azure Media Redactor** and shows how to use it with Media Services SDK for .NET.
-
-## Face redaction modes
-
-Facial redaction works by detecting faces in every frame of video and tracking the face object both forwards and backwards in time, so that the same individual can be blurred from other angles as well. The automated redaction process is complex and does not always produce 100% of desired output, for this reason Media Analytics provides you with a couple of ways to modify the final output.
-
-In addition to a fully automatic mode, there is a two-pass workflow, which allows the selection/de-selection of found faces via a list of IDs. Also, to make arbitrary per frame adjustments the MP uses a metadata file in JSON format. This workflow is split into **Analyze** and **Redact** modes. You can combine the two modes in a single pass that runs both tasks in one job; this mode is called **Combined**.
-
- > [!NOTE]
- > Face Detector Media Processor has been deprecated as of June 2020, [Azure Media Services legacy components](./legacy-components.md). Consider using Azure Media Services v3 API.
-
-### Combined mode
-
-This produces a redacted mp4 automatically without any manual input.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |foo.bar |Video in WMV, MOV, or MP4 format |
-| Input config |Job configuration preset |{'version':'1.0', 'options': {'mode':'combined'}} |
-| Output asset |foo_redacted.mp4 |Video with blurring applied |
-
-### Analyze mode
-
-The **analyze** pass of the two-pass workflow takes a video input and produces a JSON file of face locations, and jpg images of each detected face.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |foo.bar |Video in WMV, MPV, or MP4 format |
-| Input config |Job configuration preset |{'version':'1.0', 'options': {'mode':'analyze'}} |
-| Output asset |foo_annotations.json |Annotation data of face locations in JSON format. This can be edited by the user to modify the blurring bounding boxes. See sample below. |
-| Output asset |foo_thumb%06d.jpg [foo_thumb000001.jpg, foo_thumb000002.jpg] |A cropped jpg of each detected face, where the number indicates the labelId of the face |
-
-#### Output example
-
-```json
-{
- "version": 1,
- "timescale": 24000,
- "offset": 0,
- "framerate": 23.976,
- "width": 1280,
- "height": 720,
- "fragments": [
- {
- "start": 0,
- "duration": 48048,
- "interval": 1001,
- "events": [
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [],
- [
- {
- "index": 13,
- "id": 1138,
- "x": 0.29537,
- "y": -0.18987,
- "width": 0.36239,
- "height": 0.80335
- },
- {
- "index": 13,
- "id": 2028,
- "x": 0.60427,
- "y": 0.16098,
- "width": 0.26958,
- "height": 0.57943
- }
- ],
-
- ... truncated
-```
-
-### Redact mode
-
-The second pass of the workflow takes a larger number of inputs that must be combined into a single asset.
-
-This includes a list of IDs to blur, the original video, and the annotations JSON. This mode uses the annotations to apply blurring on the input video.
-
-The output from the Analyze pass does not include the original video. The video needs to be uploaded into the input asset for the Redact mode task and selected as the primary file.
-
-| Stage | File Name | Notes |
-| | | |
-| Input asset |foo.bar |Video in WMV, MPV, or MP4 format. Same video as in step 1. |
-| Input asset |foo_annotations.json |annotations metadata file from phase one, with optional modifications. |
-| Input asset |foo_IDList.txt (Optional) |Optional new line separated list of face IDs to redact. If left blank, this blurs all faces. |
-| Input config |Job configuration preset |{'version':'1.0', 'options': {'mode':'redact'}} |
-| Output asset |foo_redacted.mp4 |Video with blurring applied based on annotations |
-
-#### Example output
-
-This is the output from an IDList with one ID selected.
-
-Example foo_IDList.txt
-
-```
-1
-2
-3
-```
-
-## Blur types
-
-In the **Combined** or **Redact** mode, there are 5 different blur modes you can choose from via the JSON input configuration: **Low**, **Med**, **High**, **Box**, and **Black**. By default **Med** is used.
-
-You can find samples of the blur types below.
-
-### Example JSON
-
-```json
-{
- 'version':'1.0',
- 'options': {
- 'Mode': 'Combined',
- 'BlurType': 'High'
- }
-}
-```
-
-#### Low
-
-![Low](./media/media-services-face-redaction/blur1.png)
-
-#### Med
-
-![Med](./media/media-services-face-redaction/blur2.png)
-
-#### High
-
-![High](./media/media-services-face-redaction/blur3.png)
-
-#### Box
-
-![Box](./media/media-services-face-redaction/blur4.png)
-
-#### Black
-
-![Black](./media/media-services-face-redaction/blur5.png)
-
-## Elements of the output JSON file
-
-The Redaction MP provides high precision face location detection and tracking that can detect up to 64 human faces in a video frame. Frontal faces provide the best results, while side faces and small faces (less than or equal to 24x24 pixels) are challenging.
--
-## .NET sample code
-
-The following program shows how to:
-
-1. Create an asset and upload a media file into the asset.
-2. Create a job with a face redaction task based on a configuration file that contains the following json preset:
-
- ```json
- {
- 'version':'1.0',
- 'options': {
- 'mode':'combined'
- }
- }
- ```
-
-3. Download the output JSON files.
-
-### Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-#### Example
-
-```csharp
-using System;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Threading;
-using System.Threading.Tasks;
-
-namespace FaceRedaction
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Run the FaceRedaction job.
- var asset = RunFaceRedactionJob(@"C:\supportFiles\FaceRedaction\SomeFootage.mp4",
- @"C:\supportFiles\FaceRedaction\config.json");
-
- // Download the job output asset.
- DownloadAsset(asset, @"C:\supportFiles\FaceRedaction\Output");
- }
-
- static IAsset RunFaceRedactionJob(string inputMediaFilePath, string configurationFile)
- {
- // Create an asset and upload the input media file to storage.
- IAsset asset = CreateAssetAndUploadSingleFile(inputMediaFilePath,
- "My Face Redaction Input Asset",
- AssetCreationOptions.None);
-
- // Declare a new job.
- IJob job = _context.Jobs.Create("My Face Redaction Job");
-
- // Get a reference to Azure Media Redactor.
- string MediaProcessorName = "Azure Media Redactor";
-
- var processor = GetLatestMediaProcessorByName(MediaProcessorName);
-
- // Read configuration from the specified file.
- string configuration = File.ReadAllText(configurationFile);
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("My Face Redaction Task",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input asset.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- task.OutputAssets.AddNew("My Face Redaction Output Asset", AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
-
- progressJobTask.Wait();
-
- // If job state is Error, the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- ErrorDetail error = job.Tasks.First().ErrorDetails.First();
- Console.WriteLine(string.Format("Error: {0}. {1}",
- error.Code,
- error.Message));
- return null;
- }
-
- return job.OutputMediaAssets[0];
- }
-
- static IAsset CreateAssetAndUploadSingleFile(string filePath, string assetName, AssetCreationOptions options)
- {
- IAsset asset = _context.Assets.Create(assetName, options);
-
- var assetFile = asset.AssetFiles.Create(Path.GetFileName(filePath));
- assetFile.Upload(filePath);
-
- return asset;
- }
-
- static void DownloadAsset(IAsset asset, string outputDirectory)
- {
- foreach (IAssetFile file in asset.AssetFiles)
- {
- file.Download(Path.Combine(outputDirectory, file.Name));
- }
- }
-
- static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors
- .Where(p => p.Name == mediaProcessorName)
- .ToList()
- .OrderBy(p => new Version(p.Version))
- .LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor",
- mediaProcessorName));
-
- return processor;
- }
-
- static private void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished.");
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- // LogJobStop(job.Id);
- break;
- default:
- break;
- }
- }
- }
-}
-```
-
-## Next steps
--
-## Provide feedback
--
-## Related links
-
-[Azure Media Services Analytics Overview](./legacy-components.md)
-
-[Azure Media Analytics demos](http://amslabs.azurewebsites.net/demos/Analytics.html)
media-services Media Services Fmp4 Live Ingest Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-fmp4-live-ingest-overview.md
- Title: Azure Media Services fragmented MP4 live ingest specification | Microsoft Docs
-description: This specification describes the protocol and format for fragmented MP4-based live streaming ingestion for Azure Media Services. This document also discusses best practices for building highly redundant and robust live ingest mechanisms.
------- Previously updated : 03/18/2019---
-# Azure Media Services fragmented MP4 live ingest specification
-
-This specification describes the protocol and format for fragmented MP4-based live streaming ingestion for Azure Media Services. Media Services provides a live streaming service that customers can use to stream live events and broadcast content in real time by using Azure as the cloud platform. This document also discusses best practices for building highly redundant and robust live ingest mechanisms.
-
-## 1. Conformance notation
-The key words "MUST," "MUST NOT," "REQUIRED," "SHALL," "SHALL NOT," "SHOULD," "SHOULD NOT," "RECOMMENDED," "MAY," and "OPTIONAL" in this document are to be interpreted as they are described in RFC 2119.
-
-## 2. Service diagram
-The following diagram shows the high-level architecture of the live streaming service in Media
-
-1. A live encoder pushes live feeds to channels that are created and provisioned via the Azure Media Services SDK.
-1. Channels, programs, and streaming endpoints in Media Services handle all the live streaming functionalities, including ingest, formatting, cloud DVR, security, scalability, and redundancy.
-1. Optionally, customers can choose to deploy an Azure Content Delivery Network layer between the streaming endpoint and the client endpoints.
-1. Client endpoints stream from the streaming endpoint by using HTTP Adaptive Streaming protocols. Examples include Microsoft Smooth Streaming, Dynamic Adaptive Streaming over HTTP (DASH, or MPEG-DASH), and Apple HTTP Live Streaming (HLS).
-
-![ingest flow][image1]
-
-## 3. Bitstream format ΓÇô ISO 14496-12 fragmented MP4
-The wire format for live streaming ingest discussed in this document is based on [ISO-14496-12]. For a detailed explanation of fragmented MP4 format and extensions both for video-on-demand files and live streaming ingestion, see [[MS-SSTR]](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251).
-
-### Live ingest format definitions
-The following list describes special format definitions that apply to live ingest into Azure Media
-
-1. The **ftyp**, **Live Server Manifest Box**, and **moov** boxes MUST be sent with each request (HTTP POST). These boxes MUST be sent at the beginning of the stream and any time the encoder must reconnect to resume stream ingest. For more information, see Section 6 in [1].
-1. Section 3.3.2 in [1] defines an optional box called **StreamManifestBox** for live ingest. Due to the routing logic of the Azure load balancer, using this box is deprecated. The box SHOULD NOT be present when ingesting into Media Services. If this box is present, Media Services silently ignores it.
-1. The **TrackFragmentExtendedHeaderBox** box defined in 3.2.3.2 in [1] MUST be present for each fragment.
-1. Version 2 of the **TrackFragmentExtendedHeaderBox** box SHOULD be used to generate media segments that have identical URLs in multiple datacenters. The fragment index field is REQUIRED for cross-datacenter failover of index-based streaming formats such as Apple HLS and index-based MPEG-DASH. To enable cross-datacenter failover, the fragment index MUST be synced across multiple encoders and be increased by 1 for each successive media fragment, even across encoder restarts or failures.
-1. Section 3.3.6 in [1] defines a box called **MovieFragmentRandomAccessBox** (**mfra**) that MAY be sent at the end of live ingestion to indicate end-of-stream (EOS) to the channel. Due to the ingest logic of Media Services, using EOS is deprecated, and the **mfra** box for live ingestion SHOULD NOT be sent. If sent, Media Services silently ignores it. To reset the state of the ingest point, we recommend that you use [Channel Reset](/rest/api/media/operations/channel#reset_channels). We also recommend that you use [Program Stop](/rest/api/media/operations/program#stop_programs) to end a presentation and stream.
-1. The MP4 fragment duration SHOULD be constant, to reduce the size of the client manifests. A constant MP4 fragment duration also improves client download heuristics through the use of repeat tags. The duration MAY fluctuate to compensate for non-integer frame rates.
-1. The MP4 fragment duration SHOULD be between approximately 2 and 6 seconds.
-1. MP4 fragment timestamps and indexes (**TrackFragmentExtendedHeaderBox** `fragment_ absolute_ time` and `fragment_index`) SHOULD arrive in increasing order. Although Media Services is resilient to duplicate fragments, it has limited ability to reorder fragments according to the media timeline.
-
-## 4. Protocol format ΓÇô HTTP
-ISO fragmented MP4-based live ingest for Media Services uses a standard long-running HTTP POST request to transmit encoded media data that is packaged in fragmented MP4 format to the service. Each HTTP POST sends a complete fragmented MP4 bitstream ("stream"), starting from the beginning with header boxes (**ftyp**, **Live Server Manifest Box**, and **moov** boxes), and continuing with a sequence of fragments (**moof** and **mdat** boxes). For URL syntax for the HTTP POST request, see section 9.2 in [1]. An example of the POST URL is:
-
-`http://customer.channel.mediaservices.windows.net/ingest.isml/streams(720p)`
-
-### Requirements
-Here are the detailed requirements:
-
-1. The encoder SHOULD start the broadcast by sending an HTTP POST request with an empty ΓÇ£bodyΓÇ¥ (zero content length) by using the same ingestion URL. This can help the encoder quickly detect whether the live ingestion endpoint is valid, and if there are any authentication or other conditions required. Per HTTP protocol, the server can't send back an HTTP response until the entire request, including the POST body, is received. Given the long-running nature of a live event, without this step, the encoder might not be able to detect any error until it finishes sending all the data.
-1. The encoder MUST handle any errors or authentication challenges because of (1). If (1) succeeds with a 200 response, continue.
-1. The encoder MUST start a new HTTP POST request with the fragmented MP4 stream. The payload MUST start with the header boxes, followed by fragments. Note that the **ftyp**, **Live Server Manifest Box**, and **moov** boxes (in this order) MUST be sent with each request, even if the encoder must reconnect because the previous request was terminated prior to the end of the stream.
-1. The encoder MUST use chunked transfer encoding for uploading, because itΓÇÖs impossible to predict the entire content length of the live event.
-1. When the event is over, after sending the last fragment, the encoder MUST gracefully end the chunked transfer encoding message sequence (most HTTP client stacks handle it automatically). The encoder MUST wait for the service to return the final response code, and then terminate the connection.
-1. The encoder MUST NOT use the `Events()` noun as described in 9.2 in [1] for live ingestion into Media Services.
-1. If the HTTP POST request terminates or times out with a TCP error prior to the end of the stream, the encoder MUST issue a new POST request by using a new connection, and follow the preceding requirements. Additionally, the encoder MUST resend the previous two MP4 fragments for each track in the stream, and resume without introducing a discontinuity in the media timeline. Resending the last two MP4 fragments for each track ensures that there is no data loss. In other words, if a stream contains both an audio and a video track, and the current POST request fails, the encoder must reconnect and resend the last two fragments for the audio track, which were previously successfully sent, and the last two fragments for the video track, which were previously successfully sent, to ensure that there is no data loss. The encoder MUST maintain a ΓÇ£forwardΓÇ¥ buffer of media fragments, which it resends when it reconnects.
-
-## 5. Timescale
-[[MS-SSTR]](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251) describes the usage of timescale for **SmoothStreamingMedia** (Section 2.2.2.1), **StreamElement** (Section 2.2.2.3), **StreamFragmentElement** (Section 2.2.2.6), and **LiveSMIL** (Section 2.2.7.3.1). If the timescale value is not present, the default value used is 10,000,000 (10 MHz). Although the Smooth Streaming format specification doesnΓÇÖt block usage of other timescale values, most encoder implementations use this default value (10 MHz) to generate Smooth Streaming ingest data. Due to the [Azure Media Dynamic Packaging](media-services-dynamic-packaging-overview.md) feature, we recommend that you use a 90-KHz timescale for video streams and 44.1 KHz or 48.1 KHz for audio streams. If different timescale values are used for different streams, the stream-level timescale MUST be sent. For more information, see [[MS-SSTR]](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251).
-
-## 6. Definition of ΓÇ£streamΓÇ¥
-Stream is the basic unit of operation in live ingestion for composing live presentations, handling streaming failover, and redundancy scenarios. Stream is defined as one unique, fragmented MP4 bitstream that might contain a single track or multiple tracks. A full live presentation might contain one or more streams, depending on the configuration of the live encoders. The following examples illustrate various options of using streams to compose a full live presentation.
-
-**Example:**
-
-A customer wants to create a live streaming presentation that includes the following audio/video bitrates:
-
-Video ΓÇô 3000 kbps, 1500 kbps, 750 kbps
-
-Audio ΓÇô 128 kbps
-
-### Option 1: All tracks in one stream
-In this option, a single encoder generates all audio/video tracks, and then bundles them into one fragmented MP4 bitstream. The fragmented MP4 bitstream is then sent via a single HTTP POST connection. In this example, there is only one stream for this live presentation.
-
-![Streams-one track][image2]
-
-### Option 2: Each track in a separate stream
-In this option, the encoder puts one track into each fragment MP4 bitstream, and then posts all of the streams over separate HTTP connections. This can be done with one encoder or with multiple encoders. The live ingestion sees this live presentation as composed of four streams.
-
-![Streams-separate tracks][image3]
-
-### Option 3: Bundle audio track with the lowest bitrate video track into one stream
-In this option, the customer chooses to bundle the audio track with the lowest-bitrate video track in one fragment MP4 bitstream, and leave the other two video tracks as separate streams.
-
-![Streams-audio and video tracks][image4]
-
-### Summary
-This is not an exhaustive list of all possible ingestion options for this example. As a matter of fact, any grouping of tracks into streams is supported by live ingestion. Customers and encoder vendors can choose their own implementations based on engineering complexity, encoder capacity, and redundancy and failover considerations. However, in most cases, there is only one audio track for the entire live presentation. So, itΓÇÖs important to ensure the healthiness of the ingest stream that contains the audio track. This consideration often results in putting the audio track in its own stream (as in Option 2) or bundling it with the lowest-bitrate video track (as in Option 3). Also, for better redundancy and fault tolerance, sending the same audio track in two different streams (Option 2 with redundant audio tracks) or bundling the audio track with at least two of the lowest-bitrate video tracks (Option 3 with audio bundled in at least two video streams) is highly recommended for live ingest into Media Services.
-
-## 7. Service failover
-Given the nature of live streaming, good failover support is critical for ensuring the availability of the service. Media Services is designed to handle various types of failures, including network errors, server errors, and storage issues. When used in conjunction with proper failover logic from the live encoder side, customers can achieve a highly reliable live streaming service from the cloud.
-
-In this section, we discuss service failover scenarios. In this case, the failure happens somewhere within the service, and it manifests itself as a network error. Here are some recommendations for the encoder implementation for handling service failover:
-
-1. Use a 10-second timeout for establishing the TCP connection. If an attempt to establish the connection takes longer than 10 seconds, abort the operation and try again.
-1. Use a short timeout for sending the HTTP request message chunks. If the target MP4 fragment duration is N seconds, use a send timeout between N and 2 N seconds; for example, if the MP4 fragment duration is 6 seconds, use a timeout of 6 to 12 seconds. If a timeout occurs, reset the connection, open a new connection, and resume stream ingest on the new connection.
-1. Maintain a rolling buffer that has the last two fragments for each track that were successfully and completely sent to the service. If the HTTP POST request for a stream is terminated or times out prior to the end of the stream, open a new connection and begin another HTTP POST request, resend the stream headers, resend the last two fragments for each track, and resume the stream without introducing a discontinuity in the media timeline. This reduces the chance of data loss.
-1. We recommend that the encoder does NOT limit the number of retries to establish a connection or resume streaming after a TCP error occurs.
-1. After a TCP error:
-
- a. The current connection MUST be closed, and a new connection MUST be created for a new HTTP POST request.
-
- b. The new HTTP POST URL MUST be the same as the initial POST URL.
-
- c. The new HTTP POST MUST include stream headers (**ftyp**, **Live Server Manifest Box**, and **moov** boxes) that are identical to the stream headers in the initial POST.
-
- d. The last two fragments sent for each track must be resent, and streaming must resume without introducing a discontinuity in the media timeline. The MP4 fragment timestamps must increase continuously, even across HTTP POST requests.
-1. The encoder SHOULD terminate the HTTP POST request if data is not being sent at a rate commensurate with the MP4 fragment duration. An HTTP POST request that does not send data can prevent Media Services from quickly disconnecting from the encoder in the event of a service update. For this reason, the HTTP POST for sparse (ad signal) tracks SHOULD be short-lived, terminating as soon as the sparse fragment is sent.
-
-## 8. Encoder failover
-Encoder failover is the second type of failover scenario that needs to be addressed for end-to-end live streaming delivery. In this scenario, the error condition occurs on the encoder side.
-
-![encoder failover][image5]
-
-The following expectations apply from the live ingestion endpoint when encoder failover happens:
-
-1. A new encoder instance SHOULD be created to continue streaming, as illustrated in the diagram (Stream for 3000k video, with dashed line).
-1. The new encoder MUST use the same URL for HTTP POST requests as the failed instance.
-1. The new encoderΓÇÖs POST request MUST include the same fragmented MP4 header boxes as the failed instance.
-1. The new encoder MUST be properly synced with all other running encoders for the same live presentation to generate synced audio/video samples with aligned fragment boundaries.
-1. The new stream MUST be semantically equivalent with the previous stream, and interchangeable at the header and fragment levels.
-1. The new encoder SHOULD try to minimize data loss. The `fragment_absolute_time` and `fragment_index` of media fragments SHOULD increase from the point where the encoder last stopped. The `fragment_absolute_time` and `fragment_index` SHOULD increase in a continuous manner, but it is permissible to introduce a discontinuity, if necessary. Media Services ignores fragments that it has already received and processed, so it's better to err on the side of resending fragments than to introduce discontinuities in the media timeline.
-
-## 9. Encoder redundancy
-For certain critical live events that demand even higher availability and quality of experience, we recommended that you use active-active redundant encoders to achieve seamless failover with no data loss.
-
-![encoder redundancy][image6]
-
-As illustrated in this diagram, two groups of encoders push two copies of each stream simultaneously into the live service. This setup is supported because Media Services can filter out duplicate fragments based on stream ID and fragment timestamp. The resulting live stream and archive is a single copy of all the streams that is the best possible aggregation from the two sources. For example, in a hypothetical extreme case, as long as there is one encoder (it doesnΓÇÖt have to be the same one) running at any given point in time for each stream, the resulting live stream from the service is continuous without data loss.
-
-The requirements for this scenario are almost the same as the requirements in the "Encoder failover" case, with the exception that the second set of encoders are running at the same time as the primary encoders.
-
-## 10. Service redundancy
-For highly redundant global distribution, sometimes you must have cross-region backup to handle regional disasters. Expanding on the ΓÇ£Encoder redundancyΓÇ¥ topology, customers can choose to have a redundant service deployment in a different region that's connected with the second set of encoders. Customers also can work with a Content Delivery Network provider to deploy a Global Traffic Manager in front of the two service deployments to seamlessly route client traffic. The requirements for the encoders are the same as the ΓÇ£Encoder redundancyΓÇ¥ case. The only exception is that the second set of encoders needs to be pointed to a different live ingest endpoint. The following diagram shows this setup:
-
-![service redundancy][image7]
-
-## 11. Special types of ingestion formats
-This section discusses special types of live ingestion formats that are designed to handle specific scenarios.
-
-### Sparse track
-When delivering a live streaming presentation with a rich client experience, often it's necessary to transmit time-synced events or signals in-band with the main media data. An example of this is dynamic live ad insertion. This type of event signaling is different from regular audio/video streaming because of its sparse nature. In other words, the signaling data usually does not happen continuously, and the interval can be hard to predict. The concept of sparse track was designed to ingest and broadcast in-band signaling data.
-
-The following steps are a recommended implementation for ingesting sparse track:
-
-1. Create a separate fragmented MP4 bitstream that contains only sparse tracks, without audio/video tracks.
-1. In the **Live Server Manifest Box** as defined in Section 6 in [1], use the *parentTrackName* parameter to specify the name of the parent track. For more information, see section 4.2.1.2.1.2 in [1].
-1. In the **Live Server Manifest Box**, **manifestOutput** MUST be set to **true**.
-1. Given the sparse nature of the signaling event, we recommended the following:
-
- a. At the beginning of the live event, the encoder sends the initial header boxes to the service, which allows the service to register the sparse track in the client manifest.
-
- b. The encoder SHOULD terminate the HTTP POST request when data is not being sent. A long-running HTTP POST that does not send data can prevent Media Services from quickly disconnecting from the encoder in the event of a service update or server reboot. In these cases, the media server is temporarily blocked in a receive operation on the socket.
-
- c. During the time when signaling data is not available, the encoder SHOULD close the HTTP POST request. While the POST request is active, the encoder SHOULD send data.
-
- d. When sending sparse fragments, the encoder can set an explicit content-length header, if itΓÇÖs available.
-
- e. When sending sparse fragments with a new connection, the encoder SHOULD start sending from the header boxes, followed by the new fragments. This is for cases in which failover happens in-between, and the new sparse connection is being established to a new server that has not seen the sparse track before.
-
- f. The sparse track fragment becomes available to the client when the corresponding parent track fragment that has an equal or larger timestamp value is made available to the client. For example, if the sparse fragment has a timestamp of t=1000, it is expected that after the client sees "video" (assuming the parent track name is "video") fragment timestamp 1000 or beyond, it can download the sparse fragment t=1000. Note that the actual signal could be used for a different position in the presentation timeline for its designated purpose. In this example, itΓÇÖs possible that the sparse fragment of t=1000 has an XML payload, which is for inserting an ad in a position thatΓÇÖs a few seconds later.
-
- g. The payload of sparse track fragments can be in different formats (such as XML, text, or binary), depending on the scenario.
-
-### Redundant audio track
-In a typical HTTP adaptive streaming scenario (for example, Smooth Streaming or DASH), often, there's only one audio track in the entire presentation. Unlike video tracks, which have multiple quality levels for the client to choose from in error conditions, the audio track can be a single point of failure if the ingestion of the stream that contains the audio track is broken.
-
-To solve this problem, Media Services supports live ingestion of redundant audio tracks. The idea is that the same audio track can be sent multiple times in different streams. Although the service only registers the audio track once in the client manifest, it can use redundant audio tracks as backups for retrieving audio fragments if the primary audio track has issues. To ingest redundant audio tracks, the encoder needs to:
-
-1. Create the same audio track in multiple fragment MP4 bitstreams. The redundant audio tracks MUST be semantically equivalent, with the same fragment timestamps, and be interchangeable at the header and fragment levels.
-1. Ensure that the ΓÇ£audioΓÇ¥ entry in the Live Server Manifest (Section 6 in [1]) is the same for all redundant audio tracks.
-
-The following implementation is recommended for redundant audio tracks:
-
-1. Send each unique audio track in a stream by itself. Also, send a redundant stream for each of these audio track streams, where the second stream differs from the first only by the identifier in the HTTP POST URL: {protocol}://{server address}/{publishing point path}/Streams({identifier}).
-1. Use separate streams to send the two lowest video bitrates. Each of these streams SHOULD also contain a copy of each unique audio track. For example, when multiple languages are supported, these streams SHOULD contain audio tracks for each language.
-1. Use separate server (encoder) instances to encode and send the redundant streams mentioned in (1) and (2).
-
-[image1]: ./media/media-services-fmp4-live-ingest-overview/media-services-image1.png
-[image2]: ./media/media-services-fmp4-live-ingest-overview/media-services-image2.png
-[image3]: ./media/media-services-fmp4-live-ingest-overview/media-services-image3.png
-[image4]: ./media/media-services-fmp4-live-ingest-overview/media-services-image4.png
-[image5]: ./media/media-services-fmp4-live-ingest-overview/media-services-image5.png
-[image6]: ./media/media-services-fmp4-live-ingest-overview/media-services-image6.png
-[image7]: ./media/media-services-fmp4-live-ingest-overview/media-services-image7.png
media-services Media Services Generate Fmp4 Chunks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-generate-fmp4-chunks.md
- Title: Create an Azure Media Services encoding task that generates fMP4 chunks | Microsoft Docs
-description: This topic shows how to create an encoding task that generates fMP4 chunks. When this task is used with the Media Encoder Standard or Media Encoder Premium Workflow encoder, the output asset will contain fMP4 chunks instead of ISO MP4 files.
------ Previously updated : 03/10/2021---
-# Create an encoding task that generates fMP4 chunks
--
-## Overview
-
-This article shows how to create an encoding task that generates fragmented MP4 (fMP4) chunks instead of ISO MP4 files. To generate fMP4 chunks, use the **Media Encoder Standard** or **Media Encoder Premium Workflow** encoder to create an encoding task and also specify **AssetFormatOption.AdaptiveStreaming** option, as shown in this code snippet:
-
-```csharp
- task.OutputAssets.AddNew(@"Output Asset containing fMP4 chunks",
- options: AssetCreationOptions.None,
- formatOption: AssetFormatOption.AdaptiveStreaming);
-```
-
-## <a id="encoding_with_dotnet"></a>Encoding with Media Services .NET SDK
-
-The following code example uses Media Services .NET SDK to perform the following tasks:
--- Create an encoding job.-- Get a reference to the **Media Encoder Standard** encoder.-- Add an encoding task to the job and specify to use the **Adaptive Streaming** preset. -- Create an output asset that will contain fMP4 chunks and an .ism file.-- Add an event handler to check the job progress.-- Submit the job.-
-#### Create and configure a Visual Studio project
-
-Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-#### Example
-
-```csharp
-using System;
-using System.Configuration;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System.Threading;
-
-namespace AdaptiveStreaming
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Get an uploaded asset.
- var asset = _context.Assets.FirstOrDefault();
-
- // Encode and generate the output using the "Adaptive Streaming" preset.
- EncodeToAdaptiveBitrateMP4Set(asset);
-
- Console.ReadLine();
- }
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("Media Encoder Standard Job");
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task
- ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
- processor,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
-
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- // It is also specified to use AssetFormatOption.AdaptiveStreaming,
- // which means the output asset will contain fMP4 chunks.
-
- task.OutputAssets.AddNew(@"Output Asset containing fMP4 chunks",
- options: AssetCreationOptions.None,
- formatOption: AssetFormatOption.AdaptiveStreaming);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
- private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
-
- // Cast sender as a job.
- IJob job = (IJob)sender;
-
- // Display or log error details as needed.
- break;
- default:
- break;
- }
- }
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
- }
-}
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Media Services Encoding Overview](media-services-encode-asset.md)
-
media-services Media Services Get Media Processor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-get-media-processor.md
- Title: How to Create a media processor using the Azure Media Services SDK for .NET| Microsoft Docs
-description: Learn how to create a media processor component to encode, convert format, encrypt, or decrypt media content for Azure Media Services. Code samples are written in C# and use the Media Services SDK for .NET.
------ Previously updated : 03/10/2021----
-# How to: Get a Media Processor instance
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-get-media-processor.md)
-> * [REST](media-services-rest-get-media-processor.md)
-
-## Overview
-
-In Media Services a media processor is a component that handles a specific processing task, such as encoding, format conversion, encrypting, or decrypting media content. You typically create a media processor when you are creating a task to encode, encrypt, or convert the format of media content.
-
-## Azure media processors
-
-The following topic provides lists of media processors:
-
-* [Encoding media processors](scenarios-and-availability.md)
-* [Analytics media processors](scenarios-and-availability.md)
-
-## Get Media Processor
-
-The following method shows how to get a media processor instance. The code example assumes the use of a module-level variable named **_context** to reference the server context as described in the section [How to: Connect to Media Services Programmatically](media-services-use-aad-auth-to-access-ams-api.md).
-
-```csharp
-private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
-{
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
-}
-```
-
-## Media Services learning paths
--
-## Provide feedback
--
-## Next Steps
-
-Now that you know how to get a media processor instance, go to the [How to Encode an Asset](media-services-dotnet-encode-with-media-encoder-standard.md) topic which will show you how to use the Media Encoder Standard to encode an asset.
media-services Media Services Implement Failover https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-implement-failover.md
- Title: Implement failover streaming with Azure Media Services | Microsoft Docs
-description: This article shows how to implement a failover streaming scenario with Azure Media Services.
------ Previously updated : 03/10/2021---
-# Implement failover streaming with Media Services v2
--
-This walkthrough demonstrates how to copy content (blobs) from one asset into another in order to handle redundancy for on-demand streaming. This scenario is useful if you want to set up Azure Content Delivery Network to fail over between two datacenters, in case of an outage in one datacenter. This walkthrough uses the Azure Media Services SDK, the Azure Media Services REST API, and the Azure Storage SDK to demonstrate the following tasks:
-
-1. Set up a Media Services account in "Data Center A."
-2. Upload a mezzanine file into a source asset.
-3. Encode the asset into multi-bit rate MP4 files.
-4. Create a read-only shared access signature locator. This is for the source asset to have read access to the container in the storage account that is associated with the source asset.
-5. Get the container name of the source asset from the read-only shared access signature locator created in the previous step. This is necessary for copying blobs between storage accounts (explained later in the topic.)
-6. Create an origin locator for the asset that was created by the encoding task.
-
-Then, to handle the failover:
-
-1. Set up a Media Services account in "Data Center B."
-2. Create a target empty asset in the target Media Services account.
-3. Create a write shared access signature locator. This is for the target empty asset to have write access to the container in the target storage account that is associated with the target asset.
-4. Use the Azure Storage SDK to copy blobs (asset files) between the source storage account in "Data Center A" and the target storage account in "Data Center B." These storage accounts are associated with the assets of interest.
-5. Associate blobs (asset files) that were copied to the target blob container with the target asset.
-6. Create an origin locator for the asset in "Data Center B", and specify the locator ID that was generated for the asset in "Data Center A."
-
-This gives you the streaming URLs where the relative paths of the URLs are the same (only the base URLs are different).
-
-Then, to handle any outages, you can create a Content Delivery Network on top of these origin locators.
-
-The following considerations apply:
-
-* The current version of Media Services SDK does not support programmatically generating IAssetFile information that would associate an asset with asset files. Instead, use the CreateFileInfos Media Services REST API to do this.
-* Storage encrypted assets (AssetCreationOptions.StorageEncrypted) are not supported for replication (because the encryption key is different in both Media Services accounts).
-* If you want to take advantage of dynamic packaging, make sure the streaming endpoint from which you want to stream your content is in the **Running** state.
-
-## Prerequisites
-
-* Two Media Services accounts in a new or existing Azure subscription. See [How to Create a Media Services Account](media-services-portal-create-account.md).
-* Operating system: Windows 7, Windows 2008 R2, or Windows 8.
-* .NET Framework 4.5 or .NET Framework 4.
-* Visual Studio 2010 SP1 or later version (Professional, Premium, Ultimate, or Express).
-
-## Set up your project
-
-In this section, you create and set up a C# Console Application project.
-
-1. Use Visual Studio to create a new solution that contains the C# Console Application project. Enter **HandleRedundancyForOnDemandStreaming** for the name, and then click **OK**.
-2. Create the **SupportFiles** folder on the same level as the **HandleRedundancyForOnDemandStreaming.csproj** project file. Under the **SupportFiles** folder, create the **OutputFiles** and **MP4Files** folders. Copy an .mp4 file into the **MP4Files** folder. (In this example, the **ignite.mp4** file is used.)
-3. Use **NuGet** to add references to DLLs related to Media Services. In **Visual Studio Main Menu**, select **TOOLS** > **NuGet Package Manager** > **Package Manager Console**. In the console window, type **Install-Package windowsazure.mediaservices**, and press Enter.
-4. Add other references that are required for this project: System.Runtime.Serialization, and System.Web.
-5. Replace **using** statements that were added to the **Programs.cs** file by default with the following ones:
-
-```csharp
-using System;
-using System.Globalization;
-using System.IO;
-using System.Net;
-using System.Text;
-using System.Threading;
-using System.Threading.Tasks;
-using System.Xml;
-using System.Linq;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.Storage;
-using Microsoft.WindowsAzure.Storage.Blob;
-using Microsoft.WindowsAzure.Storage.Auth;
-using System.Runtime.Serialization.Json;
-```
-
-## Add code that handles redundancy for on-demand streaming
-
-In this section, you create the ability to handle redundancy.
-
-1. Add the following class-level fields to the Program class.
-
- ```csharp
- private static readonly string storageNameTarget = "amsstorageacct2";
- private static readonly string storageKeyTarget = "00000000000000000000000000000000000000000000000000000000000000000000000000000000000000==";
- private static readonly string storageNameSource = "amsstorageacct1";
- private static readonly string storageKeySource = "00000000000000000000000000000000000000000000000000000000000000000000000000000000000000==";
-
- private static readonly string sourceApiServer = "https://amsacct1.restv2.westus2-2.media.azure.net/api/";
- private static readonly string targetApiServer = "https://amsacct2.restv2.eastus.media.azure.net/api/";
- private static string tokenSPSource = null;
- private static string tokenSPTarget = null;
-
- private static CloudMediaContext contextSource = null;
- private static CloudMediaContext contextTarget = null;
-
- // Base support files path. Update this field to point to the base path
- // for the local support files folder that you create.
- private static readonly string SupportFiles = Path.GetFullPath(@"../..\SupportFiles");
-
- // Paths to support files (within the above base path).
- private static readonly string SingleInputMp4Path = Path.GetFullPath(SupportFiles + @"\MP4Files\ignite.mp4");
- private static readonly string OutputFilesFolder = Path.GetFullPath(SupportFiles + @"\OutputFiles");
- ```
-2. Replace the default Main method definition with the following one. Method definitions that are called from Main are defined below.
-
- ```csharp
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentialsForAcct1 =
- new AzureAdTokenCredentials("microsoft.onmicrosoft.com",
- new AzureAdClientSymmetricKey("00000000-0000-0000-0000-000000000000", "00000000000000000000000000000000"),
- AzureEnvironments.AzureCloudEnvironment);
-
- AzureAdTokenCredentials tokenCredentialsForAcct2 =
- new AzureAdTokenCredentials("microsoft.onmicrosoft.com",
- new AzureAdClientSymmetricKey("00000000-0000-0000-0000-000000000000", "00000000000000000000000000000000"),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProviderSource = new AzureAdTokenProvider(tokenCredentialsForAcct1);
- var tokenProviderTarget = new AzureAdTokenProvider(tokenCredentialsForAcct2);
- try
- {
- tokenSPSource = tokenProviderSource.GetAccessToken().Item1.ToString();
- tokenSPTarget = tokenProviderTarget.GetAccessToken().Item1.ToString();
-
- contextSource = new CloudMediaContext(new Uri(sourceApiServer), tokenProviderSource);
- contextTarget = new CloudMediaContext(new Uri(targetApiServer), tokenProviderTarget);
--
- IAsset assetSingleFile = CreateAssetAndUploadSingleFile(contextSource,
- AssetCreationOptions.None,
- SingleInputMp4Path);
-
- IJob job = CreateEncodingJob(contextSource, assetSingleFile);
-
- if (job.State != JobState.Error)
- {
- IAsset sourceOutputAsset = job.OutputMediaAssets[0];
- // Get the locator for Smooth Streaming
- var sourceOriginLocator = GetStreamingOriginLocator(contextSource, sourceOutputAsset);
-
- Console.WriteLine("Locator Id: {0}", sourceOriginLocator.Id);
-
- // 1.Create a read-only SAS locator for the source asset to have read access to the container in the source Storage account (associated with the source Media Services account)
- var readSasLocator = GetSasReadLocator(contextSource, sourceOutputAsset);
-
- // 2.Get the container name of the source asset from the read-only SAS locator created in the previous step
- string containerName = (new Uri(readSasLocator.Path)).Segments[1];
-
- // 3.Create a target empty asset in the target Media Services account
- var targetAsset = CreateTargetEmptyAsset(contextTarget, containerName);
-
- // 4.Create a write SAS locator for the target empty asset to have write access to the container in the target Storage account (associated with the target Media Services account)
- ILocator writeSasLocator = CreateSasWriteLocator(contextTarget, targetAsset);
-
- // Get asset container name.
- string targetContainerName = (new Uri(writeSasLocator.Path)).Segments[1];
- Console.WriteLine(targetContainerName);
-
- // 5.Copy the blobs in the source container (source asset) to the target container (target empty asset)
- CopyBlobsFromDifferentStorage(containerName, targetContainerName, storageNameSource, storageKeySource, storageNameTarget, storageKeyTarget);
-
- // 6.Use the CreateFileInfos Media Services REST API to automatically generate all the IAssetFileΓÇÖs for the target asset.
- // This API call is not supported in the current Media Services SDK for .NET.
- CreateFileInfos(targetApiServer, tokenSPTarget, targetAsset.Id);
- // Check if the AssetFiles are now associated with the asset.
- Console.WriteLine("Asset files associated with the {0} asset:", targetAsset.Name);
- foreach (var af in targetAsset.AssetFiles)
- {
- Console.WriteLine(af.Name);
- }
-
- // 7.Copy the Origin locator of the source asset to the target asset by using the same Id
- var replicatedLocatorPath = CreateTargetOriginLocatorWithRest(contextSource, contextTarget, sourceOriginLocator.Id, targetAsset.Id);
-
- // Create a full URL to the manifest file. Use this for playback
- // in streaming media clients.
- string originalUrlForClientStreaming = sourceOriginLocator.Path + GetPrimaryFile(sourceOutputAsset).Name + "/manifest";
-
- Console.WriteLine("Original Locator Path: {0}\n", originalUrlForClientStreaming);
-
- string replicatedUrlForClientStreaming = replicatedLocatorPath + GetPrimaryFile(sourceOutputAsset).Name + "/manifest";
-
- Console.WriteLine("Replicated Locator Path: {0}", replicatedUrlForClientStreaming);
-
- readSasLocator.Delete();
- writeSasLocator.Delete();
- }
-
- }
- catch (Exception e)
- {
- Console.WriteLine("Exception:" + e.ToString());
- }
- }
- ```
-3. The following method definitions are called from Main. See comments for more details about each method.
-
- >[!NOTE]
- >There is a limit of 1,000,000 policies for different Media Services policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days and access permissions. For example, use the same ID for policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this topic](media-services-dotnet-manage-entities.md#limit-access-policies).
-
- ```csharp
- public static IAsset CreateAssetAndUploadSingleFile(CloudMediaContext context,
- AssetCreationOptions assetCreationOptions,
- string singleFilePath)
- {
- var assetName = "UploadSingleFile_" + DateTime.UtcNow.ToString();
-
- var asset = context.Assets.Create(assetName, assetCreationOptions);
-
- Console.WriteLine("Asset name: " + asset.Name);
-
- var fileName = Path.GetFileName(singleFilePath);
-
- var assetFile = asset.AssetFiles.Create(fileName);
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading of {0}", assetFile.Name);
-
- return asset;
- }
-
- public static IJob CreateEncodingJob(CloudMediaContext context, IAsset asset)
- {
- // Declare a new job.
- IJob job = context.Jobs.Create("My encoding job");
-
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName(context,
- "Media Encoder Standard");
-
- // Create a task with the encoding details, using a string preset.
- // In this case "Adaptive Streaming" preset is used.
- ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "Adaptive Streaming",
- TaskOptions.ProtectedConfiguration);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- var outputAssetName = "OutputAsset_" + Guid.NewGuid();
- task.OutputAssets.AddNew(outputAssetName,
- AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new
- EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Optionally log job details. This displays basic job details
- // to the console and saves them to a JobDetails-{JobId}.txt file
- // in your output folder.
- LogJobDetails(context, job.Id);
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // Get an updated job reference.
- job = context.Jobs.Where(j => j.Id == job.Id).FirstOrDefault();
- // Since we the output asset contains a set of Smooth Streaming files,
- // set the .ism file to be the primary file
- if (job.State != JobState.Error)
- SetPrimaryFile(job.OutputMediaAssets[0]);
-
- return job;
- }
-
- public static ILocator GetStreamingOriginLocator(CloudMediaContext context, IAsset assetToStream)
- {
- // Get a reference to the streaming manifest file from the
- // collection of files in the asset.
- IAssetFile manifestFile = GetPrimaryFile(assetToStream);
-
- // Create a 30-day readonly access policy.
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
-
- IAccessPolicy policy = context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create a locator to the streaming content on an origin.
- ILocator originLocator = context.Locators.CreateLocator(LocatorType.OnDemandOrigin,
- assetToStream,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Return the locator.
- return originLocator;
- }
-
- public static ILocator GetSasReadLocator(CloudMediaContext context, IAsset asset)
- {
- IAccessPolicy accessPolicy = context.AccessPolicies.Create("File Download Policy",
- TimeSpan.FromDays(30), AccessPermissions.Read);
-
- ILocator sasLocator = context.Locators.CreateLocator(LocatorType.Sas,
- asset, accessPolicy);
-
- return sasLocator;
- }
-
- public static ILocator CreateSasWriteLocator(CloudMediaContext context, IAsset asset)
- {
-
- IAccessPolicy writePolicy = context.AccessPolicies.Create("Write Policy",
- TimeSpan.FromDays(30), AccessPermissions.Write);
-
- ILocator sasLocator = context.Locators.CreateLocator(LocatorType.Sas,
- asset, writePolicy);
-
- return sasLocator;
- }
-
- public static IAsset CreateTargetEmptyAsset(CloudMediaContext context, string containerName)
- {
- // Create a new asset.
- IAsset assetToBeProcessed = context.Assets.Create(containerName,
- AssetCreationOptions.None);
-
- return assetToBeProcessed;
- }
-
- public static string CreateTargetOriginLocatorWithRest(CloudMediaContext contextSource, CloudMediaContext contextTarget, string locatorIdToReplicate, string targetAssetId)
- {
- string locatorNewPath = "";
-
- if (!string.IsNullOrEmpty(tokenSPTarget))
- {
- var asset = contextTarget.Assets.Where(a => a.Id == targetAssetId).FirstOrDefault();
-
- // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
- var accessPolicy = contextTarget.AccessPolicies.Create("RestTest", TimeSpan.FromDays(100),
- AccessPermissions.Read);
- if (asset != null)
- {
- string redirectedServiceUri = null;
-
- var xmlResponse = CreateLocator(targetApiServer, out redirectedServiceUri, tokenSPTarget,
- asset.Id, accessPolicy.Id,
- (int)LocatorType.OnDemandOrigin,
- DateTime.UtcNow.AddMinutes(-10), locatorIdToReplicate);
-
- Console.WriteLine("Redirected to: " + redirectedServiceUri);
- if (xmlResponse != null)
- {
- Console.WriteLine(String.Format("Locator Id: {0}",
- xmlResponse.GetElementsByTagName("Id")[0].InnerText));
- Console.WriteLine(String.Format("Locator Path: {0}",
- xmlResponse.GetElementsByTagName("Path")[0].InnerText));
-
- locatorNewPath = xmlResponse.GetElementsByTagName("Path")[0].InnerText;
- }
- }
- }
-
- return locatorNewPath;
- }
-
- public static void SetPrimaryFile(IAsset asset)
- {
- var ismAssetFiles = asset.AssetFiles.ToList().
- Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase));
-
- if (ismAssetFiles.Count() != 1)
- throw new ArgumentException("The asset should have only one, .ism file");
-
- ismAssetFiles.First().IsPrimary = true;
- ismAssetFiles.First().Update();
- }
-
- public static IAssetFile GetPrimaryFile(IAsset asset)
- {
- // Cast the reference to a true IAssetFile type.
- IAssetFile theManifest = asset.AssetFiles.ToList().
- Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase)).
- FirstOrDefault();
-
- return theManifest;
- }
-
- public static void CopyBlobsFromDifferentStorage(string sourceContainerName, string targetContainerName,
- string srcAccountName, string srcAccountKey,
- string destAccountName, string destAccountKey)
- {
- var srcAccount = new CloudStorageAccount(new StorageCredentials(srcAccountName, srcAccountKey), true);
- var destAccount = new CloudStorageAccount(new StorageCredentials(destAccountName, destAccountKey), true);
-
- var cloudBlobClient = srcAccount.CreateCloudBlobClient();
- var targetBlobClient = destAccount.CreateCloudBlobClient();
-
- var sourceContainer = cloudBlobClient.GetContainerReference(sourceContainerName);
- var targetContainer = targetBlobClient.GetContainerReference(targetContainerName);
- targetContainer.CreateIfNotExists();
-
- string blobToken = sourceContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
- {
- // Specify the expiration time for the signature.
- SharedAccessExpiryTime = DateTime.Now.AddDays(1),
- // Specify the permissions granted by the signature.
- Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
- });
-
- foreach (var sourceBlob in sourceContainer.ListBlobs())
- {
- string fileName = (sourceBlob as ICloudBlob).Name;
- var sourceCloudBlob = sourceContainer.GetBlockBlobReference(fileName);
- sourceCloudBlob.FetchAttributes();
-
- if (sourceCloudBlob.Properties.Length > 0)
- {
- // In Azure Media Services, the files are stored as block blobs.
- // Page blobs are not supported by Azure Media Services.
- var destinationBlob = targetContainer.GetBlockBlobReference(fileName);
- destinationBlob.StartCopy(new Uri(sourceBlob.Uri.AbsoluteUri + blobToken));
-
- while (true)
- {
- // The StartCopyFromBlob is an async operation,
- // so we want to check if the copy operation is completed before proceeding.
- // To do that, we call FetchAttributes on the blob and check the CopyStatus.
- destinationBlob.FetchAttributes();
- if (destinationBlob.CopyState.Status != CopyStatus.Pending)
- {
- break;
- }
- //It's still not completed. So wait for some time.
- System.Threading.Thread.Sleep(1000);
- }
- }
-
- Console.WriteLine(fileName);
- }
-
- Console.WriteLine("Done copying.");
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(CloudMediaContext context, string mediaProcessorName)
- {
-
- var processor = context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
-
- // This method is a handler for events that track job progress.
- private static void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("********************");
- Console.WriteLine("Job is finished.");
- Console.WriteLine("Please wait while local tasks or downloads complete...");
- Console.WriteLine("********************");
- Console.WriteLine();
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- LogJobStop(null, job.Id);
- break;
- default:
- break;
- }
- }
-
- private static void LogJobStop(CloudMediaContext context, string jobId)
- {
- StringBuilder builder = new StringBuilder();
- IJob job = context.Jobs.Where(j => j.Id == jobId).FirstOrDefault();
-
- builder.AppendLine("\nThe job stopped due to cancellation or an error.");
- builder.AppendLine("***************************");
- builder.AppendLine("Job ID: " + job.Id);
- builder.AppendLine("Job Name: " + job.Name);
- builder.AppendLine("Job State: " + job.State.ToString());
- builder.AppendLine("Job started (server UTC time): " + job.StartTime.ToString());
- // Log job errors if they exist.
- if (job.State == JobState.Error)
- {
- builder.Append("Error Details: \n");
- foreach (ITask task in job.Tasks)
- {
- foreach (ErrorDetail detail in task.ErrorDetails)
- {
- builder.AppendLine(" Task Id: " + task.Id);
- builder.AppendLine(" Error Code: " + detail.Code);
- builder.AppendLine(" Error Message: " + detail.Message + "\n");
- }
- }
- }
- builder.AppendLine("***************************\n");
- // Write the output to a local file and to the console. The template
- // for an error output file is: JobStop-{JobId}.txt
- string outputFile = OutputFilesFolder + @"\JobStop-" + JobIdAsFileName(job.Id) + ".txt";
- WriteToFile(outputFile, builder.ToString());
- Console.Write(builder.ToString());
- }
-
- private static void LogJobDetails(CloudMediaContext context, string jobId)
- {
- StringBuilder builder = new StringBuilder();
- IJob job = context.Jobs.Where(j => j.Id == jobId).FirstOrDefault();
-
- builder.AppendLine("\nJob ID: " + job.Id);
- builder.AppendLine("Job Name: " + job.Name);
- builder.AppendLine("Job submitted (client UTC time): " + DateTime.UtcNow.ToString());
-
- // Write the output to a local file and to the console. The template
- // for an error output file is: JobDetails-{JobId}.txt
- string outputFile = OutputFilesFolder + @"\JobDetails-" + JobIdAsFileName(job.Id) + ".txt";
- WriteToFile(outputFile, builder.ToString());
- Console.Write(builder.ToString());
- }
-
- // Replace ":" with "_" in Job id values so they can
- // be used as log file names.
- private static string JobIdAsFileName(string jobID)
- {
- return jobID.Replace(":", "_");
- }
-
- // Write method output to the output files folder.
- private static void WriteToFile(string outFilePath, string fileContent)
- {
- StreamWriter sr = File.CreateText(outFilePath);
- sr.WriteLine(fileContent);
- sr.Close();
- }
-
- //////////////////////////////////////////////////////
- /// The following methods use REST calls.
- //////////////////////////////////////////////////////
- public static XmlDocument CreateLocator(string mediaServicesApiServerUri,
- out string redirectedMediaServicesApiServerUri,
- string bearerToken, string assetId,
- string accessPolicyId, int locatorType,
- DateTime startTime, string locatorIdToReplicate = null,
- bool autoRedirect = true)
- {
- if (string.IsNullOrEmpty(mediaServicesApiServerUri))
- {
- mediaServicesApiServerUri = "https://media.windows.net/api/";
- }
- if (!mediaServicesApiServerUri.EndsWith("/"))
- mediaServicesApiServerUri = mediaServicesApiServerUri + "/";
-
- if (string.IsNullOrEmpty(bearerToken)) throw new ArgumentNullException("BearerToken");
- if (string.IsNullOrEmpty(assetId)) throw new ArgumentNullException("assetId");
- if (string.IsNullOrEmpty(accessPolicyId)) throw new ArgumentNullException("accessPolicyId");
-
- redirectedMediaServicesApiServerUri = null;
- XmlDocument xmlResponse = null;
-
- StringBuilder sb = new StringBuilder();
- sb.Append("{ \"AssetId\" : \"" + assetId + "\"");
- sb.Append(", \"AccessPolicyId\" : \"" + accessPolicyId + "\"");
- sb.Append(", \"Type\" : \"" + locatorType + "\"");
- if (startTime != DateTime.MinValue)
- sb.Append(", \"StartTime\" : \"" + startTime.ToString("G", CultureInfo.CreateSpecificCulture("en-us")) + "\"");
- if (!string.IsNullOrEmpty(locatorIdToReplicate))
- sb.Append(", \"Id\" : \"" + locatorIdToReplicate + "\"");
- sb.Append("}");
-
- string requestbody = sb.ToString();
-
- try
- {
- var request = GenerateRequest("POST", mediaServicesApiServerUri, "Locators",
- null, bearerToken, requestbody);
- var response = (HttpWebResponse)request.GetResponse();
-
- switch (response.StatusCode)
- {
- case HttpStatusCode.MovedPermanently:
- //Recurse once with the mediaServicesApiServerUri redirect Location:
- if (autoRedirect)
- {
- redirectedMediaServicesApiServerUri = response.Headers["Location"];
- string secondRedirection = null;
- xmlResponse = CreateLocator(redirectedMediaServicesApiServerUri,
- out secondRedirection, bearerToken,
- assetId, accessPolicyId, locatorType,
- startTime, locatorIdToReplicate, false);
- }
- else
- {
- Console.WriteLine("Redirection to {0} failed.",
- mediaServicesApiServerUri);
- return null;
- }
- break;
- case HttpStatusCode.Created:
- using (Stream responseStream = response.GetResponseStream())
- {
- using (StreamReader stream = new StreamReader(responseStream))
- {
- string responseString = stream.ReadToEnd();
- var reader = JsonReaderWriterFactory.
- CreateJsonReader(Encoding.UTF8.GetBytes(responseString),
- new XmlDictionaryReaderQuotas());
-
- xmlResponse = new XmlDocument();
- reader.Read();
- xmlResponse.LoadXml(reader.ReadInnerXml());
- }
- }
- break;
-
- default:
- Console.WriteLine(response.StatusDescription);
- break;
- }
- }
- catch (WebException ex)
- {
- Console.WriteLine(ex.Message);
- }
-
- return xmlResponse;
- }
-
- public static void CreateFileInfos(string mediaServicesApiServerUri,
- string bearerToken,
- string assetId
- )
- {
- if (!mediaServicesApiServerUri.EndsWith("/"))
- mediaServicesApiServerUri = mediaServicesApiServerUri + "/";
-
- if (String.IsNullOrEmpty(bearerToken)) throw new ArgumentNullException("bearerToken");
- if (String.IsNullOrEmpty(assetId)) throw new ArgumentNullException("assetId");
-
- try
- {
- var request = GenerateRequest("GET", mediaServicesApiServerUri, "CreateFileInfos",
- String.Format(CultureInfo.InvariantCulture, "assetid='{0}'", assetId), bearerToken, null);
-
- using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
- {
- if (response.StatusCode == HttpStatusCode.MovedPermanently)
- {
- string redirectedMediaServicesApiUrl = response.Headers["Location"];
-
- CreateFileInfos(redirectedMediaServicesApiUrl, bearerToken, assetId);
- }
- else if ((response.StatusCode != HttpStatusCode.OK) &&
- (response.StatusCode != HttpStatusCode.Accepted) &&
- (response.StatusCode != HttpStatusCode.Created) &&
- (response.StatusCode != HttpStatusCode.NoContent))
- {
- throw new Exception("Invalid response received ");
- }
- }
- }
- catch (WebException ex)
- {
- Console.WriteLine(ex.Message);
- }
- }
-
- private static HttpWebRequest GenerateRequest(string verb,
- string mediaServicesApiServerUri,
- string resourcePath, string query,
- string bearerToken, string requestbody)
- {
- var uriBuilder = new UriBuilder(mediaServicesApiServerUri);
- uriBuilder.Path += resourcePath;
- if (query != null)
- {
- uriBuilder.Query = query;
- }
-
- HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uriBuilder.Uri);
- request.AllowAutoRedirect = false; //We manage our own redirects.
- request.Method = verb;
-
- if (resourcePath == "$metadata")
- request.MediaType = "application/xml";
- else
- {
- request.ContentType = "application/json;odata=verbose";
- request.Accept = "application/json;odata=verbose";
- }
-
- request.Headers.Add("DataServiceVersion", "3.0");
- request.Headers.Add("MaxDataServiceVersion", "3.0");
- request.Headers.Add("x-ms-version", "2.19");
- request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
-
- if (requestbody != null)
- {
- var requestBytes = Encoding.ASCII.GetBytes(requestbody);
- request.ContentLength = requestBytes.Length;
-
- var requestStream = request.GetRequestStream();
- requestStream.Write(requestBytes, 0, requestBytes.Length);
- requestStream.Close();
- }
- else
- {
- request.ContentLength = 0;
- }
- return request;
- }
- ```
-
-## Content protection
-
-The example in this topic shows clear streaming. If you want to do protected streaming, there are a few other things you need to setup, you need to use the same **AssetDeliveryPolicy**, the same **ContentKeyAuthorizationPolicy** or external key server URL, and you need to duplicate the content keys with the same identifier.
-
-For more information about content protection, see [Use AES-128 dynamic encryption and the key delivery service](media-services-playready-license-template-overview.md).
-
-## See also
-
-[Use Azure Webhooks to monitor Media Services job notifications](media-services-dotnet-check-job-progress-with-webhooks.md)
-
-## Next steps
-
-You can now use a traffic manager to route requests between the two datacenters, and thus fail over in case of any outages.
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Index Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-index-content.md
- Title: Indexing Media Files with Azure Media Indexer
-description: Azure Media Indexer enables you to make content of your media files searchable and to generate a full-text transcript for closed captioning and keywords. This topic shows how to use Media Indexer.
------ Previously updated : 03/10/2021----
-# Indexing Media Files with Azure Media Indexer
--
-> [!IMPORTANT]
-> It is recommended that customers migrate from Indexer v1 and Indexer v2 to using the [Media Services v3 AudioAnalyzerPreset Basic mode](../latest/analyze-video-audio-files-concept.md). The [Azure Media Indexer](media-services-index-content.md) media processor and [Azure Media Indexer 2 Preview](./legacy-components.md) media processors are being retired. For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-Azure Media Indexer enables you to make content of your media files searchable and to generate a full-text transcript for closed captioning and keywords. You can process one media file or multiple media files in a batch.
-
-When indexing content, make sure to use media files that have clear speech (without background music, noise, effects, or microphone hiss). Some examples of appropriate content are: recorded meetings, lectures, or presentations. The following content might not be suitable for indexing: movies, TV shows, anything with mixed audio and sound effects, poorly recorded content with background noise (hiss).
-
-An indexing job can generate the following outputs:
-
-* Closed caption files in the following formats: **TTML**, and **WebVTT**.
-
- Closed caption files include a tag called Recognizability, which scores an indexing job based on how recognizable the speech in the source video is. You can use the value of Recognizability to screen output files for usability. A low score would mean poor indexing results due to audio quality.
-* Keyword file (XML).
-
-This article shows how to create indexing jobs to **Index an asset** and **Index multiple files**.
-
-## Using configuration and manifest files for indexing tasks
-You can specify more details for your indexing tasks by using a task configuration. For example, you can specify which metadata to use for your media file. This metadata is used by the language engine to expand its vocabulary, and greatly improves the speech recognition accuracy. You are also able to specify your desired output files.
-
-You can also process multiple media files at once by using a manifest file.
-
-For more information, see [Task Preset for Azure Media Indexer](./legacy-components.md).
-
-## Index an asset
-The following method uploads a media file as an asset and creates a job to index the asset.
-
-If no configuration file is specified, the media file is indexed with all default settings.
-
-```csharp
- static bool RunIndexingJob(string inputMediaFilePath, string outputFolder, string configurationFile = "")
- {
- // Create an asset and upload the input media file to storage.
- IAsset asset = CreateAssetAndUploadSingleFile(inputMediaFilePath,
- "My Indexing Input Asset",
- AssetCreationOptions.None);
-
- // Declare a new job.
- IJob job = _context.Jobs.Create("My Indexing Job");
-
- // Get a reference to the Azure Media Indexer.
- string MediaProcessorName = "Azure Media Indexer";
- IMediaProcessor processor = GetLatestMediaProcessorByName(MediaProcessorName);
-
- // Read configuration from file if specified.
- string configuration = string.IsNullOrEmpty(configurationFile) ? "" : File.ReadAllText(configurationFile);
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("My Indexing Task",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input asset to be indexed.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- task.OutputAssets.AddNew("My Indexing Output Asset", AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // If job state is Error, the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- Console.WriteLine("Exiting method due to job error.");
- return false;
- }
-
- // Download the job outputs.
- DownloadAsset(task.OutputAssets.First(), outputFolder);
-
- return true;
- }
-
- static IAsset CreateAssetAndUploadSingleFile(string filePath, string assetName, AssetCreationOptions options)
- {
- IAsset asset = _context.Assets.Create(assetName, options);
-
- var assetFile = asset.AssetFiles.Create(Path.GetFileName(filePath));
- assetFile.Upload(filePath);
-
- return asset;
- }
-
- static void DownloadAsset(IAsset asset, string outputDirectory)
- {
- foreach (IAssetFile file in asset.AssetFiles)
- {
- file.Download(Path.Combine(outputDirectory, file.Name));
- }
- }
-
- static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors
- .Where(p => p.Name == mediaProcessorName)
- .ToList()
- .OrderBy(p => new Version(p.Version))
- .LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor",
- mediaProcessorName));
-
- return processor;
- }
-```
-
-<!-- __ -->
-### <a id="output_files"></a>Output files
-By default, an indexing job generates the following output files. The files are stored in the first output asset.
-
-When there is more than one input media file, Indexer generates a manifest file for the job outputs, named ΓÇÿJobResult.txtΓÇÖ. For each input media file, the resulting TTML, WebVTT, and keyword files are sequentially numbered and named using the "Alias."
-
-| File name | Description |
-| | |
-| **InputFileName.ttml**<br/>**InputFileName.vtt** |Closed Caption (CC) files in TTML and WebVTT formats.<br/><br/>They can be used to make audio and video files accessible to people with hearing disability.<br/><br/>Closed Caption files include a tag called <b>Recognizability</b> which scores an indexing job based on how recognizable the speech in the source video is. You can use the value of <b>Recognizability</b> to screen output files for usability. A low score would mean poor indexing results due to audio quality. |
-| **InputFileName.kw.xml<br/>InputFileName.info** |Keyword and info files. <br/><br/>Keyword file is an XML file that contains keywords extracted from the speech content, with frequency and offset information. <br/><br/>Info file is a plain-text file that contains granular information about each term recognized. The first line is special and contains the Recognizability score. Each subsequent line is a tab-separated list of the following data: start time, end time, word/phrase, confidence. The times are given in seconds and the confidence is given as a number from 0-1. <br/><br/>Example line: "1.20 1.45 word 0.67" <br/><br/>These files can be used for a number of purposes, such as, to perform speech analytics, or exposed to search engines such as Bing, Google or Microsoft SharePoint to make the media files more discoverable, or even used to deliver more relevant ads. |
-| **JobResult.txt** |Output manifest, present only when indexing multiple files, containing the following information:<br/><br/><table border="1"><tr><th>InputFile</th><th>Alias</th><th>MediaLength</th><th>Error</th></tr><tr><td>a.mp4</td><td>Media_1</td><td>300</td><td>0</td></tr><tr><td>b.mp4</td><td>Media_2</td><td>0</td><td>3000</td></tr><tr><td>c.mp4</td><td>Media_3</td><td>600</td><td>0</td></tr></table><br/> |
-
-If not all input media files are indexed successfully, the indexing job fails with error code 4000. For more information, see [Error codes](#error_codes).
-
-## Index multiple files
-The following method uploads multiple media files as an asset, and creates a job to index all these files in a batch.
-
-A manifest file with the ".lst" extension is created and uploading into the asset. The manifest file contains the list of all the asset files. For more information, see [Task Preset for Azure Media Indexer](./legacy-components.md).
-
-```csharp
- static bool RunBatchIndexingJob(string[] inputMediaFiles, string outputFolder)
- {
- // Create an asset and upload to storage.
- IAsset asset = CreateAssetAndUploadMultipleFiles(inputMediaFiles,
- "My Indexing Input Asset - Batch Mode",
- AssetCreationOptions.None);
-
- // Create a manifest file that contains all the asset file names and upload to storage.
- string manifestFile = "input.lst";
- File.WriteAllLines(manifestFile, asset.AssetFiles.Select(f => f.Name).ToArray());
- var assetFile = asset.AssetFiles.Create(Path.GetFileName(manifestFile));
- assetFile.Upload(manifestFile);
-
- // Declare a new job.
- IJob job = _context.Jobs.Create("My Indexing Job - Batch Mode");
-
- // Get a reference to the Azure Media Indexer.
- string MediaProcessorName = "Azure Media Indexer";
- IMediaProcessor processor = GetLatestMediaProcessorByName(MediaProcessorName);
-
- // Read configuration.
- string configuration = File.ReadAllText("batch.config");
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("My Indexing Task - Batch Mode",
- processor,
- configuration,
- TaskOptions.None);
-
- // Specify the input asset to be indexed.
- task.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- task.OutputAssets.AddNew("My Indexing Output Asset - Batch Mode", AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // If job state is Error, the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- Console.WriteLine("Exiting method due to job error.");
- return false;
- }
-
- // Download the job outputs.
- DownloadAsset(task.OutputAssets.First(), outputFolder);
-
- return true;
- }
-
- private static IAsset CreateAssetAndUploadMultipleFiles(string[] filePaths, string assetName, AssetCreationOptions options)
- {
- IAsset asset = _context.Assets.Create(assetName, options);
-
- foreach (string filePath in filePaths)
- {
- var assetFile = asset.AssetFiles.Create(Path.GetFileName(filePath));
- assetFile.Upload(filePath);
- }
-
- return asset;
- }
-```
-
-### Partially Succeeded Job
-If not all input media files are indexed successfully, the indexing job will fail with error code 4000. For more information, see [Error codes](#error_codes).
-
-The same outputs (as succeeded jobs) are generated. You can refer to the output manifest file to find out which input files are failed, according to the Error column values. For input files that failed, the resulting TTML, WebVTT, and keyword files will NOT be generated.
-
-### <a id="preset"></a> Task Preset for Azure Media Indexer
-The processing from Azure Media Indexer can be customized by providing an optional task preset alongside the task. The following describes the format of this configuration xml.
-
-| Name | Require | Description |
-| | | |
-| **input** |false |Asset file(s) that you want to index.</p><p>Azure Media Indexer supports the following media file formats: MP4, WMV, MP3, M4A, WMA, AAC, WAV.</p><p>You can specify the file name (s) in the **name** or **list** attribute of the **input** element (as shown below).If you do not specify which asset file to index, the primary file is picked. If no primary asset file is set, the first file in the input asset is indexed.</p><p>To explicitly specify the asset file name, do:<br/>`<input name="TestFile.wmv">`<br/><br/>You can also index multiple asset files at once (up to 10 files). To do this:<br/><br/><ol class="ordered"><li><p>Create a text file (manifest file) and give it an .lst extension. </p></li><li><p>Add a list of all the asset file names in your input asset to this manifest file. </p></li><li><p>Add (upload) the manifest file to the asset. </p></li><li><p>Specify the name of the manifest file in the inputΓÇÖs list attribute.<br/>`<input list="input.lst">`</li></ol><br/><br/>Note: If you add more than 10 files to the manifest file, the indexing job will fail with the 2006 error code. |
-| **metadata** |false |Metadata for the specified asset file(s) used for Vocabulary Adaptation. Useful to prepare Indexer to recognize non-standard vocabulary words such as proper nouns.<br/>`<metadata key="..." value="..."/>` <br/><br/>You can supply **values** for predefined **keys**. Currently the following keys are supported:<br/><br/>ΓÇ£titleΓÇ¥ and ΓÇ£descriptionΓÇ¥ - used for vocabulary adaptation to tweak the language model for your job and improve speech recognition accuracy. The values seed Internet searches to find contextually relevant text documents, using the contents to augment the internal dictionary for the duration of your Indexing task.<br/>`<metadata key="title" value="[Title of the media file]" />`<br/>`<metadata key="description" value="[Description of the media file] />"` |
-| **features** <br/><br/> Added in version 1.2. Currently, the only supported feature is speech recognition ("ASR"). |false |The Speech Recognition feature has the following settings keys:<table><tr><th><p>Key</p></th> <th><p>Description</p></th><th><p>Example value</p></th></tr><tr><td><p>Language</p></td><td><p>The natural language to be recognized in the multimedia file.</p></td><td><p>English, Spanish</p></td></tr><tr><td><p>CaptionFormats</p></td><td><p>a semicolon-separated list of the desired output caption formats (if any)</p></td><td><p>ttml;webvtt</p></td></tr><tr><td><p></p></td><td><p> </p></td><td><p>True; False</p></td></tr><tr><td><p>GenerateKeywords</p></td><td><p>A boolean flag specifying whether or not a keyword XML file is required.</p></td><td><p>True; False. </p></td></tr><tr><td><p>ForceFullCaption</p></td><td><p>A boolean flag specifying whether or not to force full captions (regardless of confidence level). </p><p>Default is false, in which case words and phrases which have a less than 50% confidence level are omitted from the final caption outputs and replaced by ellipses ("..."). The ellipses are useful for caption quality control and auditing.</p></td><td><p>True; False. </p></td></tr></table> |
-
-### <a id="error_codes"></a>Error codes
-In the case of an error, Azure Media Indexer should report back one of the following error codes:
-
-| Code | Name | Possible Reasons |
-| | | |
-| 2000 |Invalid configuration |Invalid configuration |
-| 2001 |Invalid input assets |Missing input assets or empty asset. |
-| 2002 |Invalid manifest |Manifest is empty or manifest contains invalid items. |
-| 2003 |Failed to download media file |Invalid URL in manifest file. |
-| 2004 |Unsupported protocol |Protocol of media URL is not supported. |
-| 2005 |Unsupported file type |Input media file type is not supported. |
-| 2006 |Too many input files |There are more than 10 files in the input manifest. |
-| 3000 |Failed to decode media file |Unsupported media codec <br/>or<br/> Corrupted media file <br/>or<br/> No audio stream in input media. |
-| 4000 |Batch indexing partially succeeded |Some of the input media files are failed to be indexed. For more information, see <a href="#output_files">Output files</a>. |
-| other |Internal errors |Please contact support team. indexer@microsoft.com |
-
-## <a id="supported_languages"></a>Supported Languages
-Currently, the English and Spanish languages are supported.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Related links
-[Azure Media Services Analytics Overview](./legacy-components.md)
-
-[Indexing Media Files with Azure Media Indexer 2 Preview](./legacy-components.md)
media-services Media Services Input Metadata Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-input-metadata-schema.md
- Title: Azure Media Services input metadata schema | Microsoft Docs
-description: This article gives an overview of Azure Media Services input metadata schema.
------ Previously updated : 03/10/2021--
-# Input Metadata
--
-An encoding job is associated with an input asset (or assets) on which you want to perform some encoding tasks. Upon completion of a task, an output asset is produced. The output asset contains video, audio, thumbnails, manifest, etc. The output asset also contains a file with metadata about the input asset. The name of the metadata XML file has the following format: &lt;asset_id&gt;_metadata.xml (for example, 41114ad3-eb5e-4c57-8d92-5354e2b7d4a4_metadata.xml), where &lt;asset_id&gt; is the AssetId value of the input asset.
-
-Media Services does not pre-emptively scan input Assets to generate metadata. Input metadata is generated only as an artifact when an input Asset is processed in a Job. Hence this artifact is written to the output Asset. Different tools are used to generate metadata for input Assets and output Assets. Therefore, the input metadata has a slightly different schema than the output metadata.
-
-If you want to examine the metadata file, you can create a **SAS** locator and download the file to your local computer. You can find an example on how to create a SAS locator and download a file [Using the Media Services .NET SDK Extensions](media-services-dotnet-get-started.md).
-
-This article discusses the elements and types of the XML schema on which the input metada (&lt;asset_id&gt;_metadata.xml) is based. For information about the file that contains metadata about the output asset, see [Output Metadata](media-services-output-metadata-schema.md).
-
-You can find the [Schema Code](media-services-input-metadata-schema.md#code) an [XML example](media-services-input-metadata-schema.md#xml) at the end of this article.
-
-
-## <a name="AssetFiles"></a> AssetFiles element (root element)
-Contains a collection of [AssetFile element](media-services-input-metadata-schema.md#AssetFile)s for the encoding job.
-
-See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-| Name | Description |
-| | |
-| **AssetFile**<br /><br /> minOccurs="1" maxOccurs="unbounded" |A single child element. For more information, see [AssetFile element](media-services-input-metadata-schema.md#AssetFile). |
-
-## <a name="AssetFile"></a> AssetFile element
- Contains attributes and elements that describe an asset file.
-
- See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Name**<br /><br /> Required |**xs:string** |Asset file name. |
-| **Size**<br /><br /> Required |**xs:long** |Size of the asset file in bytes. |
-| **Duration**<br /><br /> Required |**xs:duration** |Content play back duration. Example: Duration="PT25M37.757S". |
-| **NumberOfStreams**<br /><br /> Required |**xs:int** |Number of streams in the asset file. |
-| **FormatNames**<br /><br /> Required |**xs: string** |Format names. |
-| **FormatVerboseNames**<br /><br /> Required |**xs: string** |Format verbose names. |
-| **StartTime** |**xs:duration** |Content start time. Example: StartTime="PT2.669S". |
-| **OverallBitRate** |**xs: int** |Average bitrate of the asset file in kbps. |
-
-> [!NOTE]
-> The following four child elements must appear in a sequence.
->
->
-
-### Child elements
-| Name | Type | Description |
-| | | |
-| **Programs**<br /><br /> minOccurs="0" | |Collection of all [Programs element](media-services-input-metadata-schema.md#Programs) when the asset file is in MPEG-TS format. |
-| **VideoTracks**<br /><br /> minOccurs="0" | |Each physical asset file can contain zero or more videos tracks interleaved into an appropriate container format. This element contains a collection of all [VideoTracks](media-services-input-metadata-schema.md#VideoTracks) that are part of the asset file. |
-| **AudioTracks**<br /><br /> minOccurs="0" | |Each physical asset file can contain zero or more audio tracks interleaved into an appropriate container format. This element contains a collection of all [AudioTracks](media-services-input-metadata-schema.md#AudioTracks) that are part of the asset file. |
-| **Metadata**<br /><br /> minOccurs="0" maxOccurs="unbounded" |[MetadataType](media-services-input-metadata-schema.md#MetadataType) |Asset fileΓÇÖs metadata represented as key\value strings. For example:<br /><br /> **&lt;Metadata key="language" value="eng" /&gt;** |
-
-## <a name="TrackType"></a> TrackType
-See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Id**<br /><br /> Required |**xs:int** |Zero-based index of this audio or video track.<br /><br /> This is not necessarily that the TrackID as used in an MP4 file. |
-| **Codec** |**xs:string** |Video track codec string. |
-| **CodecLongName** |**xs: string** |Audio or video track codec long name. |
-| **TimeBase**<br /><br /> Required |**xs:string** |Time base. Example: TimeBase="1/48000" |
-| **NumberOfFrames** |**xs:int** |Number of frames (present for video tracks). |
-| **StartTime** |**xs: duration** |Track start time. Example: StartTime="PT2.669S" |
-| **Duration** |**xs:duration** |Track duration. Example: Duration="PTSampleFormat M37.757S". |
-
-> [!NOTE]
-> The following two child elements must appear in a sequence.
->
->
-
-### Child elements
-| Name | Type | Description |
-| | | |
-| **Disposition**<br /><br /> minOccurs="0" maxOccurs="1" |[StreamDispositionType](media-services-input-metadata-schema.md#StreamDispositionType) |Contains presentation information (for example, whether a particular audio track is for visually impaired viewers). |
-| **Metadata**<br /><br /> minOccurs="0" maxOccurs="unbounded" |[MetadataType](media-services-input-metadata-schema.md#MetadataType) |Generic key/value strings that can be used to hold a variety of information. For example, key=ΓÇ¥languageΓÇ¥, and value=ΓÇ¥engΓÇ¥. |
-
-## <a name="AudioTrackType"></a> AudioTrackType (inherits from TrackType)
- **AudioTrackType** is a global complex type that inherits from [TrackType](media-services-input-metadata-schema.md#TrackType).
-
- The type represents a specific audio track in the asset file.
-
- See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **SampleFormat** |**xs:string** |Sample format. |
-| **ChannelLayout** |**xs: string** |Channel layout. |
-| **Channels**<br /><br /> Required |**xs:int** |Number (0 or more) of audio channels. |
-| **SamplingRate**<br /><br /> Required |**xs:int** |Audio sampling rate in samples/sec or Hz. |
-| **Bitrate** |**xs:int** |Average audio bit rate in bits per second, as calculated from the asset file. Only the elementary stream payload is counted, and the packaging overhead is not included in this count. |
-| **BitsPerSample** |**xs:int** |Bits per sample for the wFormatTag format type. |
-
-## <a name="VideoTrackType"></a> VideoTrackType (inherits from TrackType)
-**VideoTrackType** is a global complex type that inherits from [TrackType](media-services-input-metadata-schema.md#TrackType).
-
-The type represents a specific video track in the asset file.
-
-See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **FourCC**<br /><br /> Required |**xs:string** |Video codec FourCC code. |
-| **Profile** |**xs: string** |Video track's profile. |
-| **Level** |**xs: string** |Video track's level. |
-| **PixelFormat** |**xs: string** |Video track's pixel format. |
-| **Width**<br /><br /> Required |**xs:int** |Encoded video width in pixels. |
-| **Height**<br /><br /> Required |**xs:int** |Encoded video height in pixels. |
-| **DisplayAspectRatioNumerator**<br /><br /> Required |**xs: double** |Video display aspect ratio numerator. |
-| **DisplayAspectRatioDenominator**<br /><br /> Required |**xs:double** |Video display aspect ratio denominator. |
-| **DisplayAspectRatioDenominator**<br /><br /> Required |**xs: double** |Video sample aspect ratio numerator. |
-| **SampleAspectRatioNumerator** |**xs: double** |Video sample aspect ratio numerator. |
-| **SampleAspectRatioNumerator** |**xs:double** |Video sample aspect ratio denominator. |
-| **FrameRate**<br /><br /> Required |**xs:decimal** |Measured video frame rate in .3f format. |
-| **Bitrate** |**xs:int** |Average video bit rate in kilobits per second, as calculated from the asset file. Only the elementary stream payload is counted, and the packaging overhead is not included. |
-| **MaxGOPBitrate** |**xs: int** |Max GOP average bitrate for this video track, in kilobits per second. |
-| **HasBFrames** |**xs:int** |Video track number of B frames. |
-
-## <a name="MetadataType"></a> MetadataType
-**MetadataType** is a global complex type that describes metadata of an asset file as key/value strings. For example, key=ΓÇ¥languageΓÇ¥, and value=ΓÇ¥engΓÇ¥.
-
-See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **key**<br /><br /> Required |**xs:string** |The key in the key/value pair. |
-| **value**<br /><br /> Required |**xs:string** |The value in the key/value pair. |
-
-## <a name="ProgramType"></a> ProgramType
-**ProgramType** is a global complex type that describes a program.
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **ProgramId**<br /><br /> Required |**xs:int** |Program Id |
-| **NumberOfPrograms**<br /><br /> Required |**xs:int** |Number of programs. |
-| **PmtPid**<br /><br /> Required |**xs:int** |Program Map Tables (PMTs) contain information about programs. For more information, see [PMt](https://en.wikipedia.org/wiki/MPEG_transport_stream#PMT). |
-| **PcrPid**<br /><br /> Required |**xs: int** |Used by decoder. For more information, see [PCR](https://en.wikipedia.org/wiki/MPEG_transport_stream#PCR) |
-| **StartPTS** |**xs: long** |Starting presentation time stamp. |
-| **EndPTS** |**xs: long** |Ending presentation time stamp. |
-
-## <a name="StreamDispositionType"></a> StreamDispositionType
-**StreamDispositionType** is a global complex type that describes the stream.
-
-See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Default**<br /><br /> Required |**xs: int** |Set this attribute to 1 to indicate this is the default presentation. |
-| **Dub**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this is the dubbed presentation. |
-| **Original**<br /><br /> Required |**xs: int** |Set this attribute to 1 to indicate this is the original presentation. |
-| **Comment**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this track contains commentary. |
-| **Lyrics**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this track contains lyrics. |
-| **Karaoke**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this represents the karaoke track (background music, no vocals). |
-| **Forced**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this is the forced presentation. |
-| **HearingImpaired**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this track is for people who are hard of hearing. |
-| **VisualImpaired**<br /><br /> Required |**xs:int** |Set this attribute to 1 to indicate this track is for the visually impaired. |
-| **CleanEffects**<br /><br /> Required |**xs: int** |Set this attribute to 1 to indicate this track has clean effects. |
-| **AttachedPic**<br /><br /> Required |**xs: int** |Set this attribute to 1 to indicate this track has pictures. |
-
-## <a name="Programs"></a> Programs element
-Wrapper element holding multiple **Program** elements.
-
-### Child elements
-| Name | Type | Description |
-| | | |
-| **Program**<br /><br /> minOccurs="0" maxOccurs="unbounded" |[ProgramType](media-services-input-metadata-schema.md#ProgramType) |For asset files that are in MPEG-TS format, contains information about programs in the asset file. |
-
-## <a name="VideoTracks"></a> VideoTracks element
- Wrapper element holding multiple **VideoTrack** elements.
-
- See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### Child elements
-| Name | Type | Description |
-| | | |
-| **VideoTrack**<br /><br /> minOccurs="0" maxOccurs="unbounded" |[VideoTrackType (inherits from TrackType)](media-services-input-metadata-schema.md#VideoTrackType) |Contains information about video tracks in the asset file. |
-
-## <a name="AudioTracks"></a> AudioTracks element
- Wrapper element holding multiple **AudioTrack** elements.
-
- See an XML example at the end of this article: [XML example](media-services-input-metadata-schema.md#xml).
-
-### elements
-| Name | Type | Description |
-| | | |
-| **AudioTrack**<br /><br /> minOccurs="0" maxOccurs="unbounded" |[AudioTrackType (inherits from TrackType)](media-services-input-metadata-schema.md#AudioTrackType) |Contains information about audio tracks in the asset file. |
-
-## <a name="code"></a> Schema Code
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<xs:schema xmlns:xs="https://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata" version="1.0"
- xmlns="http://schemas.microsoft.com/windowsazure/mediaservices/2014/07/mediaencoder/inputmetadata"
- targetNamespace="http://schemas.microsoft.com/windowsazure/mediaservices/2014/07/mediaencoder/inputmetadata"
- elementFormDefault="qualified">
-
- <xs:complexType name="MetadataType">
- <xs:attribute name="key" type="xs:string" use="required"/>
- <xs:attribute name="value" type="xs:string" use="required"/>
- </xs:complexType>
-
- <xs:complexType name="ProgramType">
- <xs:attribute name="ProgramId" type="xs:int" use="required">
- <xs:annotation>
- <xs:documentation>Program Id</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="NumberOfPrograms" type="xs:int" use="required">
- <xs:annotation>
- <xs:documentation>Number of programs</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="PmtPid" type="xs:int" use="required">
- <xs:annotation>
- <xs:documentation>pmt pid</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="PcrPid" type="xs:int" use="required">
- <xs:annotation>
- <xs:documentation>pcr pid</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="StartPTS" type="xs:long">
- <xs:annotation>
- <xs:documentation>start pts</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="EndPTS" type="xs:long">
- <xs:annotation>
- <xs:documentation>end pts</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- </xs:complexType>
-
- <xs:complexType name="StreamDispositionType">
- <xs:attribute name="Default" type="xs:int" use="required" />
- <xs:attribute name="Dub" type="xs:int" use="required" />
- <xs:attribute name="Original" type="xs:int" use="required" />
- <xs:attribute name="Comment" type="xs:int" use="required" />
- <xs:attribute name="Lyrics" type="xs:int" use="required" />
- <xs:attribute name="Karaoke" type="xs:int" use="required" />
- <xs:attribute name="Forced" type="xs:int" use="required" />
- <xs:attribute name="HearingImpaired" type="xs:int" use="required" />
- <xs:attribute name="VisualImpaired" type="xs:int" use="required" />
- <xs:attribute name="CleanEffects" type="xs:int" use="required" />
- <xs:attribute name="AttachedPic" type="xs:int" use="required" />
- </xs:complexType>
-
- <xs:complexType name="TrackType" abstract="true">
- <xs:sequence>
- <xs:element name="Disposition" type="StreamDispositionType" minOccurs="0" maxOccurs="1"/>
- <xs:element name="Metadata" type="MetadataType" minOccurs="0" maxOccurs="unbounded"/>
- </xs:sequence>
- <xs:attribute name="Id" use="required">
- <xs:annotation>
- <xs:documentation>zero-based index of this video track. Note: this is not necessarily the TrackID as used in an MP4 file</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Codec" type="xs:string">
- <xs:annotation>
- <xs:documentation>video track codec string</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="CodecLongName" type="xs:string">
- <xs:annotation>
- <xs:documentation>video track codec long name</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="TimeBase" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>Time base. Example: TimeBase="1/48000"</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="NumberOfFrames">
- <xs:annotation>
- <xs:documentation>number of frames</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="StartTime" type="xs:duration">
- <xs:annotation>
- <xs:documentation>Track start time. Example: StartTime="PT2.669S"</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Duration" type="xs:duration">
- <xs:annotation>
- <xs:documentation>Track duration. Example: Duration="PT25M37.757S"</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- </xs:complexType>
-
- <xs:complexType name="VideoTrackType">
- <xs:annotation>
- <xs:documentation>A specific video track in the parent AssetFile</xs:documentation>
- </xs:annotation>
- <xs:complexContent>
- <xs:extension base="TrackType">
- <xs:attribute name="FourCC" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>video codec FourCC code</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Profile" type="xs:string">
- <xs:annotation>
- <xs:documentation>profile</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Level" type="xs:string">
- <xs:annotation>
- <xs:documentation>level</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="PixelFormat" type="xs:string">
- <xs:annotation>
- <xs:documentation>Video track's pixel format</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Width" use="required">
- <xs:annotation>
- <xs:documentation>encoded video width in pixels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Height" use="required">
- <xs:annotation>
- <xs:documentation>encoded video height in pixels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="DisplayAspectRatioNumerator" use="required">
- <xs:annotation>
- <xs:documentation>video display aspect ratio numerator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="DisplayAspectRatioDenominator" use="required">
- <xs:annotation>
- <xs:documentation>video display aspect ratio denominator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="SampleAspectRatioNumerator">
- <xs:annotation>
- <xs:documentation>video sample aspect ratio numerator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="SampleAspectRatioDenominator">
- <xs:annotation>
- <xs:documentation>video sample aspect ratio denominator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="FrameRate" use="required">
- <xs:annotation>
- <xs:documentation>measured video frame rate in .3f format</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:decimal">
- <xs:minInclusive value="0"/>
- <xs:fractionDigits value="3"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Bitrate">
- <xs:annotation>
- <xs:documentation>average video bit rate in kilobits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="MaxGOPBitrate">
- <xs:annotation>
- <xs:documentation>Max GOP average bitrate for this video track, in kilobits per second</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="HasBFrames" type="xs:int">
- <xs:annotation>
- <xs:documentation>video track number of B frames</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
-
- <xs:complexType name="AudioTrackType">
- <xs:annotation>
- <xs:documentation>a specific audio track in the parent AssetFile</xs:documentation>
- </xs:annotation>
- <xs:complexContent>
- <xs:extension base="TrackType">
- <xs:attribute name="SampleFormat" type="xs:string">
- <xs:annotation>
- <xs:documentation>sample format</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="ChannelLayout" type="xs:string">
- <xs:annotation>
- <xs:documentation>channel layout</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Channels" use="required">
- <xs:annotation>
- <xs:documentation>number of audio channels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="SamplingRate" use="required">
- <xs:annotation>
- <xs:documentation>audio sampling rate in samples/sec or Hz</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Bitrate">
- <xs:annotation>
- <xs:documentation>average audio bit rate in bits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="BitsPerSample">
- <xs:annotation>
- <xs:documentation>Bits per sample for the wFormatTag format type</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
-
- <xs:element name="AssetFiles">
- <xs:annotation>
- <xs:documentation>Collection of AssetFile entries for the encoding job</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="AssetFile" minOccurs="1" maxOccurs="unbounded">
- <xs:annotation>
- <xs:documentation>asset file</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="Programs" minOccurs="0">
- <xs:annotation>
- <xs:documentation>This is the collection of all programs when file is MPEG-TS</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="Program" type="ProgramType" minOccurs="0" maxOccurs="unbounded" />
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- <xs:element name="VideoTracks" minOccurs="0">
- <xs:annotation>
- <xs:documentation>Each physical AssetFile can contain in it zero or more video tracks interleaved into an appropriate container format. This is the collection of all those video tracks</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="VideoTrack" type="VideoTrackType" minOccurs="0" maxOccurs="unbounded" />
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- <xs:element name="AudioTracks" minOccurs="0">
- <xs:annotation>
- <xs:documentation>each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. This is the collection of all those audio tracks</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="AudioTrack" type="AudioTrackType" minOccurs="0" maxOccurs="unbounded" />
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- <xs:element name="Metadata" type="MetadataType" minOccurs="0" maxOccurs="unbounded" />
- </xs:sequence>
- <xs:attribute name="Name" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>the media asset file name</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Size" use="required">
- <xs:annotation>
- <xs:documentation>size of file in bytes</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:long">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Duration" type="xs:duration" use="required">
- <xs:annotation>
- <xs:documentation>content play back duration. Example: Duration="PT25M37.757S"</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="NumberOfStreams" type="xs:int" use="required">
- <xs:annotation>
- <xs:documentation>number of streams in asset file</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="FormatNames" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>format names</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="FormatVerboseName" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>format verbose names</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="StartTime" type="xs:duration">
- <xs:annotation>
- <xs:documentation>content start time. Example: StartTime="PT2.669S"</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="OverallBitRate">
- <xs:annotation>
- <xs:documentation>average bitrate of the asset file in kbps</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- </xs:element>
-</xs:schema>
-```
--
-## <a name="xml"></a> XML example
-The following is an example of the Input metadata file.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<AssetFiles xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/windowsazure/mediaservices/2014/07/mediaencoder/inputmetadata">
- <AssetFile Name="bear.mp4" Size="1973733" Duration="PT12.678S" NumberOfStreams="2" FormatNames="mov,mp4,m4a,3gp,3g2,mj2" FormatVerboseName="QuickTime / MOV" StartTime="PT0S" OverallBitRate="1245">
- <VideoTracks>
- <VideoTrack Id="1" Codec="h264" CodecLongName="H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10" TimeBase="1/29970" NumberOfFrames="375" StartTime="PT0.034S" Duration="PT12.645S" FourCC="avc1" Profile="High" Level="4.1" PixelFormat="yuv420p" Width="512" Height="384" DisplayAspectRatioNumerator="4" DisplayAspectRatioDenominator="3" SampleAspectRatioNumerator="1" SampleAspectRatioDenominator="1" FrameRate="29.656" Bitrate="1043" HasBFrames="1">
- <Disposition Default="1" Dub="0" Original="0" Comment="0" Lyrics="0" Karaoke="0" Forced="0" HearingImpaired="0" VisualImpaired="0" CleanEffects="0" AttachedPic="0" />
- <Metadata key="creation_time" value="2010-03-10 16:11:56" />
- <Metadata key="language" value="eng" />
- <Metadata key="handler_name" value="Mainconcept MP4 Video Media Handler" />
- </VideoTrack>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="aac" CodecLongName="AAC (Advanced Audio Coding)" TimeBase="1/44100" NumberOfFrames="546" StartTime="PT0S" Duration="PT12.678S" SampleFormat="fltp" ChannelLayout="stereo" Channels="2" SamplingRate="44100" Bitrate="156" BitsPerSample="0">
- <Disposition Default="1" Dub="0" Original="0" Comment="0" Lyrics="0" Karaoke="0" Forced="0" HearingImpaired="0" VisualImpaired="0" CleanEffects="0" AttachedPic="0" />
- <Metadata key="creation_time" value="2010-03-10 16:11:56" />
- <Metadata key="language" value="eng" />
- <Metadata key="handler_name" value="Mainconcept MP4 Sound Media Handler" />
- </AudioTrack>
- </AudioTracks>
- <Metadata key="major_brand" value="mp42" />
- <Metadata key="minor_version" value="0" />
- <Metadata key="compatible_brands" value="mp42mp41" />
- <Metadata key="creation_time" value="2010-03-10 16:11:53" />
- <Metadata key="comment" value="Courtesy of National Geographic. Used by Permission." />
- </AssetFile>
-</AssetFiles>
-```
-
-## Next steps
-
-## Provide feedback
-
media-services Media Services Inserting Ads On Client Side https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-inserting-ads-on-client-side.md
- Title: Inserting ads on the client side | Microsoft Docs
-description: This article demonstrates how to insert ads into your media on the client side.
------ Previously updated : 03/10/2021---
-# Inserting ads on the client side
--
-This article contains information on how to insert various types of ads on the client side.
-
-For information about closed captioning and ad support in Live streaming videos, see [Supported Closed Captioning and Ad Insertion Standards](media-services-live-streaming-with-onprem-encoders.md#cc_and_ads).
-
-> [!NOTE]
-> Azure Media Player does not currently support Ads.
->
->
-
-## <a id="insert_ads_into_media"></a>Inserting Ads into your Media
-Azure Media Services provides support for ad insertion through the Windows Media Platform: Player Frameworks. Player frameworks with ad support are available for Windows 8, Silverlight, Windows Phone 8, and iOS devices. Each player framework contains sample code that shows you how to implement a player application. There are three different kinds of ads you can insert into your media:list.
-
-* **Linear** ΓÇô full frame ads that pause the main video.
-* **Nonlinear** ΓÇô overlay ads that are displayed as the main video is playing, usually a logo or other static image placed within the player.
-* **Companion** ΓÇô ads that are displayed outside of the player.
-
-Ads can be placed at any point in the main videoΓÇÖs time line. You must tell the player when to play the ad and which ads to play. This is done using a set of standard XML-based files: Video Ad Service Template (VAST), Digital Video Multiple Ad Playlist (VMAP), Media Abstract Sequencing Template (MAST), and Digital Video Player Ad Interface Definition (VPAID). VAST files specify what ads to display. VMAP files specify when to play various ads and contain VAST XML. MAST files are another way to sequence ads that also can contain VAST XML. VPAID files define an interface between the video player and the ad or ad server.
-
-Each player framework works differently and each will be covered in its own article. This article describes the basic mechanisms used to insert ads. Video player applications request ads from an ad server. The ad server can respond in a number of ways:
-
-* Return a VAST file
-* Return a VMAP file (with embedded VAST)
-* Return a MAST file (with embedded VAST)
-* Return a VAST file with VPAID ads
-
-### Using a Video Ad Service Template (VAST) File
-A VAST file specifies what ad or ads to display. The following XML is an example of a VAST file for a linear ad:
-
-```xml
- <VAST version="2.0" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="oxml.xsd">
- <Ad id="115571748">
- <InLine>
- <AdSystem version="2.0 alpha">Atlas</AdSystem>
- <AdTitle>Unknown</AdTitle>
- <Description>Unknown</Description>
- <Survey></Survey>
- <Error></Error>
- <Impression id="Atlas"><![CDATA[http://www.myserver.com/tracking-resource]]></Impression>
- <Creatives>
- <Creative id="video" sequence="0" AdID="">
- <Linear>
- <Duration>00:00:32</Duration>
- <TrackingEvents>
- <Tracking event="start"><![CDATA[http://www.myserver.com/start-tracking-resource]]></Tracking>
- <Tracking event="midpoint"><![CDATA[http://www.myserver.com/midpoint-tracking-resource]]></Tracking>
- <Tracking event="complete"><![CDATA http://www.myserver.com/complete-tracking-resource]]></Tracking>
- <Tracking event="expand"><![CDATA[http://www.myserver.com/expand-tracking-resource]]></Tracking>
- </TrackingEvents>
- <VideoClicks>
- <ClickThrough id="Atlas Redirect"><![CDATA[http://www.myserver.com/click-resource]]></ClickThrough>
- <ClickTracking id="Spare"></ClickTracking>
- </VideoClicks>
- <MediaFiles>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_200" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="200" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://www.myserver.com/media/myad_200_4x3.wmv]]>
- </MediaFile>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_300" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="300" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://www.myserver.com/media/myad_300_4x3.wmv]]>
- </MediaFile>
- </MediaFiles>
- </Linear>
- </Creative>
- </Creatives>
- <Extensions>
- <Extension type="Atlas">
- </Extension>
- </Extensions>
- </InLine>
- </Ad>
- </VAST>
-```
-
-The linear ad is described by the <**Linear**> element. It specifies the duration of the ad, tracking events, click through, click tracking, and a number of **MediaFile** elements. Tracking events are specified within the <**TrackingEvents**> element and allow an ad server to track various events that occur while viewing the ad. In this case the start, midpoint, complete, and expand events are tracked. The start event occurs when the ad is displayed. The midpoint event occurs when at least 50% of the adΓÇÖs timeline has been viewed. The complete event occurs when the ad has run to the end. The Expand event occurs when the user expands the video player to full screen. Clickthroughs are specified with a <**ClickThrough**> element within a <**VideoClicks**> element and specifies a URI to a resource to display when the user clicks on the ad. ClickTracking is specified in a <**ClickTracking**> element, also within the <**VideoClicks**> element and specifies a tracking resource for the player to request when the user clicks on the ad. The <**MediaFile**> elements specify information about a specific encoding of an ad. When there is more than one <**MediaFile**> element, the video player can choose the best encoding for the platform.
-
-Linear ads can be displayed in a specified order. To do this, add additional `<Ad>` elements to the VAST file and specify the order using the sequence attribute. The following example illustrates this:
-
-```xml
- <VAST version="2.0" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="oxml.xsd">
- <Ad id="1" sequence="0">
- <InLine>
- <AdSystem version="2.0 alpha">Atlas</AdSystem>
- <AdTitle>Unknown</AdTitle>
- <Description>Unknown</Description>
- <Survey></Survey>
- <Error></Error>
- <Impression id="Atlas"><![CDATA[http://myserver.com/Impression/Ad1trackingResource]]></Impression>
- <Creatives>
- <Creative id="video" sequence="0" AdID="">
- <Linear>
- <Duration>00:00:32</Duration>
- <MediaFiles>
- <!-- ... -->
- </MediaFiles>
- </Linear>
- </Creative>
- </Creatives>
- </InLine>
- </Ad>
- <Ad id="2" sequence="1">
- <InLine>
- <AdSystem version="2.0 alpha">Atlas</AdSystem>
- <AdTitle>Unknown</AdTitle>
- <Description>Unknown</Description>
- <Survey></Survey>
- <Error></Error>
- <Impression id="Atlas"><![CDATA[http://myserver.com/Impression/Ad2trackingResource]]></Impression>
- <Creatives>
- <Creative id="video" sequence="0" AdID="">
- <Linear>
- <Duration>00:00:30</Duration>
- <MediaFiles>
- <!-- ... -->
- </MediaFiles>
- </Linear>
- </Creative>
- </Creatives>
- </InLine>
- </Ad>
- </VAST>
-```
-
-Nonlinear ads are specified in a `<Creative>` element as well. The following example shows a `<Creative>` element that describes a nonlinear ad.
-
-```xml
- <Creative id="video" sequence="1" AdID="">
- <NonLinearAds>
- <NonLinear width="216" height="121" minSuggestedDuration="00:00:15">
- <StaticResource creativeType="image/png"><![CDATA[http://myserver/images/image.png]]></StaticResource>
- <StaticResource creativeType="image/jpg"><![CDATA[http://myserver/images/image.jpg]]></StaticResource>
- </NonLinear>
- <TrackingEvents>
- <Tracking event="acceptInvitation"><![CDATA[http://myserver/tracking/trackingID]></Tracking>
- <Tracking event="collapse"><![CDATA[http://myserver/tracking/trackingID2]]></Tracking>
- </TrackingEvents>
- </NonLinearAds>
- </Creative>
-```
-
-The <**NonLinearAds**> element can contain one or more <**NonLinear**> elements, each of which can describe a nonlinear ad. The <**NonLinear**> element specifies the resource for the nonlinear ad. The resource can be a <**StaticResource**>, an <**IFrameResource**>, or an <**HTMLResource**>. \<**StaticResource**> describes a non-HTML resource and defines a creativeType attribute that specifies how the resource is displayed:
-
-Image/gif, image/jpeg, image/png ΓÇô the resource is displayed in an HTML <**img**> tag.
-
-Application/x-javascript ΓÇô the resource is displayed in an HTML <**script**> tag.
-
-Application/x-shockwave-flash ΓÇô the resource is displayed in a Flash player.
-
-**IFrameResource** describes an HTML resource that can be displayed in an IFrame. **HTMLResource** describes a piece of HTML code that can be inserted into a web page. **TrackingEvents** specify tracking events and the URI to request when the event occurs. In this sample, the acceptInvitation and collapse events are tracked. For more information on the **NonLinearAds** element and its children, see IAB.NET/VAST. Note that the **TrackingEvents** element is located within the **NonLinearAds** element rather than the **NonLinear** element.
-
-Companion ads are defined within a `<CompanionAds>` element. The `<CompanionAds>` element can contain one or more `<Companion>` elements. Each `<Companion>` element describes a companion ad and can contain a `<StaticResource>`, `<IFrameResource>`, or `<HTMLResource>` which are specified in the same way as in a nonlinear ad. A VAST file can contain multiple companion ads and the player application can choose the most appropriate ad to display. For more information about VAST, see [VAST 3.0](https://www.iab.net/media/file/VASTv3.0.pdf).
-
-### Using a Digital Video Multiple Ad Playlist (VMAP) File
-A VMAP file allows you to specify when ad breaks occur, how long each break is, how many ads can be displayed within a break, and what types of ads may be displayed during a break. The following in an example VMAP file that defines a single ad break:
-
-```xml
- <vmap:VMAP xmlns:vmap="http://www.iab.net/vmap-1.0" version="1.0">
- <vmap:AdBreak breakType="linear" breakId="mypre" timeOffset="start">
- <vmap:AdSource allowMultipleAds="true" followRedirects="true" id="1">
- <vmap:VASTData>
- <VAST version="2.0" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="oxml.xsd">
- <Ad id="115571748">
- <InLine>
- <AdSystem version="2.0 alpha">Atlas</AdSystem>
- <AdTitle>Unknown</AdTitle>
- <Description>Unknown</Description>
- <Survey></Survey>
- <Error></Error>
- <Impression id="Atlas"><![CDATA[https://view.atdmt.com/000/sview/115571748/direct;ai.201582527;vt.2/01/634364885739970673]]></Impression>
- <Creatives>
- <Creative id="video" sequence="0" AdID="">
- <Linear>
- <Duration>00:00:32</Duration>
- <MediaFiles>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_200" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="200" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://smf.blob.core.windows.net/samples/ads/media/XBOX_HD_DEMO_700_1_000_200_4x3.wmv]]>
- </MediaFile>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_300" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="300" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://smf.blob.core.windows.net/samples/ads/media/XBOX_HD_DEMO_700_2_000_300_4x3.wmv]]>
- </MediaFile>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_500" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="500" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://smf.blob.core.windows.net/samples/ads/media/XBOX_HD_DEMO_700_1_000_500_4x3.wmv]]>
- </MediaFile>
- <MediaFile apiFramework="Windows Media" id="windows_progressive_700" maintainAspectRatio="true" scalable="true" delivery="progressive" bitrate="700" width="400" height="300" type="video/x-ms-wmv">
- <![CDATA[http://smf.blob.core.windows.net/samples/ads/media/XBOX_HD_DEMO_700_2_000_700_4x3.wmv]]>
- </MediaFile>
- </MediaFiles>
- </Linear>
- </Creative>
- </Creatives>
- </InLine>
- </Ad>
- </VAST>
- </vmap:VASTData>
- </vmap:AdSource>
- <vmap:TrackingEvents>
- <vmap:Tracking event="breakStart">
- http://MyServer.com/breakstart.gif
- </vmap:Tracking>
- </vmap:TrackingEvents>
- </vmap:AdBreak>
- </vmap:VMAP>
-```
-
-A VMAP file begins with a `<VMAP>` element that contains one or more `<AdBreak>` elements, each defining an ad break. Each ad break specifies a break type, break ID, and time offset. The breakType attribute specifies the type of ad that can be played during the break: linear, nonlinear, or display. Display ads map to VAST companion ads. More than one ad type can be specified in a comma (no spaces) separated list. The breakID is an optional identifier for the ad. The timeOffset specifies when the ad should be displayed. It can be specified in one of the following ways:
-
-1. Time ΓÇô in hh:mm:ss or hh:mm:ss.mmm format where .mmm is milliseconds. The value of this attribute specifies the time from the beginning of the video timeline to the beginning of the ad break.
-2. Percentage ΓÇô in n% format where n is the percentage of the video timeline to play before playing the ad
-3. Start/End ΓÇô specifies that an ad should be displayed before or after the video has been displayed
-4. Position ΓÇô specifies the order of ad breaks when the timing of the ad breaks is unknown, such as in live streaming. The order of each ad break is specified in the #n format where n is an integer 1 or greater. 1 signifies the ad should be played at the first opportunity, 2 signifies the ad should be played at the second opportunity and so on.
-
-Within the `<AdBreak>` element, there can be one <**AdSource**> element. The <**AdSource**> element contains the following attributes:
-
-1. Id ΓÇô specifies an identifier for the ad source
-2. allowMultipleAds ΓÇô a Boolean value that specifies whether multiple ads can be displayed during the ad break
-3. followRedirects ΓÇô an optional Boolean value that specifies if the video player should honor redirects within an ad response
-
-The <**AdSource**> element provides the player an inline ad response or a reference to an ad response. It can contain one of the following elements:
-
-* `<VASTAdData>` indicates a VAST ad response is embedded within the VMAP file
-* `<AdTagURI>` a URI that references an ad response from another system
-* `<CustomAdData>` -an arbitrary string that represents a non-VAST response
-
-In this example, an in-line ad response is specified with a `<VASTAdData>` element that contains a VAST ad response. For more information about the other elements, see [VMAP](https://www.iab.net/guidelines/508676/digitalvideo/vsuite/vmap).
-
-The <**AdBreak**> element can also contain one <**TrackingEvents**> element. The <**TrackingEvents**> element allows you to track the start or end of an ad break or whether an error occurred during the ad break. The <**TrackingEvents**> element contains one or more <**Tracking**> elements, each of which specifies a tracking event and a tracking URI. The possible tracking events are:
-
-1. breakStart ΓÇô tracks the beginning of an ad break
-2. breakEnd ΓÇô track the completion of an ad break
-3. error ΓÇô tracks an error that occurred during the ad break
-
-The following example shows a VMAP file that specifies tracking events
-
-```xml
- <vmap:VMAP xmlns:vmap="http://www.iab.net/vmap-1.0" version="1.0">
- <vmap:AdBreak breakType="linear" breakId="mypre" timeOffset="start">
- <vmap:AdSource allowMultipleAds="true" followRedirects="true" id="1">
- <vmap:VASTData>
- <!--Inline VAST -->
- </vmap:VASTData>
- </vmap:AdSource>
- <vmap:TrackingEvents>
- <vmap:Tracking event="breakStart">
- http://MyServer.com/breakstart.gif
- </vmap:Tracking>
- <vmap:Tracking event="breakend">
- http://MyServer.com/breakend.gif
- </vmap:Tracking>
- <vmap:Tracking event="error">
- http://MyServer.com/error.gif
- </vmap:Tracking>
- </vmap:TrackingEvents>
- </vmap:AdBreak>
- </vmap:VMAP>
-```
-
-For more information on the <**TrackingEvents**> element and its children, see http://iab.net/VMAP.pdf
-
-### Using a Media Abstract Sequencing Template (MAST) File
-A MAST file allows you to specify triggers that define when an ad is displayed. The following is an example MAST file that contains triggers for a pre roll ad, a mid-roll ad, and a post-roll ad.
-
-```xml
- <MAST xsi:schemaLocation="http://openvideoplayer.sf.net/mast http://openvideoplayer.sf.net/mast/mast.xsd" xmlns="http://openvideoplayer.sf.net/mast" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance">
- <triggers>
- <trigger id="preroll" description="preroll every item" >
- <startConditions>
- <condition type="event" name="OnItemStart" />
- </startConditions>
- <sources>
- <source uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" format="vast">
- <sources />
- </source>
- </sources>
- </trigger>
-
- <trigger id="midroll" description="midroll at 15 sec." >
- <startConditions>
- <condition type="property" name="Position" value="00:00:15.0" operator="GEQ" />
- </startConditions>
- <endConditions>
- <condition type="event" name="OnItemEnd"/>
- <!--This 'resets' the trigger for the next clip-->
- </endConditions>
- <sources>
- <source uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" format="vast">
- <sources />
- </source>
- </sources>
- </trigger>
-
- <trigger id="postroll" description="postroll" >
- <startConditions>
- <condition type="event" name="OnItemEnd"/>
- </startConditions>
- <sources>
- <source uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" format="vast">
- <sources />
- </source>
- </sources>
- </trigger>
- </triggers>
- </MAST>
-```
--
-A MAST file begins with a **MAST** element that contains one **triggers** element. The `<triggers>` element contains one or more **trigger** elements that define when an ad should be played.
-
-The **trigger** element contains a **startConditions** element that specify when an ad should begin to play. The **startConditions** element contains one or more `<condition>` elements. When each `<condition>` evaluates to true a trigger is initiated or revoked depending upon whether the `<condition>` is contained within a **startConditions** or **endConditions** element respectively. When multiple `<condition>` elements are present, they are treated as an implicit OR, any condition evaluating to true will cause the trigger to initiate. `<condition>` elements can be nested. When child `<condition>` elements are preset, they are treated as an implicit AND, all conditions must evaluate to true for the trigger to initiate. The `<condition>` element contains the following attributes that define the condition:
-
-1. **type** ΓÇô specifies the type of condition, event, or property
-2. **name** ΓÇô the name of the property or event to be used during evaluation
-3. **value** ΓÇô the value that a property will be evaluated against
-4. **operator** ΓÇô the operation to use during evaluation: EQ (equal), NEQ (not equal), GTR (greater), GEQ (greater or equal), LT (Less than), LEQ (less than or equal), MOD (modulo)
-
-**endConditions** also contain `<condition>` elements. When a condition evaluates to true the trigger is reset. The `<trigger>` element also contains a `<sources>` element that contains one or more `<source>` elements. The `<source>` elements define the URI to the ad response and the type of ad response. In this example, a URI is given to a VAST response.
-
-```xml
- <trigger id="postroll" description="postroll" >
- <startConditions>
- <condition/>
- </startConditions>
- <sources>
- <source uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" format="vast">
- <sources />
- </source>
- </sources>
- </trigger>
-```
-
-### Using Video Player-Ad Interface Definition (VPAID)
-VPAID is an API for enabling executable ad units to communicate with a video player. This allows highly interactive ad experiences. The user can interact with the ad and the ad can respond to actions taken by the viewer. For example, an ad may display buttons that allow the user to view more information or a longer version of the ad. The video player must support the VPAID API and the executable ad must implement the API. When a player requests an ad from an ad server the server may respond with a VAST response that contains a VPAID ad.
-
-An executable ad is created in code that must be executed in a runtime environment such as Adobe FlashΓäó or JavaScript that can be executed in a web browser. When an ad server returns a VAST response containing a VPAID ad, the value of the apiFramework attribute in the `<MediaFile>` element must be ΓÇ£VPAIDΓÇ¥. This attribute specifies that the contained ad is a VPAID executable ad. The type attribute must be set to the MIME type of the executable, such as ΓÇ£application/x-shockwave-flashΓÇ¥ or ΓÇ£application/x-javascriptΓÇ¥. The following XML snippet shows the `<MediaFile>` element from a VAST response containing a VPAID executable ad.
-
-```xml
- <MediaFiles>
- <MediaFile id="1" delivery="progressive" type="application/x-shockwaveflash"
- width="640" height="480" apiFramework="VPAID">
- <!-- CDATA wrapped URI to executable ad -->
- </MediaFile>
- </MediaFiles>
-```
-
-An executable ad can be initialized using the `<AdParameters>` element within the `<Linear>` or `<NonLinear>` elements in a VAST response. For more information on the `<AdParameters>` element, see [VAST 3.0](https://www.iab.net/media/file/VASTv3.0.pdf). For more information about the VPAID API, see [VPAID 2.0](https://www.iab.net/media/file/VPAID_2.0_Final_04-10-2012.pdf).
-
-## Implementing a Windows or Windows Phone 8 Player with Ad Support
-The Microsoft Media Platform: Player Framework for Windows 8 and Windows Phone 8 contains a collection of sample applications that show you how to implement a video player application using the framework. You can download the Player Framework and the samples from [Player Framework for Windows 8 and Windows Phone 8](https://developerpublish.com/player-framework-for-windows-8-preview-6-released/).
-
-When you open the Microsoft.PlayerFramework.Xaml.Samples solution, you will see a number of folders within the project. The Advertising folder contains the sample code relevant to creating a video player with ad support. Inside the Advertising folder is a number of XAML/cs files each of which show how to insert ads in a different way. The following list describes each:
-
-* AdPodPage.xaml Shows how to display an ad pod.
-* AdSchedulingPage.xaml Shows how to schedule ads.
-* FreeWheelPage.xaml Shows how to use the FreeWheel plugin to schedule ads.
-* MastPage.xaml Shows how to schedule ads with a MAST file.
-* ProgrammaticAdPage.xaml Shows how to programmatically schedule ads into a video.
-* ScheduleClipPage.xaml Shows how to schedule an ad without a VAST file.
-* VastLinearCompanionPage.xaml Shows how to insert a linear and companion ad.
-* VastNonLinearPage.xaml Shows how to insert a non-linear ad.
-* VmapPage.xaml Shows how to specify ads with a VMAP file.
-
-Each of these samples uses the MediaPlayer class defined by the player framework. Most samples use plugins that add support for various ad response formats. The ProgrammaticAdPage sample programmatically interacts with a MediaPlayer instance.
-
-### AdPodPage Sample
-This sample uses the AdSchedulerPlugin to define when to display an ad. In this example, a mid-roll advertisement is scheduled to be played after five seconds. The ad pod (a group of ads to display in order) is specified in a VAST file returned from an ad server. The URI to the VAST file is specified in the `<RemoteAdSource>` element.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
-
- <mmppf:MediaPlayer.Plugins>
- <ads:AdSchedulerPlugin>
- <ads:AdSchedulerPlugin.Advertisements>
-
- <ads:MidrollAdvertisement Time="00:00:05">
- <ads:MidrollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_adpod.xml" Type="vast"/>
- </ads:MidrollAdvertisement.Source>
- </ads:MidrollAdvertisement>
-
- </ads:AdSchedulerPlugin.Advertisements>
- </ads:AdSchedulerPlugin>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
--
-### AdSchedulingPage
-This sample also uses the AdSchedulerPlugin. It schedules three ads, a pre-roll ad, a mid-roll ad, and a post-roll ad. The URI to the VAST for each ad is specified in a `<RemoteAdSource>` element.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:AdSchedulerPlugin>
- <ads:AdSchedulerPlugin.Advertisements>
-
- <ads:PrerollAdvertisement>
- <ads:PrerollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" Type="vast"/>
- </ads:PrerollAdvertisement.Source>
- </ads:PrerollAdvertisement>
-
- <ads:MidrollAdvertisement Time="00:00:15">
- <ads:MidrollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" Type="vast"/>
- </ads:MidrollAdvertisement.Source>
- </ads:MidrollAdvertisement>
-
- <ads:PostrollAdvertisement>
- <ads:PostrollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml" Type="vast"/>
- </ads:PostrollAdvertisement.Source>
- </ads:PostrollAdvertisement>
-
- </ads:AdSchedulerPlugin.Advertisements>
- </ads:AdSchedulerPlugin>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### FreeWheelPage
-This sample uses the FreeWheelPlugin that specifies a Source attribute that specifies a URI that points to a SmartXML file that specifies ad content as well as ad scheduling information.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:FreeWheelPlugin Source="http://smf.blob.core.windows.net/samples/win8/ads/freewheel.xml"/>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### MastPage
-This sample uses the MastSchedulerPlugin that allows you to use a MAST file. The Source attribute specifies the location of the MAST file.
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:MastSchedulerPlugin Source="http://smf.blob.core.windows.net/samples/win8/ads/mast.xml" />
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### ProgrammaticAdPage
-This sample programmatically interacts with the MediaPlayer. The ProgrammaticAdPage.xaml file instantiates the MediaPlayer:
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4"/>
-```
-
-The ProgrammaticAdPage.xaml.cs file creates an AdHandlerPlugin, adds a TimelineMarker to specify when an ad should be displayed, and then adds a handler for the MarkerReached event that loads a RemoteAdSource specifying a URI to a VAST file, and then plays the ad.
-
-```csharp
- public sealed partial class ProgrammaticAdPage : Microsoft.PlayerFramework.Samples.Common.LayoutAwarePage
- {
- AdHandlerPlugin adHandler;
-
- public ProgrammaticAdPage()
- {
- this.InitializeComponent();
- adHandler = new AdHandlerPlugin();
- player.Plugins.Add(new AdHandlerPlugin());
- player.Markers.Add(new TimelineMarker() { Time = TimeSpan.FromSeconds(5), Type = "myAd" });
- player.MarkerReached += pf_MarkerReached;
- }
-
- async void pf_MarkerReached(object sender, TimelineMarkerRoutedEventArgs e)
- {
- if (e.Marker.Type == "myAd")
- {
- var adSource = new RemoteAdSource() { Type = VastAdPayloadHandler.AdType, Uri = new Uri("http://smf.blob.core.windows.net/samples/win8/ads/vast_linear.xml") };
- //var adSource = new AdSource() { Type = DocumentAdPayloadHandler.AdType, Payload = SampleAdDocument };
- var progress = new Progress<AdStatus>();
- try
- {
- await player.PlayAd(adSource, progress, CancellationToken.None);
- }
- catch { /* ignore */ }
- }
- }
-```
-
-### ScheduleClipPage
-This sample uses the AdSchedulerPlugin to schedule a mid-roll ad by specifying a .wmv file that contains the ad.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.cloudapp.net/html5/media/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:AdSchedulerPlugin>
- <ads:AdSchedulerPlugin.Advertisements>
-
- <ads:MidrollAdvertisement Time="00:00:05">
- <ads:MidrollAdvertisement.Source>
- <ads:AdSource Type="clip">
- <ads:AdSource.Payload>
- <ads:ClipAdPayload MediaSource="http://smf.blob.core.windows.net/samples/ads/media/XBOX_HD_DEMO_700_2_000_700_4x3.wmv" MimeType="video/x-ms-wmv" />
- </ads:AdSource.Payload>
- </ads:AdSource>
- </ads:MidrollAdvertisement.Source>
- </ads:MidrollAdvertisement>
-
- </ads:AdSchedulerPlugin.Advertisements>
- </ads:AdSchedulerPlugin>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### VastLinearCompanionPage
-This sample illustrates how to use the AdSchedulerPlugin to schedule a mid-roll linear ad with a companion ad. The `<RemoteAdSource>` element specifies the location of the VAST file.
-
-```xml
- <mmppf:MediaPlayer Grid.Row="1" x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:AdSchedulerPlugin>
- <ads:AdSchedulerPlugin.Advertisements>
-
- <ads:MidrollAdvertisement Time="00:00:05">
- <ads:MidrollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear_companions.xml" Type="vast"/>
- </ads:MidrollAdvertisement.Source>
- </ads:MidrollAdvertisement>
-
- </ads:AdSchedulerPlugin.Advertisements>
- </ads:AdSchedulerPlugin>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### VastLinearNonLinearPage
-This sample uses the AdSchedulerPlugin to schedule a linear and a non-linear ad. The VAST file location is specified with the `<RemoteAdSource>` element.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:AdSchedulerPlugin>
- <ads:AdSchedulerPlugin.Advertisements>
-
- <ads:MidrollAdvertisement Time="00:00:05">
- <ads:MidrollAdvertisement.Source>
- <ads:RemoteAdSource Uri="http://smf.blob.core.windows.net/samples/win8/ads/vast_linear_nonlinear.xml" Type="vast"/>
- </ads:MidrollAdvertisement.Source>
- </ads:MidrollAdvertisement>
-
- </ads:AdSchedulerPlugin.Advertisements>
- </ads:AdSchedulerPlugin>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-### VMAPPage
-This sample uses the VmapSchedulerPlugin to schedule ads using a VMAP file. The URI to the VMAP file is specified in the Source attribute of the `<VmapSchedulerPlugin>` element.
-
-```xml
- <mmppf:MediaPlayer x:Name="player" Source="http://smf.blob.core.windows.net/samples/videos/bigbuck.mp4">
- <mmppf:MediaPlayer.Plugins>
- <ads:VmapSchedulerPlugin Source="http://smf.blob.core.windows.net/samples/win8/ads/vmap.xml"/>
- <ads:AdHandlerPlugin/>
- </mmppf:MediaPlayer.Plugins>
- </mmppf:MediaPlayer>
-```
-
-## Implementing an iOS Video Player with Ad Support
-The Microsoft Media Platform: Player Framework for iOS contains a collection of sample applications that show you how to implement a video player application using the framework. You can download the Player Framework and the samples from [Azure Media Player Framework](https://github.com/CloudMetal/azure-media-player-framework). The GitHub page has a link to a Wiki that contains additional information on the player framework and an introduction to the player sample: [Azure Media Player Wiki](https://github.com/CloudMetal/azure-media-player-framework/wiki/How-to-player-use-azure-media-player-how-to-framework).
-
-### Scheduling Ads with VMAP
-The following example shows how to schedule ads using a VMAP file.
-
-```csharp
- // How to schedule an Ad using VMAP.
- //First download the VMAP manifest
-
- if (![framework.adResolver downloadManifest:&manifest withURL:[NSURL URLWithString:@"https://portalvhdsq3m25bf47d15c.blob.core.windows.net/vast/PlayerTestVMAP.xml"]])
- {
- [self logFrameworkError];
- }
- else
- {
- // Schedule a list of ads using the downloaded VMAP manifest
- if (![framework scheduleVMAPWithManifest:manifest])
- {
- [self logFrameworkError];
- }
- }
-```
-
-### Scheduling Ads with VAST
-The following sample shows how to schedule a late binding VAST ad.
--
-```csharp
- //Example:3 How to schedule a late binding VAST ad.
- // set the start time for the ad
- adLinearTime.startTime = 13;
- adLinearTime.duration = 0;
- // Specify the URI of the VAST file
- NSString *vastAd1=@"https://portalvhdsq3m25bf47d15c.blob.core.windows.net/vast/PlayerTestVAST.xml";
- // Create an AdInfo object
- AdInfo *vastAdInfo1 = [[[AdInfo alloc] init] autorelease];
- // set URL to VAST file
- vastAdInfo1.clipURL = [NSURL URLWithString:vastAd1];
- // set running time of ad
- vastAdInfo1.mediaTime = [[[MediaTime alloc] init] autorelease];
- vastAdInfo1.mediaTime.clipBeginMediaTime = 0;
- vastAdInfo1.mediaTime.clipEndMediaTime = 10;
- vastAdInfo1.policy = @"policy for late binding VAST";
- // specify ad type
- vastAdInfo1.type = AdType_Midroll;
- vastAdInfo1.appendTo=-1;
- adIndex = 0;
- // schedule ad
- if (![framework scheduleClip:vastAdInfo1 atTime:adLinearTime forType:PlaylistEntryType_VAST andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
- The following sample shows how to schedule an early binding VAST ad.
-
-```csharp
- //Example:4 Schedule an early binding VAST ad
- //Download the VAST file
- if (![framework.adResolver downloadManifest:&manifest withURL:[NSURL URLWithString:@"https://portalvhdsq3m25bf47d15c.blob.core.windows.net/vast/PlayerTestVAST.xml"]])
- {
- [self logFrameworkError];
- }
- else
- {
- adLinearTime.startTime = 7;
- adLinearTime.duration = 0;
-
- // Create AdInfo instance
- AdInfo *vastAdInfo2 = [[[AdInfo alloc] init] autorelease];
- vastAdInfo2.mediaTime = [[[MediaTime alloc] init] autorelease];
- vastAdInfo2.policy = @"policy for early binding VAST";
- // specify ad type
- vastAdInfo2.type = AdType_Midroll;
- vastAdInfo2.appendTo=-1;
- // schedule ad
- if (![framework scheduleVASTClip:vastAdInfo2 withManifest:manifest atTime:adLinearTime andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
- }
-```
-
-The following sample shows how to insert an ad using Rough Cut Editing (RCE)
-
-```csharp
- //Example:1 How to use RCE.
- // specify manifest for ad content
- NSString *secondContent=@"http://wamsblureg001orig-hs.cloudapp.net/6651424c-a9d1-419b-895c-6993f0f48a26/The%20making%20of%20Microsoft%20Surface-m3u8-aapl.ism/Manifest(format=m3u8-aapl)";
-
- // specify ad length
- mediaTime.currentPlaybackPosition = 0;
- mediaTime.clipBeginMediaTime = 0;
- mediaTime.clipEndMediaTime = 80;
- // append ad content
- if (![framework appendContentClip:[NSURL URLWithString:secondContent] withMediaTime:mediaTime andGetClipId:&clipId])
- {
- [self logFrameworkError];
- }
-```
-
-The following example shows how to schedule an ad pod.
-
-```csharp
- //Example:5 Schedule an ad Pod.
- // Set start time for ad
- adLinearTime.startTime = 23;
- adLinearTime.duration = 0;
-
- // Specify URL to content
- NSString *adpodSt1=@"https://portalvhdsq3m25bf47d15c.blob.core.windows.net/asset-e47b43fd-05dc-4587-ac87-5916439ad07f/Windows%208_%20Cliffjumpers.mp4?st=2012-11-28T16%3A31%3A57Z&se=2014-11-28T16%3A31%3A57Z&sr=c&si=2a6dbb1e-f906-4187-a3d3-7e517192cbd0&sig=qrXYZBekqlbbYKqwovxzaVZNLv9cgyINgMazSCbdrfU%3D";
- // Create an AdInfo instance
- AdInfo *adpodInfo1 = [[[AdInfo alloc] init] autorelease];
- // set URI to ad content
- adpodInfo1.clipURL = [NSURL URLWithString:adpodSt1];
- // Set ad running time
- adpodInfo1.mediaTime = [[[MediaTime alloc] init] autorelease];
- adpodInfo1.mediaTime.clipBeginMediaTime = 0;
- adpodInfo1.mediaTime.clipEndMediaTime = 17;
- adpodInfo1.policy = @"policy for ad pod 1";
- // Set ad type
- adpodInfo1.type = AdType_Midroll;
- adpodInfo1.appendTo=-1;
- adIndex = 0;
- // Schedule ad
- if (![framework scheduleClip:adpodInfo1 atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
-The following example shows how to schedule a non-sticky mid-roll ad. A non-sticky ad is only played once regardless of any seeking the viewer performs.
-
-```csharp
- //Example:6 Schedule a single non sticky mid roll Ad
- // specify URL to content
- NSString *oneTimeAd=@"http://wamsblureg001orig-hs.cloudapp.net/5389c0c5-340f-48d7-90bc-0aab664e5f02/Windows%208_%20You%20and%20Me%20Together-m3u8-aapl.ism/Manifest(format=m3u8-aapl)";
-
- // create an AdInfo instance
- AdInfo *oneTimeInfo = [[[AdInfo alloc] init] autorelease];
- // set URL of ad
- oneTimeInfo.clipURL = [NSURL URLWithString:oneTimeAd];
- oneTimeInfo.mediaTime = [[[MediaTime alloc] init] autorelease];
- oneTimeInfo.mediaTime.clipBeginMediaTime = 0;
- oneTimeInfo.mediaTime.clipEndMediaTime = -1;
- oneTimeInfo.policy = @"policy for one-time ad";
- // set ad start time
- adLinearTime.startTime = 43;
- adLinearTime.duration = 0;
- // set ad type
- oneTimeInfo.type = AdType_Midroll;
- // non sticky ad
- oneTimeInfo.deleteAfterPlayed = YES;
- // schedule ad
- if (![framework scheduleClip:oneTimeInfo atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
-The following example shows how to schedule a sticky mid-roll ad. A sticky ad is displayed each time the specified point on the video timeline is reached.
-
-```csharp
- //Example:7 Schedule a single sticky mid roll Ad
- NSString *stickyAd=@"http://wamsblureg001orig-hs.cloudapp.net/2e4e7d1f-b72a-4994-a406-810c796fc4fc/The%20Surface%20Movement-m3u8-aapl.ism/Manifest(format=m3u8-aapl)";
- // create AdInfo instance
- AdInfo *stickyAdInfo = [[[AdInfo alloc] init] autorelease];
- // set URI to ad
- stickyAdInfo.clipURL = [NSURL URLWithString:stickyAd];
- stickyAdInfo.mediaTime = [[[MediaTime alloc] init] autorelease];
- stickyAdInfo.mediaTime.clipBeginMediaTime = 0;
- stickyAdInfo.mediaTime.clipEndMediaTime = 15;
- stickyAdInfo.policy = @"policy for sticky mid-roll ad";
- // set ad start time
- adLinearTime.startTime = 64;
- adLinearTime.duration = 0;
- // set ad type
- stickyAdInfo.type = AdType_Midroll;
- stickyAdInfo.deleteAfterPlayed = NO;
- // schedule ad
- if (![framework scheduleClip:stickyAdInfo atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
-The following sample shows how to schedule a post-roll ad.
-
-```csharp
- //Example:8 Schedule Post Roll Ad
- NSString *postAdURLString=@"http://wamsblureg001orig-hs.cloudapp.net/aa152d7f-3c54-487b-ba07-a58e0e33280b/wp-m3u8-aapl.ism/Manifest(format=m3u8-aapl)";
- // create AdInfo instance
- AdInfo *postAdInfo = [[[AdInfo alloc] init] autorelease];
- postAdInfo.clipURL = [NSURL URLWithString:postAdURLString];
- postAdInfo.mediaTime = [[[MediaTime alloc] init] autorelease];
- postAdInfo.mediaTime.clipBeginMediaTime = 0;
- // set ad length
- postAdInfo.mediaTime.clipEndMediaTime = 45;
- postAdInfo.policy = @"policy for post-roll ad";
- // set ad type
- postAdInfo.type = AdType_Postroll;
- adLinearTime.duration = 0;
- if (![framework scheduleClip:postAdInfo atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
-The following sample shows how to schedule a pre-roll ad.
-
-```csharp
- //Example:9 Schedule Pre Roll Ad
- NSString *adURLString = @"http://wamsblureg001orig-hs.cloudapp.net/2e4e7d1f-b72a-4994-a406-810c796fc4fc/The%20Surface%20Movement-m3u8-aapl.ism/Manifest(format=m3u8-aapl)";
- AdInfo *adInfo = [[[AdInfo alloc] init] autorelease];
- adInfo.clipURL = [NSURL URLWithString:adURLString];
- adInfo.mediaTime = [[[MediaTime alloc] init] autorelease];
- adInfo.mediaTime.currentPlaybackPosition = 0;
- adInfo.mediaTime.clipBeginMediaTime = 40; //You could play a portion of an Ad. Yeh!
- adInfo.mediaTime.clipEndMediaTime = 59;
- adInfo.policy = @"policy for pre-roll ad";
- adInfo.appendTo = -1;
- adInfo.type = AdType_Preroll;
- adLinearTime.duration = 0;
- // schedule ad
- if (![framework scheduleClip:adInfo atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
-
-The following sample shows how to schedule a mid-roll overlay ad.
-
-```csharp
- // Example10: Schedule a Mid Roll overlay Ad
- NSString *adURLString = @"https://portalvhdsq3m25bf47d15c.blob.core.windows.net/asset-e47b43fd-05dc-4587-ac87-5916439ad07f/Windows%208_%20Cliffjumpers.mp4?st=2012-11-28T16%3A31%3A57Z&se=2014-11-28T16%3A31%3A57Z&sr=c&si=2a6dbb1e-f906-4187-a3d3-7e517192cbd0&sig=qrXYZBekqlbbYKqwovxzaVZNLv9cgyINgMazSCbdrfU%3D";
- //Create AdInfo instance
- AdInfo *adInfo = [[[AdInfo alloc] init] autorelease];
- adInfo.clipURL = [NSURL URLWithString:adURLString];
- adInfo.mediaTime = [[[MediaTime alloc] init] autorelease];
- adInfo.mediaTime.currentPlaybackPosition = 0;
- adInfo.mediaTime.clipBeginMediaTime = 0;
- // specify ad length
- adInfo.mediaTime.clipEndMediaTime = 20;
- adInfo.policy = @"policy for mid-roll overlay ad";
- adInfo.appendTo = -1;
- // specify ad type
- adInfo.type = AdType_Midroll;
- // specify ad start time & duration
- adLinearTime.startTime = 300;
- adLinearTime.duration = 20;
- // schedule ad if (![framework scheduleClip:adInfo atTime:adLinearTime forType:PlaylistEntryType_Media andGetClipId:&adIndex])
- {
- [self logFrameworkError];
- }
-```
--
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Java How To Use https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-java-how-to-use.md
- Title: Get started using the Java SDK for Azure Media Services | Microsoft Docs
-description: This tutorial walks you through the steps of implementing a basic Video-on-Demand (VoD) content delivery service with Azure Media Services (AMS) application using Java.
------ Previously updated : 03/10/2021----
-# Get started with the Java client SDK for Azure Media Services
---
-This tutorial walks you through the steps of implementing a basic video content delivery service with Azure Media Services using the Java client SDK.
-
-## Prerequisites
-
-The following are required to complete this tutorial:
-
-* An Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* The current [Azure Media Services Java SDK](https://mvnrepository.com/artifact/com.microsoft.azure/azure-media/latest)
-
-## How to: Import the Azure Media Services Java client SDK package
-
-To start using the Media Services SDK for Java, add a reference to the current version (0.9.8) of the `azure-media` package from the [Azure Media Services Java SDK](https://mvnrepository.com/artifact/com.microsoft.azure/azure-media/latest)
-
-For example, if your build tool is `gradle`, add the following dependency to your `build.gradle` file:
-
-`compile group: 'com.microsoft.azure', name: 'azure-media', version: '0.9.8'`
-
->[!IMPORTANT]
->Starting with `azure-media` package version `0.9.8`, the SDK added support for Azure Active Directory (AAD) authentication and removed support for Azure Access Control Service (ACS) authentication. We recommend that you migrate to the Azure AD authentication model as soon as possible. For details on migration, read the article [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
->[!NOTE]
->You can find the source code of the Azure Media Services Java SDK in our [GitHub repository](https://github.com/Azure/azure-sdk-for-java/tree/0.9/services/azure-media). Make sure to switch to the 0.9 branch, and not the main branch.
-
-## How to: Use Azure Media Services with Java
-
->[!NOTE]
->When your Media Services account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-The following code shows how to create an asset, upload a media file to the asset, run a job with a task to transform the asset, and create a locator to stream your video.
-
-Set up a Media Services account before using this code. For information about setting up an account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-
-The code connects to the Azure Media Services API using Azure AD service principal authentication. Create an Azure AD application and specify the values for the following variables in the code:
-* `tenant`: The Azure AD tenant domain where the Azure AD application resides
-* `clientId`: The client ID of the Azure AD application
-* `clientKey`: The client key of the Azure AD application
-* `restApiEndpoint`: The REST API endpoint of the Azure Media Services account
-
-You can create an Azure AD application and obtain the preceding configuration values from the Azure portal. For more information, see the **Service principal authentication** section of [Getting started with Azure AD authentication using the Azure portal](./media-services-portal-get-started-with-aad.md).
-
-The code also relies on a locally stored video file. You must edit the code to provide your own local file to upload.
-
-```java
- import java.io.*;
- import java.net.URI;
- import java.security.NoSuchAlgorithmException;
- import java.util.EnumSet;
- import java.util.concurrent.ExecutorService;
- import java.util.concurrent.Executors;
-
- import com.microsoft.windowsazure.Configuration;
- import com.microsoft.windowsazure.exception.ServiceException;
- import com.microsoft.windowsazure.services.media.MediaConfiguration;
- import com.microsoft.windowsazure.services.media.MediaContract;
- import com.microsoft.windowsazure.services.media.MediaService;
- import com.microsoft.windowsazure.services.media.WritableBlobContainerContract;
- import com.microsoft.windowsazure.services.media.authentication.AzureAdClientSymmetricKey;
- import com.microsoft.windowsazure.services.media.authentication.AzureAdTokenCredentials;
- import com.microsoft.windowsazure.services.media.authentication.AzureAdTokenProvider;
- import com.microsoft.windowsazure.services.media.authentication.AzureEnvironments;
- import com.microsoft.windowsazure.services.media.models.AccessPolicy;
- import com.microsoft.windowsazure.services.media.models.AccessPolicyInfo;
- import com.microsoft.windowsazure.services.media.models.AccessPolicyPermission;
- import com.microsoft.windowsazure.services.media.models.Asset;
- import com.microsoft.windowsazure.services.media.models.AssetFile;
- import com.microsoft.windowsazure.services.media.models.AssetFileInfo;
- import com.microsoft.windowsazure.services.media.models.AssetInfo;
- import com.microsoft.windowsazure.services.media.models.Job;
- import com.microsoft.windowsazure.services.media.models.JobInfo;
- import com.microsoft.windowsazure.services.media.models.JobState;
- import com.microsoft.windowsazure.services.media.models.ListResult;
- import com.microsoft.windowsazure.services.media.models.Locator;
- import com.microsoft.windowsazure.services.media.models.LocatorInfo;
- import com.microsoft.windowsazure.services.media.models.LocatorType;
- import com.microsoft.windowsazure.services.media.models.MediaProcessor;
- import com.microsoft.windowsazure.services.media.models.MediaProcessorInfo;
- import com.microsoft.windowsazure.services.media.models.Task;
-
- public class Program
- {
- // Media Services account credentials configuration
- private static String tenant = "tenant.domain.com";
- private static String clientId = "<client id>";
- private static String clientKey = "<client key>";
- private static String restApiEndpoint = "https://account_name.restv2.region_name.media.azure.net/api/";
-
- // Media Services API
- private static MediaContract mediaService;
-
- // Encoder configuration
- // This is using the default Adaptive Streaming encoding preset.
- // You can choose to use a custom preset, or any other sample defined preset.
- // In addition you can use other processors, like Speech Analyzer, or Redactor if desired.
- private static String preferredEncoder = "Media Encoder Standard";
- private static String encodingPreset = "Adaptive Streaming";
-
- public static void main(String[] args)
- {
- ExecutorService executorService = Executors.newFixedThreadPool(1);
-
- try {
- // Setup Azure AD Service Principal Symmetric Key Credentials
- AzureAdTokenCredentials credentials = new AzureAdTokenCredentials(
- tenant,
- new AzureAdClientSymmetricKey(clientId, clientKey),
- AzureEnvironments.AZURE_CLOUD_ENVIRONMENT);
-
- AzureAdTokenProvider provider = new AzureAdTokenProvider(credentials, executorService);
-
- // Create a new configuration with the credentials
- Configuration configuration = MediaConfiguration.configureWithAzureAdTokenProvider(
- new URI(restApiEndpoint),
- provider);
-
- // Create the media service provisioned with the new configuration
- mediaService = MediaService.create(configuration);
-
- // Upload a local file to an Asset
- AssetInfo uploadAsset = uploadFileAndCreateAsset("Video Name", "C:/path/to/video.mp4");
- System.out.println("Uploaded Asset Id: " + uploadAsset.getId());
-
- // Transform the Asset
- AssetInfo encodedAsset = encode(uploadAsset);
- System.out.println("Encoded Asset Id: " + encodedAsset.getId());
-
- // Create the Streaming Origin Locator
- String url = getStreamingOriginLocator(encodedAsset);
-
- System.out.println("Origin Locator URL: " + url);
- System.out.println("Sample completed!");
-
- } catch (ServiceException se) {
- System.out.println("ServiceException encountered.");
- System.out.println(se.toString());
- } catch (Exception e) {
- System.out.println("Exception encountered.");
- System.out.println(e.toString());
- } finally {
- executorService.shutdown();
- }
- }
-
- private static AssetInfo uploadFileAndCreateAsset(String assetName, String fileName)
- throws ServiceException, FileNotFoundException, NoSuchAlgorithmException {
-
- WritableBlobContainerContract uploader;
- AssetInfo resultAsset;
- AccessPolicyInfo uploadAccessPolicy;
- LocatorInfo uploadLocator = null;
-
- // Create an Asset
- resultAsset = mediaService.create(Asset.create().setName(assetName).setAlternateId("altId"));
- System.out.println("Created Asset " + fileName);
-
- // Create an AccessPolicy that provides Write access for 15 minutes
- uploadAccessPolicy = mediaService
- .create(AccessPolicy.create("uploadAccessPolicy", 15.0, EnumSet.of(AccessPolicyPermission.WRITE)));
-
- // Create a Locator using the AccessPolicy and Asset
- uploadLocator = mediaService
- .create(Locator.create(uploadAccessPolicy.getId(), resultAsset.getId(), LocatorType.SAS));
-
- // Create the Blob Writer using the Locator
- uploader = mediaService.createBlobWriter(uploadLocator);
-
- File file = new File(fileName);
-
- // The local file that will be uploaded to your Media Services account
- InputStream input = new FileInputStream(file);
-
- System.out.println("Uploading " + fileName);
-
- // Upload the local file to the media asset
- uploader.createBlockBlob(file.getName(), input);
-
- // Inform Media Services about the uploaded files
- mediaService.action(AssetFile.createFileInfos(resultAsset.getId()));
- System.out.println("Uploaded Asset File " + fileName);
-
- mediaService.delete(Locator.delete(uploadLocator.getId()));
- mediaService.delete(AccessPolicy.delete(uploadAccessPolicy.getId()));
-
- return resultAsset;
- }
-
- // Create a Job that contains a Task to transform the Asset
- private static AssetInfo encode(AssetInfo assetToEncode)
- throws ServiceException, InterruptedException {
-
- // Retrieve the list of Media Processors that match the name
- ListResult<MediaProcessorInfo> mediaProcessors = mediaService
- .list(MediaProcessor.list().set("$filter", String.format("Name eq '%s'", preferredEncoder)));
-
- // Use the latest version of the Media Processor
- MediaProcessorInfo mediaProcessor = null;
- for (MediaProcessorInfo info : mediaProcessors) {
- if (null == mediaProcessor || info.getVersion().compareTo(mediaProcessor.getVersion()) > 0) {
- mediaProcessor = info;
- }
- }
-
- System.out.println("Using Media Processor: " + mediaProcessor.getName() + " " + mediaProcessor.getVersion());
-
- // Create a task with the specified Media Processor
- String outputAssetName = String.format("%s as %s", assetToEncode.getName(), encodingPreset);
- String taskXml = "<taskBody><inputAsset>JobInputAsset(0)</inputAsset>"
- + "<outputAsset assetCreationOptions=\"0\"" // AssetCreationOptions.None
- + " assetName=\"" + outputAssetName + "\">JobOutputAsset(0)</outputAsset></taskBody>";
-
- Task.CreateBatchOperation task = Task.create(mediaProcessor.getId(), taskXml)
- .setConfiguration(encodingPreset).setName("Encoding");
-
- // Create the Job; this automatically schedules and runs it.
- Job.Creator jobCreator = Job.create()
- .setName(String.format("Encoding %s to %s", assetToEncode.getName(), encodingPreset))
- .addInputMediaAsset(assetToEncode.getId()).setPriority(2).addTaskCreator(task);
- JobInfo job = mediaService.create(jobCreator);
-
- String jobId = job.getId();
- System.out.println("Created Job with Id: " + jobId);
-
- // Check to see if the Job has completed
- checkJobStatus(jobId);
- // Done with the Job
-
- // Retrieve the output Asset
- ListResult<AssetInfo> outputAssets = mediaService.list(Asset.list(job.getOutputAssetsLink()));
- return outputAssets.get(0);
- }
--
- public static String getStreamingOriginLocator(AssetInfo asset) throws ServiceException {
- // Get the .ISM AssetFile
- ListResult<AssetFileInfo> assetFiles = mediaService.list(AssetFile.list(asset.getAssetFilesLink()));
- AssetFileInfo streamingAssetFile = null;
- for (AssetFileInfo file : assetFiles) {
- if (file.getName().toLowerCase().endsWith(".ism")) {
- streamingAssetFile = file;
- break;
- }
- }
-
- AccessPolicyInfo originAccessPolicy;
- LocatorInfo originLocator = null;
-
- // Create a 30-day read only AccessPolicy
- double durationInMinutes = 60 * 24 * 30;
- originAccessPolicy = mediaService.create(
- AccessPolicy.create("Streaming policy", durationInMinutes, EnumSet.of(AccessPolicyPermission.READ)));
-
- // Create a Locator using the AccessPolicy and Asset
- originLocator = mediaService
- .create(Locator.create(originAccessPolicy.getId(), asset.getId(), LocatorType.OnDemandOrigin));
-
- // Create a Smooth Streaming base URL
- return originLocator.getPath() + streamingAssetFile.getName() + "/manifest";
- }
-
- private static void checkJobStatus(String jobId) throws InterruptedException, ServiceException {
- boolean done = false;
- JobState jobState = null;
- while (!done) {
- // Sleep for 5 seconds
- Thread.sleep(5000);
-
- // Query the updated Job state
- jobState = mediaService.get(Job.get(jobId)).getState();
- System.out.println("Job state: " + jobState);
-
- if (jobState == JobState.Finished || jobState == JobState.Canceled || jobState == JobState.Error) {
- done = true;
- }
- }
- }
- }
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Additional Resources
-For more information about developing Java apps on Azure, see [Azure Java Developer Center][Azure Java Developer Center] and [Azure for Java developers][Azure for Java developers].
--
-For Media Services Javadoc documentation, see [Azure Libraries for Java documentation][Azure Libraries for Java documentation].
-
-<!-- URLs. -->
-
-[Azure Media Services SDK Maven Package]: https://mvnrepository.com/artifact/com.microsoft.azure/azure-media/latest
-[Azure Java Developer Center]: https://azure.microsoft.com/develop/java/
-[Azure for Java developers]: /java/azure/
-[Media Services Client Development]: /previous-versions/azure/dn223283(v=azure.100)
media-services Media Services Live Encoders Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-live-encoders-overview.md
- Title: Configure on-premises encoders when using Azure Media Services to create multi-bitrate streams | Microsoft Docs
-description: This topic lists on-premises live encoders that you can use to capture your live events and send a single bitrate live stream to AMS channels (that are live encoding enabled) for further processing. The topic links to tutorials that show how to configure listed encoders.
------ Previously updated : 03/10/2021--
-# How to configure on-premises encoders when using Azure Media Services to create multi-bitrate streams
--
-This topic lists on-premises live encoders that you can use to capture your live events and send a single bitrate live stream to AMS channels (that are live encoding enabled) for further processing. The topic also links to tutorials that show how to configure listed encoders.
-
-> [!NOTE]
-> When streaming via RTMP, check firewall and/or proxy settings to confirm that outbound TCP ports 1935 and 1936 are open.
-
-## Haivision KB Encoder
-For information on how to configure the [Haivision KB Encoder](https://www.haivision.com/products/kb-series/) encoder to send a single bitrate live stream to an AMS Channel, see [Configuring Haivision KB Encoder](media-services-configure-kb-live-encoder.md).
-
-## Telestream Wirecast
-For information on how to configure the [Telestream Wirecast](https://www.telestream.net/wirecast/overview.htm) encoder to send a single bitrate live stream to an AMS Channel, see [Configuring Wirecast](media-services-configure-wirecast-live-encoder.md).
-
-## Elemental Live
-For more information, see [Elemental Live](https://www.elemental.com/products/aws-elemental-appliances-software/#elemental-live).
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-
-[Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
-
media-services Media Services Live Streaming With Onprem Encoders https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-live-streaming-with-onprem-encoders.md
- Title: Stream live with on-premises encoders that create multi-bitrate streams - Azure | Microsoft Docs
-description: This topic describes how to set up a channel that receives a multi-bitrate live stream from an on-premises encoder.
------ Previously updated : 03/10/2021--
-# Working with Channels that receive multi-bitrate live stream from on-premises encoders
--
-> [!NOTE]
-> Starting May 12, 2018, live channels will no longer support the RTP/MPEG-2 transport stream ingest protocol. Please migrate from RTP/MPEG-2 to RTMP or fragmented MP4 (Smooth Streaming) ingest protocols.
-
-## Overview
-In Azure Media Services, a *channel* represents a pipeline for processing live-streaming content. A channel receives live input streams in one of two ways:
-
-* An on-premises live encoder sends a multi-bitrate RTMP or Smooth Streaming (fragmented MP4) stream to the channel that is not enabled to perform live encoding with Media Services. The ingested streams pass through channels without any further processing. This method is called *pass-through*. A live encoder can also send a single-bitrate stream to a channel that is not enabled for live encoding, but we don't recommend that. Media Services delivers the stream to customers who request it.
-
- > [!NOTE]
- > Using a pass-through method is the most economical way to do live streaming.
--
-* An on-premises live encoder sends a single-bitrate stream to the channel that is enabled to perform live encoding with Media Services in one of the following formats: RTMP or Smooth Streaming (fragmented MP4). The channel then performs live encoding of the incoming single-bitrate stream to a multi-bitrate (adaptive) video stream. Media Services delivers the stream to customers who request it.
-
-Starting with the Media Services 2.10 release, when you create a channel, you can specify how you want your channel to receive the input stream. You can also specify whether you want the channel to perform live encoding of your stream. You have two options:
-
-* **Pass Through**: Specify this value if you plan to use an on-premises live encoder that has a multi-bitrate stream (a pass-through stream) as output. In this case, the incoming stream passes through to the output without any encoding. This is the behavior of a channel before the 2.10 release. This article gives details about working with channels of this type.
-* **Live Encoding**: Choose this value if you plan to use Media Services to encode your single-bitrate live stream to a multi-bitrate stream. Leaving a live encoding channel in a **Running** state incurs billing charges. We recommend that you immediately stop your running channels after your live-streaming event is complete to avoid extra hourly charges. Media Services delivers the stream to customers who request it.
-
-> [!NOTE]
-> This article discusses attributes of channels that are not enabled to perform live encoding. For information about working with channels that are enabled to perform live encoding, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
->
->For information about recommended on premises encoders, see [Recommended on premises encoders](media-services-recommended-encoders.md).
-
-The following diagram represents a live-streaming workflow that uses an on-premises live encoder to have multi-bitrate RTMP or fragmented MP4 (Smooth Streaming) streams as output.
-
-![Live workflow][live-overview]
-
-## <a id="scenario"></a>Common live-streaming scenario
-The following steps describe tasks involved in creating common live-streaming applications.
-
-1. Connect a video camera to a computer. Start and configure an on-premises live encoder that has a multi-bitrate RTMP or fragmented MP4 (Smooth Streaming) stream as output. For more information, see [Azure Media Services RTMP Support and Live Encoders](https://go.microsoft.com/fwlink/?LinkId=532824).
-
- You can also perform this step after you create your channel.
-2. Create and start a channel.
-
-3. Retrieve the channel ingest URL.
-
- The live encoder uses the ingest URL to send the stream to the channel.
-4. Retrieve the channel preview URL.
-
- Use this URL to verify that your channel is properly receiving the live stream.
-5. Create a program.
-
- When you use the Azure portal, creating a program also creates an asset.
-
- When you use the .NET SDK or REST, you need to create an asset and specify to use this asset when creating a program.
-6. Publish the asset that's associated with the program.
-
- >[!NOTE]
- >When your Azure Media Services account is created, a **default** streaming endpoint is added to your account in the **Stopped** state. The streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-7. Start the program when you're ready to start streaming and archiving.
-
-8. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
-
-9. Stop the program whenever you want to stop streaming and archiving the event.
-
-10. Delete the program (and optionally delete the asset).
-
-## <a id="channel"></a>Description of a channel and its related components
-### <a id="channel_input"></a>Channel input (ingest) configurations
-#### <a id="ingest_protocols"></a>Ingest streaming protocol
-Media Services supports ingesting live feeds by using multi-bitrate fragmented MP4 and multi-bitrate RTMP as streaming protocols. When the RTMP ingest streaming protocol is selected, two ingest (input) endpoints are created for the channel:
-
-* **Primary URL**: Specifies the fully qualified URL of the channel's primary RTMP ingest endpoint.
-* **Secondary URL** (optional): Specifies the fully qualified URL of the channel's secondary RTMP ingest endpoint.
-
-Use the secondary URL if you want to improve the durability and fault tolerance of your ingest stream (as well as encoder failover and fault tolerance), especially for the following scenarios:
--- Single encoder double-pushing to both primary and secondary URLs:-
- The main purpose of this scenario is to provide more resiliency to network fluctuations and jitters. Some RTMP encoders don't handle network disconnects well. When a network disconnect happens, an encoder might stop encoding and then not send the buffered data when a reconnect happens. This causes discontinuities and data loss. Network disconnects can happen because of a bad network or maintenance on the Azure side. Primary/secondary URLs reduce the network problems and provide a controlled upgrade process. Each time a scheduled network disconnect happens, Media Services manages the primary and secondary disconnects and provides a delayed disconnect between the two. Encoders then have time to keep sending data and reconnect again. The order of the disconnects can be random, but there will always be a delay between primary/secondary or secondary/primary URLs. In this scenario, the encoder is still the single point of failure.
--- Multiple encoders, with each encoder pushing to a dedicated point:-
- This scenario provides both encoder and ingests redundancy. In this scenario, encoder1 pushes to the primary URL, and encoder2 pushes to the secondary URL. When an encoder fails, the other encoder can keep sending data. Data redundancy can be maintained because Media Services does not disconnect primary and secondary URLs at the same time. This scenario assumes that encoders are time synced and provide exactly the same data.
--- Multiple encoders double-pushing to both primary and secondary URLs:-
- In this scenario, both encoders push data to both primary and secondary URLs. This provides the best reliability and fault tolerance, as well as data redundancy. This scenario can tolerate both encoder failures and disconnects, even if one encoder stops working. It assumes that encoders are time synced and provide exactly the same data.
-
-For information about RTMP live encoders, see [Azure Media Services RTMP Support and Live Encoders](https://go.microsoft.com/fwlink/?LinkId=532824).
-
-#### Ingest URLs (endpoints)
-A channel provides an input endpoint (ingest URL) that you specify in the live encoder, so the encoder can push streams to your channels.
-
-You can get the ingest URLs when you create the channel. For you to get these URLs, the channel does not have to be in the **Running** state. When you're ready to start pushing data to the channel, the channel must be in the **Running** state. After the channel starts ingesting data, you can preview your stream through the preview URL.
-
-You have an option of ingesting a fragmented MP4 (Smooth Streaming) live stream over a TLS connection. To ingest over TLS, make sure to update the ingest URL to HTTPS. Currently, you cannot ingest RTMP over TLS.
-
-#### <a id="keyframe_interval"></a>Keyframe interval
-When you're using an on-premises live encoder to generate multi-bitrate stream, the keyframe interval specifies the duration of the group of pictures (GOP) as used by that external encoder. After the channel receives this incoming stream, you can deliver your live stream to client playback applications in any of the following formats: Smooth Streaming, Dynamic Adaptive Streaming over HTTP (DASH), and HTTP Live Streaming (HLS). When you're doing live streaming, HLS is always packaged dynamically. By default, Media Services automatically calculates the HLS segment packaging ratio (fragments per segment) based on the keyframe interval that's received from the live encoder.
-
-The following table shows how the segment duration is calculated:
-
-| Keyframe interval | HLS segment packaging ratio (FragmentsPerSegment) | Example |
-| | | |
-| Less than or equal to 3 seconds |3:1 |If KeyFrameInterval (or GOP) is 2 seconds, the default HLS segment packaging ratio is 3 to 1. This creates a 6-second HLS segment. |
-| 3 to 5 seconds |2:1 |If KeyFrameInterval (or GOP) is 4 seconds, the default HLS segment packaging ratio is 2 to 1. This creates an 8-second HLS segment. |
-| Greater than 5 seconds |1:1 |If KeyFrameInterval (or GOP) is 6 seconds, the default HLS segment packaging ratio is 1 to 1. This creates a 6-second HLS segment. |
-
-You can change the fragments-per-segment ratio by configuring the channelΓÇÖs output and setting FragmentsPerSegment on ChannelOutputHls.
-
-You can also change the keyframe interval value by setting the KeyFrameInterval property on ChannelInput. If you explicitly set KeyFrameInterval, the HLS segment packaging ratio FragmentsPerSegment is calculated via the rules described previously.
-
-If you explicitly set both KeyFrameInterval and FragmentsPerSegment, Media Services uses the values that you set.
-
-#### Allowed IP addresses
-You can define the IP addresses that are allowed to publish video to this channel. An allowed IP address can be specified as one of the following:
-
-* A single IP address (for example, 10.0.0.1)
-* An IP range that uses an IP address and a CIDR subnet mask (for example, 10.0.0.1/22)
-* An IP range that uses an IP address and a dotted decimal subnet mask (for example, 10.0.0.1(255.255.252.0))
-
-If no IP addresses are specified and there's no rule definition, then no IP address is allowed. To allow any IP address, create a rule and set 0.0.0.0/0.
-
-### Channel preview
-#### Preview URLs
-Channels provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-You can get the preview URL when you create the channel. For you to get the URL, the channel does not have to be in the **Running** state. After the channel starts ingesting data, you can preview your stream.
-
-Currently, the preview stream can be delivered only in fragmented MP4 (Smooth Streaming) format, regardless of the specified input type. You can use the [Smooth Streaming Health Monitor](https://playready.directtaps.net/smoothstreaming/) player to test the smooth stream. You can also use a player that's hosted in the Azure portal to view your stream.
-
-#### Allowed IP addresses
-You can define the IP addresses that are allowed to connect to the preview endpoint. If no IP addresses are specified, any IP address is allowed. An allowed IP address can be specified as one of the following:
-
-* A single IP address (for example, 10.0.0.1)
-* An IP range that uses an IP address and a CIDR subnet mask (for example, 10.0.0.1/22)
-* An IP range that uses an IP address and a dotted decimal subnet mask (for example, 10.0.0.1(255.255.252.0))
-
-### Channel output
-For information about channel output, see the [Keyframe interval](#keyframe_interval) section.
-
-### Channel-managed programs
-A channel is associated with programs that you can use to control the publishing and storage of segments in a live stream. Channels manage programs. The channel and program relationship is similar to traditional media, where a channel has a constant stream of content and a program is scoped to some timed event on that channel.
-
-You can specify the number of hours you want to retain the recorded content for the program by setting the **Archive Window** length. This value can be set from a minimum of 5 minutes to a maximum of 25 hours. Archive window length also dictates the maximum number of time clients can seek back in time from the current live position. Programs can run over the specified amount of time, but content that falls behind the window length is continuously discarded. This value of this property also determines how long the client manifests can grow.
-
-Each program is associated with an asset that stores the streamed content. An asset is mapped to a block blob container in the Azure storage account, and the files in the asset are stored as blobs in that container. To publish the program so your customers can view the stream, you must create an OnDemand locator for the associated asset. You can use this locator to build a streaming URL that you can provide to your clients.
-
-A channel supports up to three concurrently running programs, so you can create multiple archives of the same incoming stream. You can publish and archive different parts of an event as needed. For example, imagine that your business requirement is to archive 6 hours of a program, but to broadcast only the last 10 minutes. To accomplish this, you need to create two concurrently running programs. One program is set to archive six hours of the event, but the program is not published. The other program is set to archive for 10 minutes, and this program is published.
-
-You should not reuse existing programs for new events. Instead, create a new program for each event. Start the program when you're ready to start streaming and archiving. Stop the program whenever you want to stop streaming and archiving the event.
-
-To delete archived content, stop and delete the program, and then delete the associated asset. An asset cannot be deleted if a program uses it. The program must be deleted first.
-
-Even after you stop and delete the program, users can stream your archived content as a video on demand, until you delete the asset. If you want to retain the archived content but not have it available for streaming, delete the streaming locator.
-
-## <a id="states"></a>Channel states and billing
-Possible values for the current state of a channel include:
-
-* **Stopped**: This is the initial state of the channel after its creation. In this state, the channel properties can be updated but streaming is not allowed.
-* **Starting**: The channel is being started. No updates or streaming is allowed during this state. If an error occurs, the channel returns to the **Stopped** state.
-* **Running**: The channel can process live streams.
-* **Stopping**: The channel is being stopped. No updates or streaming is allowed during this state.
-* **Deleting**: The channel is being deleted. No updates or streaming is allowed during this state.
-
-The following table shows how channel states map to the billing mode.
-
-| Channel state | Portal UI indicators | Billed? |
-| | | |
-| **Starting** |**Starting** |No (transient state) |
-| **Running** |**Ready** (no running programs)<p><p>or<p>**Streaming** (at least one running program) |Yes |
-| **Stopping** |**Stopping** |No (transient state) |
-| **Stopped** |**Stopped** |No |
-
-## <a id="cc_and_ads"></a>Closed captioning and ad insertion
-The following table demonstrates supported standards for closed captioning and ad insertion.
-
-| Standard | Notes |
-| | |
-| CEA-708 and EIA-608 (708/608) |CEA-708 and EIA-608 are closed-captioning standards for the United States and Canada.<p><p>Currently, captioning is supported only if carried in the encoded input stream. You need to use a live media encoder that can insert 608 or 708 captions in the encoded stream that's sent to Media Services. Media Services delivers the content with inserted captions to your viewers. |
-| TTML inside .ismt (Smooth Streaming text tracks) |Media Services dynamic packaging enables your clients to stream content in any of the following formats: DASH, HLS, or Smooth Streaming. However, if you ingest fragmented MP4 (Smooth Streaming) with captions inside .ismt (Smooth Streaming text tracks), you can deliver the stream to only Smooth Streaming clients. |
-| SCTE-35 |SCTE-35 is a digital signaling system that's used to cue advertising insertion. Downstream receivers use the signal to splice advertising into the stream for the allotted time. SCTE-35 must be sent as a sparse track in the input stream.<p><p>Currently, the only supported input stream format that carries ad signals is fragmented MP4 (Smooth Streaming). The only supported output format is also Smooth Streaming. |
-
-## <a id="considerations"></a>Considerations
-When you're using an on-premises live encoder to send a multi-bitrate stream to a channel, the following constraints apply:
-
-* Make sure you have sufficient free Internet connectivity to send data to the ingest points.
-* Using a secondary ingest URL requires additional bandwidth.
-* The incoming multi-bitrate stream can have a maximum of 10 video quality levels (layers) and a maximum of 5 audio tracks.
-* The highest average bitrate for any of the video quality levels should be below 10 Mbps.
-* The aggregate of the average bitrates for all the video and audio streams should be below 25 Mbps.
-* You cannot change the input protocol while the channel or its associated programs are running. If you require different protocols, you should create separate channels for each input protocol.
-* You can ingest a single bitrate in your channel. But because the channel does not process the stream, the client applications will also receive a single bitrate stream. (We don't recommend this option.)
-
-Here are other considerations related to working with channels and related components:
-
-* Every time you reconfigure the live encoder, call the **Reset** method on the channel. Before you reset the channel, you have to stop the program. After you reset the channel, restart the program.
-
- > [!NOTE]
- > When you restart the program, you need to associate it with a new asset and create a new locator.
-
-* A channel can be stopped only when it's in the **Running** state and all programs on the channel have been stopped.
-* By default, you can add only five channels to your Media Services account. For more information, see [Quotas and limitations](media-services-quotas-and-limitations.md).
-* You are billed only when your channel is in the **Running** state. For more information, see the [Channel states and billing](media-services-live-streaming-with-onprem-encoders.md#states) section.
-
-## Media Services learning paths
-
-## Suggestions and Feedback
--
-## Related topics
-[Recommended on premises encoders](media-services-recommended-encoders.md)
-
-[Azure Media Services fragmented MP4 lives ingest specification](media-services-fmp4-live-ingest-overview.md)
-
-[Azure Media Services overview and common scenarios](media-services-overview.md)
-
-[Media Services concepts](media-services-concepts.md)
-
-[live-overview]: ./media/media-services-manage-channels-overview/media-services-live-streaming-current.png
media-services Media Services Manage Channels Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-manage-channels-overview.md
- Title: Overview of Live Streaming using Azure Media Services | Microsoft Docs
-description: This article gives an overview of live streaming using Microsoft Azure Media Services.
------ Previously updated : 03/10/2021--
-# Overview of Live Streaming using Media Services
---
-## Overview
-
-When delivering live streaming events with Azure Media Services the following components are commonly involved:
-
-* A camera that is used to broadcast an event.
-* A live video encoder that converts signals from the camera to streams that are sent to a live streaming service.
-
- Optionally, multiple live time synchronized encoders. For certain critical live events that demand very high availability and quality of experience, it is recommended to employ active-active redundant encoders with time synchronization to achieve seamless failover with no data loss.
-* A live streaming service that enables you to do the following:
-
- * ingest live content using various live streaming protocols (for example RTMP or Smooth Streaming),
- * (optionally) encode your stream into adaptive bitrate stream
- * preview your live stream,
- * record and store the ingested content in order to be streamed later (Video-on-Demand)
- * deliver the content through common streaming protocols (for example, MPEG DASH, Smooth, HLS) directly to your customers, or to a Content Delivery Network (CDN) for further distribution.
-
-**Microsoft Azure Media Services** (AMS) provides the ability to ingest, encode, preview, store, and deliver your live streaming content.
-
-With Media Services, you can take advantage of [dynamic packaging](media-services-dynamic-packaging-overview.md), which allows you to broadcast your live streams in MPEG DASH, HLS, and Smooth Streaming formats from the contribution feed that is being sent to the service. Your viewers can play back the live stream with any HLS, DASH, or Smooth Streaming compatible players. You can use Azure Media Player in your web or mobile applications to deliver your stream in any of these protocols.
-
-> [!NOTE]
-> Starting May 12, 2018, live channels will no longer support the RTP/MPEG-2 transport stream ingest protocol. Please migrate from RTP/MPEG-2 to RTMP or fragmented MP4 (Smooth Streaming) ingest protocols.
-
-## Streaming Endpoints, Channels, Programs
-
-In Azure Media Services, **Channels**, **Programs**, and **StreamingEndpoints** handle all the live streaming functionalities including ingest, formatting, DVR, security, scalability and redundancy.
-
-A **Channel** represents a pipeline for processing live streaming content. A Channel can receive a live input streams in the following ways:
-
-* An on-premises live encoder sends multi-bitrate **RTMP** or **Smooth Streaming** (fragmented MP4) to the Channel that is configured for **pass-through** delivery. The **pass-through** delivery is when the ingested streams pass through **Channel**s without any further processing. You can use the following live encoders that output multi-bitrate Smooth Streaming: MediaExcel, Ateme, Imagine Communications, Envivio, Cisco and Elemental. The following live encoders output RTMP: Telestream Wirecast, Haivision, Teradek transcoders. A live encoder can also send a single bitrate stream to a channel that is not enabled for live encoding, but that is not recommended. When requested, Media Services delivers the stream to customers.
-
- > [!NOTE]
- > Using a pass-through method is the most economical way to do live streaming when you are doing multiple events over a long period of time, and you have already invested in on-premises encoders. See [pricing](https://azure.microsoft.com/pricing/details/media-services/) details.
- >
- >
-* An on-premises live encoder sends a single-bitrate stream to the Channel that is enabled to perform live encoding with Media Services in one of the following formats: RTMP or Smooth Streaming (fragmented MP4). The following live encoders with RTMP output are known to work with channels of this type: Telestream Wirecast. The Channel then performs live encoding of the incoming single bitrate stream to a multi-bitrate (adaptive) video stream. When requested, Media Services delivers the stream to customers.
-
-Starting with the Media Services 2.10 release, when you create a Channel, you can specify in which way you want for your channel to receive the input stream and whether or not you want for the channel to perform live encoding of your stream. You have two options:
-
-* **None** (pass-through) ΓÇô Specify this value, if you plan to use an on-premises live encoder which will output multi-bitrate stream (a pass-through stream). In this case, the incoming stream passed through to the output without any encoding. This is the behavior of a Channel prior to 2.10 release.
-* **Standard** ΓÇô Choose this value, if you plan to use Media Services to encode your single bitrate live stream to multi-bitrate stream. This method is more economical for scaling up quickly for infrequent events. Be aware that there is a billing impact for live encoding and you should remember that leaving a live encoding channel in the "Running" state will incur billing charges. It is recommended that you immediately stop your running channels after your live streaming event is complete to avoid extra hourly charges.
-
-## Comparison of Channel Types
-
-Following table provides a guide to comparing the two Channel types supported in Media Services
-
-| Feature | Pass-through Channel | Standard Channel |
-| | | |
-| Single bitrate input is encoded into multiple bitrates in the cloud |No |Yes |
-| Maximum resolution, number of layers |1080p, 8 layers, 60+fps |720p, 6 layers, 30 fps |
-| Input protocols |RTMP, Smooth Streaming |RTMP, Smooth Streaming |
-| Price |See the [pricing page](https://azure.microsoft.com/pricing/details/media-services/) and click on "Live Video" tab |See the [pricing page](https://azure.microsoft.com/pricing/details/media-services/) |
-| Maximum run time |24x7 |8 hours |
-| Support for inserting slates |No |Yes |
-| Support for ad signaling |No |Yes |
-| Pass-through CEA 608/708 captions |Yes |Yes |
-| Support for non-uniform input GOPs |Yes |No ΓÇô input must be fixed 2sec GOPs |
-| Support for variable frame rate input |Yes |No ΓÇô input must be fixed frame rate.<br/>Minor variations are tolerated, for example, during high motion scenes. But encoder cannot drop to 10 frames/sec. |
-| Auto-shutoff of Channels when input feed is lost |No |After 12 hours, if there is no Program running |
-
-## Working with Channels that receive multi-bitrate live stream from on-premises encoders (pass-through)
-
-The following diagram shows the major parts of the AMS platform that are involved in the **pass-through** workflow.
-
-![Diagram that shows the major parts of the A M S platform for the "pass-through" workflow.](./media/media-services-live-streaming-workflow/media-services-live-streaming-current.png)
-
-For more information, see [Working with Channels that Receive Multi-bitrate Live Stream from On-premises Encoders](media-services-live-streaming-with-onprem-encoders.md).
-
-## Working with Channels that are enabled to perform live encoding with Azure Media Services
-
-The following diagram shows the major parts of the AMS platform that are involved in Live Streaming workflow where a Channel is enabled to perform live encoding with Media Services.
-
-![Live workflow](./media/media-services-live-streaming-workflow/media-services-live-streaming-new.png)
-
-For more information, see [Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md).
-
-## Description of a Channel and its related components
-
-### Channel
-
-In Media Services, [Channel](/rest/api/media/operations/channel)s are responsible for processing live streaming content. A Channel provides an input endpoint (ingest URL) that you then provide to a live transcoder. The channel receives live input streams from the live transcoder and makes it available for streaming through one or more StreamingEndpoints. Channels also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-You can get the ingest URL and the preview URL when you create the channel. To get these URLs, the channel does not have to be in the started state. When you are ready to start pushing data from a live transcoder into the channel, the channel must be started. Once the live transcoder starts ingesting data, you can preview your stream.
-
-Each Media Services account can contain multiple Channels, multiple Programs, and multiple StreamingEndpoints. Depending on the bandwidth and security needs, StreamingEndpoint services can be dedicated to one or more channels. Any StreamingEndpoint can pull from any Channel.
-
-When creating a Channel, you can specify allowed IP addresses in one of the following formats: IpV4 address with 4 numbers, CIDR address range.
-
-### Program
-A [Program](/rest/api/media/operations/program) enables you to control the publishing and storage of segments in a live stream. Channels manage Programs. The Channel and Program relationship is very similar to traditional media where a channel has a constant stream of content and a program is scoped to some timed event on that channel.
-You can specify the number of hours you want to retain the recorded content for the program by setting the **ArchiveWindowLength** property. This value can be set from a minimum of 5 minutes to a maximum of 25 hours.
-
-ArchiveWindowLength also dictates the maximum amount of time clients can seek back in time from the current live position. Programs can run over the specified amount of time, but content that falls behind the window length is continuously discarded. The value of this property also determines how long the client manifests can grow.
-
-Each program is associated with an Asset. To publish the program you must create a locator for the associated asset. Having this locator will enable you to build a streaming URL that you can provide to your clients.
-
-A channel supports up to three concurrently running programs so you can create multiple archives of the same incoming stream. This allows you to publish and archive different parts of an event as needed. For example, your business requirement is to archive 6 hours of a program, but to broadcast only last 10 minutes. To accomplish this, you need to create two concurrently running programs. One program is set to archive 6 hours of the event but the program is not published. The other program is set to archive for 10 minutes and this program is published.
-
-## Billing Implications
-A channel begins billing as soon as it's state transitions to "Running" via the API.
-
-The following table shows how Channel states map to billing states in the API and Azure portal. Note that the states are slightly different between the API and Portal UX. As soon as a channel is in the "Running" state via the API, or in the "Ready" or "Streaming" state in the Azure portal, billing will be active.
-
-To stop the Channel from billing you further, you have to Stop the Channel via the API or in the Azure portal.
-You are responsible for stopping your channels when you are done with the channel. Failure to stop the channel will result in continued billing.
-
-> [!NOTE]
-> When working with Standard channels, AMS will auto shutoff any Channel that is still in ΓÇ£RunningΓÇ¥ state 12 hours after the input feed is lost, and there are no Programs running. However, you will still be billed for the time the Channel was in ΓÇ£RunningΓÇ¥ state.
->
->
-
-### <a id="states"></a>Channel states and how they map to the billing mode
-The current state of a Channel. Possible values include:
-
-* **Stopped**. This is the initial state of the Channel after its creation (unless autostart was selected in the portal.) No billing occurs in this state. In this state, the Channel properties can be updated but streaming is not allowed.
-* **Starting**. The Channel is being started. No billing occurs in this state. No updates or streaming is allowed during this state. If an error occurs, the Channel returns to the Stopped state.
-* **Running**. The Channel is capable of processing live streams. It is now billing usage. You must stop the channel to prevent further billing.
-* **Stopping**. The Channel is being stopped. No billing occurs in this transient state. No updates or streaming is allowed during this state.
-* **Deleting**. The Channel is being deleted. No billing occurs in this transient state. No updates or streaming is allowed during this state.
-
-The following table shows how Channel states map to the billing mode.
-
-| Channel state | Portal UI Indicators | Is it Billing? |
-| | | |
-| Starting |Starting |No (transient state) |
-| Running |Ready (no running programs)<br/>or<br/>Streaming (at least one running program) |YES |
-| Stopping |Stopping |No (transient state) |
-| Stopped |Stopped |No |
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Related topics
-[Azure Media Services Fragmented MP4 Live Ingest Specification](media-services-fmp4-live-ingest-overview.md)
-
-[Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md)
-
-[Working with Channels that Receive Multi-bitrate Live Stream from On-premises Encoders](media-services-live-streaming-with-onprem-encoders.md)
-
-[Quotas and limitations](media-services-quotas-and-limitations.md).
-
-[Media Services Concepts](media-services-concepts.md)
media-services Media Services Manage Encoding Speed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-manage-encoding-speed.md
- Title: Manage speed and concurrency of your encoding with Azure Media Services | Microsoft Docs
-description: This article gives a brief overview of how you can manage speed and concurrency of your encoding jobs/tasks with Azure Media Services.
------ Previously updated : 03/10/2021--
-# Manage speed and concurrency of your encoding
--
-This article gives a brief overview of how you can manage speed and concurrency of your encoding jobs/tasks.
-
-## Overview
-
-In Media Services, a **Reserved Unit Type** determines the speed with which your media processing tasks are processed. You can pick between the following reserved unit types: **S1**, **S2**, or **S3**. For example, the same encoding job runs faster when you use the **S2** reserved unit type compare to the **S1** type. The [scaling encoding units](media-services-scale-media-processing-overview.md) topic shows a table that helps you make decision when choosing between different encoding speeds.
-
-In addition to specifying the reserved unit type, you can specify to provision your account with **Reserved Units**. The number of provisioned reserved units determines the number of media tasks that can be processed concurrently in a given account. For example, if your account has five reserved units, then five media tasks will be running concurrently as long as there are tasks to be processed. The remaining tasks will wait in the queue and will get picked up for processing sequentially when a running task finishes. If an account does not have any reserved units provisioned, then tasks will be picked up sequentially. In this case, the wait time between one task finishing and the next one starting will depend on the availability of resources in the system.
-
-For detailed information and examples that show how to scale encoding units, see [this](media-services-scale-media-processing-overview.md) topic.
-
-## Next step
-
-[Scale encoding units](media-services-scale-media-processing-overview.md)
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Manage Live Encoder Enabled Channels https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-manage-live-encoder-enabled-channels.md
- Title: Live streaming using Azure Media Services to create multi-bitrate streams | Microsoft Docs
-description: This topic describes how to set up a Channel that receives a single bitrate live stream from an on-premises encoder and then performs live encoding to adaptive bitrate stream with Media Services.
------ Previously updated : 03/10/2021----
-# Live streaming using Azure Media Services to create multi-bitrate streams
--
-> [!NOTE]
-> Starting May 12, 2018, live channels will no longer support the RTP/MPEG-2 transport stream ingest protocol. Please migrate from RTP/MPEG-2 to RTMP or fragmented MP4 (Smooth Streaming) ingest protocols.
-
-## Overview
-In Azure Media Services (AMS), a **Channel** represents a pipeline for processing live streaming content. A **Channel** receives live input streams in one of two ways:
-
-* An on-premises live encoder sends a single-bitrate stream to the Channel that is enabled to perform live encoding with Media Services in one of the following formats: RTMP or Smooth Streaming (Fragmented MP4). The Channel then performs live encoding of the incoming single bitrate stream to a multi-bitrate (adaptive) video stream. When requested, Media Services delivers the stream to customers.
-* An on-premises live encoder sends a multi-bitrate **RTMP** or **Smooth Streaming** (Fragmented MP4) to the Channel that is not enabled to perform live encoding with AMS. The ingested streams pass through **Channel**s without any further processing. This method is called **pass-through**. You can use the following live encoders that output multi-bitrate Smooth Streaming: MediaExcel, Ateme, Imagine Communications, Envivio, Cisco and Elemental. The following live encoders output RTMP: [Telestream Wirecast](media-services-configure-wirecast-live-encoder.md), Haivision, Teradek encoders. A live encoder can also send a single bitrate stream to a channel that is not enabled for live encoding, but that is not recommended. When requested, Media Services delivers the stream to customers.
-
- > [!NOTE]
- > Using a pass-through method is the most economical way to do live streaming.
- >
- >
-
-Starting with the Media Services 2.10 release, when you create a Channel, you can specify in which way you want for your channel to receive the input stream and whether or not you want for the channel to perform live encoding of your stream. You have two options:
-
-* **None** ΓÇô Specify this value, if you plan to use an on-premises live encoder which will output multi-bitrate stream (a pass-through stream). In this case, the incoming stream passed through to the output without any encoding. This is the behavior of a Channel prior to 2.10 release. For more detailed information about working with channels of this type, see [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md).
-* **Standard** ΓÇô Choose this value, if you plan to use Media Services to encode your single bitrate live stream to multi-bitrate stream. Be aware that there is a billing impact for live encoding and you should remember that leaving a live encoding channel in the "Running" state will incur billing charges. It is recommended that you immediately stop your running channels after your live streaming event is complete to avoid extra hourly charges.
-
-> [!NOTE]
-> This topic discusses attributes of channels that are enabled to perform live encoding (**Standard** encoding type). For information about working with channels that are not enabled to perform live encoding, see [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md).
->
-> Make sure to review the [Considerations](media-services-manage-live-encoder-enabled-channels.md#Considerations) section.
->
->
-
-## Billing Implications
-A live encoding channel begins billing as soon as it's state transitions to "Running" via the API. You can also view the state in the Azure portal, or in the Azure Media Services Explorer tool (https://aka.ms/amse).
-
-The following table shows how Channel states map to billing states in the API and Azure portal. The states are slightly different between the API and Portal UX. As soon as a channel is in the "Running" state via the API, or in the "Ready" or "Streaming" state in the Azure portal, billing will be active.
-To stop the Channel from billing you further, you have to Stop the Channel via the API or in the Azure portal.
-You are responsible for stopping your channels when you are done with the live encoding channel. Failure to stop an encoding channel will result in continued billing.
-
-### <a id="states"></a>Channel states and how they map to the billing mode
-The current state of a Channel. Possible values include:
-
-* **Stopped**. This is the initial state of the Channel after its creation (unless autostart was selected in the portal.) No billing occurs in this state. In this state, the Channel properties can be updated but streaming is not allowed.
-* **Starting**. The Channel is being started. No billing occurs in this state. No updates or streaming is allowed during this state. If an error occurs, the Channel returns to the Stopped state.
-* **Running**. The Channel is capable of processing live streams. It is now billing usage. You must stop the channel to prevent further billing.
-* **Stopping**. The Channel is being stopped. No billing occurs in this transient state. No updates or streaming is allowed during this state.
-* **Deleting**. The Channel is being deleted. No billing occurs in this transient state. No updates or streaming is allowed during this state.
-
-The following table shows how Channel states map to the billing mode.
-
-| Channel state | Portal UI Indicators | Is it Billing? |
-| | | |
-| Starting |Starting |No (transient state) |
-| Running |Ready (no running programs)<br/>or<br/>Streaming (at least one running program) |YES |
-| Stopping |Stopping |No (transient state) |
-| Stopped |Stopped |No |
-
-### Automatic shut-off for unused Channels
-Starting with January 25, 2016, Media Services rolled out an update that automatically stops a Channel (with live encoding enabled) after it has been running in an unused state for a long period. This applies to Channels that have no active Programs, and which have not received an input contribution feed for an extended period of time.
-
-The threshold for an unused period is nominally 12 hours, but is subject to change.
-
-## Live Encoding Workflow
-The following diagram represents a live streaming workflow where a channel receives a single bitrate stream in one of the following protocols: RTMP or Smooth Streaming; it then encodes the stream to a multi-bitrate stream.
-
-![Live workflow][live-overview]
-
-## <a id="scenario"></a>Common Live Streaming Scenario
-The following are general steps involved in creating common live streaming applications.
-
-> [!NOTE]
-> Currently, the max recommended duration of a live event is 8 hours.
->
-> There is a billing impact for live encoding and you should remember that leaving a live encoding channel in the "Running" state will incur hourly billing charges. It is recommended that you immediately stop your running channels after your live streaming event is complete to avoid extra hourly charges.
-
-1. Connect a video camera to a computer. Launch and configure an on-premises live encoder that can output a **single** bitrate stream in one of the following protocols: RTMP or Smooth Streaming.
-
- This step could also be performed after you create your Channel.
-2. Create and start a Channel.
-3. Retrieve the Channel ingest URL.
-
- The ingest URL is used by the live encoder to send the stream to the Channel.
-4. Retrieve the Channel preview URL.
-
- Use this URL to verify that your channel is properly receiving the live stream.
-5. Create a program.
-
- When using the Azure portal, creating a program also creates an asset.
-
- When using .NET SDK or REST you need to create an asset and specify to use this asset when creating a Program.
-6. Publish the asset associated with the program.
-
- >[!NOTE]
- >When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. The streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-7. Start the program when you are ready to start streaming and archiving.
-8. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
-9. Stop the program whenever you want to stop streaming and archiving the event.
-10. Delete the Program (and optionally delete the asset).
-
-> [!NOTE]
-> It is very important not to forget to Stop a Live Encoding Channel. Be aware that there is an hourly billing impact for live encoding and you should remember that leaving a live encoding channel in the "Running" state will incur billing charges. It is recommended that you immediately stop your running channels after your live streaming event is complete to avoid extra hourly charges.
->
->
-
-## <a id="channel"></a>Channel's input (ingest) configurations
-### <a id="Ingest_Protocols"></a>Ingest streaming protocol
-If the **Encoder Type** is set to **Standard**, valid options are:
-
-* Single bitrate **RTMP**
-* Single bitrate **Fragmented MP4** (Smooth Streaming)
-
-#### <a id="single_bitrate_RTMP"></a>Single bitrate RTMP
-Considerations:
-
-* The incoming stream cannot contain multi-bitrate video
-* The video stream should have an average bitrate below 15 Mbps
-* The audio stream should have an average bitrate below 1 Mbps
-* Following are the supported codecs:
-* MPEG-4 AVC / H.264 Video
-* Baseline, Main, High Profile (8-bit 4:2:0)
-* High 10 Profile (10-bit 4:2:0)
-* High 422 Profile (10-bit 4:2:2)
-* MPEG-2 AAC-LC Audio
-* Mono, Stereo, Surround (5.1, 7.1)
-* 44.1 kHz sampling rate
-* MPEG-2 style ADTS packaging
-* Recommended encoders include:
-* [Telestream Wirecast](media-services-configure-wirecast-live-encoder.md)
-* Flash Media Live Encoder
-
-#### Single bitrate Fragmented MP4 (Smooth Streaming)
-Typical use case:
-
-Use on-premises live encoders from vendors like Elemental Technologies, Ericsson, Ateme, Envivio to send the input stream over the open internet to a nearby Azure data center.
-
-Considerations:
-
-Same as for [single bitrate RTMP](media-services-manage-live-encoder-enabled-channels.md#single_bitrate_RTMP).
-
-#### Other considerations
-* You cannot change the input protocol while the Channel or its associated programs are running. If you require different protocols, you should create separate channels for each input protocol.
-* Maximum resolution for the incoming video stream is 1920x1080, and at most 60 fields/second if interlaced, or 30 frames/second if progressive.
-
-### Ingest URLs (endpoints)
-A Channel provides an input endpoint (ingest URL) that you specify in the live encoder, so the encoder can push streams to your Channels.
-
-You can get the ingest URLs once you create a Channel. To get these URLs, the Channel does not have to be in the **Running** state. When you are ready to start pushing data into the Channel, it must be in the **Running** state. Once the Channel starts ingesting data, you can preview your stream through the preview URL.
-
-You have an option of ingesting Fragmented MP4 (Smooth Streaming) live stream over an TLS connection. To ingest over TLS, make sure to update the ingest URL to HTTPS. Currently, AMS doesnΓÇÖt support TLS with custom domains.
-
-### Allowed IP addresses
-You can define the IP addresses that are allowed to publish video to this channel. Allowed IP addresses can be specified as either a single IP address (for example, '10.0.0.1'), an IP range using an IP address and a CIDR subnet mask (for example, ΓÇÿ10.0.0.1/22ΓÇÖ), or an IP range using an IP address and a dotted decimal subnet mask (for example, '10.0.0.1(255.255.252.0)').
-
-If no IP addresses are specified and there is no rule definition then no IP address will be allowed. To allow any IP address, create a rule and set 0.0.0.0/0.
-
-## Channel preview
-### Preview URLs
-Channels provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.
-
-You can get the preview URL when you create the channel. To get the URL, the channel does not have to be in the **Running** state.
-
-Once the Channel starts ingesting data, you can preview your stream.
-
-> [!NOTE]
-> Currently the preview stream can only be delivered in Fragmented MP4 (Smooth Streaming) format regardless of the specified input type. You can use a player hosted in the Azure portal to view your stream.
->
->
-
-### Allowed IP Addresses
-You can define the IP addresses that are allowed to connect to the preview endpoint. If no IP addresses are specified any IP address will be allowed. Allowed IP addresses can be specified as either a single IP address (for example, ΓÇÿ10.0.0.1ΓÇÖ), an IP range using an IP address and a CIDR subnet mask (for example, ΓÇÿ10.0.0.1/22ΓÇÖ), or an IP range using an IP address and a dotted decimal subnet mask (for example, ΓÇÿ10.0.0.1(255.255.252.0)ΓÇÖ).
-
-## Live encoding settings
-This section describes how the settings for the live encoder within the Channel can be adjusted, when the **Encoding Type** of a Channel is set to **Standard**.
-
-> [!NOTE]
-> Your contribution feed can only contain a single audio track ΓÇô ingesting of multiple audio tracks is currently not supported. When doing live encoding with [on-premises live encodes](media-services-live-streaming-with-onprem-encoders.md), you can send a contribution feed in the Smooth Streaming protocol containing multiple audio tracks.
->
->
-
-### Ad marker source
-You can specify the source for ad markers signals. Default value is **Api**, which indicates that the live encoder within the Channel should listen to an asynchronous **Ad Marker API**.
-
-### CEA 708 Closed Captions
-An optional flag which tells the live encoder to ignore any CEA 708 captions data embedded in the incoming video. When the flag is set to false (default), the encoder will detect and re-insert CEA 708 data into the output video streams.
-
-#### Index
-It is recommended to send in a single program transport stream (SPTS). If the input stream contains multiple programs, the live encoder within the Channel parses the Program Map Table (PMT) in the input, identifies the inputs that have a stream type name of MPEG-2 AAC ADTS or AC-3 System-A or AC-3 System-B or MPEG-2 Private PES or MPEG-1 Audio or MPEG-2 Audio, and arranges them in the order specified in the PMT. The zero-based index is then used to pick up the n-th entry in that arrangement.
-
-#### Language
-The language identifier of the audio stream, conforming to ISO 639-2, such as ENG. If not present, the default is UND (undefined).
-
-### <a id="preset"></a>System Preset
-Specifies the preset to be used by the live encoder within this Channel. Currently, the only allowed value is **Default720p** (default).
-
-**Default720p** will encode the video into the following 6 layers.
-
-#### Output Video Stream
-
-| BitRate | Width | Height | MaxFPS | Profile | Output Stream Name |
-| | | | | | |
-| 3500 |1280 |720 |30 |High |Video_1280x720_3500kbps |
-| 2200 |960 |540 |30 |High |Video_960x540_2200kbps |
-| 1350 |704 |396 |30 |High |Video_704x396_1350kbps |
-| 850 |512 |288 |30 |High |Video_512x288_850kbps |
-| 550 |384 |216 |30 |High |Video_384x216_550kbps |
-| 200 |340 |192 |30 |High |Video_340x192_200kbps |
-
-#### Output Audio Stream
-
-Audio is encoded to stereo AAC-LC at 128 kbps, sampling rate of 48 kHz.
-
-## Signaling Advertisements
-When your Channel has Live Encoding enabled, you have a component in your pipeline that is processing video, and can manipulate it. You can signal for the Channel to insert slates and/or advertisements into the outgoing adaptive bitrate stream. Slates are still images that you can use to cover up the input live feed in certain cases (for example during a commercial break). Advertising signals, are time-synchronized signals you embed into the outgoing stream to tell the video player to take special action ΓÇô such as to switch to an advertisement at the appropriate time. See this [blog](https://codesequoia.wordpress.com/2014/02/24/understanding-scte-35/) for an overview of the SCTE-35 signaling mechanism used for this purpose. Below is a typical scenario you could implement in your live event.
-
-1. Have your viewers get a PRE-EVENT image before the event starts.
-2. Have your viewers get a POST-EVENT image after the event ends.
-3. Have your viewers get an ERROR-EVENT image if there is a problem during the event (for example, power failure in the stadium).
-4. Send an AD-BREAK image to hide the live event feed during a commercial break.
-
-The following are the properties you can set when signaling advertisements.
-
-### Duration
-The duration, in seconds, of the commercial break. This has to be a non-zero positive value in order to start the commercial break. When a commercial break is in progress, and the duration is set to zero with the CueId matching the on-going commercial break, then that break is canceled.
-
-### CueId
-A Unique ID for the commercial break, to be used by downstream application to take appropriate action(s). Needs to be a positive integer. You can set this value to any random positive integer or use an upstream system to track the Cue Ids. Make certain to normalize any IDs to positive integers before submitting through the API.
-
-### Show slate
-Optional. Signals the live encoder to switch to the [default slate](media-services-manage-live-encoder-enabled-channels.md#default_slate) image during a commercial break and hide the incoming video feed. Audio is also muted during slate. Default is **false**.
-
-The image used will be the one specified via the default slate asset Id property at the time of the channel creation.
-The slate will be stretched to fit the display image size.
-
-## Insert Slate images
-The live encoder within the Channel can be signaled to switch to a slate image. It can also be signaled to end an on-going slate.
-
-The live encoder can be configured to switch to a slate image and hide the incoming video signal in certain situations ΓÇô for example, during an ad break. If such a slate is not configured, input video is not masked during that ad break.
-
-### Duration
-The duration of the slate in seconds. This has to be a non-zero positive value in order to start the slate. If there is an on-going slate, and a duration of zero is specified, then that on-going slate will be terminated.
-
-### Insert slate on ad marker
-When set to true, this setting configures the live encoder to insert a slate image during an ad break. The default value is true.
-
-### <a id="default_slate"></a>Default slate Asset Id
-
-Optional. Specifies the Asset Id of the Media Services Asset which contains the slate image. Default is null.
--
-> [!NOTE]
-> Before creating the Channel, the slate image with the following constraints should be uploaded as a dedicated asset (no other files should be in this asset). This image is used only when the live encoder is inserting a slate due to an ad break, or has been explicitly signaled to insert a slate.
-> There is currently no option to use a custom image when the live encoder enters such an 'input signal lost' state. You can vote for this feature [here](https://feedback.azure.com/d365community/idea/b249e4f3-0d25-ec11-b6e6-000d3a4f09d0).
-
-* At most 1920x1080 in resolution.
-* At most 3 Mbytes in size.
-* The file name must have a *.jpg extension.
-* The image must be uploaded into an Asset as the only AssetFile in that Asset and this AssetFile should be marked as the primary file. The Asset cannot be storage encrypted.
-
-If the **default slate Asset Id** is not specified, and **insert slate on ad marker** is set to **true**, a default Azure Media Services image will be used to hide the input video stream. Audio is also muted during slate.
-
-## Channel's programs
-A channel is associated with programs that enable you to control the publishing and storage of segments in a live stream. Channels manage Programs. The Channel and Program relationship is very similar to traditional media where a Channel has a constant stream of content and a program is scoped to some timed event on that Channel.
-
-You can specify the number of hours you want to retain the recorded content for the program by setting the **Archive Window** length. This value can be set from a minimum of 5 minutes to a maximum of 25 hours. Archive window length also dictates the maximum number of time clients can seek back in time from the current live position. Programs can run over the specified amount of time, but content that falls behind the window length is continuously discarded. This value of this property also determines how long the client manifests can grow.
-
-Each program is associated with an Asset which stores the streamed content. An asset is mapped to a block blob container in the Azure Storage account and the files in the asset are stored as blobs in that container. To publish the program so your customers can view the stream you must create an OnDemand locator for the associated asset. Having this locator will enable you to build a streaming URL that you can provide to your clients.
-
-A Channel supports up to three concurrently running programs so you can create multiple archives of the same incoming stream. This allows you to publish and archive different parts of an event as needed. For example, your business requirement is to archive 6 hours of a program, but to broadcast only last 10 minutes. To accomplish this, you need to create two concurrently running programs. One program is set to archive 6 hours of the event but the program is not published. The other program is set to archive for 10 minutes and this program is published.
-
-You should not reuse existing programs for new events. Instead, create and start a new program for each event as described in the Programming Live Streaming Applications section.
-
-Start the program when you are ready to start streaming and archiving. Stop the program whenever you want to stop streaming and archiving the event.
-
-To delete archived content, stop and delete the program and then delete the associated asset. An asset cannot be deleted if it is used by a program; the program must be deleted first.
-
-Even after you stop and delete the program, the users would be able to stream your archived content as a video on demand, for as long as you do not delete the asset.
-
-If you do want to retain the archived content, but not have it available for streaming, delete the streaming locator.
-
-## Getting a thumbnail preview of a live feed
-When Live Encoding is enabled, you can now get a preview of the live feed as it reaches the Channel. This can be a valuable tool to check whether your live feed is actually reaching the Channel.
-
-## <a id="states"></a>Channel states and how states map to the billing mode
-The current state of a Channel. Possible values include:
-
-* **Stopped**. This is the initial state of the Channel after its creation. In this state, the Channel properties can be updated but streaming is not allowed.
-* **Starting**. The Channel is being started. No updates or streaming is allowed during this state. If an error occurs, the Channel returns to the Stopped state.
-* **Running**. The Channel is capable of processing live streams.
-* **Stopping**. The Channel is being stopped. No updates or streaming is allowed during this state.
-* **Deleting**. The Channel is being deleted. No updates or streaming is allowed during this state.
-
-The following table shows how Channel states map to the billing mode.
-
-| Channel state | Portal UI Indicators | Billed? |
-| | | |
-| Starting |Starting |No (transient state) |
-| Running |Ready (no running programs)<br/>or<br/>Streaming (at least one running program) |Yes |
-| Stopping |Stopping |No (transient state) |
-| Stopped |Stopped |No |
-
-> [!NOTE]
-> Currently, the Channel start average is about 2 minutes, but at times could take up to 20+ minutes. Channel resets can take up to 5 minutes.
->
->
-
-## <a id="Considerations"></a>Considerations
-* When a Channel of **Standard** encoding type experiences a loss of input source/contribution feed, it compensates for it by replacing the source video/audio with an error slate and silence. The Channel will continue to emit a slate until the input/contribution feed resumes. We recommend that a live channel not be left in such a state for longer than 2 hours. Beyond that point, the behavior of the Channel on input reconnection is not guaranteed, neither is its behavior in response to a Reset command. You will have to stop the Channel, delete it and create a new one.
-* You cannot change the input protocol while the Channel or its associated programs are running. If you require different protocols, you should create separate channels for each input protocol.
-* Every time you reconfigure the live encoder, call the **Reset** method on the channel. Before you reset the channel, you have to stop the program. After you reset the channel, restart the program.
-* A channel can be stopped only when it is in the Running state, and all programs on the channel have been stopped.
-* By default you can only add 5 channels to your Media Services account. This is a soft quota on all new accounts. For more information, see [Quotas and Limitations](media-services-quotas-and-limitations.md).
-* You cannot change the input protocol while the Channel or its associated programs are running. If you require different protocols, you should create separate channels for each input protocol.
-* You are only billed when your Channel is in the **Running** state. For more information, refer to [this](media-services-manage-live-encoder-enabled-channels.md#states) section.
-* Currently, the max recommended duration of a live event is 8 hours.
-* Make sure to have the streaming endpoint from which you want to stream content in the **Running** state.
-* The encoding preset uses the notion of "max frame rate" of 30 fps. So if the input is 60fps/59.94i, the input frames are dropped/de-interlaced to 30/29.97 fps. If the input is 50fps/50i, the input frames are dropped/de-interlaced to 25 fps. If the input is 25 fps, output remains at 25 fps.
-* Don't forget to STOP YOUR CHANNELS when done. If you don't, billing will continue.
-
-## Known Issues
-* Channel start up time has been improved to an average of 2 minutes, but at times of increased demand could still take up to 20+ minutes.
-* Slate images should conform to restrictions described [here](media-services-manage-live-encoder-enabled-channels.md#default_slate). If you attempt to create a Channel with a default slate that is larger than 1920x1080, the request will eventually error out.
-* Once again....don't forget to STOP YOUR CHANNELS when you are done streaming. If you don't, billing will continue.
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Next step
-
-Review Media Services learning paths.
--
-## Provide feedback
-
-## Related topics
-[Delivering Live Streaming Events with Azure Media Services](media-services-overview.md)
-
-[Create channels that perform live encoding from a singe bitrate to adaptive bitrate stream with Portal](media-services-portal-creating-live-encoder-enabled-channel.md)
-
-[Create channels that perform live encoding from a singe bitrate to adaptive bitrate stream with .NET SDK](media-services-dotnet-creating-live-encoder-enabled-channel.md)
-
-[Manage channels with REST API](/rest/api/media/operations/channel)
-
-[Media Services Concepts](media-services-concepts.md)
-
-[Azure Media Services Fragmented MP4 Live Ingest Specification](media-services-fmp4-live-ingest-overview.md)
-
-[live-overview]: ./media/media-services-manage-live-encoder-enabled-channels/media-services-live-streaming-new.png
media-services Media Services Managing Multiple Storage Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-managing-multiple-storage-accounts.md
- Title: Managing Media Services assets across multiple storage accounts | Microsoft Docs
-description: This articles give you guidance on how to manage Media Services assets across multiple storage accounts.
------ Previously updated : 03/10/2021---
-# Managing Media Services assets across multiple storage accounts
--
-You can attach multiple storage accounts to a single Media Services account. Ability to attach multiple storage accounts to a Media Services account provides the following benefits:
-
-* Load balancing your assets across multiple storage accounts.
-* Scaling Media Services for large amounts of content processing (as currently a single storage account has a max limit of 500 TB).
-
-This article demonstrates how to attach multiple storage accounts to a Media Services account using [Azure Resource Manager APIs](/rest/api/media/operations/azure-media-services-rest-api-reference) and [PowerShell](/powershell/module/az.media). It also shows how to specify different storage accounts when creating assets using the Media Services SDK.
--
-## Considerations
-
-When attaching multiple storage accounts to your Media Services account, the following considerations apply:
-
-* The Media Services account and all associated storage accounts must be in the same Azure subscription. It is recommended to use storage accounts in the same location as the Media Services account.
-* Once a storage account is attached to the specified Media Services account, it cannot be detached.
-* Primary storage account is the one indicated during Media Services account creation time. Currently, you cannot change the default storage account.
-* If you want to add a Cool Storage account to the AMS account, the storage account must be a Blob type and set to non-primary.
-
-Other considerations:
-
-Media Services uses the value of the **IAssetFile.Name** property when building URLs for the streaming content (for example, http://{WAMSAccount}.origin.mediaservices.windows.net/{GUID}/{IAssetFile.Name}/streamingParameters.) For this reason, percent-encoding is not allowed. The value of the Name property cannot have any of the following [percent-encoding-reserved characters](https://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters): !*'();:@&=+$,/?%#[]". Also, there can only be one ΓÇÿ.ΓÇÖ for the file name extension.
-
-## To attach storage accounts
-
-To attach storage accounts to your AMS account, use [Azure Resource Manager APIs](/rest/api/media/operations/azure-media-services-rest-api-reference) and [PowerShell](/powershell/module/az.media), as shown in the following example:
-
-```azurepowershell
-$regionName = "West US"
-$subscriptionId = " xxxxxxxx-xxxx-xxxx-xxxx- xxxxxxxxxxxx "
-$resourceGroupName = "SkyMedia-USWest-App"
-$mediaAccountName = "sky"
-$storageAccount1Name = "skystorage1"
-$storageAccount2Name = "skystorage2"
-$storageAccount1Id = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccount1Name"
-$storageAccount2Id = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccount2Name"
-$storageAccount1 = New-AzMediaServiceStorageConfig -StorageAccountId $storageAccount1Id -IsPrimary
-$storageAccount2 = New-AzMediaServiceStorageConfig -StorageAccountId $storageAccount2Id
-$storageAccounts = @($storageAccount1, $storageAccount2)
-
-Set-AzMediaService -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccounts $storageAccounts
-```
-
-### Support for Cool Storage
-
-Currently, if you want to add a Cool Storage account to the AMS account, the storage account must be a Blob type and set to non-primary.
-
-## To manage Media Services assets across multiple Storage Accounts
-The following code uses the latest Media Services SDK to perform the following tasks:
-
-1. Display all the storage accounts associated with the specified Media Services account.
-2. Retrieve the name of the default storage account.
-3. Create a new asset in the default storage account.
-4. Create an output asset of the encoding job in the specified storage account.
-
-```cs
-using Microsoft.WindowsAzure.MediaServices.Client;
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using System.Text;
-using System.Threading;
-using System.Threading.Tasks;
-
-namespace MultipleStorageAccounts
-{
- class Program
- {
- // Location of the media file that you want to encode.
- private static readonly string _singleInputFilePath =
- Path.GetFullPath(@"../..\supportFiles\multifile\interview2.wmv");
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static CloudMediaContext _context;
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Display the storage accounts associated with
- // the specified Media Services account:
- foreach (var sa in _context.StorageAccounts)
- Console.WriteLine(sa.Name);
-
- // Retrieve the name of the default storage account.
- var defaultStorageName = _context.StorageAccounts.Where(s => s.IsDefault == true).FirstOrDefault();
- Console.WriteLine("Name: {0}", defaultStorageName.Name);
- Console.WriteLine("IsDefault: {0}", defaultStorageName.IsDefault);
-
- // Retrieve the name of a storage account that is not the default one.
- var notDefaultStorageName = _context.StorageAccounts.Where(s => s.IsDefault == false).FirstOrDefault();
- Console.WriteLine("Name: {0}", notDefaultStorageName.Name);
- Console.WriteLine("IsDefault: {0}", notDefaultStorageName.IsDefault);
-
- // Create the original asset in the default storage account.
- IAsset asset = CreateAssetAndUploadSingleFile(AssetCreationOptions.None,
- defaultStorageName.Name, _singleInputFilePath);
- Console.WriteLine("Created the asset in the {0} storage account", asset.StorageAccountName);
-
- // Create an output asset of the encoding job in the other storage account.
- IAsset outputAsset = CreateEncodingJob(asset, notDefaultStorageName.Name, _singleInputFilePath);
- if (outputAsset != null)
- Console.WriteLine("Created the output asset in the {0} storage account", outputAsset.StorageAccountName);
-
- }
-
- static public IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string storageName, string singleFilePath)
- {
- var assetName = "UploadSingleFile_" + DateTime.UtcNow.ToString();
-
- // If you are creating an asset in the default storage account, you can omit the StorageName parameter.
- var asset = _context.Assets.Create(assetName, storageName, assetCreationOptions);
-
- var fileName = Path.GetFileName(singleFilePath);
-
- var assetFile = asset.AssetFiles.Create(fileName);
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
-
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return asset;
- }
-
- static IAsset CreateEncodingJob(IAsset asset, string storageName, string inputMediaFilePath)
- {
- // Declare a new job.
- IJob job = _context.Jobs.Create("My encoding job");
- // Get a media processor reference, and pass to it the name of the
- // processor to use for the specific task.
- IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
- // Create a task with the encoding details, using a string preset.
- ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "Adaptive Streaming",
- Microsoft.WindowsAzure.MediaServices.Client.TaskOptions.ProtectedConfiguration);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(asset);
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is not encrypted.
- task.OutputAssets.AddNew("Output asset", storageName,
- AssetCreationOptions.None);
-
- // Use the following event handler to check job progress.
- job.StateChanged += new
- EventHandler<JobStateChangedEventArgs>(StateChanged);
-
- // Launch the job.
- job.Submit();
-
- // Check job execution and wait for job to finish.
- Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
- progressJobTask.Wait();
-
- // Get an updated job reference.
- job = GetJob(job.Id);
-
- // If job state is Error the event handling
- // method for job progress should log errors. Here we check
- // for error state and exit if needed.
- if (job.State == JobState.Error)
- {
- Console.WriteLine("\nExiting method due to job error.");
- return null;
- }
-
- // Get a reference to the output asset from the job.
- IAsset outputAsset = job.OutputMediaAssets[0];
-
- return outputAsset;
- }
-
- private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
- {
- var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
- ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();
-
- if (processor == null)
- throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));
-
- return processor;
- }
-
- private static void StateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine("Job state changed event:");
- Console.WriteLine(" Previous state: " + e.PreviousState);
- Console.WriteLine(" Current state: " + e.CurrentState);
-
- switch (e.CurrentState)
- {
- case JobState.Finished:
- Console.WriteLine();
- Console.WriteLine("********************");
- Console.WriteLine("Job is finished.");
- Console.WriteLine("Please wait while local tasks or downloads complete...");
- Console.WriteLine("********************");
- Console.WriteLine();
- Console.WriteLine();
- break;
- case JobState.Canceling:
- case JobState.Queued:
- case JobState.Scheduled:
- case JobState.Processing:
- Console.WriteLine("Please wait...\n");
- break;
- case JobState.Canceled:
- case JobState.Error:
- // Cast sender as a job.
- IJob job = (IJob)sender;
- // Display or log error details as needed.
- Console.WriteLine("An error occurred in {0}", job.Id);
- break;
- default:
- break;
- }
- }
-
- static IJob GetJob(string jobId)
- {
- // Use a Linq select query to get an updated
- // reference by Id.
- var jobInstance =
- from j in _context.Jobs
- where j.Id == jobId
- select j;
- // Return the job reference as an Ijob.
- IJob job = jobInstance.FirstOrDefault();
-
- return job;
- }
- }
-}
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Media Encoder Standard Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-media-encoder-standard-formats.md
- Title: Media Encoder Standard formats and codecs - Azure
-description: This article provides an overview of Media Encoder Standard formats and codecs.
------ Previously updated : 03/10/2021---
-# Media Encoder Standard Formats and Codecs
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 2](media-services-media-encoder-standard-formats.md)
-> * [Version 3](../latest/encode-media-encoder-standard-formats-reference.md)
-
-This document contains a list of the most common import and export file formats that you can use with Media Encoder Standard.
-
-## Input container/file Formats
-| File formats (file extensions) | Supported |
-| | |
-| FLV (with H.264 and AAC codecs) (.flv) |Yes |
-| MXF (.mxf) |Yes |
-| GXF (.gxf) |Yes |
-| MPEG2-PS, MPEG2-TS, 3GP (.ts, .ps, .3gp, .3gpp, .mpg) |Yes |
-| Windows Media Video (WMV)/ASF (.wmv, .asf) |Yes |
-| AVI (Uncompressed 8bit/10bit) (.avi) |Yes |
-| MP4 (.mp4, .m4a, .m4v)/ISMV (.isma, .ismv) |Yes |
-| [Microsoft Digital Video Recording(DVR-MS)](/previous-versions/windows/desktop/mstv/about-the-dvr-ms-file-format) (.dvr-ms) |Yes |
-| Matroska/WebM (.mkv) |Yes |
-| WAVE/WAV (.wav) |Yes |
-| QuickTime (.mov) |Yes |
-
-> [!NOTE]
-> Above is a list of the more commonly encountered file extensions. Media Encoder Standard does support many others (for example: .m2ts, .mpeg2video, .qt). If you try to encode a file and you get an error message about the format not being supported, provide your feedback [here](https://feedback.azure.com/d365community/forum/a78db44a-0d25-ec11-b6e6-000d3a4f09d0?c=a1733251-0d25-ec11-b6e6-000d3a4f09d0).
->
->
-
-### Audio formats in input containers
-Media Encoder Standard supports carrying the following audio formats in input containers:
-
-* MXF, GXF, and QuickTime files, which have audio tracks with interleaved stereo or 5.1 samples
-
-or
-
-* MXF, GXF, and QuickTime files where the audio is carried as separate PCM tracks but the channel mapping (to stereo or 5.1) can be deduced from the file metadata
-
-## Input video codecs
-| Input video codecs | Supported |
-| | |
-| AVC 8-bit/10-bit, up to 4:2:2, including AVCIntra |8 bit 4:2:0 and 4:2:2 |
-| Avid DNxHD (in MXF) |Yes |
-| DVCPro/DVCProHD (in MXF) |Yes |
-| Digital video (DV) (in AVI files) |Yes |
-| JPEG 2000 |Yes |
-| MPEG-2 (up to 422 Profile and High Level; including variants such as XDCAM, XDCAM HD, XDCAM IMX, CableLabs®, and D10) |Up to 422 Profile |
-| MPEG-1 |Yes |
-| VC-1/WMV9 |Yes |
-| Canopus HQ/HQX |No |
-| MPEG-4 Part 2 |Yes |
-| [Theora](https://en.wikipedia.org/wiki/Theora) |Yes |
-| YUV420 uncompressed, or mezzanine |Yes |
-| Apple ProRes 422 |Yes |
-| Apple ProRes 422 LT |Yes |
-| Apple ProRes 422 HQ |Yes |
-| Apple ProRes Proxy |Yes |
-| Apple ProRes 4444 |Yes |
-| Apple ProRes 4444 XQ |Yes |
-| HEVC/H.265| Main and Main 10 (&#42;) Profiles<br/>Main 10 Profile support is intended for 8bit 4:2:0 content. |
-
-## Input audio codecs
-| Input Audio Codecs | Supported |
-| | |
-| AAC (AAC-LC, AAC-HE, and AAC-HEv2; up to 5.1) |Yes |
-| MPEG Layer 2 |Yes |
-| MP3 (MPEG-1 Audio Layer 3) |Yes |
-| Windows Media Audio |Yes |
-| WAV/PCM |Yes |
-| [FLAC](https://en.wikipedia.org/wiki/FLAC)</a> |Yes |
-| [Opus](https://go.microsoft.com/fwlink/?LinkId=822667) |Yes |
-| [Vorbis](https://en.wikipedia.org/wiki/Vorbis)</a> |Yes |
-| AMR (adaptive multi-rate) |Yes |
-| AES (SMPTE 331M and 302M, AES3-2003) |No |
-| Dolby® E |No |
-| Dolby® Digital (AC3) |No |
-| Dolby® Digital Plus (E-AC3) |No |
-
-## Output formats and codecs
-The following table lists the codecs and file formats that are supported for export.
-
-| File Format | Video Codec | Audio Codec |
-| | | |
-| MP4 <br/><br/>(including multi-bitrate MP4 containers) |H.264 (High, Main, and Baseline Profiles) |AAC-LC, HE-AAC v1, HE-AAC v2 |
-| MPEG2-TS |H.264 (High, Main, and Baseline Profiles) |AAC-LC, HE-AAC v1, HE-AAC v2 |
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See also
-[Encoding On-Demand Content with Azure Media Services](media-services-encode-asset.md)
-
-[How to encode with Media Encoder Standard](media-services-dotnet-encode-with-media-encoder-standard.md)
media-services Media Services Mes Preset H264 Multiple Bitrate 1080P Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-1080p-Audio-5.1.md
- Title: H264 Multiple Bitrate 1080p Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 1080p Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 1080p Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 1080p Audio 5.1` preset in XML and JSON format.
-
- This preset produces a set of 8 GOP-aligned MP4 files, ranging from 6000 kbps to 400 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>6000</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>4700</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4700</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>320</Width>
- <Height>180</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6000,
- "MaxBitrate": 6000,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4700,
- "MaxBitrate": 4700,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1500,
- "MaxBitrate": 1500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 1080P https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-1080p.md
- Title: H264 Multiple Bitrate 1080p Media Encoder Standard preset - Azure | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 1080p** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 1080p
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 1080p` preset in XML and JSON format.
-
- This preset produces a set of 8 GOP-aligned MP4 files, ranging from 6000 kbps to 400 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>6000</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>4700</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4700</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>320</Width>
- <Height>180</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6000,
- "MaxBitrate": 6000,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4700,
- "MaxBitrate": 4700,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1500,
- "MaxBitrate": 1500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 16X9 SD Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-16x9-SD-Audio-5.1.md
- Title: H264 Multiple Bitrate 16x9 SD Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 16x9 SD Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 16x9 SD Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 16x9 SD Audio 5.1` preset in XML and JSON format.
-
- This preset produces a set of 5 GOP-aligned MP4 files, ranging from 1900 kbps to 400 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>1900</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1900</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1300</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1300</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>900</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>900</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>432</Width>
- <Height>240</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1900,
- "MaxBitrate": 1900,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1300,
- "MaxBitrate": 1300,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 900,
- "MaxBitrate": 900,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 432,
- "Height": 240,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 16X9 SD https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-16x9-SD.md
- Title: H264 Multiple Bitrate 16x9 SD| Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 16x9 SD** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 16x9 SD
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 16x9 SD` preset in XML and JSON format.
-
- This preset produces a set of 5 GOP-aligned MP4 files, ranging from 1900 kbps to 400 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>1900</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1900</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1300</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1300</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>900</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>900</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>432</Width>
- <Height>240</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1900,
- "MaxBitrate": 1900,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1300,
- "MaxBitrate": 1300,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 900,
- "MaxBitrate": 900,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 432,
- "Height": 240,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 16X9 For Ios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-16x9-for-iOS.md
- Title: H264 Multiple Bitrate 16x9 for iOS | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 16x9 for iOS** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 16x9 for iOS
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 16x9 for iOS` preset in XML and JSON format.
-
- This preset produces a set of 8 GOP-aligned MP4 files, ranging from 8500 kbps to 200 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:03</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>8500</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>High</Profile>
- <Level>4</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>8500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>6500</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.1</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>5000</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.1</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>5000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.1</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1200</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3.1</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1200</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>600</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>480</Width>
- <Height>270</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>200</Bitrate>
- <Width>416</Width>
- <Height>214</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>200</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>HEAACV2</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>64</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:03",
- "H264Layers": [
- {
- "Profile": "High",
- "Level": "4",
- "Bitrate": 8500,
- "MaxBitrate": 8500,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.1",
- "Bitrate": 6500,
- "MaxBitrate": 6500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.1",
- "Bitrate": 5000,
- "MaxBitrate": 5000,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.1",
- "Bitrate": 3500,
- "MaxBitrate": 3500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3.1",
- "Bitrate": 1200,
- "MaxBitrate": 1200,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 600,
- "MaxBitrate": 600,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 270,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 200,
- "MaxBitrate": 200,
- "BufferWindow": "00:00:05",
- "Width": 416,
- "Height": 214,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "HEAACV2",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 64,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 4K Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-4K-Audio-5.1.md
- Title: 264 Multiple Bitrate 4K Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **264 Multiple Bitrate 4K Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 4K Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 4K Audio 5.1` preset in XML and JSON format.
-
- This preset produces a set of 12 GOP-aligned MP4 files, ranging from 20000 kbps to 1000 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
-> [!NOTE]
-> You should get the Premium reserved unit type with 4K encodes. For more information, see [How to Scale Encoding](./media-services-scale-media-processing-overview.md).
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>20000</Bitrate>
- <Width>4096</Width>
- <Height>2304</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>20000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>18000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>18000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>16000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>16000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>14000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>14000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>12000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>12000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>10000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>10000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>8000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>8000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>6000</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>4700</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4700</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 20000,
- "MaxBitrate": 20000,
- "BufferWindow": "00:00:05",
- "Width": 4096,
- "Height": 2304,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 18000,
- "MaxBitrate": 18000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 16000,
- "MaxBitrate": 16000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 14000,
- "MaxBitrate": 14000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 12000,
- "MaxBitrate": 12000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 10000,
- "MaxBitrate": 10000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 8000,
- "MaxBitrate": 8000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6000,
- "MaxBitrate": 6000,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4700,
- "MaxBitrate": 4700,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 4K https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-4K.md
- Title: H264 Multiple Bitrate 4K Media Encoder Standard preset - Azure | Microsoft Docs
-description: The article gives an overview of the Media Encoder Standard **H264 Multiple Bitrate 4K** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 4K
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 4K` preset in XML and JSON format.
-
- This preset produces a set of 12 GOP-aligned MP4 files, ranging from 20000 kbps to 1000 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> You should get the Premium reserved unit type with 4K encodes. For more information, see [How to Scale Encoding](./media-services-scale-media-processing-overview.md).
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>20000</Bitrate>
- <Width>4096</Width>
- <Height>2304</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>20000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>18000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>18000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>16000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>16000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>14000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>14000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>12000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>12000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>10000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>10000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>8000</Bitrate>
- <Width>2560</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>8000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>6000</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>4700</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4700</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 20000,
- "MaxBitrate": 20000,
- "BufferWindow": "00:00:05",
- "Width": 4096,
- "Height": 2304,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 18000,
- "MaxBitrate": 18000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 16000,
- "MaxBitrate": 16000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 14000,
- "MaxBitrate": 14000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 12000,
- "MaxBitrate": 12000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 10000,
- "MaxBitrate": 10000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 8000,
- "MaxBitrate": 8000,
- "BufferWindow": "00:00:05",
- "Width": 2560,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6000,
- "MaxBitrate": 6000,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4700,
- "MaxBitrate": 4700,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 4X3 SD Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-4x3-SD-Audio-5.1.md
- Title: H264 Multiple Bitrate 4x3 SD Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 4x3 SD Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 4x3 SD Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 4x3 SD Audio 5.1` preset in XML and JSON format.
-
- This preset produces a set of 5 GOP-aligned MP4 files, ranging from 1600 kbps to 400 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md)..
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>1600</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1300</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1300</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>800</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>800</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>600</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>360</Width>
- <Height>240</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1600,
- "MaxBitrate": 1600,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1300,
- "MaxBitrate": 1300,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 800,
- "MaxBitrate": 800,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 600,
- "MaxBitrate": 600,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 360,
- "Height": 240,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 4X3 SD https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-4x3-SD.md
- Title: H264 Multiple Bitrate 4x3 SD | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 4x3 SD** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 4x3 SD
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 4x3 SD` preset in XML and JSON format.
-
- This preset produces a set of 5 GOP-aligned MP4 files, ranging from 1600 kbps to 400 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>1600</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1300</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1300</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>800</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>800</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>600</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>360</Width>
- <Height>240</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1600,
- "MaxBitrate": 1600,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1300,
- "MaxBitrate": 1300,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 800,
- "MaxBitrate": 800,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 600,
- "MaxBitrate": 600,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 360,
- "Height": 240,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 4X3 For Ios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-4x3-for-iOS.md
- Title: H264 Multiple Bitrate 4x3 for iOS | Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 4x3 for iOS** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 4x3 for iOS
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 4x3 for iOS` preset in XML and JSON format.
-
- This preset produces a set of 8 GOP-aligned MP4 files, ranging from 8500 kbps to 200 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:03</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>8500</Bitrate>
- <Width>1920</Width>
- <Height>1440</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>High</Profile>
- <Level>5</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>8500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>6500</Bitrate>
- <Width>1280</Width>
- <Height>960</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.2</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>5000</Bitrate>
- <Width>1280</Width>
- <Height>960</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.2</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>5000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>3500</Bitrate>
- <Width>960</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Main</Profile>
- <Level>3.1</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1200</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3.1</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1200</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>600</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>600</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>200</Bitrate>
- <Width>400</Width>
- <Height>300</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>200</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>HEAACV2</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>64</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:03",
- "H264Layers": [
- {
- "Profile": "High",
- "Level": "5",
- "Bitrate": 8500,
- "MaxBitrate": 8500,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1440,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.2",
- "Bitrate": 6500,
- "MaxBitrate": 6500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 960,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.2",
- "Bitrate": 5000,
- "MaxBitrate": 5000,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 960,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Main",
- "Level": "3.1",
- "Bitrate": 3500,
- "MaxBitrate": 3500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3.1",
- "Bitrate": 1200,
- "MaxBitrate": 1200,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 600,
- "MaxBitrate": 600,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 200,
- "MaxBitrate": 200,
- "BufferWindow": "00:00:05",
- "Width": 400,
- "Height": 300,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "HEAACV2",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 64,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 720P Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-720p-Audio-5.1.md
- Title: H264 Multiple Bitrate 720p Audio 5.1 | Microsoft Docs
-description: The topic gives andverview of the **H264 Multiple Bitrate 720p Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 720p Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 720p Audio 5.1` preset in XML and JSON format.
-
- This preset produces a set of 6 GOP-aligned MP4 files, ranging from 3400 kbps to 400 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>320</Width>
- <Height>180</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1500,
- "MaxBitrate": 1500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Multiple Bitrate 720P https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Multiple-Bitrate-720p.md
- Title: H264 Multiple Bitrate 720p Media Encoder Standard preset - Azure| Microsoft Docs
-description: The topic gives an overview of the **H264 Multiple Bitrate 720p** task preset.
------ Previously updated : 03/10/2021---
-# H264 Multiple Bitrate 720p
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Multiple Bitrate 720p` preset in XML and JSON format.
-
- This preset produces a set of 6 GOP-aligned MP4 files, ranging from 3400 kbps to 400 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When modifying the `Width` and `Height` values across layers, make sure that the aspect ratio remains consistent. For example: 1920x1080, 1280x720, 1080x576, 640x360. You should not use a mixture of aspect ratios, such as: 1280x720, 720x480, 640x360.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <H264Layers>
- <H264Layer>
- <Bitrate>3400</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>3400</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>2250</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2250</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1500</Bitrate>
- <Width>960</Width>
- <Height>540</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1500</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>1000</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1000</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>650</Bitrate>
- <Width>640</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>650</MaxBitrate>
- </H264Layer>
- <H264Layer>
- <Bitrate>400</Bitrate>
- <Width>320</Width>
- <Height>180</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>400</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 3400,
- "MaxBitrate": 3400,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2250,
- "MaxBitrate": 2250,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1500,
- "MaxBitrate": 1500,
- "BufferWindow": "00:00:05",
- "Width": 960,
- "Height": 540,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1000,
- "MaxBitrate": 1000,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 650,
- "MaxBitrate": 650,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 360,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- },
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 400,
- "MaxBitrate": 400,
- "BufferWindow": "00:00:05",
- "Width": 320,
- "Height": 180,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 1080P Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-1080p-Audio-5.1.md
- Title: H264 Single Bitrate 1080p Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 1080p Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 1080p Audio 5.1
---
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 1080p Audio 5.1` preset in XML and JSON format..
-
- This preset produces a single MP4 file with a bitrate of 6750 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>6750</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6750</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6750,
- "MaxBitrate": 6750,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 1080P https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-1080p.md
- Title: H264 Single Bitrate 1080p Media Encoder Standard preset - Azure | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 1080p** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 1080p
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 1080p` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 6750 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>6750</Bitrate>
- <Width>1920</Width>
- <Height>1080</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>6750</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 6750,
- "MaxBitrate": 6750,
- "BufferWindow": "00:00:05",
- "Width": 1920,
- "Height": 1080,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 16X9 SD Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-16x9-SD-Audio-5.1.md
- Title: H264 Single Bitrate 16x9 SD Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 16x9 SD Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 16x9 SD Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 16x9 SD Audio 5.1` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 2200 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>2200</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2200</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2200,
- "MaxBitrate": 2200,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 16X9 SD https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-16x9-SD.md
- Title: H264 Single Bitrate 16x9 SD Media Encoder Standard preset - Azure | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 16x9 SD** task preset.
------ Previously updated : 03/10/2021----
-# H264 Single Bitrate 16x9 SD
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 16x9 SD` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 2200 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>2200</Bitrate>
- <Width>848</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2200</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 2200,
- "MaxBitrate": 2200,
- "BufferWindow": "00:00:05",
- "Width": 848,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 4K Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-4K-Audio-5.1.md
- Title: H264 Single Bitrate 4K Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 4K Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 4K Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 4K Audio 5.1` preset (in XML and JSON format).
-
- This preset produces a single MP4 file with a bitrate of 18000 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
-> [!NOTE]
-> You should get the Premium reserved unit type with 4K encodes. For more information, see [How to Scale Encoding](./media-services-scale-media-processing-overview.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>18000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>18000</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 18000,
- "MaxBitrate": 18000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 4K https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-4K.md
- Title: H264 Single Bitrate 4K Media Encoder Standard preset - Azure | Microsoft Docs
-description: The article gives an overview of the Media Encoder Standard H264 Single Bitrate 4K task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 4K
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 4K` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 18000 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> You should get the Premium reserved unit type with 4K encodes. For more information, see [How to Scale Encoding](./media-services-scale-media-processing-overview.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>18000</Bitrate>
- <Width>3840</Width>
- <Height>2160</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>18000</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 18000,
- "MaxBitrate": 18000,
- "BufferWindow": "00:00:05",
- "Width": 3840,
- "Height": 2160,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 4X3 SD Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-4x3-SD-Audio-5.1.md
- Title: H264 Single Bitrate 4x3 SD Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 4x3 SD Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 4x3 SD Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 4x3 SD Audio 5.1` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 1800 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>1800</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1800</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1800,
- "MaxBitrate": 1800,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 4X3 SD https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-4x3-SD.md
- Title: H264 Single Bitrate 4x3 SD Media Encoder Standard preset - Azure | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 4x3 SD** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 4x3 SD
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 4x3 SD` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 1800 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>1800</Bitrate>
- <Width>640</Width>
- <Height>480</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>1800</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 1800,
- "MaxBitrate": 1800,
- "BufferWindow": "00:00:05",
- "Width": 640,
- "Height": 480,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 720P Audio 5.1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-720p-Audio-5.1.md
- Title: H264 Single Bitrate 720p Audio 5.1 | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 720p Audio 5.1** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 720p Audio 5.1
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 720p Audio 5.1` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 4500 kbps, and AAC 5.1 audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md).
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>4500</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4500</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>6</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>384</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4500,
- "MaxBitrate": 4500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 6,
- "SamplingRate": 48000,
- "Bitrate": 384,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 720P For Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-720p-for-Android.md
- Title: H264 Single Bitrate 720p for Android | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate 720p for Android** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 720p for Android
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
-This topic shows the `H264 Single Bitrate 720p for Android` preset in XML and JSON format.
-
-This preset produces a single MP4 file with a bitrate of 2000 kbps, and stereo AAC. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:05</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>2000</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3.1</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>2000</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>192</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:05",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Baseline",
- "Level": "3.1",
- "Bitrate": 2000,
- "MaxBitrate": 2000,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 192,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate 720P https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-720p.md
- Title: H264 Single Bitrate 720p Media Encoder Standard preset - Azure | Microsoft Docs
-description: This article gives an overview of the Media Encoder Standard H264 Single Bitrate 720p task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate 720p
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate 720p` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 4500 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:02</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>4500</Bitrate>
- <Width>1280</Width>
- <Height>720</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Auto</Profile>
- <Level>auto</Level>
- <BFrames>3</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>true</AdaptiveBFrame>
- <EntropyMode>Cabac</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>4500</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:02",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Auto",
- "Level": "auto",
- "Bitrate": 4500,
- "MaxBitrate": 4500,
- "BufferWindow": "00:00:05",
- "Width": 1280,
- "Height": 720,
- "BFrames": 3,
- "ReferenceFrames": 3,
- "AdaptiveBFrame": true,
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate High Quality SD For Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-High-Quality-SD-for-Android.md
- Title: H264 Single Bitrate High Quality SD for Android | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate High Quality SD for Android** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate High Quality SD for Android
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate High Quality SD for Android` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 500 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:05</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>500</Bitrate>
- <Width>480</Width>
- <Height>360</Height>
- <FrameRate>0/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>3</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>500</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>AACLC</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>128</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:05",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Baseline",
- "Level": "3",
- "Bitrate": 500,
- "MaxBitrate": 500,
- "BufferWindow": "00:00:05",
- "Width": 480,
- "Height": 360,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "0/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "AACLC",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 128,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Preset H264 Single Bitrate Low Quality SD For Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-preset-H264-Single-Bitrate-Low-Quality-SD-for-Android.md
- Title: H264 Single Bitrate Low Quality SD for Android | Microsoft Docs
-description: The topic gives an overview of the **H264 Single Bitrate Low Quality SD for Android** task preset.
------ Previously updated : 03/10/2021---
-# H264 Single Bitrate Low Quality SD for Android
--
-`Media Encoder Standard` defines a set of encoding presets you can use when creating encoding jobs. You can either use a `preset name` to specify into which format you would like to encode your media file. Or, you can create your own JSON or XML-based presets (using UTF-8 or UTF-16 encoding. You would then pass the custom preset to the encoder. For the list of all the preset names supported by this `Media Encoder Standard` encoder, see [Task Presets for Media Encoder Standard](media-services-mes-presets-overview.md).
-
- This topic shows the `H264 Single Bitrate Low Quality SD for Android` preset in XML and JSON format.
-
- This preset produces a single MP4 file with a bitrate of 56 kbps, and stereo AAC audio. For detailed information about profile, bitrate, sampling rate, etc. of this preset, examine the XML or JSON defined below. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
- XML
-
-```
-<?xml version="1.0" encoding="utf-16"?>
-<Preset xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" Version="1.0" xmlns="https://www.windowsazure.com/media/encoding/Preset/2014/03">
- <Encoding>
- <H264Video>
- <KeyFrameInterval>00:00:05</KeyFrameInterval>
- <SceneChangeDetection>true</SceneChangeDetection>
- <H264Layers>
- <H264Layer>
- <Bitrate>56</Bitrate>
- <Width>176</Width>
- <Height>144</Height>
- <FrameRate>12/1</FrameRate>
- <Profile>Baseline</Profile>
- <Level>2</Level>
- <BFrames>0</BFrames>
- <ReferenceFrames>3</ReferenceFrames>
- <Slices>0</Slices>
- <AdaptiveBFrame>false</AdaptiveBFrame>
- <EntropyMode>Cavlc</EntropyMode>
- <BufferWindow>00:00:05</BufferWindow>
- <MaxBitrate>56</MaxBitrate>
- </H264Layer>
- </H264Layers>
- <Chapters />
- </H264Video>
- <AACAudio>
- <Profile>HEAACV2</Profile>
- <Channels>2</Channels>
- <SamplingRate>48000</SamplingRate>
- <Bitrate>24</Bitrate>
- </AACAudio>
- </Encoding>
- <Outputs>
- <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
- <MP4Format />
- </Output>
- </Outputs>
-</Preset>
-```
-
- JSON
-
-```
-{
- "Version": 1.0,
- "Codecs": [
- {
- "KeyFrameInterval": "00:00:05",
- "SceneChangeDetection": true,
- "H264Layers": [
- {
- "Profile": "Baseline",
- "Level": "2",
- "Bitrate": 56,
- "MaxBitrate": 56,
- "BufferWindow": "00:00:05",
- "Width": 176,
- "Height": 144,
- "ReferenceFrames": 3,
- "EntropyMode": "Cavlc",
- "Type": "H264Layer",
- "FrameRate": "12/1"
- }
- ],
- "Type": "H264Video"
- },
- {
- "Profile": "HEAACV2",
- "Channels": 2,
- "SamplingRate": 48000,
- "Bitrate": 24,
- "Type": "AACAudio"
- }
- ],
- "Outputs": [
- {
- "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
- "Format": {
- "Type": "MP4Format"
- }
- }
- ]
-}
-```
media-services Media Services Mes Presets Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-presets-overview.md
- Title: Task Presets for Media Encoder Standard (MES) | Microsoft Docs
-description: The topic gives and overview of the service-defined sample presets for Media Encoder Standard (MES).
------ Previously updated : 03/10/2021---
-# Sample Presets for Media Encoder Standard (MES)
--
-**Media Encoder Standard** defines a set of pre-defined system encoding presets you can use when creating encoding jobs. It is recommended to use the "Adaptive Streaming" preset if you want to encode a video for streaming with Media Services. When you specify this preset, Media Encoder Standard will [auto-generate a bitrate ladder](media-services-autogen-bitrate-ladder-with-mes.md).
-
-### Creating Custom Presets from Samples
-Media Services fully supports customizing all values in presets to meet your specific encoding needs and requirements. If you need to customize an encoding preset, you should start with one of the below system presets that are provided in this section as a template for your custom configuration. For explanations of what each element in these presets means, and the valid values for each element, see the [Media Encoder Standard schema](media-services-mes-schema.md) topic.
-
-> [!NOTE]
-> When using a preset for 4k encodes, you should get the `S3` reserved unit type. For more information, see [How to Scale Encoding](./media-services-scale-media-processing-overview.md).
-
-#### Video Rotation Default Setting in Presets:
-When working with Media Encoder Standard, video rotation is enabled by default. If your video has been recorded on a mobile device in Portrait mode, then these presets will rotate them to Landscape mode prior to encoding.
-
-## Available presets:
-
- [H264 Multiple Bitrate 1080p Audio 5.1](media-services-mes-preset-H264-Multiple-Bitrate-1080p-Audio-5.1.md) produces a set of 8 GOP-aligned MP4 files, ranging from 6000 kbps to 400 kbps, and AAC 5.1 audio.
-
- [H264 Multiple Bitrate 1080p](media-services-mes-preset-H264-Multiple-Bitrate-1080p.md) produces a set of 8 GOP-aligned MP4 files, ranging from 6000 kbps to 400 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 16x9 for iOS](media-services-mes-preset-H264-Multiple-Bitrate-16x9-for-iOS.md) produces a set of 8 GOP-aligned MP4 files, ranging from 8500 kbps to 200 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 16x9 SD Audio 5.1](media-services-mes-preset-H264-Multiple-Bitrate-16x9-SD-Audio-5.1.md) produces a set of 5 GOP-aligned MP4 files, ranging from 1900 kbps to 400 kbps, and AAC 5.1 audio.
-
- [H264 Multiple Bitrate 16x9 SD](media-services-mes-preset-H264-Multiple-Bitrate-16x9-SD.md) produces a set of 5 GOP-aligned MP4 files, ranging from 1900 kbps to 400 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 4K Audio 5.1](media-services-mes-preset-H264-Multiple-Bitrate-4K-Audio-5.1.md) produces a set of 12 GOP-aligned MP4 files, ranging from 20000 kbps to 1000 kbps, and AAC 5.1 audio.
-
- [H264 Multiple Bitrate 4K](media-services-mes-preset-H264-Multiple-Bitrate-4K.md) produces a set of 12 GOP-aligned MP4 files, ranging from 20000 kbps to 1000 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 4x3 for iOS](media-services-mes-preset-H264-Multiple-Bitrate-4x3-for-iOS.md) produces a set of 8 GOP-aligned MP4 files, ranging from 8500 kbps to 200 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 4x3 SD Audio 5.1](media-services-mes-preset-H264-Multiple-Bitrate-4x3-SD-Audio-5.1.md) produces a set of 5 GOP-aligned MP4 files, ranging from 1600 kbps to 400 kbps, and AAC 5.1 audio.
-
- [H264 Multiple Bitrate 4x3 SD](media-services-mes-preset-H264-Multiple-Bitrate-4x3-SD.md) produces a set of 5 GOP-aligned MP4 files, ranging from 1600 kbps to 400 kbps, and stereo AAC audio.
-
- [H264 Multiple Bitrate 720p Audio 5.1](media-services-mes-preset-H264-Multiple-Bitrate-720p-Audio-5.1.md) produces a set of 6 GOP-aligned MP4 files, ranging from 3400 kbps to 400 kbps, and AAC 5.1 audio.
-
- [H264 Multiple Bitrate 720p](media-services-mes-preset-H264-Multiple-Bitrate-720p.md) produces a set of 6 GOP-aligned MP4 files, ranging from 3400 kbps to 400 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate 1080p Audio 5.1](media-services-mes-preset-H264-Single-Bitrate-1080p-Audio-5.1.md) produces a single MP4 file with a bitrate of 6750 kbps, and AAC 5.1 audio.
-
- [H264 Single Bitrate 1080p](media-services-mes-preset-H264-Single-Bitrate-1080p.md) produces a single MP4 file with a bitrate of 6750 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate 4K Audio 5.1](media-services-mes-preset-H264-Single-Bitrate-4K-Audio-5.1.md) produces a single MP4 file with a bitrate of 18000 kbps, and AAC 5.1 audio.
-
- [H264 Single Bitrate 4K](media-services-mes-preset-H264-Single-Bitrate-4K.md) produces a single MP4 file with a bitrate of 18000 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate 4x3 SD Audio 5.1](media-services-mes-preset-H264-Single-Bitrate-4x3-SD-Audio-5.1.md) produces a single MP4 file with a bitrate of 1800 kbps, and AAC 5.1 audio.
-
- [H264 Single Bitrate 4x3 SD](media-services-mes-preset-H264-Single-Bitrate-4x3-SD.md) produces a single MP4 file with a bitrate of 1800 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate 16x9 SD Audio 5.1](media-services-mes-preset-H264-Single-Bitrate-16x9-SD-Audio-5.1.md) produces a single MP4 file with a bitrate of 2200 kbps, and AAC 5.1 audio.
-
- [H264 Single Bitrate 16x9 SD](media-services-mes-preset-H264-Single-Bitrate-16x9-SD.md) produces a single MP4 file with a bitrate of 2200 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate 720p Audio 5.1](media-services-mes-preset-H264-Single-Bitrate-720p-Audio-5.1.md) produces a single MP4 file with a bitrate of 4500 kbps, and AAC 5.1 audio.
-
- [H264 Single Bitrate 720p for Android](media-services-mes-preset-H264-Single-Bitrate-720p-for-Android.md) preset produces a single MP4 file with a bitrate of 2000 kbps, and stereo AAC.
-
- [H264 Single Bitrate 720p](media-services-mes-preset-H264-Single-Bitrate-720p.md) produces a single MP4 file with a bitrate of 4500 kbps, and stereo AAC audio.
-
- [H264 Single Bitrate High Quality SD for Android](media-services-mes-preset-H264-Single-Bitrate-High-Quality-SD-for-Android.md) produces a single MP4 file with a bitrate of 500 kbps, and stereo AAC audio..
-
- [H264 Single Bitrate Low Quality SD for Android](media-services-mes-preset-H264-Single-Bitrate-Low-Quality-SD-for-Android.md) produces a single MP4 file with a bitrate of 56 kbps, and stereo AAC audio.
-
- For more information related to Media Services encoders, see [Encoding On-Demand with Azure Media Services](./media-services-encode-asset.md).
media-services Media Services Mes Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-mes-schema.md
- Title: Media Encoder Standard schema | Microsoft Docs
-description: This article describes some of the elements and types of the XML schema on which Media Encoder Standard presets are based.
------ Previously updated : 03/10/2021--
-# Media Encoder Standard schema
--
-This article describes some of the elements and types of the XML schema on which [Media Encoder Standard presets](media-services-mes-presets-overview.md) are based. The article gives explanation of elements and their valid values.
-
-## <a name="Preset"></a> Preset (root element)
-Defines an encoding preset.
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **Encoding** |[Encoding](media-services-mes-schema.md#Encoding) |Root element, indicates that the input sources are to be encoded. |
-| **Outputs** |[Outputs](media-services-mes-schema.md#Output) |Collection of desired output files. |
-| **StretchMode**<br/>minOccurs="0"<br/>default="AutoSize|xs:string|Control the output video frame size, padding, pixel, or display aspect ratio. **StretchMode** could be one of the following values: **None**, **AutoSize** (default), or **AutoFit**.<br/><br/>**None**: Strictly follow the output resolution (for example, the **Width** and **Height** in the preset) without considering the pixel aspect ratio or display aspect ratio of the input video. Recommended in scenarios such as [cropping](media-services-crop-video.md), where the output video has a different aspect ratio compared to the input. <br/><br/>**AutoSize**: The output resolution will fit inside the window (Width * Height) specified by preset. However, the encoder produces an output video that has square (1:1) pixel aspect ratio. Therefore, either output Width or output Height could be overridden in order to match the display aspect ratio of the input, without padding. For example, if the input is 1920x1080 and the encoding preset asks for 1280x1280, then the Height value in the preset is overridden, and the output will be at 1280x720, which maintains the input aspect ratio of 16:9. <br/><br/>**AutoFit**: If needed, pad the output video (with either letterbox or pillarbox) to honor the desired output resolution, while ensuring that the active video region in the output has the same aspect ratio as the input. For example, suppose the input is 1920x1080 and the encoding preset asks for 1280x1280. Then the output video will be at 1280x1280, but it will contain an inner 1280x720 rectangle of ΓÇÿactive videoΓÇÖ with aspect ratio of 16:9, and letterbox regions 280 pixels high at the top and bottom. For another example, if the input is 1440x1080 and the encoding preset asks for 1280x720, then the output will be at 1280x720, which contains an inner rectangle of 960x720 at aspect ratio of 4:3, and pillar box regions 160 pixels wide at the left and right.
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Version**<br/><br/> Required |**xs: decimal** |The preset version. The following restrictions apply: xs:fractionDigits value="1" and xs:minInclusive value="1" For example, **version="1.0"**. |
-
-## <a name="Encoding"></a> Encoding
-Contains a sequence of the following elements:
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **H264Video** |[H264Video](media-services-mes-schema.md#H264Video) |Settings for H.264 encoding of video. |
-| **AACAudio** |[AACAudio](media-services-mes-schema.md#AACAudio) |Settings for AAC encoding of audio. |
-| **BmpImage** |[BmpImage](media-services-mes-schema.md#BmpImage) |Settings for Bmp image. |
-| **PngImage** |[PngImage](media-services-mes-schema.md#PngImage) |Settings for Png image. |
-| **JpgImage** |[JpgImage](media-services-mes-schema.md#JpgImage) |Settings for Jpg image. |
-
-## <a name="H264Video"></a> H264Video
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **TwoPass**<br/><br/> minOccurs="0" |**xs:boolean** |Currently, only one-pass encoding is supported. |
-| **KeyFrameInterval**<br/><br/> minOccurs="0"<br/><br/> **default="00:00:02"** |**xs:time** |Determines the fixed spacing between IDR frames in units of seconds. Also referred to as the GOP duration. See **SceneChangeDetection** for controlling whether the encoder can deviate from this value. |
-| **SceneChangeDetection**<br/><br/> minOccurs="0"<br/><br/> default=ΓÇ¥falseΓÇ¥ |**xs: boolean** |If set to true, encoder attempts to detect scene change in the video and inserts an IDR frame. |
-| **Complexity**<br/><br/> minOccurs="0"<br/><br/> default="Balanced" |**xs:string** |Controls the trade-off between encode speed and video quality. Could be one of the following values: **Speed**, **Balanced**, or **Quality**<br/><br/> Default: **Balanced** |
-| **SyncMode**<br/><br/> minOccurs="0" | |Feature will be exposed in a future release. |
-| **H264Layers**<br/><br/> minOccurs="0" |[H264Layers](media-services-mes-schema.md#H264Layers) |Collection of output video layers. |
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Condition** |**xs:string** | When the input has no video, you may want to force the encoder to insert a monochrome video track. To do that, use Condition="InsertBlackIfNoVideoBottomLayerOnly" (to insert a video at only the lowest bitrate) or Condition="InsertBlackIfNoVideo" (to insert a video at all output bitrates). For more information, see [this](media-services-advanced-encoding-with-mes.md#no_video) article.|
-
-## <a name="H264Layers"></a> H264Layers
-
-By default, if you send an input to the encoder that contains only audio, and no video, the output asset contains files with audio data only. Some players may not be able to handle such output streams. You can use the H264Video's **InsertBlackIfNoVideo** attribute setting to force the encoder to add a video track to the output in that scenario. For more information, see [this](media-services-advanced-encoding-with-mes.md#no_video) article.
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **H264Layer**<br/><br/> minOccurs="0" maxOccurs="unbounded" |[H264Layer](media-services-mes-schema.md#H264Layer) |A collection of H264 layers. |
-
-## <a name="H264Layer"></a> H264Layer
-> [!NOTE]
-> Video limits are based on the values described in the [H264 Levels](https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels) table.
->
->
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **Profile**<br/><br/> minOccurs="0"<br/><br/> default=ΓÇ¥AutoΓÇ¥ |**xs: string** |Could be of one of the following **xs: string** values: **Auto**, **Baseline**, **Main**, **High**. |
-| **Level**<br/><br/> minOccurs="0"<br/><br/> default=ΓÇ¥AutoΓÇ¥ |**xs: string** | |
-| **Bitrate**<br/><br/> minOccurs="0" |**xs:int** |The bitrate used for this video layer, specified in kbps. |
-| **MaxBitrate**<br/><br/> minOccurs="0" |**xs: int** |The maximum bitrate used for this video layer, specified in kbps. |
-| **BufferWindow**<br/><br/> minOccurs="0"<br/><br/> default="00:00:05" |**xs: time** |Length of the video buffer. |
-| **Width**<br/><br/> minOccurs="0" |**xs: int** |Width of the output video frame, in pixels.<br/><br/> Currently, you must specify both Width and Height. The Width and Height need to be even numbers. |
-| **Height**<br/><br/> minOccurs="0" |**xs:int** |Height of the output video frame, in pixels.<br/><br/> Currently, you must specify both Width and Height. The Width and Height need to be even numbers.|
-| **BFrames**<br/><br/> minOccurs="0" |**xs: int** |Number of B frames between reference frames. |
-| **ReferenceFrames**<br/><br/> minOccurs="0"<br/><br/> default=ΓÇ¥3ΓÇ¥ |**xs:int** |Number of reference frames in a GOP. |
-| **EntropyMode**<br/><br/> minOccurs="0"<br/><br/> default=ΓÇ¥CabacΓÇ¥ |**xs: string** |Could be one of the following values: **Cabac** and **Cavlc**. |
-| **FrameRate**<br/><br/> minOccurs="0" |rational number |Determines the frame rate of the output video. Use default of "0/1" to let the encoder use the same frame rate as the input video. Allowed values are expected to be common video frame rates. However, any valid rational is allowed. For example, 1/1 would be 1 fps and is valid.<br/><br/> - 12/1 (12 fps)<br/><br/> - 15/1 (15 fps)<br/><br/> - 24/1 (24 fps)<br/><br/> - 24000/1001 (23.976 fps)<br/><br/> - 25/1 (25 fps)<br/><br/> - 30/1 (30 fps)<br/><br/> - 30000/1001 (29.97 fps) <br/> <br/>**NOTE** If you are creating a custom preset for multiple-bitrate encoding, then all layers of the preset **must** use the same value of FrameRate.|
-| **AdaptiveBFrame**<br/><br/> minOccurs="0" |**xs: boolean** |Copy from Azure media encoder |
-| **Slices**<br/><br/> minOccurs="0"<br/><br/> default="0" |**xs:int** |Determines how many slices a frame is divided into. Recommend using default. |
-
-## <a name="AACAudio"></a> AACAudio
- Contains a sequence of the following elements and groups.
-
- For more information about AAC, see [AAC](https://en.wikipedia.org/wiki/Advanced_Audio_Coding).
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **Profile**<br/><br/> minOccurs="0 "<br/><br/> default="AACLC" |**xs: string** |Could be one of the following values: **AACLC**, **HEAACV1**, or **HEAACV2**. |
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Condition** |**xs: string** |To force the encoder to produce an asset that contains a silent audio track when input has no audio, specify the "InsertSilenceIfNoAudio" value.<br/><br/> By default, if you send an input to the encoder that contains only video, and no audio, then the output asset contains files that contain only video data. Some players may not be able to handle such output streams. You can use this setting to force the encoder to add a silent audio track to the output in that scenario. |
-
-### Groups
-
-| Reference | Description |
-| | |
-| [AudioGroup](media-services-mes-schema.md#AudioGroup)<br/><br/> minOccurs="0" |See description of [AudioGroup](media-services-mes-schema.md#AudioGroup) to know the appropriate number of channels, sampling rate, and bit rate that could be set for each profile. |
-
-## <a name="AudioGroup"></a> AudioGroup
-For details about what values are valid for each profile, see the ΓÇ£Audio codec detailsΓÇ¥ table that follows.
-
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **Channels**<br/><br/> minOccurs="0" |**xs: int** |The number of audio channels encoded. The following are valid options: 1, 2, 5, 6, 8.<br/><br/> Default: 2. |
-| **SamplingRate**<br/><br/> minOccurs="0" |**xs: int** |The audio sampling rate, specified in Hz. |
-| **Bitrate**<br/><br/> minOccurs="0" |**xs: int** |The bitrate used when encoding the audio, specified in kbps. |
-
-### Audio codec details
-
-Audio Codec|Details
|
-**AACLC** |1:<br/><br/> - 11025: 8 &lt;= bitrate &lt; 16<br/><br/> - 12000: 8 &lt;= bitrate &lt; 16<br/><br/> - 16000: 8 &lt;= bitrate &lt;32<br/><br/>- 22050: 24 &lt;= bitrate &lt; 32<br/><br/> - 24000: 24 &lt;= bitrate &lt; 32<br/><br/> - 32000: 32 &lt;= bitrate &lt;= 192<br/><br/> - 44100: 56 &lt;= bitrate &lt;= 288<br/><br/> - 48000: 56 &lt;= bitrate &lt;= 288<br/><br/> - 88200 : 128 &lt;= bitrate &lt;= 288<br/><br/> - 96000 : 128 &lt;= bitrate &lt;= 288<br/><br/> 2:<br/><br/> - 11025: 16 &lt;= bitrate &lt; 24<br/><br/> - 12000: 16 &lt;= bitrate &lt; 24<br/><br/> - 16000: 16 &lt;= bitrate &lt; 40<br/><br/> - 22050: 32 &lt;= bitrate &lt; 40<br/><br/> - 24000 : 32 &lt;= bitrate &lt; 40<br/><br/> - 32000: 40 &lt;= bitrate &lt;= 384<br/><br/> - 44100: 96 &lt;= bitrate &lt;= 576<br/><br/> - 48000 : 96 &lt;= bitrate &lt;= 576<br/><br/> - 88200: 256 &lt;= bitrate &lt;= 576<br/><br/> - 96000: 256 &lt;= bitrate &lt;= 576<br/><br/> 5/6:<br/><br/> - 32000: 160 &lt;= bitrate &lt;= 896<br/><br/> - 44100: 240 &lt;= bitrate &lt;= 1024<br/><br/> - 48000: 240 &lt;= bitrate &lt;= 1024<br/><br/> - 88200: 640 &lt;= bitrate &lt;= 1024<br/><br/> - 96000: 640 &lt;= bitrate &lt;= 1024<br/><br/> 8:<br/><br/> - 32000 : 224 &lt;= bitrate &lt;= 1024<br/><br/> - 44100 : 384 &lt;= bitrate &lt;= 1024<br/><br/> - 48000: 384 &lt;= bitrate &lt;= 1024<br/><br/> - 88200: 896 &lt;= bitrate &lt;= 1024<br/><br/> - 96000: 896 &lt;= bitrate &lt;= 1024
-**HEAACV1** |1:<br/><br/> - 22050: bitrate = 8<br/><br/> - 24000: 8 &lt;= bitrate &lt;= 10<br/><br/> - 32000: 12 &lt;= bitrate &lt;= 64<br/><br/> - 44100: 20 &lt;= bitrate &lt;= 64<br/><br/> - 48000: 20 &lt;= bitrate &lt;= 64<br/><br/> - 88200: bitrate = 64<br/><br/> 2:<br/><br/> - 32000: 16 &lt;= bitrate &lt;= 128<br/><br/> - 44100: 16 &lt;= bitrate &lt;= 128<br/><br/> - 48000: 16 &lt;= bitrate &lt;= 128<br/><br/> - 88200 : 96 &lt;= bitrate &lt;= 128<br/><br/> - 96000: 96 &lt;= bitrate &lt;= 128<br/><br/> 5/6:<br/><br/> - 32000 : 64 &lt;= bitrate &lt;= 320<br/><br/> - 44100: 64 &lt;= bitrate &lt;= 320<br/><br/> - 48000: 64 &lt;= bitrate &lt;= 320<br/><br/> - 88200 : 256 &lt;= bitrate &lt;= 320<br/><br/> - 96000: 256 &lt;= bitrate &lt;= 320<br/><br/> 8:<br/><br/> - 32000: 96 &lt;= bitrate &lt;= 448<br/><br/> - 44100: 96 &lt;= bitrate &lt;= 448<br/><br/> - 48000: 96 &lt;= bitrate &lt;= 448<br/><br/> - 88200: 384 &lt;= bitrate &lt;= 448<br/><br/> - 96000: 384 &lt;= bitrate &lt;= 448
-**HEAACV2** |2:<br/><br/> - 22050: 8 &lt;= bitrate &lt;= 10<br/><br/> - 24000: 8 &lt;= bitrate &lt;= 10<br/><br/> - 32000: 12 &lt;= bitrate &lt;= 64<br/><br/> - 44100: 20 &lt;= bitrate &lt;= 64<br/><br/> - 48000: 20 &lt;= bitrate &lt;= 64<br/><br/> - 88200: 64 &lt;= bitrate &lt;= 64
-
-## <a name="Clip"></a> Clip
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **StartTime** |**xs:duration** |Specifies the start time of a presentation. The value of StartTime needs to match the absolute timestamps of the input video. For example, if the first frame of the input video has a timestamp of 12:00:10.000, then StartTime should be at least 12:00:10.000 or greater. |
-| **Duration** |**xs:duration** |Specifies the duration of a presentation (for example, appearance of an overlay in the video). |
-
-## <a name="Output"></a> Output
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **FileName** |**xs:string** |The name of the output file.<br/><br/> You can use macros described in the following table to build the output file names. For example:<br/><br/> **"Outputs": [ { "FileName": "{Basename}*{Resolution}*{Bitrate}.mp4", "Format": { "Type": "MP4Format" } } ]** |
-
-### Macros
-
-| Macro | Description |
-| | |
-| **{Basename}** |If you are doing VoD encoding, the {Basename} is the first 32 characters of the AssetFile.Name property of the primary file in the input asset.<br/><br/> If the input asset is a live archive, then the {Basename} is derived from the trackName attributes in the server manifest. If you are submitting a subclip job using the TopBitrate, as in: "<VideoStream\>TopBitrate</VideoStream\>", and the output file contains video, then the {Basename} is the first 32 characters of the trackName of the video layer with the highest bitrate.<br/><br/> If instead you are submitting a subclip job using all of the input bitrates, such as "<VideoStream\>*</VideoStream\>", and the output file contains video, then {Basename} is the first 32 characters of the trackName of the corresponding video layer. |
-| **{Codec}** |Maps to "H264" for video and "AAC" for audio. |
-| **{Bitrate}** |The target video bitrate if the output file contains video and audio, or target audio bitrate if the output file contains audio only. The value used is the bitrate in kbps. |
-| **{Channel}** |Audio channel count if the file contains audio. |
-| **{Width}** |Width of the video, in pixels, in the output file, if the file contains video. |
-| **{Height}** |Height of the video, in pixels, in the output file, if the file contains video. |
-| **{Extension}** |Inherits from the "Type" property for the output file. The output file name has an extension which is one of: "mp4", "ts", "jpg", "png", or "bmp". |
-| **{Index}** |Mandatory for thumbnail. Should only be present once. |
-
-## <a name="Video"></a> Video (complex type inherits from Codec)
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Start** |**xs:string** | |
-| **Step** |**xs:string** | |
-| **Range** |**xs:string** | |
-| **PreserveResolutionAfterRotation** |**xs:boolean** |For detailed explanation, see the following section: [PreserveResolutionAfterRotation](media-services-mes-schema.md#PreserveResolutionAfterRotation) |
-
-### <a name="PreserveResolutionAfterRotation"></a> PreserveResolutionAfterRotation
-It is recommended to use the **PreserveResolutionAfterRotation** flag in combination with resolution values expressed in percentage terms (Width="100%" , Height="100%").
-
-By default, the encode resolution settings (Width, Height) in the Media Encoder Standard (MES) presets are targeted at videos with 0-degree rotation. For example, if your input video is 1280x720 with zero-degree rotation, then the default presets ensure that the output has the same resolution.
-
-![MESRoation1](./media/media-services-shemas/media-services-mes-roation1.png)
-
-If the input video has been captured with non-zero rotation (for example, a smartphone or tablet held vertically), then MES by default applies the encode resolution settings (Width, Height) to the input video, and then compensate for the rotation. For example, see the picture that follows. The preset uses Width = "100%", Height = "100%", which MES interprets as requiring the output to be 1280 pixels wide and 720 pixels tall. After rotating the video, it then shrinks the picture to fit into that window, leading to pillar-box areas on the left and right.
-
-![MESRoation2](./media/media-services-shemas/media-services-mes-roation2.png)
-
-Alternatively, you can make use of the **PreserveResolutionAfterRotation** flag and set it to "true" (default is "false"). So if your preset has Width = "100%", Height = "100%" and PreserveResolutionAfterRotation set to "true", an input video, which is 1280 pixels wide and 720 pixels tall with 90-degree rotation produces an output with zero-degree rotation, but 720 pixels wide and 1280 pixels tall. See the following picture:
-
-![MESRoation3](./media/media-services-shemas/media-services-mes-roation3.png)
-
-## <a name="FormatGroup"></a> FormatGroup (group)
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **BmpFormat** |**BmpFormat** | |
-| **PngFormat** |**PngFormat** | |
-| **JpgFormat** |**JpgFormat** | |
-
-## <a name="BmpLayer"></a> BmpLayer
-### Element
-
-| Name | Type | Description |
-| | | |
-| **Width**<br/><br/> minOccurs="0" |**xs:int** | |
-| **Height**<br/><br/> minOccurs="0" |**xs:int** | |
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Condition** |**xs:string** | |
-
-## <a name="PngLayer"></a> PngLayer
-### Element
-
-| Name | Type | Description |
-| | | |
-| **Width**<br/><br/> minOccurs="0" |**xs:int** | |
-| **Height**<br/><br/> minOccurs="0" |**xs:int** | |
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Condition** |**xs:string** | |
-
-## <a name="JpgLayer"></a> JpgLayer
-### Element
-
-| Name | Type | Description |
-| | | |
-| **Width**<br/><br/> minOccurs="0" |**xs:int** | |
-| **Height**<br/><br/> minOccurs="0" |**xs:int** | |
-| **Quality**<br/><br/> minOccurs="0" |**xs:int** |Valid values: 1(worst)-100(best) |
-
-### Attributes
-
-| Name | Type | Description |
-| | | |
-| **Condition** |**xs:string** | |
-
-## <a name="PngLayers"></a> PngLayers
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **PngLayer**<br/><br/> minOccurs="0" maxOccurs="unbounded" |[PngLayer](media-services-mes-schema.md#PngLayer) | |
-
-## <a name="BmpLayers"></a> BmpLayers
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **BmpLayer**<br/><br/> minOccurs="0" maxOccurs="unbounded" |[BmpLayer](media-services-mes-schema.md#BmpLayer) | |
-
-## <a name="JpgLayers"></a> JpgLayers
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **JpgLayer**<br/><br/> minOccurs="0" maxOccurs="unbounded" |[JpgLayer](media-services-mes-schema.md#JpgLayer) | |
-
-## <a name="BmpImage"></a> BmpImage (complex type inherits from Video)
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **PngLayers**<br/><br/> minOccurs="0" |[PngLayers](media-services-mes-schema.md#PngLayers) |Png layers |
-
-## <a name="JpgImage"></a> JpgImage (complex type inherits from Video)
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **PngLayers**<br/><br/> minOccurs="0" |[PngLayers](media-services-mes-schema.md#PngLayers) |Png layers |
-
-## <a name="PngImage"></a> PngImage (complex type inherits from Video)
-### Elements
-
-| Name | Type | Description |
-| | | |
-| **PngLayers**<br/><br/> minOccurs="0" |[PngLayers](media-services-mes-schema.md#PngLayers) |Png layers |
-
-## Examples
-See examples of XML presets that are built based on this schema, see [Task Presets for MES (Media Encoder Standard)](media-services-mes-presets-overview.md).
-
-## Next steps
-
-## Provide feedback
-
media-services Media Services Output Metadata Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-output-metadata-schema.md
- Title: Azure Media Services output metadata schema | Microsoft Docs
-description: This article gives an overview of Azure Media Services output metadata schema.
------ Previously updated : 03/10/2021--
-# Output Metadata
--
-## Overview
-An encoding job is associated with an input asset (or assets) on which you want to perform some encoding tasks. For example, encode an MP4 file to H.264 MP4 adaptive bitrate sets; create a thumbnail; create overlays. Upon completion of a task, an output asset is produced. The output asset contains video, audio, thumbnails, etc. The output asset also contains a file with metadata about the output asset. The name of the metadata XML file has the following format: &lt;source_file_name&gt;_manifest.xml (for example, BigBuckBunny_manifest.xml).
-
-Media Services does not pre-emptively scan input Assets to generate metadata. Input metadata is generated only as an artifact when an input Asset is processed in a Job. Hence this artifact is written to the output Asset. Different tools are used to generate metadata for input Assets and output Assets. Therefore, the input metadata has a slightly different schema than the output metadata.
-
-If you want to examine the metadata file, you can create a **SAS** locator and download the file to your local computer.
-
-This article discusses the elements and types of the XML schema on which the output metada (&lt;source_file_name&gt;_manifest.xml) is based. For information about the file that contains metadata about the input asset, see Input Metadata.
-
-You can find the complete schema code and XML example at the end of this article.
-
-## <a name="AssetFiles"></a> AssetFiles root element
-Collection of AssetFile entries for the encoding job.
-
-### Child elements
-| Name | Description |
-| | |
-| **AssetFile**<br/><br/> minOccurs="0" maxOccurs="1" |An AssetFile element that is part of the AssetFiles collection. |
-
-## <a name="AssetFile"></a> AssetFile element
-You can find an XML example [XML example](#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Name**<br/><br/> Required |**xs:string** |The media asset file name. |
-| **Size**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:long** |Size of the asset file in bytes. |
-| **Duration**<br/><br/> Required |**xs:duration** |Content play back duration. |
-
-### Child elements
-| Name | Description |
-| | |
-| **Sources** |Collection of input/source media files, that was processed in order to produce this AssetFile. For more information, see Source element. |
-| **VideoTracks**<br/><br/> minOccurs="0" maxOccurs="1" |Each physical AssetFile can contain in it zero or more videos tracks interleaved into an appropriate container format. For more information, see VideoTracks element. |
-| **AudioTracks**<br/><br/> minOccurs="0" maxOccurs="1" |Each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. This is the collection of all those audio tracks. For more information, see AudioTracks element. |
-
-## <a name="Sources"></a> Sources element
-Collection of input/source media files, that was processed in order to produce this AssetFile.
-
-You can find an XML example [XML example](#xml).
-
-### Child elements
-| Name | Description |
-| | |
-| **Source**<br/><br/> minOccurs="1" maxOccurs="unbounded" |An input/source file used when generating this asset. For more information, see Source element. |
-
-## <a name="Source"></a> Source element
-An input/source file used when generating this asset.
-
-You can find an XML example [XML example](#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Name**<br/><br/> Required |**xs:string** |Input source file name. |
-
-## <a name="VideoTracks"></a> VideoTracks element
-Each physical AssetFile can contain in it zero or more videos tracks interleaved into an appropriate container format. The **VideoTracks** element represents a collection of all the video tracks.
-
-You can find an XML example [XML example](#xml).
-
-### Child elements
-| Name | Description |
-| | |
-| **VideoTrack**<br/><br/> minOccurs="1" maxOccurs="unbounded" |A specific video track in the parent AssetFile. For more information, see VideoTrack element. |
-
-## <a name="VideoTrack"></a> VideoTrack element
-A specific video track in the parent AssetFile.
-
-You can find an XML example [XML example](#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Id**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Zero-based index of this video track. **Note:** This **Id** is not necessarily the TrackID as used in an MP4 file. |
-| **FourCC**<br/><br/> Required |**xs:string** |Video codec FourCC code. |
-| **Profile** |**xs:string** |H264 profile (only applicable to H264 codec). |
-| **Level** |**xs:string** |H264 level (only applicable to H264 codec). |
-| **Width**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Encoded video width in pixels. |
-| **Height**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Encoded video height in pixels. |
-| **DisplayAspectRatioNumerator**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:double** |Video display aspect ratio numerator. |
-| **DisplayAspectRatioDenominator**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:double** |Video display aspect ratio denominator. |
-| **Framerate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:decimal** |Measured video frame rate in .3f format. |
-| **TargetFramerate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:decimal** |Preset target video frame rate in .3f format. |
-| **Bitrate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Average video bit rate in kilobits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead. |
-| **TargetBitrate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Target average bitrate for this video track, as requested via the encoding preset, in kilobits per second. |
-| **MaxGOPBitrate**<br/><br/> minInclusive ="0" |**xs:int** |Max GOP average bitrate for this video track, in kilobits per second. |
-
-## <a name="AudioTracks"></a> AudioTracks element
-Each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. The **AudioTracks** element represents a collection of all those audio tracks.
-
-You can find an XML example [XML example](#xml).
-
-### Child elements
-| Name | Description |
-| | |
-| **AudioTrack**<br/><br/> minOccurs="1" maxOccurs="unbounded" |A specific audio track in the parent AssetFile. For more information, see AudioTrack element. |
-
-## <a name="AudioTrack"></a> AudioTrack element
-A specific audio track in the parent AssetFile.
-
-You can find an XML example [XML example](#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **Id**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Zero-based index of this audio track. **Note:** This is not necessarily the TrackID as used in an MP4 file. |
-| **Codec** |**xs:string** |Audio track codec string. |
-| **EncoderVersion** |**xs:string** |Optional encoder version string, required for EAC3. |
-| **Channels**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Number of audio channels. |
-| **SamplingRate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Audio sampling rate in samples/sec or Hz. |
-| **Bitrate**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Average audio bit rate in bits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead. |
-| **BitsPerSample**<br/><br/> minInclusive ="0"<br/><br/> Required |**xs:int** |Bits per sample for the wFormatTag format type. |
-
-### Child elements
-| Name | Description |
-| | |
-| **LoudnessMeteringResultParameters**<br/><br/> minOccurs="0" maxOccurs="1" |Loudness metering result parameters. For more information, see LoudnessMeteringResultParameters element. |
-
-## <a name="LoudnessMeteringResultParameters"></a> LoudnessMeteringResultParameters element
-Loudness metering result parameters.
-
-You can find an XML example [XML example](#xml).
-
-### Attributes
-| Name | Type | Description |
-| | | |
-| **DPLMVersionInformation** |**xs:string** |**Dolby** professional loudness metering development kit version. |
-| **DialogNormalization**<br/><br/> minInclusive="-31" maxInclusive="-1"<br/><br/> Required |**xs:int** |DialogNormalization generated through DPLM, required when LoudnessMetering is set |
-| **IntegratedLoudness**<br/><br/> minInclusive="-70" maxInclusive="10"<br/><br/> Required |**xs:float** |Integrated loudness |
-| **IntegratedLoudnessUnit**<br/><br/> Required |**xs:string** |Integrated loudness unit. |
-| **IntegratedLoudnessGatingMethod**<br/><br/> Required |**xs:string** |Gating identifier |
-| **IntegratedLoudnessSpeechPercentage**<br/><br/> minInclusive ="0" maxInclusive="100" |**xs:float** |Speech content over the program, as a percentage. |
-| **SamplePeak**<br/><br/> Required |**xs:float** |Peak absolute sample value, since reset or since it was last cleared, per channel. Units are dBFS. |
-| **SamplePeakUnit**<br/><br/> fixed="dBFS"<br/><br/> Required |**xs:anySimpleType** |Sample peak unit. |
-| **TruePeak**<br/><br/> Required |**xs:float** |Maximum true peak value, as per ITU-R BS.1770-2, since reset or since it was last cleared, per channel. Units are dBTP. |
-| **TruePeakUnit**<br/><br/> fixed="dBTP"<br/><br/> Required |**xs:anySimpleType** |True peak unit. |
-
-## Schema Code
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<xs:schema xmlns:xs="https://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata" version="1.2"
- xmlns="http://schemas.microsoft.com/windowsazure/mediaservices/2013/05/mediaencoder/metadata"
- targetNamespace="http://schemas.microsoft.com/windowsazure/mediaservices/2013/05/mediaencoder/metadata"
- elementFormDefault="qualified">
- <xs:element name="AssetFiles">
- <xs:annotation>
- <xs:documentation>Collection of AssetFile entries for the encoding job</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="AssetFile" minOccurs="1" maxOccurs="unbounded">
- <xs:annotation>
- <xs:documentation>asset file</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="Sources">
- <xs:annotation>
- <xs:documentation>Collection of input/source media files, that was processed in order to produce this AssetFile</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="Source" minOccurs="1" maxOccurs="unbounded">
- <xs:annotation>
- <xs:documentation>An input/source file used when generating this asset</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:attribute name="Name" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>input source file name</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- <xs:element name="VideoTracks" minOccurs="0">
- <xs:annotation>
- <xs:documentation>Each physical AssetFile can contain in it zero or more video tracks interleaved into an appropriate container format. This is the collection of all those video tracks</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="VideoTrack" maxOccurs="unbounded">
- <xs:annotation>
- <xs:documentation>A specific video track in the parent AssetFile</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:attribute name="Id" use="required">
- <xs:annotation>
- <xs:documentation>zero-based index of this video track. Note: this is not necessarily the TrackID as used in an MP4 file</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="FourCC" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>video codec FourCC code</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Profile" type="xs:string">
- <xs:annotation>
- <xs:documentation>H264 profile (only appliable for H264 codec)</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Level" type="xs:string">
- <xs:annotation>
- <xs:documentation>H264 level (only appliable for H264 codec)</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Width" use="required">
- <xs:annotation>
- <xs:documentation>encoded video width in pixels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Height" use="required">
- <xs:annotation>
- <xs:documentation>encoded video height in pixels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="DisplayAspectRatioNumerator" use="required">
- <xs:annotation>
- <xs:documentation>video display aspect ratio numerator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="DisplayAspectRatioDenominator" use="required">
- <xs:annotation>
- <xs:documentation>video display aspect ratio denominator</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:double">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Framerate" use="required">
- <xs:annotation>
- <xs:documentation>measured video frame rate in .3f format</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:decimal">
- <xs:minInclusive value="0"/>
- <xs:fractionDigits value="3"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="TargetFramerate" use="required">
- <xs:annotation>
- <xs:documentation>preset target video frame rate in .3f format</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:decimal">
- <xs:minInclusive value="0"/>
- <xs:fractionDigits value="3"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Bitrate" use="required">
- <xs:annotation>
- <xs:documentation>average video bit rate in kilobits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="TargetBitrate" use="required">
- <xs:annotation>
- <xs:documentation>target average bitrate for this video track, as requested via the encoding preset, in kilobits per second</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="MaxGOPBitrate">
- <xs:annotation>
- <xs:documentation>Max GOP average bitrate for this video track, in kilobits per second</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- <xs:element name="AudioTracks" minOccurs="0">
- <xs:annotation>
- <xs:documentation>each physical AssetFile can contain in it zero or more audio tracks interleaved into an appropriate container format. This is the collection of all those audio tracks</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="AudioTrack" maxOccurs="unbounded">
- <xs:annotation>
- <xs:documentation>a specific audio track in the parent AssetFile</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:sequence>
- <xs:element name="LoudnessMeteringResultParameters" minOccurs="0" maxOccurs="1">
- <xs:annotation>
- <xs:documentation>Loudness Metering Result Parameters</xs:documentation>
- </xs:annotation>
- <xs:complexType>
- <xs:attribute name="DPLMVersionInformation" type="xs:string">
- <xs:annotation>
- <xs:documentation>Dolby Professional Loudness Metering Development Kit Version</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="DialogNormalization" use="required">
- <xs:annotation>
- <xs:documentation> DialogNormalization generated through DPLM, required when LoudnessMetering is set</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="-31"/>
- <xs:maxInclusive value="-1"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="IntegratedLoudness" use="required">
- <xs:annotation>
- <xs:documentation>Integrated loudness</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:float">
- <xs:minInclusive value="-70"/>
- <xs:maxInclusive value="10"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="IntegratedLoudnessUnit" use="required" type="xs:string">
- </xs:attribute>
- <xs:attribute name="IntegratedLoudnessGatingMethod" use="required" type="xs:string">
- <xs:annotation>
- <xs:documentation>Gating identifier</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="IntegratedLoudnessSpeechPercentage">
- <xs:annotation>
- <xs:documentation>Speech content over the program, as a percentage.</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:float">
- <xs:minInclusive value="0"/>
- <xs:maxInclusive value="100"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="SamplePeak" use="required" type="xs:float">
- <xs:annotation>
- <xs:documentation>Peak absolute sample value, since reset or since it was last cleared, per channel. Units are dBFS.</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="SamplePeakUnit" use="required" fixed="dBFS">
- </xs:attribute>
- <xs:attribute name="TruePeak" use="required" type="xs:float">
- <xs:annotation>
- <xs:documentation>Maximum True Peak value, as per ITU-R BS.1770-2, since reset or since it was last cleared, per channel. Units are dBTP.</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="TruePeakUnit" use="required" fixed="dBTP">
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- <xs:attribute name="Id" use="required">
- <xs:annotation>
- <xs:documentation>zero-based index of this audio track. Note: this is not necessarily the TrackID as used in an MP4 file</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Codec" type="xs:string">
- <xs:annotation>
- <xs:documentation>audio track codec string</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="EncoderVersion" type="xs:string">
- <xs:annotation>
- <xs:documentation>optional encoder version string, required for EAC3</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Channels" use="required">
- <xs:annotation>
- <xs:documentation>number of audio channels</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="SamplingRate" use="required">
- <xs:annotation>
- <xs:documentation>audio sampling rate in samples/sec or Hz</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Bitrate" use="required">
- <xs:annotation>
- <xs:documentation>average audio bit rate in bits per second, as calculated from the AssetFile. Counts only the elementary stream payload, and does not include the packaging overhead</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="BitsPerSample" use="required">
- <xs:annotation>
- <xs:documentation>Bits per sample for the wFormatTag format type</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:int">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- <xs:attribute name="Name" type="xs:string" use="required">
- <xs:annotation>
- <xs:documentation>the media asset file name</xs:documentation>
- </xs:annotation>
- </xs:attribute>
- <xs:attribute name="Size" use="required">
- <xs:annotation>
- <xs:documentation>size of file in bytes</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:long">
- <xs:minInclusive value="0"/>
- </xs:restriction>
- </xs:simpleType>
- </xs:attribute>
- <xs:attribute name="Duration" use="required">
- <xs:annotation>
- <xs:documentation>content play back duration</xs:documentation>
- </xs:annotation>
- <xs:simpleType>
- <xs:restriction base="xs:duration"/>
- </xs:simpleType>
- </xs:attribute>
- </xs:complexType>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- </xs:element>
-</xs:schema>
-```
---
-## <a name="xml"></a> XML example
-
-The following XML is an example of the Output metadata file.
-
-```xml
-<AssetFiles xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="https://www.w3.org/2001/XMLSchema"
- xmlns="http://schemas.microsoft.com/windowsazure/mediaservices/2013/05/mediaencoder/metadata">
- <AssetFile Name="BigBuckBunny_H264_3400kbps_AAC_und_ch2_96kbps.mp4" Size="4646283" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="3.2" Width="1280" Height="720" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="4250" TargetBitrate="3400" MaxGOPBitrate="5514"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_H264_2250kbps_AAC_und_ch2_96kbps.mp4" Size="3166728" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="3.1" Width="960" Height="540" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="2846" TargetBitrate="2250" MaxGOPBitrate="3630"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_H264_1500kbps_AAC_und_ch2_96kbps.mp4" Size="2205095" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="3.1" Width="960" Height="540" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="1932" TargetBitrate="1500" MaxGOPBitrate="2513"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_H264_1000kbps_AAC_und_ch2_96kbps.mp4" Size="1508567" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="3.0" Width="640" Height="360" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="1271" TargetBitrate="1000" MaxGOPBitrate="1527"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4" Size="1057155" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="3.0" Width="640" Height="360" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="843" TargetBitrate="650" MaxGOPBitrate="1086"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_H264_400kbps_AAC_und_ch2_96kbps.mp4" Size="699262" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <VideoTracks>
- <VideoTrack Id="0" FourCC="AVC1" Profile="Main" Level="1.3" Width="320" Height="180" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" Framerate="23.974" TargetFramerate="23.974" Bitrate="503" TargetBitrate="400" MaxGOPBitrate="661"/>
- </VideoTracks>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_AAC_und_ch2_96kbps.mp4" Size="166780" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="93" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
- <AssetFile Name="BigBuckBunny_AAC_und_ch2_56kbps.mp4" Size="124576" Duration="PT8.4288444S">
- <Sources>
- <Source Name="BigBuckBunny.mp4"/>
- </Sources>
- <AudioTracks>
- <AudioTrack Id="0" Codec="AacLc" Channels="2" SamplingRate="44100" Bitrate="53" BitsPerSample="16"/>
- </AudioTracks>
- </AssetFile>
-</AssetFiles>
-```
-
-## Next steps
-
-## Provide feedback
media-services Media Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-overview.md
- Title: Azure Media Services overview | Microsoft Docs
-description: Microsoft Azure Media Services is an extensible cloud-based platform that enables developers to build scalable media management and delivery applications. This article gives an overview of Azure Media Services.
------ Previously updated : 03/10/2021--
-# Azure Media Services overview
---
-Microsoft Azure Media Services (AMS) is an extensible cloud-based platform that enables developers to build scalable media management and delivery applications. Media Services is based on REST APIs that enable you to securely upload, store, encode, and package video or audio content for both on-demand and live streaming delivery to various clients (for example, TV, PC, and mobile devices).
-
-You can build end-to-end workflows using entirely Media Services. You can also choose to use third-party components for some parts of your workflow. For example, encode using a third-party encoder. Then, upload, protect, package, deliver using Media Services. You can choose to stream your content live or deliver content on-demand.
--
-## Compliance, Privacy and Security
-
-As an important reminder, you must comply with all applicable laws in your use of Azure Media Services, and you may not use Media Services or any Azure service in a manner that violates the rights of others, or that may be harmful to others.
-
-Before uploading any video/image to Media Services, You must have all the proper rights to use the video/image, including, where required by law, all the necessary consents from individuals (if any) in the video/image, for the use, processing, and storage of their data in Media Services and Azure. Some jurisdictions may impose special legal requirements for the collection, online processing and storage of certain categories of data, such as biometric data. Before using Media Services and Azure for the processing and storage of any data subject to special legal requirements, You must ensure compliance with any such legal requirements that may apply to You.
-
-To learn about compliance, privacy and security in Media Services please visit the Microsoft [Trust Center](https://www.microsoft.com/trust-center/?rtc=1). For MicrosoftΓÇÖs privacy obligations, data handling and retention practices, including how to delete your data, please review MicrosoftΓÇÖs [Privacy Statement](https://privacy.microsoft.com/PrivacyStatement), the [Online Services Terms](https://www.microsoft.com/licensing/product-licensing/products?rtc=1) (ΓÇ£OSTΓÇ¥) and [Data Processing Addendum](https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67) (ΓÇ£DPAΓÇ¥). By using Media Services, you agree to be bound by the OST, DPA and the Privacy Statement.
-
-## Prerequisites
-
-To start using Azure Media Services, you should have the following:
-
-* An Azure account. If you don't have an account, you can create a free trial account in just a couple of minutes. For details, see [Azure Free Trial](https://azure.microsoft.com).
-* An Azure Media Services account. For more information, see [Create Account](media-services-portal-create-account.md).
-* (Optional) Set up development environment. Choose .NET or REST API for your development environment. For more information, see [Set up environment](media-services-dotnet-how-to-use.md).
-
- Also, learn how to [connect programmatically to AMS API](media-services-use-aad-auth-to-access-ams-api.md).
-* A standard or premium streaming endpoint in started state. For more information, see [Managing streaming endpoints](media-services-portal-manage-streaming-endpoints.md)
-
-## SDKs and tools
-
-To build Media Services solutions, you can use:
-
-* [Media Services REST API](/rest/api/media/operations/azure-media-services-rest-api-reference)
-* One of the available client SDKs:
- * Azure Media Services SDK for .NET
-
- * [NuGet package](https://www.nuget.org/packages/windowsazure.mediaservices/)
- * [GitHub source code](https://github.com/Azure/azure-sdk-for-media-services)
- * [Azure SDK for Java](https://github.com/Azure/azure-sdk-for-java),
- * [Azure PHP SDK](https://github.com/Azure/azure-sdk-for-php),
- * [Azure Media Services for Node.js](https://github.com/michelle-becker/node-ams-sdk/blob/master/lib/request.js) (This is a non-Microsoft version of a Node.js SDK. It is maintained by a community and currently does not have a 100% coverage of the AMS APIs).
-* Existing tools:
- * [Azure portal](https://portal.azure.com/)
- * [Azure-Media-Services-Explorer](https://github.com/Azure/Azure-Media-Services-Explorer) (Azure Media Services Explorer (AMSE) is a Winforms/C# application for Windows)
-
-> [!NOTE]
-> To get the latest version of Java SDK and get started developing with Java, see [Get started with the Java client SDK for Media Services](./media-services-java-how-to-use.md). <br/>
-> To download the latest PHP SDK for Media Services, look for version 0.5.7 of the Microsoft/WindowAzure package in the [Packagist repository](https://packagist.org/packages/microsoft/windowsazure#v0.5.7).
-
-## Code samples
-
-Find multiple code samples in the **Azure Code Samples** gallery: [Azure Media Services code samples](https://azure.microsoft.com/resources/samples/?service=media-services&sort=0).
-
-## Concepts
-
-For Azure Media Services concepts, see [Concepts](media-services-concepts.md).
-
-## Supported scenarios and availability of Media Services across data centers
-
-For more information about Azure common scenarios, see [AMS scenarios](scenarios-and-availability.md).
-For information about regional availability, see [Media service availability](availability-regions-v-2.md).
-
-## Service Level Agreement (SLA)
-
-For more information, see [Microsoft Azure SLA](https://azure.microsoft.com/support/legal/sla/).
-
-## Support
-
-[Azure Support](https://azure.microsoft.com/support/options/) provides support options for Azure, including Media Services.
-
-## Provide feedback
-
media-services Media Services Pass Authentication Tokens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-pass-authentication-tokens.md
- Title: Pass authentication tokens to Azure Media Services | Microsoft Docs
-description: Learn how to send authentication tokens from the client to the Azure Media Services key delivery service
-
-keywords: content protection, DRM, token authentication
----- Previously updated : 03/22/2021----
-# Learn how clients pass tokens to the Azure Media Services key delivery service
--
-Customers often ask how a player can pass tokens to the Azure Media Services key delivery service for verification so the player can obtain the key. Media Services supports the simple web token (SWT) and JSON Web Token (JWT) formats. Token authentication is applied to any type of key, regardless of whether you use common encryption or Advanced Encryption Standard (AES) envelope encryption in the system.
-
- Depending on the player and platform you target, you can pass the token with your player in the following ways:
--- Through the HTTP Authorization header.
- > [!NOTE]
- > The "Bearer" prefix is expected per the OAuth 2.0 specs. To set the video source, choose **AES (JWT Token)** or **AES (SWT Token)**. The token is passed via the Authorization header.
--- Via the addition of a URL query parameter with "token=tokenvalue."
- > [!NOTE]
- > The "Bearer" prefix isn't expected. Because the token is sent through a URL, you need to armor the token string. Here is a C# sample code that shows how to do it:
-
- ```csharp
- string armoredAuthToken = System.Web.HttpUtility.UrlEncode(authToken);
- string uriWithTokenParameter = string.Format("{0}&token={1}", keyDeliveryServiceUri.AbsoluteUri, armoredAuthToken);
- Uri keyDeliveryUrlWithTokenParameter = new Uri(uriWithTokenParameter);
- ```
--- Through the CustomData field.
-This option is used for PlayReady license acquisition only, through the CustomData field of the PlayReady License Acquisition Challenge. In this case, the token must be inside the XML document as described here:
-
- ```xml
- <?xml version="1.0"?>
- <CustomData xmlns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyCustomData/v1">
- <Token></Token>
- </CustomData>
- ```
- Put your authentication token in the Token element.
-
-## Next steps
-
media-services Media Services Playback Content With Existing Players https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-playback-content-with-existing-players.md
- Title: Use existing players to playback your content - Azure | Microsoft Docs
-description: This article lists existing players that you can use to playback your content.
------ Previously updated : 03/10/2021--
-# Playing your content with existing players
--
-Azure Media Services supports many popular streaming formats, such as Smooth Streaming, HTTP Live Streaming, and MPEG-Dash. This topic points you to existing players that you can use to test your streams.
-
-## The Azure portal Media Services content player
-
-The **Azure** portal provides a content player that you can use to test your video.
-
-Click on the desired video (make sure it was [published](media-services-portal-publish.md)) and click the **Play** button at the bottom of the portal.
-
-Some considerations apply:
-
-* The **MEDIA SERVICES CONTENT PLAYER** plays from the default streaming endpoint. If you want to play from a non-default streaming endpoint, use another player. For example, [Azure Media Player](https://aka.ms/azuremediaplayer).
-
-### Azure Media Player
-
-Use [Azure Media Player](https://aka.ms/azuremediaplayer) to playback your content (clear or protected) in any of the following formats:
-
-* Smooth Streaming
-* MPEG DASH
-* HLS
-* Progressive MP4
-
-### Flash Player
-
-#### PlayReady with Token
-
-[http://reference.dashif.org/dash.js/nightly/samples/dash-if-reference-player/https://docsupdatetracker.net/index.html](http://reference.dashif.org/dash.js/nightly/samples/dash-if-reference-player/https://docsupdatetracker.net/index.html)
-
-### DASH Players
-
-[dash player](http://players.akamai.com/players/dashjs)
-
-[https://dashif.org](https://dashif.org)
-
-### Other
-
-To test HLS URLs you can also use:
-
-* **Safari** on an iOS device or
-* **3ivx HLS Player** on Windows.
-
-## Media Services learning paths
--
-## Provide feedback
-
media-services Media Services Playready License Template Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-playready-license-template-overview.md
- Title: Media Services PlayReady license template overview
-description: This topic gives an overview of a PlayReady license template that is used to configure PlayReady licenses.
------ Previously updated : 03/10/2021--
-# Media Services PlayReady license template overview
--
-Azure Media Services now provides a service for delivering PlayReady licenses. When the player (for example, Silverlight) tries to play your PlayReady-protected content, a request is sent to the license delivery service to obtain a license. If the license service approves the request, it issues the license that is sent to the client and is used to decrypt and play the specified content.
-
-Media Services also provides APIs that you can use to configure your PlayReady licenses. Licenses contain the rights and restrictions that you want the PlayReady digital rights management (DRM) runtime to enforce when a user tries to play back protected content.
-Here are some examples of PlayReady license restrictions that you can specify:
-
-* The date and time from which the license is valid.
-* The DateTime value when the license expires.
-* For the license to be saved in persistent storage on the client. Persistent licenses are typically used to allow offline playback of the content.
-* The minimum security level that a player must have to play your content.
-* The output protection level for the output controls for audio\video content.
-* For more information, see the "Output Controls" section (3.5) in the [PlayReady Compliance Rules](https://www.microsoft.com/playready/licensing/compliance/) document.
-
-> [!NOTE]
-> Currently, you can only configure the PlayRight of the PlayReady license. This right is required. The PlayRight gives the client the ability to play back the content. You also can use the PlayRight to configure restrictions specific to playback. For more information, see [PlayReadyPlayRight](media-services-playready-license-template-overview.md#PlayReadyPlayRight).
->
->
-
-To configure PlayReady licenses by using Media Services, you must configure the Media Services PlayReady license template. The template is defined in XML.
-
-The following example shows the simplest (and most common) template that configures a basic streaming license. With this license, your clients can play back your PlayReady-protected content.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<PlayReadyLicenseResponseTemplate xmlns:i="https://www.w3.org/2001/XMLSchema-instance"
- xmlns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1">
- <LicenseTemplates>
- <PlayReadyLicenseTemplate>
- <ContentKey i:type="ContentEncryptionKeyFromHeader" />
- <PlayRight />
- </PlayReadyLicenseTemplate>
- </LicenseTemplates>
-</PlayReadyLicenseResponseTemplate>
-```
-
-The XML conforms to the PlayReady license template XML schema defined in the "PlayReady license template XML schema" section.
-
-Media Services also defines a set of .NET classes that can be used to serialize and deserialize to and from the XML. For a description of the main classes, see the [Media Services .NET classes](media-services-playready-license-template-overview.md#classes) that are used to configure license templates.
-
-For an end-to-end example that uses .NET classes to configure the PlayReady license template, see [Use PlayReady dynamic encryption and the license delivery service](media-services-protect-with-playready-widevine.md).
-
-## <a id="classes"></a>Media Services .NET classes that are used to configure license templates
-The following classes are the main .NET classes that are used to configure Media Services PlayReady license templates. These classes map to the types defined in [PlayReady license template XML schema](media-services-playready-license-template-overview.md#schema).
-
-The [MediaServicesLicenseTemplateSerializer](/dotnet/api/microsoft.windowsazure.mediaservices.client.contentkeyauthorization.mediaserviceslicensetemplateserializer#microsoft_windowsazure_mediaservices_client_contentkeyauthorization_mediaserviceslicensetemplateserializer) class is used to serialize and deserialize to and from the Media Services license template XML.
-
-### PlayReadyLicenseResponseTemplate
-[PlayReadyLicenseResponseTemplate](/dotnet/api/microsoft.windowsazure.mediaservices.client.contentkeyauthorization.playreadylicenseresponsetemplate#microsoft_windowsazure_mediaservices_client_contentkeyauthorization_playreadylicenseresponsetemplate): This class represents the template for the response sent back to the user. It contains a field for a custom data string between the license server and the application (which might be useful for custom app logic). It also contains a list of one or more license templates.
-
-As the "top-level" class in the template hierarchy, the response template includes a list of license templates. The license templates include (directly or indirectly) all the other classes that make up the template data to be serialized.
-
-### PlayReadyLicenseTemplate
-[PlayReadyLicenseTemplate](/dotnet/api/microsoft.windowsazure.mediaservices.client.contentkeyauthorization.playreadylicensetemplate#microsoft_windowsazure_mediaservices_client_contentkeyauthorization_playreadylicensetemplate): This class represents a license template that is used to create PlayReady licenses to be returned to users. It contains the data on the content key in the license. It also includes any rights or restrictions that the PlayReady DRM runtime must enforce when the content key is used.
-
-### <a id="PlayReadyPlayRight"></a>PlayReadyPlayRight
-[PlayReadyPlayRight](/dotnet/api/microsoft.windowsazure.mediaservices.client.contentkeyauthorization.playreadyplayright#microsoft_windowsazure_mediaservices_client_contentkeyauthorization_playreadyplayright): This class represents the PlayRight of a PlayReady license. It grants the user the ability to play back the content subject to any restrictions configured in the license and on the PlayRight itself (for playback-specific policy). Much of the policy on a PlayRight concerns output restrictions that control the types of outputs that the content can be played over. It also includes any restrictions that must be put in place when a given output is used. For example, if DigitalVideoOnlyContentRestriction is enabled, the DRM runtime only allows the video to be displayed over digital outputs. (Analog video outputs aren't allowed to pass the content.)
-
-> [!IMPORTANT]
-> These types of restrictions can be powerful, but they also can affect the consumer experience. If the output protections are too restrictive, the content might be unplayable on some clients. For more information, see the [PlayReady Compliance Rules](https://www.microsoft.com/playready/licensing/compliance/).
->
->
-
-For an example of the protection levels that Silverlight supports, see [Silverlight support for output protections](/previous-versions/windows/silverlight/dotnet-windows-silverlight/cc838192(v=vs.95)).
-
-## <a id="schema"></a>PlayReady license template XML schema
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<xs:schema xmlns:tns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1" xmlns:ser="http://schemas.microsoft.com/2003/10/Serialization/" elementFormDefault="qualified" targetNamespace="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1" xmlns:xs="https://www.w3.org/2001/XMLSchema">
- <xs:import namespace="http://schemas.microsoft.com/2003/10/Serialization/" />
- <xs:complexType name="AgcAndColorStripeRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="AgcAndColorStripeRestriction" nillable="true" type="tns:AgcAndColorStripeRestriction" />
- <xs:simpleType name="ContentType">
- <xs:restriction base="xs:string">
- <xs:enumeration value="Unspecified" />
- <xs:enumeration value="UltravioletDownload" />
- <xs:enumeration value="UltravioletStreaming" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="ContentType" nillable="true" type="tns:ContentType" />
- <xs:complexType name="ExplicitAnalogTelevisionRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="BestEffort" type="xs:boolean" />
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ExplicitAnalogTelevisionRestriction" nillable="true" type="tns:ExplicitAnalogTelevisionRestriction" />
- <xs:complexType name="PlayReadyContentKey">
- <xs:sequence />
- </xs:complexType>
- <xs:element name="PlayReadyContentKey" nillable="true" type="tns:PlayReadyContentKey" />
- <xs:complexType name="ContentEncryptionKeyFromHeader">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:PlayReadyContentKey">
- <xs:sequence />
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="ContentEncryptionKeyFromHeader" nillable="true" type="tns:ContentEncryptionKeyFromHeader" />
- <xs:complexType name="ContentEncryptionKeyFromKeyIdentifier">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:PlayReadyContentKey">
- <xs:sequence>
- <xs:element minOccurs="0" name="KeyIdentifier" type="ser:guid" />
- </xs:sequence>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="ContentEncryptionKeyFromKeyIdentifier" nillable="true" type="tns:ContentEncryptionKeyFromKeyIdentifier" />
- <xs:complexType name="PlayReadyLicenseResponseTemplate">
- <xs:sequence>
- <xs:element name="LicenseTemplates" nillable="true" type="tns:ArrayOfPlayReadyLicenseTemplate" />
- <xs:element minOccurs="0" name="ResponseCustomData" nillable="true" type="xs:string">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyLicenseResponseTemplate" nillable="true" type="tns:PlayReadyLicenseResponseTemplate" />
- <xs:complexType name="ArrayOfPlayReadyLicenseTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="PlayReadyLicenseTemplate" nillable="true" type="tns:PlayReadyLicenseTemplate" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfPlayReadyLicenseTemplate" nillable="true" type="tns:ArrayOfPlayReadyLicenseTemplate" />
- <xs:complexType name="PlayReadyLicenseTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" name="AllowTestDevices" type="xs:boolean" />
- <xs:element minOccurs="0" name="BeginDate" nillable="true" type="xs:dateTime">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element name="ContentKey" nillable="true" type="tns:PlayReadyContentKey" />
- <xs:element minOccurs="0" name="ContentType" type="tns:ContentType">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ExpirationDate" nillable="true" type="xs:dateTime">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="GracePeriod" nillable="true" type="ser:duration">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="LicenseType" type="tns:PlayReadyLicenseType" />
- <xs:element minOccurs="0" name="PlayRight" nillable="true" type="tns:PlayReadyPlayRight" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyLicenseTemplate" nillable="true" type="tns:PlayReadyLicenseTemplate" />
- <xs:simpleType name="PlayReadyLicenseType">
- <xs:restriction base="xs:string">
- <xs:enumeration value="Nonpersistent" />
- <xs:enumeration value="Persistent" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="PlayReadyLicenseType" nillable="true" type="tns:PlayReadyLicenseType" />
- <xs:complexType name="PlayReadyPlayRight">
- <xs:sequence>
- <xs:element minOccurs="0" name="AgcAndColorStripeRestriction" nillable="true" type="tns:AgcAndColorStripeRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="AllowPassingVideoContentToUnknownOutput" type="tns:UnknownOutputPassingOption">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="AnalogVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="CompressedDigitalAudioOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="CompressedDigitalVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="DigitalVideoOnlyContentRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ExplicitAnalogTelevisionOutputRestriction" nillable="true" type="tns:ExplicitAnalogTelevisionRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="FirstPlayExpiration" nillable="true" type="ser:duration">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ImageConstraintForAnalogComponentVideoRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ImageConstraintForAnalogComputerMonitorRestriction" type="xs:boolean">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="ScmsRestriction" nillable="true" type="tns:ScmsRestriction">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="UncompressedDigitalAudioOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- <xs:element minOccurs="0" name="UncompressedDigitalVideoOpl" nillable="true" type="xs:int">
- <xs:annotation>
- <xs:appinfo>
- <DefaultValue EmitDefaultValue="false" xmlns="http://schemas.microsoft.com/2003/10/Serialization/" />
- </xs:appinfo>
- </xs:annotation>
- </xs:element>
- </xs:sequence>
- </xs:complexType>
- <xs:element name="PlayReadyPlayRight" nillable="true" type="tns:PlayReadyPlayRight" />
- <xs:simpleType name="UnknownOutputPassingOption">
- <xs:restriction base="xs:string">
- <xs:enumeration value="NotAllowed" />
- <xs:enumeration value="Allowed" />
- <xs:enumeration value="AllowedWithVideoConstriction" />
- </xs:restriction>
- </xs:simpleType>
- <xs:element name="UnknownOutputPassingOption" nillable="true" type="tns:UnknownOutputPassingOption" />
- <xs:complexType name="ScmsRestriction">
- <xs:sequence>
- <xs:element minOccurs="0" name="ConfigurationData" type="xs:unsignedByte" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ScmsRestriction" nillable="true" type="tns:ScmsRestriction" />
-</xs:schema>
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Portal Check Job Progress https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-check-job-progress.md
- Title: Monitor encoding job progress with the Azure portal
-description: This tutorial walks you through the steps of monitoring your job progress using the Azure portal.
------ Previously updated : 03/10/2021--
-# Monitor encoding job progress with the Azure portal
---
-## Overview
-
-When you run jobs, you often require a way to track job progress.
-
-To monitor the progress of the encoding job, click **Settings** (at the top of the page) and then select **Jobs**.
-
-![Screenshot that shows "Jobs" selected from the "Settings" menu.](./media/media-services-portal-vod-get-started/media-services-jobs.png)
-
-You can click the job to see more details.
-
-![Jobs](./media/media-services-portal-vod-get-started/media-services-job-progress2.png)
-
-## Next steps
-After your encoding job is one, you can publish and play your assets, as described [here](media-services-portal-publish.md).
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Portal Configure Content Key Auth Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-configure-content-key-auth-policy.md
- Title: Configure a content key authorization policy by using the Azure portal | Microsoft Docs
-description: This article demonstrates how to configure an authorization policy for a content key.
------ Previously updated : 03/10/2021--
-# Configure a content key authorization policy
---
-## Overview
- You can use Azure Media Services to deliver MPEG-DASH, Smooth Streaming, and HTTP Live Streaming (HLS) streams protected with Advanced Encryption Standard (AES) by using 128-bit encryption keys or [PlayReady digital rights management (DRM)](https://www.microsoft.com/playready/overview/). With Media Services, you also can deliver DASH streams encrypted with Widevine DRM. Both PlayReady and Widevine are encrypted per the common encryption (ISO/IEC 23001-7 CENC) specification.
-
-Media Services also provides a key/license delivery Service from which clients can obtain AES keys or PlayReady/Widevine licenses to play the encrypted content.
-
-This article shows how to use the Azure portal to configure the content key authorization policy. The key can later be used to dynamically encrypt your content. Currently, you can encrypt HLS, MPEG-DASH, and Smooth Streaming formats. You can't encrypt progressive downloads.
-
-When a player requests a stream that is set to be dynamically encrypted, Media Services uses the configured key to dynamically encrypt your content by using AES or DRM encryption. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-If you plan to have multiple content keys or want to specify a key/license delivery service URL other than the Media Services key delivery service, use the Media Services .NET SDK or REST APIs. For more information, see:
-
-* [Configure a content key authorization policy by using the Media Services .NET SDK](media-services-dotnet-configure-content-key-auth-policy.md)
-* [Configure a content key authorization policy by using the Media Services REST API](media-services-rest-configure-content-key-auth-policy.md)
-
-### Some considerations apply
-* When your Media Services account is created, a default streaming endpoint is added to your account in the "Stopped" state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, your streaming endpoint must be in the "Running" state.
-* Your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files. For more information, see [Encode an asset](media-services-encode-asset.md).
-* The key delivery service caches ContentKeyAuthorizationPolicy and its related objects (policy options and restrictions) for 15 minutes. You can create a ContentKeyAuthorizationPolicy and specify to use a token restriction, test it, and then update the policy to the open restriction. This process takes roughly 15 minutes before the policy switches to the open version.
-* A Media Services streaming endpoint sets the value of the CORS Access-Control-Allow-Origin header in preflight response as the wildcard "\*". This value works well with most players, including Azure Media Player, Roku and JWPlayer, and others. However, some players that use dash.js don't work because, with credentials mode set to "include," XMLHttpRequest in their dash.js doesn't allow the wildcard "\*" as the value of Access-Control-Allow-Origin. As a workaround to this limitation in dash.js, if you host your client from a single domain, Media Services can specify that domain in the preflight response header. For assistance, open a support ticket through the Azure portal.
-
-## Configure the key authorization policy
-To configure the key authorization policy, select the **CONTENT PROTECTION** page.
-
-Media Services supports multiple ways to authenticate users who make key requests. The content key authorization policy can have open, token, or IP authorization restrictions. (IP can be configured with REST or the .NET SDK.)
-
-### Open restriction
-The open restriction means the system delivers the key to anyone who makes a key request. This restriction might be useful for testing purposes.
-
-![OpenPolicy][open_policy]
-
-### Token restriction
-To choose the token restricted policy, select the **TOKEN** button.
-
-The token restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the simple web token ([SWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_2)) and JSON Web Token (JWT) formats. For more information, see [JWT authentication](http://www.gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/).
-
-Media Services doesn't provide STS. You can create a custom STS to issue tokens. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration. If the token is valid and the claims in the token match those configured for the content key, the Media Services key delivery service returns the encryption key to the client.
-
-When you configure the token-restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the STS that issues the token. The audience (sometimes called scope) describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-### PlayReady
-When you protect your content with PlayReady, one of the things you need to specify in your authorization policy is an XML string that defines the PlayReady license template. By default, the following policy is set:
-
-```xml
-<PlayReadyLicenseResponseTemplate xmlns:i="https://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1">
- <LicenseTemplates>
- <PlayReadyLicenseTemplate><AllowTestDevices>true</AllowTestDevices>
- <ContentKey i:type="ContentEncryptionKeyFromHeader" />
- <LicenseType>Nonpersistent</LicenseType>
- <PlayRight>
- <AllowPassingVideoContentToUnknownOutput>Allowed</AllowPassingVideoContentToUnknownOutput>
- </PlayRight>
- </PlayReadyLicenseTemplate>
- </LicenseTemplates>
-</PlayReadyLicenseResponseTemplate>
-```
-
-You can select the **import policy xml** button and provide a different XML that conforms to the XML schema defined in the [Media Services PlayReady license template overview](media-services-playready-license-template-overview.md).
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-## Provide feedback
-
-[open_policy]: ./media/media-services-portal-configure-content-key-auth-policy/media-services-protect-content-with-open-restriction.png
-[token_policy]: ./media/media-services-key-authorization-policy/media-services-protect-content-with-token-restriction.png
media-services Media Services Portal Create Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-create-account.md
- Title: Create an Azure Media Services account with the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of creating an Azure Media Services account with the Azure portal.
------ Previously updated : 03/10/2021--
-# Create a Media Services account using the Azure portal
---
-The Azure portal provides a way to quickly create an Azure Media Services (AMS) account. You can use your account to access Media Services that enable you to store, encrypt, encode, manage, and stream media content in Azure. At the time you create a Media Services account, you also create an associated storage account (or use an existing one). If you delete a Media Services account, the blobs in your related storage account are not deleted.
-
-The Media Services account and all associated storage accounts must be in the same Azure subscription. It is strongly recommended to use storage accounts in the same location as the Media Services account to avoid additional latency and data egress costs.
-
-This article shows how to create a Media Services account using the Azure portal.
-
-> [!NOTE]
-> For information about availability of Azure Media Services features in different regions, see [AMS features across regions](availability-regions-v-2.md).
-
-## Prerequisites
-
-To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-
-## Create an AMS account
-
-The steps in this section show how to create an AMS account.
-
-1. Sign in at the [Azure portal](https://portal.azure.com/).
-2. Click **+Create a resource** > **Media** > **Media Services**.
-3. In **CREATE MEDIA SERVICES ACCOUNT** enter required values.
-
- 1. In **Account Name**, enter the name of the new AMS account. A Media Services account name is all lowercase letters or numbers with no spaces, and is 3 to 24 characters in length.
- 2. In Subscription, select among the different Azure subscriptions that you have access to.
- 3. In **Resource Group**, select the new or existing resource. A resource group is a collection of resources that share lifecycle, permissions, and policies. Learn more [here](../../azure-resource-manager/management/overview.md#resource-groups).
- 4. In **Location**, select the geographic region that will be used to store the media and metadata records for your Media Services account. This region will be used to process and stream your media. Only the available Media Services regions appear in the drop-down list box.
- 5. In **Storage Account**, select a storage account to provide blob storage of the media content from your Media Services account. You can select an existing storage account in the same geographic region as your Media Services account, or you can create a storage account. A new storage account is created in the same region. The rules for storage account names are the same as for Media Services accounts.
-
- Learn more about storage [here](../../storage/common/storage-introduction.md).
- 6. Select **Pin to dashboard** to see the progress of the account deployment.
-4. Click **Create** at the bottom of the form.
-
- Once the account is successfully created, overview page loads. In the streaming endpoint table the account will have a default streaming endpoint in the **Stopped** state.
-
- >[!NOTE]
- >When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-## To manage your AMS account
-
-To manage your AMS account (for example, connect to the AMS API programmatically, upload videos, encode assets, configure content protection, monitor job progress) select **Settings** on the left side of the portal. From the **Settings**, navigate to one of the available blades (for example: **API access**, **Assets**, **Jobs**, **Content protection**).
-
-## Next steps
-
-You can now upload files into your AMS account. For more information, see [Upload files](media-services-portal-upload-files.md).
-
-If you plan to access AMS API programmatically, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Portal Creating Live Encoder Enabled Channel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-creating-live-encoder-enabled-channel.md
- Title: Perform live streaming using Azure Media Services to create multi-bitrate streams with Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of creating a Channel that receives a single-bitrate live stream and encodes it to multi-bitrate stream using the Azure portal.
------ Previously updated : 03/10/2021--
-# Perform live streaming using Media Services to create multi-bitrate streams with Azure portal
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-creating-live-encoder-enabled-channel.md)
-> * [.NET](media-services-dotnet-creating-live-encoder-enabled-channel.md)
-> * [REST API](/rest/api/media/operations/channel)
->
--
-This tutorial walks you through the steps of creating a **Channel** that receives a single-bitrate live stream and encodes it to multi-bitrate stream.
-
-For more conceptual information related to Channels that are enabled for live encoding, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
-
-## Common Live Streaming Scenario
-
-The following are general steps involved in creating common live streaming applications.
-
-> [!NOTE]
-> Currently, the max recommended duration of a live event is 8 hours. Please contact amshelp@microsoft.com if you need to run a Channel for longer periods of time.
-
-1. Connect a video camera to a computer. <br/>For setup ideas, check out [Simple and portable event video gear setup]( https://link.medium.com/KNTtiN6IeT).
-
- If you do not have access to a camera, tools such as [Telestream Wirecast](media-services-configure-wirecast-live-encoder.md) can be used generate a live feed from a video file.
-1. Launch and configure an on-premises live encoder that can output a single bitrate stream in one of the following protocols: RTMP or Smooth Streaming. For more information, see [Azure Media Services RTMP Support and Live Encoders](https://go.microsoft.com/fwlink/?LinkId=532824). <br/>Also, check out this blog: [Live streaming production with OBS](https://link.medium.com/ttuwHpaJeT).
-
- This step could also be performed after you create your Channel.
-1. Create and start a Channel.
-1. Retrieve the Channel ingest URL.
-
- The ingest URL is used by the live encoder to send the stream to the Channel.
-1. Retrieve the Channel preview URL.
-
- Use this URL to verify that your channel is properly receiving the live stream.
-1. Create an event/program (that will also create an asset).
-1. Publish the event (that will create an OnDemand locator for the associated asset).
-1. Start the event when you are ready to start streaming and archiving.
-1. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
-1. Stop the event whenever you want to stop streaming and archiving the event.
-1. Delete the event (and optionally delete the asset).
-
-## Prerequisites
-
-The following are required to complete the tutorial.
-
-* To complete this tutorial, you need an Azure account. If you don't have an account, you can create a free trial account in just a couple of minutes.
- For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [Create Account](media-services-portal-create-account.md).
-* A webcam and an encoder that can send a single bitrate live stream.
-
-## Create a channel
-
-1. In the [Azure portal](https://portal.azure.com/), select Media Services and then click on your Media Services account name.
-2. Select **Live Streaming**.
-3. Select **Custom create**. This option will let you create a channel that is enabled for live encoding.
-
- ![Create a channel](./media/media-services-portal-creating-live-encoder-enabled-channel/media-services-create-channel.png)
-4. Click on **Settings**.
-
- 1. Choose the **Live Encoding** channel type. This type specifies that you want to create a Channel that is enabled for live encoding. That means the incoming single bitrate stream is sent to the Channel and encoded into a multi-bitrate stream using specified live encoder settings. For more information, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md). Click OK.
- 2. Specify a channel's name.
- 3. Click OK at the bottom of the screen.
-5. Select the **Ingest** tab.
-
- 1. On this page, you can select a streaming protocol. For the **Live Encoding** channel type, valid protocol options are:
-
- * Single bitrate Fragmented MP4 (Smooth Streaming)
- * Single bitrate RTMP
-
- For detailed explanation about each protocol, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
-
- You cannot change the protocol option while the Channel or its associated events/programs are running. If you require different protocols, you should create separate channels for each streaming protocol.
- 2. You can apply IP restriction on the ingest.
-
- You can define the IP addresses that are allowed to ingest a video to this channel. Allowed IP addresses can be specified as either a single IP address (e.g. '10.0.0.1'), an IP range using an IP address and a CIDR subnet mask (e.g. '10.0.0.1/22'), or an IP range using an IP address and a dotted decimal subnet mask (e.g. '10.0.0.1(255.255.252.0)').
-
- If no IP addresses are specified and there is no rule definition then no IP address will be allowed. To allow any IP address, create a rule and set 0.0.0.0/0.
-6. On the **Preview** tab, apply IP restriction on the preview.
-7. On the **Encoding** tab, specify the encoding preset.
-
- Currently, the only system preset you can select is **Default 720p**. To specify a custom preset, open a Microsoft support ticket. Then, enter the name of the preset created for you.
-
-> [!NOTE]
-> Currently, the Channel start can take up to 30 minutes. Channel reset can take up to 5 minutes.
-
-Once you created the Channel, you can click on the channel and select **Settings** where you can view your channels configurations.
-
-For more information, see [Live streaming using Azure Media Services to create multi-bitrate streams](media-services-manage-live-encoder-enabled-channels.md).
-
-## Get ingest URLs
-
-Once the channel is created, you can get ingest URLs that you will provide to the live encoder. The encoder uses these URLs to input a live stream.
-
-![ingest urls](./media/media-services-portal-creating-live-encoder-enabled-channel/media-services-ingest-urls.png)
-
-## Create and manage events
-
-### Overview
-
-A channel is associated with events/programs that enable you to control the publishing and storage of segments in a live stream. Channels manage events/programs. The Channel and Program relationship is very similar to traditional media where a channel has a constant stream of content and a program is scoped to some timed event on that channel.
-
-You can specify the number of hours you want to retain the recorded content for the event by setting the **Archive Window** length. This value can be set from a minimum of 5 minutes to a maximum of 25 hours. Archive window length also dictates the maximum amount of time clients can seek back in time from the current live position. Events can run over the specified amount of time, but content that falls behind the window length is continuously discarded. This value of this property also determines how long the client manifests can grow.
-
-Each event is associated with an Asset. To publish the event you must create an OnDemand locator for the associated asset. Having this locator will enable you to build a streaming URL that you can provide to your clients.
-
-A channel supports up to three concurrently running events so you can create multiple archives of the same incoming stream. This allows you to publish and archive different parts of an event as needed. For example, your business requirement is to archive 6 hours of an event, but to broadcast only last 10 minutes. To accomplish this, you need to create two concurrently running event. One event is set to archive 6 hours of the event but the program is not published. The other event is set to archive for 10 minutes and this program is published.
-
-You should not reuse existing programs for new events. Instead, create and start a new program for each event.
-
-Start an event/program when you are ready to start streaming and archiving. Stop the event whenever you want to stop streaming and archiving the event.
-
-To delete archived content, stop and delete the event and then delete the associated asset. An asset cannot be deleted if it is used by the event; the event must be deleted first.
-
-Even after you stop and delete the event, the users would be able to stream your archived content as a video on demand, for as long as you do not delete the asset.
-
-If you do want to retain the archived content, but not have it available for streaming, delete the streaming locator.
-
-### Create/start/stop events
-
-Once you have the stream flowing into the Channel you can begin the streaming event by creating an Asset, Program, and Streaming Locator. This will archive the stream and make it available to viewers through the Streaming Endpoint.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-There are two ways to start event:
-
-1. From the **Channel** page, press **Live Event** to add a new event.
-
- Specify: event name, asset name, archive window, and encryption option.
-
- ![create program](./media/media-services-portal-creating-live-encoder-enabled-channel/media-services-create-program.png)
-
- If you left **Publish this live event now** checked, the event the PUBLISHING URLs will get created.
-
- You can press **Start**, whenever you are ready to stream the event.
-
- Once you start the event, you can press **Watch** to start playing the content.
-2. Alternatively, you can use a shortcut and press **Go Live** button on the **Channel** page. This will create a default Asset, Program, and Streaming Locator.
-
- The event is named **default** and the archive window is set to 8 hours.
-
-You can watch the published event from the **Live event** page.
-
-If you click **Off Air**, it will stop all live events.
-
-## Watch the event
-
-To watch the event, click **Watch** in the Azure portal or copy the streaming URL and use a player of your choice.
-
-![Created](./media/media-services-portal-creating-live-encoder-enabled-channel/media-services-play-event.png)
-
-Live event automatically converts events to on-demand content when stopped.
-
-## Clean up
-
-If you are done streaming events and want to clean up the resources provisioned earlier, follow the following procedure.
-
-* Stop pushing the stream from the encoder.
-* Stop the channel. Once the Channel is stopped, it will not incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
-* You can stop your Streaming Endpoint, unless you want to continue to provide the archive of your live event as an on-demand stream. If the channel is in stopped state, it will not incur any charges.
-
-## View archived content
-
-Even after you stop and delete the event, the users would be able to stream your archived content as a video on demand, for as long as you do not delete the asset.
-
-> [!WARNING]
-> An asset **should not** be deleted if it is used by an event; the event must be deleted first.
-
-To manage your assets, select **Setting** and click **Assets**.
-
-![Assets](./media/media-services-portal-creating-live-encoder-enabled-channel/media-services-assets.png)
-
-## Considerations
-
-* Currently, the max recommended duration of a live event is 8 hours. Please contact amshelp@microsoft.com if you need to run a Channel for longer periods of time.
-* Make sure the streaming endpoint from which you want to stream your content is in the **Running** state.
-
-## Next steps
-
-Review Media Services learning paths.
--
-## Provide feedback
-
media-services Media Services Portal Encode https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-encode.md
- Title: Encode an asset by using Media Encoder Standard in the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of encoding an asset by using Media Encoder Standard in the Azure portal.
------ Previously updated : 03/10/2021--
-# Encode an asset by using Media Encoder Standard in the Azure portal
--
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
->
->
-
-One of the most common scenarios in working with Azure Media Services is delivering adaptive bitrate streaming to your clients. Media Services supports the following adaptive bitrate streaming technologies: Apple HTTP Live Streaming (HLS), Microsoft Smooth Streaming, and Dynamic Adaptive Streaming over HTTP (DASH, also called MPEG-DASH). To prepare your videos for adaptive bitrate streaming, first encode your source video as multi-bitrate files. You can use Media Encoder Standard to encode your videos.
-
-Media Services gives you dynamic packaging. With dynamic packaging, you can deliver your multi-bitrate MP4s in HLS, Smooth Streaming, and MPEG-DASH, without repackaging in these streaming formats. When you use dynamic packaging, you can store and pay for the files in single-storage format. Media Services builds and serves the appropriate response based on a client's request.
-
-To take advantage of dynamic packaging, you must encode your source file into a set of multi-bitrate MP4 files. The encoding steps are demonstrated later in this article.
-
-To learn how to scale media processing, see [Scale media processing by using the Azure portal](media-services-portal-scale-media-processing.md).
-
-## Encode in the Azure portal
-
-To encode your content by using Media Encoder Standard:
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Select the asset that you want to encode.
-3. Select the **Encode** button.
-4. In the **Encode an asset** pane, select the **Media Encoder Standard** processor and a preset. For information about presets, see [Auto-generate a bitrate ladder](media-services-autogen-bitrate-ladder-with-mes.md) and [Task presets for Media Encoder Standard](media-services-mes-presets-overview.md). It's important to choose the preset that will work best for your input video. For example, if you know your input video has a resolution of 1920 &#215; 1080 pixels, you might choose the **H264 Multiple Bitrate 1080p** preset. If you have a low-resolution (640 &#215; 360) video, you shouldn't use the **H264 Multiple Bitrate 1080p** preset.
-
- To help you manage your resources, you can edit the name of the output asset and the name of the job.
-
- ![Encode assets](./media/media-services-portal-vod-get-started/media-services-encode1.png)
-5. Select **Create**.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-* [Monitor the progress of your encoding job](media-services-portal-check-job-progress.md) in the Azure portal.
-
media-services Media Services Portal Get Started With Aad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-get-started-with-aad.md
- Title: Get started with Azure AD authentication by using the Azure portal| Microsoft Docs
-description: Learn how to use the Azure portal to access Azure Active Directory (Azure AD) authentication to consume the Azure Media Services API.
------ Previously updated : 03/10/2021--
-# Get started with Azure AD authentication by using the Azure portal
---
-Learn how to use the Azure portal to access Azure Active Directory (Azure AD) authentication to access the Azure Media Services API.
-
-## Prerequisites
--- An Azure account. If you don't have an account, start with an [Azure free trial](https://azure.microsoft.com/pricing/free-trial/). -- A Media Services account. For more information, see [Create an Azure Media Services account by using the Azure portal](media-services-portal-create-account.md).-
-When you use Azure AD authentication with Azure Media Services, you have two authentication options:
--- **Service principal authentication**. Authenticate a service. Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs: web apps, function apps, logic apps, APIs, or a microservice.-- **User authentication**. Authenticate a person who is using the app to interact with Media Services resources. The interactive application should first prompt the user for credentials. An example is a management console app used by authorized users to monitor encoding jobs or live streaming. -
-## Access the Media Services API
-
-This page lets you select the authentication method you want to use to connect to the API. The page also provides the values you need to connect to the API.
-
-1. In the [Azure portal](https://portal.azure.com/), select your Media Services account.
-2. Select how to connect to the Media Services API.
-3. Under **Connect to Media Services API**, select the Media Services API version you want to connect to.
-
-## Service principal authentication (recommended)
-
-Authenticates a service using an Azure Active Directory (Azure AD) app and secret. This is recommended for any middle-tier services calling to the Media Services API. Examples are Web Apps, Functions, Logic Apps, APIs, and microservices. This is the recommended authentication method.
-
-### Manage your Azure AD app and secret
-
-The **Manage your AAD app and secret** section lets you select or create a new Azure AD app and generate a secret. For security purposes, the secret cannot be shown after the blade is closed. The application uses the application ID and secret for authentication to obtain a valid token for media services.
-
-Make sure that you have sufficient permissions to register an application with your Azure AD tenant and to assign the application to a role in your Azure subscription. For more information, see [Required permissions](../../active-directory/develop/howto-create-service-principal-portal.md#permissions-required-for-registering-an-app).
-
-### Connect to Media Services API
-
-The **Connect to Media Services API** provides you with values that you use to connect your service principal application. You can get text values or copy the JSON or XML blocks.
-
-## User authentication
-
-This option could be used to authenticate an employee or member of an Azure Active Directory who is using an app to interact with Media Services resources. The interactive application should first prompt the user for the user's credentials. This authentication method should only be used for Management applications.
-
-### Connect to Media Services API
-
-Copy your credentials to connect your user application from the **Connect to Media Services API** section. You can get text values or copy the JSON or XML blocks.
-
-## Next steps
-
-Get started with [uploading files to your account](media-services-portal-upload-files.md).
media-services Media Services Portal Live Passthrough Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-live-passthrough-get-started.md
- Title: Live stream with on-premises encoders using Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of creating a Channel that is configured for a pass-through delivery.
------ Previously updated : 03/10/2021--
-# Perform live streaming with on-premises encoders using Azure portal
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-live-passthrough-get-started.md)
-> * [.NET](media-services-dotnet-live-encode-with-onpremises-encoders.md)
-> * [REST](/rest/api/media/operations/channel)
->
->
--
-This tutorial walks you through the steps of using the Azure portal to create a **Channel** that is configured for a pass-through delivery.
-
-## Prerequisites
-The following are required to complete the tutorial:
-
-* An Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* A webcam. For example, [Telestream Wirecast encoder](media-services-configure-wirecast-live-encoder.md).
-
-It is highly recommended to review the following articles:
-
-* [Azure Media Services RTMP Support and Live Encoders](https://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/)
-* [Overview of Live Steaming using Azure Media Services](media-services-manage-channels-overview.md)
-* [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md)
-
-## <a id="scenario"></a>Common live streaming scenario
-
-The following steps describe tasks involved in creating common live streaming applications that use channels that are configured for pass-through delivery. This tutorial shows how to create and manage a pass-through channel and live events.
-
-> [!NOTE]
-> Make sure the streaming endpoint from which you want to stream content is in the **Running** state.
-
-1. Connect a video camera to a computer. <br/>For setup ideas, check out [Simple and portable event video gear setup]( https://link.medium.com/KNTtiN6IeT).
-1. Launch and configure an on-premises live encoder that outputs a multi-bitrate RTMP or Fragmented MP4 stream. For more information, see [Azure Media Services RTMP Support and Live Encoders](https://go.microsoft.com/fwlink/?LinkId=532824).<br/>Also, check out this blog: [Live streaming production with OBS](https://link.medium.com/ttuwHpaJeT).
-
- This step could also be performed after you create your Channel.
-1. Create and start a pass-through Channel.
-1. Retrieve the Channel ingest URL.
-
- The ingest URL is used by the live encoder to send the stream to the Channel.
-1. Retrieve the Channel preview URL.
-
- Use this URL to verify that your channel is properly receiving the live stream.
-1. Create a live event/program.
-
- When using the Azure portal, creating a live event also creates an asset.
-
-1. Start the event/program when you are ready to start streaming and archiving.
-1. Optionally, the live encoder can be signaled to start an advertisement. The advertisement is inserted in the output stream.
-1. Stop the event/program whenever you want to stop streaming and archiving the event.
-1. Delete the event/program (and optionally delete the asset).
-
-> [!IMPORTANT]
-> Please review [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md) to learn about concepts and considerations related to live streaming with on-premises encoders and pass-through channels.
->
->
-
-## To view notifications and errors
-If you want to view notifications and errors produced by the Azure portal, click on the Notification icon.
-
-![Notifications](./media/media-services-portal-passthrough-get-started/media-services-notifications.png)
-
-## Create and start pass-through channels and events
-A channel is associated with events/programs that enable you to control the publishing and storage of segments in a live stream. Channels manage events.
-
-You can specify the number of hours you want to retain the recorded content for the program by setting the **Archive Window** length. This value can be set from a minimum of 5 minutes to a maximum of 25 hours. Archive window length also dictates the maximum amount of time clients can seek back in time from the current live position. Events can run over the specified amount of time, but content that falls behind the window length is continuously discarded. This value of this property also determines how long the client manifests can grow.
-
-Each event is associated with an asset. To publish the event, you must create an OnDemand locator for the associated asset. Having this locator enables you to build a streaming URL that you can provide to your clients.
-
-A channel supports up to three concurrently running events so you can create multiple archives of the same incoming stream. This allows you to publish and archive different parts of an event as needed. For example, your business requirement is to archive 6 hours of a program, but to broadcast only last 10 minutes. To accomplish this, you need to create two concurrently running programs. One program is set to archive 6 hours of the event but the program is not published. The other program is set to archive for 10 minutes and this program is published.
-
-You should not reuse existing live events. Instead, create and start a new event for each event.
-
-Start the event when you are ready to start streaming and archiving. Stop the program whenever you want to stop streaming and archiving the event.
-
-To delete archived content, stop and delete the event and then delete the associated asset. An asset cannot be deleted if it is used by an event; the event must be deleted first.
-
-Even after you stop and delete the event, the users would be able to stream your archived content as a video on demand, for as long as you do not delete the asset.
-
-If you do want to retain the archived content, but not have it available for streaming, delete the streaming locator.
-
-### To use the portal to create a channel
-This section shows how to use the **Quick Create** option to create a pass-through channel.
-
-For more details about pass-through channels, see [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md).
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. In the **Settings** window, click **Live streaming**.
-
- ![Getting started](./media/media-services-portal-passthrough-get-started/media-services-getting-started.png)
-
- The **Live streaming** window appears.
-3. Click **Quick Create** to create a pass-through channel with the RTMP ingest protocol.
-
- The **CREATE A NEW CHANNEL** window appears.
-4. Give the new channel a name and click **Create**.
-
- This creates a pass-through channel with the RTMP ingest protocol.
-
-## Create events
-1. Select a channel to which you want to add an event.
-2. Press **Live Event** button.
-
-![Event](./media/media-services-portal-passthrough-get-started/media-services-create-events.png)
-
-## Get ingest URLs
-Once the channel is created, you can get ingest URLs that you will provide to the live encoder. The encoder uses these URLs to input a live stream.
-
-![Screenshot that shows the "Live streaming" page with a channel selected and the channel pane displayed.](./media/media-services-portal-passthrough-get-started/media-services-channel-created.png)
-
-## Watch the event
-To watch the event, click **Watch** in the Azure portal or copy the streaming URL and use a player of your choice.
-
-![Created](./media/media-services-portal-passthrough-get-started/media-services-default-event.png)
-
-Live event automatically get converted to on-demand content when stopped.
-
-## Clean up
-For more details about pass-through channels, see [Live streaming with on-premises encoders that create multi-bitrate streams](media-services-live-streaming-with-onprem-encoders.md).
-
-* A channel can be stopped only when all events/programs on the channel have been stopped. Once the Channel is stopped, it does not incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
-* A channel can be deleted only when all live events on the channel have been deleted.
-
-## View archived content
-Even after you stop and delete the event, the users would be able to stream your archived content as a video on demand, for as long as you do not delete the asset. An asset cannot be deleted if it is used by an event; the event must be deleted first.
-
-To manage your assets, select **Setting** and click **Assets**.
-
-![Assets](./media/media-services-portal-passthrough-get-started/media-services-assets.png)
-
-## Next step
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Portal Manage Streaming Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-manage-streaming-endpoints.md
- Title: Manage streaming endpoints with the Azure portal | Microsoft Docs
-description: This article demonstrates how to manage streaming endpoints with the Azure portal.
--
-writer: juliako
---- Previously updated : 03/10/2021---
-# Manage streaming endpoints with the Azure portal
--
-This article shows how to use the Azure portal to manage streaming endpoints.
-
->[!NOTE]
->Make sure to review the [overview](media-services-streaming-endpoints-overview.md) article.
-
-For information about how to scale the streaming endpoint, see [this](media-services-portal-scale-streaming-endpoints.md) article.
-
-## Start managing streaming endpoints
-
-To start managing streaming endpoints for your account, do the following.
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. In the **Settings** blade, select **Streaming endpoints**.
-
- ![Screenshot that shows the "Media services" page with "Streaming endpoints" selected from the "Settings" blade.](./media/media-services-portal-manage-streaming-endpoints/media-services-manage-streaming-endpoints1.png)
-
-> [!NOTE]
-> You are only billed when your Streaming Endpoint is in running state.
-
-## Add/delete a streaming endpoint
-
->[!NOTE]
->The default streaming endpoint cannot be deleted.
-
-To add/delete streaming endpoint using the Azure portal, do the following:
-
-1. To add a streaming endpoint, click the **+ Endpoint** at the top of the page.
-
- You might want multiple Streaming Endpoints if you plan to have different CDNs or a CDN and direct access.
-
-2. To delete a streaming endpoint, press **Delete** button.
-3. Click the **Start** button to start the streaming endpoint.
-
- ![Screenshot that shows the "Endpoint" action selected and the "Streaming Endpoint Details" pane displayed.](./media/media-services-portal-manage-streaming-endpoints/media-services-manage-streaming-endpoints2.png)
--
-## <a id="configure_streaming_endpoints"></a>Configuring the Streaming Endpoint
-Streaming Endpoint enables you to configure the following properties:
-
-* Access control
-* Cache control
-* Cross site access policies
-
-For detailed information about these properties, see [StreamingEndpoint](/rest/api/media/operations/streamingendpoint).
-
->[!NOTE]
->When CDN is enabled, you cannot access IP access. IP access is only applicable when you donΓÇÖt have CDN.
-
-You can configure streaming endpoint by doing the following:
-
-1. Select the streaming endpoint you want to configure.
-2. Click **Settings**.
-
-A brief description of the fields follows.
-
-![Screenshot that shows the "Settings" action selected for the streaming endpoint.](./media/media-services-portal-manage-streaming-endpoints/media-services-manage-streaming-endpoints4.png)
-
-1. Maximum cache policy: used to configure cache lifetime for assets served through this streaming endpoint. If no value is set, the default is used. The default values can also be defined directly in Azure storage. If Azure CDN is enabled for the streaming endpoint, you should not set the cache policy value to less than 600 seconds.
-2. Allowed IP addresses: used to specify IP addresses that would be allowed to connect to the published streaming endpoint. If no IP addresses specified, any IP address would be able to connect. IP addresses can be specified as either a single IP address (for example, '10.0.0.1'), an IP range using an IP address and a CIDR subnet mask (for example, '10.0.0.1/22'), or an IP range using IP address and a dotted decimal subnet mask (for example, '10.0.0.1(255.255.255.0)').
-3. Configuration for Akamai signature header authentication: used to specify how signature header authentication request from Akamai servers is configured. Expiration is in UTC.
-
-## Scale your Premium streaming endpoint
-
-For more information, see [this](media-services-portal-scale-streaming-endpoints.md) article.
-
-## <a id="enable_cdn"></a>Enable Azure CDN integration
-
-When you create a new account, default Streaming Endpoint Azure CDN integration is enabled by default.
-
-If you later want to disable/enable the CDN, your streaming endpoint must be in the **stopped** state. It could take up to two hours for the Azure CDN integration to get enabled and for the changes to be active across all the CDN POPs. However, your can start your streaming endpoint and stream without interruptions from the streaming endpoint and once the integration is complete, the stream is delivered from the CDN. During the provisioning period your streaming endpoint will be in **starting** state and you might observe degraded performance.
-
-CDN integration is enabled in all the Azure data centers except China and Federal Government regions.
-
-Once it is enabled, the **Access Control**, **Custom hostname, and **Akamai Signature authentication** configuration gets disabled.
-
-> [!IMPORTANT]
-> Azure Media Services integration with Azure CDN is implemented on **Azure CDN from Verizon** for standard streaming endpoints. Premium streaming endpoints can be configured using all **Azure CDN pricing tiers and providers**. For more information about Azure CDN features, see the [CDN overview](../../cdn/cdn-overview.md).
-
-### Additional considerations
-
-* When CDN is enabled for a streaming endpoint, clients cannot request content directly from the origin. If you need the ability to test your content with or without CDN, you can create another streaming endpoint that isn't CDN enabled.
-* Your streaming endpoint hostname remains the same after enabling CDN. You donΓÇÖt need to make any changes to your media services workflow after CDN is enabled. For example, if your streaming endpoint hostname is strasbourg.streaming.mediaservices.windows.net, after enabling CDN, the exact same hostname is used.
-* For new streaming endpoints, you can enable CDN simply by creating a new endpoint; for existing streaming endpoints, you need to first stop the endpoint and then enable/disable the CDN.
-* Standard streaming endpoint can only be configured using **Verizon Standard CDN provider** using Azure classic portal. However, you can enable other Azure CDN providers using REST APIs.
-
-## Configure CDN profile
-
-You can configure the CDN profile by selecting the **Manage CDN** button from the top.
-
-![Streaming endpoint](./media/media-services-portal-manage-streaming-endpoints/media-services-manage-streaming-endpoints6.png)
-
-## Next steps
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Portal Protect Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-protect-content.md
- Title: Configure content protection policies by using the Azure portal | Microsoft Docs
-description: This article demonstrates how to use the Azure portal to configure content protection policies. The article also shows how to enable dynamic encryption for your assets.
------ Previously updated : 03/10/2021--
-# Configure content protection policies by using the Azure portal
--
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/). > No new features or functionality are being added to Media Services v2. <br/>Check out the latest version, [Media Services v3](../latest/index.yml). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
->
-
- With Azure Media Services, you can secure your media from the time it leaves your computer through storage, processing, and delivery. You can use Media Services to deliver your content encrypted dynamically with the Advanced Encryption Standard (AES) by using 128-bit encryption keys. You also can use it with common encryption (CENC) by using PlayReady and/or Widevine digital rights management (DRM) and Apple FairPlay.
-
-Media Services provides a service for delivering DRM licenses and AES clear keys to authorized clients. You can use the Azure portal to create one key/license authorization policy for all types of encryptions.
-
-This article demonstrates how to configure a content protection policy by using the portal. The article also shows how to apply dynamic encryption to your assets.
-
-## Start to configure content protection
-To use the portal to configure global content protection by using your Media Services account, take the following steps:
-
-1. In the [portal](https://portal.azure.com/), select your Media Services account.
-
-1. Select **Settings** > **Content protection**.
-
- ![Content protection](./media/media-services-portal-content-protection/media-services-content-protection001.png)
-
-## Key/license authorization policy
-Media Services supports multiple ways of authenticating users who make key or license requests. You must configure the content key authorization policy. Your client then must meet the policy before the key/license can be delivered to it. The content key authorization policy can have one or more authorization restrictions, either open or token restrictions.
-
-You can use the portal to create one key/license authorization policy for all types of encryptions.
-
-### Open authorization
-Open restriction means that the system delivers the key to anyone who makes a key request. This restriction might be useful for test purposes.
-
-### Token authorization
-The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the simple web token (SWT) and JSON Web Token (JWT) formats. Media Services doesn't provide an STS. You can create a custom STS or use Azure Access Control Service to issue tokens. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration. If the token is valid and the claims in the token match those configured for the key (or license), the Media Services key delivery service returns the requested key (or license) to the client.
-
-When you configure the token-restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the secure token service that issues the token. The audience (sometimes called scope) describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-![Key/license authorization policy](./media/media-services-portal-content-protection/media-services-content-protection002.png)
-
-## PlayReady license template
-The PlayReady license template sets the functionality that is enabled on your PlayReady license. For more information about the PlayReady license template, see the [Media Services PlayReady license template overview](media-services-playready-license-template-overview.md).
-
-### Nonpersistent
-If you configure a license as nonpersistent, it's held in memory only while the player uses the license.
-
-![Nonpersistent content protection](./media/media-services-portal-content-protection/media-services-content-protection003.png)
-
-### Persistent
-If you configure a license as persistent, it's saved in persistent storage on the client.
-
-![Persistent content protection](./media/media-services-portal-content-protection/media-services-content-protection004.png)
-
-## Widevine license template
-The Widevine license template sets the functionality that is enabled on your Widevine licenses.
-
-### Basic
-When you select **Basic**, the template is created with all default values.
-
-### Advanced
-For more information about the Widevine rights template, see the [Widevine license template overview](media-services-widevine-license-template-overview.md).
-
-![Advanced content protection](./media/media-services-portal-content-protection/media-services-content-protection005.png)
-
-## FairPlay configuration
-To enable FairPlay encryption, select **FairPlay configuration**. Then select the **App certificate** and enter the **Application Secret Key**. For more information about FairPlay configuration and requirements, see [Protect your HLS content with Apple FairPlay or Microsoft PlayReady](media-services-protect-hls-with-FairPlay.md).
-
-![FairPlay configuration](./media/media-services-portal-content-protection/media-services-content-protection006.png)
-
-## Apply dynamic encryption to your asset
-To take advantage of dynamic encryption, encode your source file into a set of adaptive-bitrate MP4 files.
-
-### Select an asset that you want to encrypt
-To see all your assets, select **Settings** > **Assets**.
-
-![Assets option](./media/media-services-portal-content-protection/media-services-content-protection007.png)
-
-### Encrypt with AES or DRM
-When you select **Encrypt** for an asset, you see two choices: **AES** or **DRM**.
-
-#### AES
-AES clear key encryption is enabled on all streaming protocols: Smooth Streaming, HLS, and MPEG-DASH.
-
-![Encryption configuration](./media/media-services-portal-content-protection/media-services-content-protection008.png)
-
-#### DRM
-1. After you select **DRM**, you see different content protection policies (which must be configured by this point) and a set of streaming protocols:
-
- a. **PlayReady and Widevine with MPEG-DASH** dynamically encrypts your MPEG-DASH stream with PlayReady and Widevine DRMs.
-
- b. **PlayReady and Widevine with MPEG-DASH + FairPlay with HLS** dynamically encrypt your MPEG-DASH stream with PlayReady and Widevine DRMs. This option also encrypts your HLS streams with FairPlay.
-
- c. **PlayReady only with Smooth Streaming, HLS, and MPEG-DASH** dynamically encrypts Smooth Streaming, HLS, and MPEG-DASH streams with PlayReady DRM.
-
- d. **Widevine only with MPEG-DASH** dynamically encrypts your MPEG-DASH with Widevine DRM.
-
- e. **FairPlay only with HLS** dynamically encrypts your HLS stream with FairPlay.
-
-1. To enable FairPlay encryption, on the **Content Protection Global Settings** blade, select **FairPlay configuration**. Then select the **App certificate**, and enter the **Application Secret Key**.
-
- ![Encryption type](./media/media-services-portal-content-protection/media-services-content-protection009.png)
-
-1. After you make the encryption selection, select **Apply**.
-
->[!NOTE]
->If you plan to play an AES-encrypted HLS in Safari, see the blog post [Encrypted HLS in Safari](https://azure.microsoft.com/blog/how-to-make-token-authorized-aes-encrypted-hls-stream-working-in-safari/).
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-## Provide feedback
media-services Media Services Portal Publish https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-publish.md
- Title: Publish content in the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of publishing your content in the Azure portal.
------ Previously updated : 03/10/2021--
-# Publish content in the Azure portal
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-publish.md)
-> * [.NET](media-services-deliver-streaming-content.md)
-> * [REST](media-services-rest-deliver-streaming-content.md)
->
->
-
-## Overview
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
->
->
-
-To provide your user with a URL that they can use to stream or download your content, first you must publish your asset by creating a locator. Locators provide access to asset files. Azure Media Services supports two types of locators:
-
-* **Streaming (OnDemandOrigin) locators**. Streaming locators are used for adaptive streaming. Examples of adaptive streaming include Apple HTTP Live Streaming (HLS), Microsoft Smooth Streaming, and Dynamic Adaptive Streaming over HTTP (DASH, also called MPEG-DASH). To create a streaming locator, your asset must include an .ism file. For example, `http://amstest.streaming.mediaservices.windows.net/61b3da1d-96c7-489e-bd21-c5f8a7494b03/scott.ism/manifest`.
-* **Progressive (shared access signature) locators**. Progressive locators are used to deliver video via progressive download.
-
-To build an HLS streaming URL, append *(format=m3u8-aapl)* to the URL:
-
-`{streaming endpoint name-media services account name}/{locator ID}/{file name}.ism/Manifest(format=m3u8-aapl)`
-
-To build a streaming URL to play Smooth Streaming assets, use the following URL format:
-
-`{streaming endpoint name-media services account name}/{locator ID}/{file name}.ism/Manifest`
-
-To build an MPEG-DASH streaming URL, append *(format=mpd-time-csf)* to the URL:
-
-`{streaming endpoint name-media services account name}/{locator ID}/{file name}.ism/Manifest(format=mpd-time-csf)`
-
-A shared access signature URL has the following format:
-
-`{blob container name}/{asset name}/{file name}/{shared access signature}`
-
-For more information, see the [delivering content overview](media-services-deliver-content-overview.md).
-
-> [!NOTE]
-> Locators that were created in the Azure portal before March 2015 have a two-year expiration date.
->
->
-
-To update an expiration date on a locator, use can use a [REST API](/rest/api/media/operations/locator#update_a_locator) or a [.NET API](/dotnet/api/microsoft.windowsazure.mediaservices.client.ilocator).
-
-> [!NOTE]
-> When you update the expiration date of a shared access signature locator, the URL changes.
-
-### To use the portal to publish an asset
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Select the asset that you want to publish.
-3. Select the **Publish** button.
-4. Select the locator type.
-5. Select **Add**.
-
- ![Publish the video](./media/media-services-portal-vod-get-started/media-services-publish1.png)
-
-The URL is added to the list of **Published URLs**.
-
-## Play content in the portal
-You can test your video on a content player in the Azure portal.
-
-Select the video, and then select the **Play** button.
-
-![Play the video in the Azure portal](./media/media-services-portal-vod-get-started/media-services-play.png)
-
-Some considerations apply:
-
-* Make sure that the video has been published.
-* The Azure portal media player plays from the default streaming endpoint. If you want to play from a non-default streaming endpoint, select and copy the URL, and then paste it into another player. For example, you can test your video on the [Azure Media Player](https://aka.ms/azuremediaplayer).
-* The streaming endpoint from which you are streaming must be running.
-
-## Provide feedback
-
-## Next steps
media-services Media Services Portal Scale Media Processing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-scale-media-processing.md
- Title: Scale media processing using the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of scaling media processing using the Azure portal.
------ Previously updated : 08/24/2021--
-# Change the reserved unit type
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-encoding-units.md)
-> * [Portal](media-services-portal-scale-media-processing.md)
-> * [REST](/rest/api/media/operations/encodingreservedunittype)
-> * [Java](https://github.com/rnrneverdies/azure-sdk-for-media-services-java-samples)
-> * [PHP](https://github.com/Azure/azure-sdk-for-php/tree/master/examples/MediaServices)
->
->
-
-## Overview
-
-By default, Media Reserve Units are no longer needed to be used and are not supported by Azure Media Services. For compatibility purposes, the current Azure portal has an option for you to manage and scale MRUs. However, by default, none of the MRU configurations that you set will be used to control encoding concurrency or performance.
-
-> [!IMPORTANT]
-> Make sure to review the [overview](media-services-scale-media-processing-overview.md) topic to get more information about scaling media processing topic.
-
-## Scale media processing
->[!NOTE]
->Selecting MRUs will not affect concurrency or performance in Azure Media Services V3.
-
-To change the reserved unit type and the number of reserved units, do the following:
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. In the **Settings** window, select **Media reserved units**.
-
- To change the number of reserved units for the selected reserved unit type, use the **Media Served Units** slider at the top of the screen.
-
- To change the **RESERVED UNIT TYPE**, click on the **Speed of reserved processing units** bar. Then, select the pricing tier you need: S1, S2, or S3.
-
-3. Press the SAVE button to save your changes.
-
- The new reserved units are allocated when you press SAVE.
-
-## Next steps
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Portal Scale Streaming Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-scale-streaming-endpoints.md
- Title: Scale streaming endpoints with the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of scaling streaming endpoints with the Azure portal.
------ Previously updated : 03/10/2021--
-# Scale streaming endpoints with the Azure portal
--
-## Overview
-
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
->
->
-
-**Premium** streaming endpoints are suitable for advanced workloads, providing dedicated and scalable bandwidth capacity. Customers that have a **Premium** streaming endpoint, by default get one streaming unit (SU). The streaming endpoint can be scaled by adding SUs. Each SU provides additional bandwidth capacity to the application. For more information about streaming endpoint types and CDN configuration, see the [Streaming Endpoint overview](media-services-streaming-endpoints-overview.md) topic.
-
-This topic shows how to scale a streaming endpoint.
-
-For information about pricing details, see [Media Services Pricing Details](https://azure.microsoft.com/pricing/details/media-services/).
-
-## Scale streaming endpoints
-
-To change the number of streaming units, do the following:
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. In the **Settings** window, select **Streaming endpoints**.
-3. Click on the streaming endpoint that you want to scale.
-
- > [!NOTE]
- > You can only scale **Premium** streaming endpoints.
-
-4. Move the slider to specify the number of streaming units.
-
- ![Streaming endpoint](./media/media-services-portal-manage-streaming-endpoints/media-services-manage-streaming-endpoints3.png)
-
-## Next steps
-Review Media Services learning paths.
--
-## Provide feedback
-
media-services Media Services Portal Upload Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-upload-files.md
- Title: Upload files to a Media Services account in the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of uploading files to a Media Services account in the Azure portal.
------ Previously updated : 03/10/2021--
-# Upload files to a Media Services account in the Azure portal
--
-> [!div class="op_single_selector"]
-> * [Portal](media-services-portal-upload-files.md)
-> * [.NET](media-services-dotnet-upload-files.md)
-> * [REST](media-services-rest-upload-files.md)
->
-
-> [!NOTE]
-> No new features or functionality are being added to Media Services v2. For the up-to-date upload files with portal, see [Use portal to upload, encode, and stream content](../latest/asset-create-asset-upload-portal-quickstart.md).<br/>Also, check out: [Media Services v3](../latest/index.yml). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
-
-In Azure Media Services, you upload your digital files to an asset. The asset can contain video, audio, images, thumbnail collections, text tracks, and closed caption files (and the metadata for these files). After the files are uploaded, your content is stored securely in the cloud for further processing and streaming.
-
-Media Services has a maximum file size for processing files. For details about file size limits, see [Media Services quotas and limitations](media-services-quotas-and-limitations.md).
-
-To complete this tutorial, you need an Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
-
-## Upload files
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Then, select the **Upload** button.
-
- ![Upload files](./media/media-services-portal-vod-get-started/media-services-upload.png)
-
- The **Upload a video asset** window appears.
-
- > [!NOTE]
- > Media Services doesn't limit the file size for uploading videos.
-
-3. On your computer, go to the video that you want to upload. Select the video, and then select **OK**.
-
- The upload begins. You can see the progress under the file name.
-
-When the upload is finished, the new asset is listed in the **Assets** pane.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-* Learn how to [encode your uploaded assets](media-services-portal-encode.md).
-
-* You also can use Azure Functions to trigger an encoding job when a file arrives in the configured container. For more information, see the sample at [Media
media-services Media Services Portal Vod Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-portal-vod-get-started.md
- Title: Get started with delivering video-on-demand by using the Azure portal | Microsoft Docs
-description: This tutorial walks you through the steps of implementing a basic video-on-demand content delivery service with an Azure Media Services application in the Azure portal.
------ Previously updated : 03/10/2021--
-# Get started with delivering content on demand by using the Azure portal
---
-This tutorial walks you through the steps of implementing a basic video-on-demand content delivery service with an Azure Media Services application in the Azure portal.
-
-## Prerequisites
-The following items are required to complete the tutorial:
-
-* An Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to create a Media Services account](media-services-portal-create-account.md).
-
-This tutorial includes the following tasks:
-
-1. Start the streaming endpoint.
-2. Upload a video file.
-3. Encode the source file into a set of adaptive bitrate MP4 files.
-4. Publish the asset, and get streaming and progressive download URLs.
-5. Play your content.
-
-## Start the streaming endpoint
-
-One of the most common scenarios when working with Azure Media Services is delivering video via adaptive bitrate streaming. Media Services gives you dynamic packaging. With dynamic packaging, you can deliver your adaptive bitrate MP4 encoded content in just-in-time streaming formats that are supported by Media Services. Examples include Apple HTTP Live Streaming (HLS), Microsoft Smooth Streaming, and Dynamic Adaptive Streaming over HTTP (DASH, also called MPEG-DASH). By using Media Services adaptive bitrate streaming, you can deliver your videos without storing prepackaged versions of each of these streaming formats.
-
-> [!NOTE]
-> When you create your Media Services account, a default streaming endpoint is added to your account in the **Stopped** state. To start streaming your content, and to take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-To start the streaming endpoint:
-
-1. Sign in to the [Azure portal](https://portal.azure.com/).
-2. Select **Settings** > **Streaming endpoints**.
-3. Select the default streaming endpoint. The **DEFAULT STREAMING ENDPOINT DETAILS** window appears.
-4. Select the **Start** icon.
-5. Select the **Save** button.
-
-## Upload files
-To stream videos by using Media Services, you upload the source videos, encode them into multiple bitrates, and then publish the result. The first step is covered in this section.
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Then, select the **Upload** button.
-
- ![Upload files](./media/media-services-portal-vod-get-started/media-services-upload.png)
-
- The **Upload a video asset** window appears.
-
- > [!NOTE]
- > Media Services doesn't limit the file size for uploading videos.
- >
- >
-3. On your computer, go to the video that you want to upload. Select the video, and then select **OK**.
-
- The upload begins. You can see the progress under the file name.
-
-When the upload is finished, the new asset is listed in the **Assets** pane.
-
-## Encode assets
-To take advantage of dynamic packaging, you must encode your source file into a set of multi-bitrate MP4 files. The encoding steps are demonstrated in this section.
-
-### Encode assets in the portal
-To encode your content by using Media Encoder Standard in the Azure portal:
-
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Select the asset that you want to encode.
-3. Select the **Encode** button.
-4. In the **Encode an asset** pane, select the **Media Encoder Standard** processor and a preset. For information about presets, see [Auto-generate a bitrate ladder](media-services-autogen-bitrate-ladder-with-mes.md) and [Task presets for Media Encoder Standard](media-services-mes-presets-overview.md). It's important to choose the preset that will work best for your input video. For example, if you know your input video has a resolution of 1920 &#215; 1080 pixels, you might choose the **H264 Multiple Bitrate 1080p** preset. If you have a low-resolution (640 &#215; 360) video, you shouldn't use the **H264 Multiple Bitrate 1080p** preset.
-
- To help you manage your resources, you can edit the name of the output asset and the name of the job.
-
- ![Encode assets](./media/media-services-portal-vod-get-started/media-services-encode1.png)
-5. Select **Create**.
-
-### Monitor encoding job progress
-To monitor the progress of the encoding job, at the top of the page, select **Settings**, and then select **Jobs**.
-
-![Jobs](./media/media-services-portal-vod-get-started/media-services-jobs.png)
-
-## Publish content
-To provide your user with a URL that they can use to stream or download your content, first you must publish your asset by creating a locator. Locators provide access to files that are in the asset. Azure Media Services supports two types of locators:
-
-* **Streaming (OnDemandOrigin) locators**. Streaming locators are used for adaptive streaming. Examples of adaptive streaming include HLS, Smooth Streaming, and MPEG-DASH. To create a streaming locator, your asset must include an .ism file.
-* **Progressive (shared access signature) locators**. Progressive locators are used to deliver video via progressive download.
-
-To build an HLS streaming URL, append *(format=m3u8-aapl)* to the URL:
-
-`{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{file name}.ism/Manifest(format=m3u8-aapl)`
-
-To build a streaming URL to play Smooth Streaming assets, use the following URL format:
-
-`{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{file name}.ism/Manifest`
-
-To build an MPEG-DASH streaming URL, append *(format=mpd-time-csf)* to the URL:
-
-`{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{file name}.ism/Manifest(format=mpd-time-csf)`
-
-A shared access signature URL has the following format:
-
-`{blob container name}/{asset name}/{file name}/{shared access signature}`
-
-> [!NOTE]
-> Locators that were created in the Azure portal before March 2015 have a two-year expiration date.
->
->
-
-To update an expiration date on a locator, you can use a [REST API](/rest/api/media/operations/locator#update_a_locator) or a [.NET API](/dotnet/api/microsoft.windowsazure.mediaservices.client.ilocator).
-
-> [!NOTE]
-> When you update the expiration date of a shared access signature locator, the URL changes.
-
-### To use the portal to publish an asset
-1. In the [Azure portal](https://portal.azure.com/), select your Azure Media Services account.
-2. Select **Settings** > **Assets**. Select the asset that you want to publish.
-3. Select the **Publish** button.
-4. Select the locator type.
-5. Select **Add**.
-
- ![Publish the video](./media/media-services-portal-vod-get-started/media-services-publish1.png)
-
-The URL is added to the list of **Published URLs**.
-
-## Play content from the portal
-You can test your video on a content player in the Azure portal.
-
-Select the video, and then select the **Play** button.
-
-![Play the video in the Azure portal](./media/media-services-portal-vod-get-started/media-services-play.png)
-
-Some considerations apply:
-
-* To begin streaming, start running the default streaming endpoint.
-* Make sure that the video has been published.
-* The Azure portal media player plays from the default streaming endpoint. If you want to play from a non-default streaming endpoint, select and copy the URL, and then paste it into another player. For example, you can test your video on the [Azure Media Player](https://aka.ms/azuremediaplayer).
-
-## Provide feedback
-
-## Next steps
media-services Media Services Powershell Create And Configure Aad App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-powershell-create-and-configure-aad-app.md
- Title: Use PowerShell to create an Azure AD app to access the Azure Media Services API | Microsoft Docs
-description: Learn how to use PowerShell to create an Azure Active Directory (Azure AD) app and set it up to access the Azure Media Services API.
------ Previously updated : 03/10/2021----
-# Use PowerShell to create an Azure AD app to use with the Azure Media Services API
---
-Learn how to use a PowerShell script to create an Azure Active Directory (Azure AD) application and service principal to access Azure Media Services resources.
-
-## Prerequisites
--- An Azure account. If you don't have an account, start with an [Azure free trial](https://azure.microsoft.com/pricing/free-trial/). -- A Media Services account. For more information, see [Create an Azure Media Services account in the Azure portal](media-services-portal-create-account.md).--- Azure PowerShell. For more information, see [How to use Azure PowerShell](/powershell/azure/).--
-## Create an Azure AD app by using PowerShell
-
-```powershell
-Connect-AzAccount
-Import-Module Az.Resources
-Set-AzContext -SubscriptionId $SubscriptionId
-$ServicePrincipal = New-AzADServicePrincipal -DisplayName $ApplicationDisplayName -Password $Password
-
-Get-AzADServicePrincipal -ObjectId $ServicePrincipal.Id
-$NewRole = $null
-$Scope = "/subscriptions/your subscription id/resourceGroups/userresourcegroup/providers/microsoft.media/mediaservices/your media account"
-
-$Retries = 0;While ($NewRole -eq $null -and $Retries -le 6)
-{
- # Sleep here for a few seconds to allow the service principal application to become active (usually, it will take only a couple of seconds)
- Sleep 15
- New-AzRoleAssignment -RoleDefinitionName Contributor -ServicePrincipalName $ServicePrincipal.ApplicationId -Scope $Scope | Write-Verbose -ErrorAction SilentlyContinue
- $NewRole = Get-AzRoleAssignment -ServicePrincipalName $ServicePrincipal.ApplicationId -ErrorAction SilentlyContinue
- $Retries++;
-}
-```
-
-For more information, see the following articles:
--- [Use Azure PowerShell to create a service principal to access resources](../../active-directory/develop/howto-authenticate-service-principal-powershell.md)-- [Add or remove Azure role assignments using Azure PowerShell](../../role-based-access-control/role-assignments-powershell.md)-- [How to manually configure daemon apps by using certificates](https://github.com/azure-samples/active-directory-dotnetcore-daemon-v2)-
-## Next steps
-
-Get started with [uploading files to your account](media-services-portal-upload-files.md).
media-services Media Services Protect Hls With Fairplay https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-protect-hls-with-fairplay.md
- Title: Protect HLS content with Microsoft PlayReady or Apple FairPlay - Azure | Microsoft Docs
-description: This topic gives an overview and shows how to use Azure Media Services to dynamically encrypt your HTTP Live Streaming (HLS) content with Apple FairPlay. It also shows how to use the Media Services license delivery service to deliver FairPlay licenses to clients.
------ Previously updated : 03/10/2021---
-# Protect your HLS content with Apple FairPlay or Microsoft PlayReady
--
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/). > No new features or functionality are being added to Media Services v2. <br/>Check out the latest version, [Media Services v3](../latest/index.yml). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
->
-
-Azure Media Services enables you to dynamically encrypt your HTTP Live Streaming (HLS) content by using the following formats:
-
-* **AES-128 envelope clear key**
-
- The entire chunk is encrypted by using the **AES-128 CBC** mode. The decryption of the stream is supported by iOS and OS X player natively. For more information, see [Using AES-128 dynamic encryption and key delivery service](media-services-playready-license-template-overview.md).
-* **Apple FairPlay**
-
- The individual video and audio samples are encrypted by using the **AES-128 CBC** mode. **FairPlay Streaming** (FPS) is integrated into the device operating systems, with native support on iOS and Apple TV. Safari on OS X enables FPS by using the Encrypted Media Extensions (EME) interface support.
-* **Microsoft PlayReady**
-
-The following image shows the **HLS + FairPlay or PlayReady dynamic encryption** workflow.
-
-![Diagram of dynamic encryption workflow](./media/media-services-content-protection-overview/media-services-content-protection-with-FairPlay.png)
-
-This article demonstrates how to use Media Services to dynamically encrypt your HLS content with Apple FairPlay. It also shows how to use the Media Services license delivery service to deliver FairPlay licenses to clients.
-
-> [!NOTE]
-> If you also want to encrypt your HLS content with PlayReady, you need to create a common content key and associate it with your asset. You also need to configure the content keyΓÇÖs authorization policy, as described in [Using PlayReady dynamic common encryption](media-services-protect-with-playready-widevine.md).
->
->
-
-## Requirements and considerations
-
-The following are required when using Media Services to deliver HLS encrypted with FairPlay, and to deliver FairPlay licenses:
-
- * An Azure account. For details, see [Azure free trial](https://azure.microsoft.com/pricing/free-trial/?WT.mc_id=A261C142F).
- * A Media Services account. To create one, see [Create an Azure Media Services account using the Azure portal](media-services-portal-create-account.md).
- * Sign up with [Apple Development Program](https://developer.apple.com/).
- * Apple requires the content owner to obtain the [deployment package](https://developer.apple.com/contact/fps/). State that you already implemented Key Security Module (KSM) with Media Services, and that you are requesting the final FPS package. There are instructions in the final FPS package to generate certification and obtain the Application Secret Key (ASK). You use ASK to configure FairPlay.
- * Azure Media Services .NET SDK version **3.6.0** or later.
-
-The following things must be set on Media Services key delivery side:
-
- * **App Cert (AC)**: This is a .pfx file that contains the private key. You create this file and encrypt it with a password.
-
- When you configure a key delivery policy, you must provide that password and the .pfx file in Base64 format.
-
- The following steps describe how to generate a .pfx certificate file for FairPlay:
-
- 1. Install OpenSSL from https://slproweb.com/products/Win32OpenSSL.html.
-
- Go to the folder where the FairPlay certificate and other files delivered by Apple are.
- 2. Run the following command from the command line. This converts the .cer file to a .pem file.
-
- "C:\OpenSSL-Win32\bin\openssl.exe" x509 -inform der -in FairPlay.cer -out FairPlay-out.pem
- 3. Run the following command from the command line. This converts the .pem file to a .pfx file with the private key. The password for the .pfx file is then asked by OpenSSL.
-
- "C:\OpenSSL-Win32\bin\openssl.exe" pkcs12 -export -out FairPlay-out.pfx -inkey privatekey.pem -in FairPlay-out.pem -passin file:privatekey-pem-pass.txt
- * **App Cert password**: The password for creating the .pfx file.
- * **App Cert password ID**: You must upload the password, similar to how they upload other Media Services keys. Use the **ContentKeyType.FairPlayPfxPassword** enum value to get the Media Services ID. This is what they need to use inside the key delivery policy option.
- * **iv**: This is a random value of 16 bytes. It must match the iv in the asset delivery policy. You generate the iv, and put it in both places: the asset delivery policy and the key delivery policy option.
- * **ASK**: This key is received when you generate the certification by using the Apple Developer portal. Each development team receives a unique ASK. Save a copy of the ASK, and store it in a safe place. You need to configure ASK as FairPlayAsk to Media Services later.
- * **ASK ID**: This ID is obtained when you upload ASK into Media Services. You must upload ASK by using the **ContentKeyType.FairPlayAsk** enum value. As the result, the Media Services ID is returned, and this is what should be used when setting the key delivery policy option.
-
-The following things must be set by the FPS client side:
-
- * **App Cert (AC)**: This is a .cer/.der file that contains the public key, which the operating system uses to encrypt some payload. Media Services needs to know about it because it is required by the player. The key delivery service decrypts it using the corresponding private key.
-
-To play back a FairPlay encrypted stream, get a real ASK first, and then generate a real certificate. That process creates all three parts:
-
- * .der file
- * .pfx file
- * password for the .pfx
-
-The following clients support HLS with **AES-128 CBC** encryption: Safari on OS X, Apple TV, iOS.
-
-## Configure FairPlay dynamic encryption and license delivery services
-The following are general steps for protecting your assets with FairPlay by using the Media Services license delivery service, and also by using dynamic encryption.
-
-1. Create an asset, and upload files into the asset.
-2. Encode the asset that contains the file to the adaptive bitrate MP4 set.
-3. Create a content key, and associate it with the encoded asset.
-4. Configure the content keyΓÇÖs authorization policy. Specify the following:
-
- * The delivery method (in this case, FairPlay).
- * FairPlay policy options configuration. For details on how to configure FairPlay, see the **ConfigureFairPlayPolicyOptions()** method in the sample below.
-
- > [!NOTE]
- > Usually, you would want to configure FairPlay policy options only once, because you will only have one set of a certification and an ASK.
- >
- >
- * Restrictions (open or token).
- * Information specific to the key delivery type that defines how the key is delivered to the client.
-5. Configure the asset delivery policy. The delivery policy configuration includes:
-
- * The delivery protocol (HLS).
- * The type of dynamic encryption (common CBC encryption).
- * The license acquisition URL.
-
- > [!NOTE]
- > If you want to deliver a stream that is encrypted with FairPlay and another Digital Rights Management (DRM) system, you have to configure separate delivery policies:
- >
- > * One IAssetDeliveryPolicy to configure Dynamic Adaptive Streaming over HTTP (DASH) with Common Encryption (CENC) (PlayReady + Widevine), and Smooth with PlayReady
- > * Another IAssetDeliveryPolicy to configure FairPlay for HLS
- >
- >
-6. Create an OnDemand locator to get a streaming URL.
-
-## Use FairPlay key delivery by player apps
-You can develop player apps by using the iOS SDK. To be able to play FairPlay content, you have to implement the license exchange protocol. This protocol is not specified by Apple. It is up to each app how to send key delivery requests. The Media Services FairPlay key delivery service expects the SPC to come as a www-form-url encoded post message, in the following form:
-
-`spc=<Base64 encoded SPC>`
-
-> [!NOTE]
-> Azure Media Player supports FairPlay playback. See [Azure Media Player documentation](https://amp.azure.net/libs/amp/latest/docs/https://docsupdatetracker.net/index.html) for further information.
->
->
-
-## Streaming URLs
-If your asset was encrypted with more than one DRM, you should use an encryption tag in the streaming URL: (format='m3u8-aapl', encryption='xxx').
-
-The following considerations apply:
-
-* Only zero or one encryption type can be specified.
-* The encryption type doesn't have to be specified in the URL if only one encryption was applied to the asset.
-* The encryption type is case insensitive.
-* The following encryption types can be specified:
- * **cenc**: Common encryption (PlayReady or Widevine)
- * **cbcs-aapl**: FairPlay
- * **cbc**: AES envelope encryption
-
-## Create and configure a Visual Studio project
-
-1. Set up your development environment and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-2. Add the following elements to **appSettings** defined in your app.config file:
-
- ```xml
- <add key="Issuer" value="http://testissuer.com"/>
- <add key="Audience" value="urn:test"/>
- ```
-
-## Example
-
-The following sample demonstrates the ability to use Media Services to deliver your content encrypted with FairPlay. This functionality was introduced in the Azure Media Services SDK for .NET version 3.6.0.
-
-Overwrite the code in your Program.cs file with the code shown in this section.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-Make sure to update variables to point to folders where your input files are located.
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using System.Threading;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
-using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
-using Microsoft.WindowsAzure.MediaServices.Client.FairPlay;
-using Newtonsoft.Json;
-using System.Security.Cryptography.X509Certificates;
-
-namespace DynamicEncryptionWithFairPlay
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly Uri _sampleIssuer =
- new Uri(ConfigurationManager.AppSettings["Issuer"]);
- private static readonly Uri _sampleAudience =
- new Uri(ConfigurationManager.AppSettings["Audience"]);
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- bool tokenRestriction = false;
- string tokenTemplateString = null;
-
- IAsset asset = UploadFileAndCreateAsset(_singleMP4File);
- Console.WriteLine("Uploaded asset: {0}", asset.Id);
-
- IAsset encodedAsset = EncodeToAdaptiveBitrateMP4Set(asset);
- Console.WriteLine("Encoded asset: {0}", encodedAsset.Id);
-
- IContentKey key = CreateCommonCBCTypeContentKey(encodedAsset);
- Console.WriteLine("Created key {0} for the asset {1} ", key.Id, encodedAsset.Id);
- Console.WriteLine("FairPlay License Key delivery URL: {0}", key.GetKeyDeliveryUrl(ContentKeyDeliveryType.FairPlay));
- Console.WriteLine();
-
- if (tokenRestriction)
- tokenTemplateString = AddTokenRestrictedAuthorizationPolicy(key);
- else
- AddOpenAuthorizationPolicy(key);
-
- Console.WriteLine("Added authorization policy: {0}", key.AuthorizationPolicyId);
- Console.WriteLine();
-
- CreateAssetDeliveryPolicy(encodedAsset, key);
- Console.WriteLine("Created asset delivery policy. \n");
- Console.WriteLine();
-
- if (tokenRestriction && !String.IsNullOrEmpty(tokenTemplateString))
- {
- // Deserializes a string containing an Xml representation of a TokenRestrictionTemplate
- // back into a TokenRestrictionTemplate class instance.
- TokenRestrictionTemplate tokenTemplate =
- TokenRestrictionTemplateSerializer.Deserialize(tokenTemplateString);
-
- // Generate a test token based on the data in the given TokenRestrictionTemplate.
- // Note, you need to pass the key id Guid because we specified
- // TokenClaim.ContentKeyIdentifierClaim in during the creation of TokenRestrictionTemplate.
- Guid rawkey = EncryptionUtils.GetKeyIdAsGuid(key.Id);
- string testToken = TokenRestrictionTemplateSerializer.GenerateTestToken(tokenTemplate, null, rawkey,
- DateTime.UtcNow.AddDays(365));
- Console.WriteLine("The authorization token is:\nBearer {0}", testToken);
- Console.WriteLine();
- }
-
- string url = GetStreamingOriginLocator(encodedAsset);
- Console.WriteLine("Encrypted HLS URL: {0}/manifest(format=m3u8-aapl)", url);
-
- Console.ReadLine();
- }
-
- static public IAsset UploadFileAndCreateAsset(string singleFilePath)
- {
- if (!File.Exists(singleFilePath))
- {
- Console.WriteLine("File does not exist.");
- return null;
- }
-
- var assetName = Path.GetFileNameWithoutExtension(singleFilePath);
- IAsset inputAsset = _context.Assets.Create(assetName, AssetCreationOptions.None);
-
- var assetFile = inputAsset.AssetFiles.Create(Path.GetFileName(singleFilePath));
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return inputAsset;
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset inputAsset)
- {
- var encodingPreset = "Adaptive Streaming";
-
- IJob job = _context.Jobs.Create(String.Format("Encoding {0}", inputAsset.Name));
-
- var mediaProcessors =
- _context.MediaProcessors.Where(p => p.Name.Contains("Media Encoder Standard")).ToList();
-
- var latestMediaProcessor =
- mediaProcessors.OrderBy(mp => new Version(mp.Version)).LastOrDefault();
-
- ITask encodeTask = job.Tasks.AddNew("Encoding", latestMediaProcessor, encodingPreset, TaskOptions.None);
- encodeTask.InputAssets.Add(inputAsset);
- encodeTask.OutputAssets.AddNew(String.Format("{0} as {1}", inputAsset.Name, encodingPreset), AssetCreationOptions.StorageEncrypted);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
-
- static public IContentKey CreateCommonCBCTypeContentKey(IAsset asset)
- {
- // Create HLS SAMPLE AES encryption content key
- Guid keyId = Guid.NewGuid();
- byte[] contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryptionCbcs);
-
- // Associate the key with the asset.
- asset.ContentKeys.Add(key);
-
- return key;
- }
--
- static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
- {
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
--
- // Configure FairPlay policy option.
- string FairPlayConfiguration = ConfigureFairPlayPolicyOptions();
-
- IContentKeyAuthorizationPolicyOption FairPlayPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.FairPlay,
- restrictions,
- FairPlayConfiguration);
--
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common CBC Content Key with no restrictions").
- Result;
-
- contentKeyAuthorizationPolicy.Options.Add(FairPlayPolicy);
-
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-
- public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
- {
- string tokenTemplateString = GenerateTokenRequirements();
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Token Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
- Requirements = tokenTemplateString,
- }
- };
-
- // Configure FairPlay policy option.
- string FairPlayConfiguration = ConfigureFairPlayPolicyOptions();
--
- IContentKeyAuthorizationPolicyOption FairPlayPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.FairPlay,
- restrictions,
- FairPlayConfiguration);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common CBC Content Key with token restrictions").
- Result;
-
- contentKeyAuthorizationPolicy.Options.Add(FairPlayPolicy);
-
- // Associate the content key authorization policy with the content key
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
-
- return tokenTemplateString;
- }
-
- private static string ConfigureFairPlayPolicyOptions()
- {
- // For testing you can provide all zeroes for ASK bytes together with the cert from Apple FPS SDK.
- // However, for production you must use a real ASK from Apple bound to a real prod certificate.
- byte[] askBytes = Guid.NewGuid().ToByteArray();
- var askId = Guid.NewGuid();
- // Key delivery retrieves askKey by askId and uses this key to generate the response.
- IContentKey askKey = _context.ContentKeys.Create(
- askId,
- askBytes,
- "askKey",
- ContentKeyType.FairPlayASk);
-
- //Customer password for creating the .pfx file.
- string pfxPassword = "<customer password for creating the .pfx file>";
- // Key delivery retrieves pfxPasswordKey by pfxPasswordId and uses this key to generate the response.
- var pfxPasswordId = Guid.NewGuid();
- byte[] pfxPasswordBytes = System.Text.Encoding.UTF8.GetBytes(pfxPassword);
- IContentKey pfxPasswordKey = _context.ContentKeys.Create(
- pfxPasswordId,
- pfxPasswordBytes,
- "pfxPasswordKey",
- ContentKeyType.FairPlayPfxPassword);
-
- // iv - 16 bytes random value, must match the iv in the asset delivery policy.
- byte[] iv = Guid.NewGuid().ToByteArray();
-
- //Specify the .pfx file created by the customer.
- var appCert = new X509Certificate2("path to the .pfx file created by the customer", pfxPassword, X509KeyStorageFlags.Exportable);
-
- string FairPlayConfiguration =
- Microsoft.WindowsAzure.MediaServices.Client.FairPlay.FairPlayConfiguration.CreateSerializedFairPlayOptionConfiguration(
- appCert,
- pfxPassword,
- pfxPasswordId,
- askId,
- iv);
-
- return FairPlayConfiguration;
- }
-
- static private string GenerateTokenRequirements()
- {
- TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-
- template.PrimaryVerificationKey = new SymmetricVerificationKey();
- template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
- template.Audience = _sampleAudience.ToString();
- template.Issuer = _sampleIssuer.ToString();
- template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
-
- return TokenRestrictionTemplateSerializer.Serialize(template);
- }
-
- static public void CreateAssetDeliveryPolicy(IAsset asset, IContentKey key)
- {
- var kdPolicy = _context.ContentKeyAuthorizationPolicies.Where(p => p.Id == key.AuthorizationPolicyId).Single();
-
- var kdOption = kdPolicy.Options.Single(o => o.KeyDeliveryType == ContentKeyDeliveryType.FairPlay);
-
- FairPlayConfiguration configFP = JsonConvert.DeserializeObject<FairPlayConfiguration>(kdOption.KeyDeliveryConfiguration);
-
- // Get the FairPlay license service URL.
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.FairPlay);
-
- // The reason the below code replaces "https://" with "skd://" is because
- // in the IOS player sample code which you obtained in Apple developer account,
- // the player only recognizes a Key URL that starts with skd://.
- // However, if you are using a customized player,
- // you can choose whatever protocol you want.
- // For example, "https".
-
- Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
- {
- {AssetDeliveryPolicyConfigurationKey.FairPlayLicenseAcquisitionUrl, acquisitionUrl.ToString().Replace("https://", "skd://")},
- {AssetDeliveryPolicyConfigurationKey.CommonEncryptionIVForCbcs, configFP.ContentEncryptionIV}
- };
-
- var assetDeliveryPolicy = _context.AssetDeliveryPolicies.Create(
- "AssetDeliveryPolicy",
- AssetDeliveryPolicyType.DynamicCommonEncryptionCbcs,
- AssetDeliveryProtocol.HLS,
- assetDeliveryPolicyConfiguration);
-
- // Add AssetDelivery Policy to the asset
- asset.DeliveryPolicies.Add(assetDeliveryPolicy);
-
- }
--
- /// <summary>
- /// Gets the streaming origin locator.
- /// </summary>
- /// <param name="assets"></param>
- /// <returns></returns>
- static public string GetStreamingOriginLocator(IAsset asset)
- {
-
- // Get a reference to the streaming manifest file from the
- // collection of files in the asset.
-
- var assetFile = asset.AssetFiles.LoList().Where(f => f.Name.ToLower().
- EndsWith(".ism")).
- FirstOrDefault();
-
- // Create a 30-day readonly access policy.
- IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create a locator to the streaming content on an origin.
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Create a URL to the manifest file.
- return originLocator.Path + assetFile.Name;
- }
-
- static private void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine(string.Format("{0}\n State: {1}\n Time: {2}\n\n",
- ((IJob)sender).Name,
- e.CurrentState,
- DateTime.UtcNow.ToString(@"yyyy_M_d__hh_mm_ss")));
- }
-
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
- }
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps: Media Services learning paths
-
-## Provide feedback
media-services Media Services Protect Hls With Offline Fairplay https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-protect-hls-with-offline-fairplay.md
- Title: Protect HLS content with offline Apple FairPlay - Azure | Microsoft Docs
-description: This topic gives an overview and shows how to use Azure Media Services to dynamically encrypt your HTTP Live Streaming (HLS) content with Apple FairPlay in offline mode.
-
-keywords: HLS, DRM, FairPlay Streaming (FPS), Offline, iOS 10
----- Previously updated : 03/10/2021----
-# Offline FairPlay Streaming for iOS
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 3](../latest/drm-offline-fairplay-for-ios-concept.md)
-> * [Version 2](media-services-protect-hls-with-offline-fairplay.md)
--
-Azure Media Services provides a set of well-designed [content protection services](https://azure.microsoft.com/services/media-services/content-protection/) that cover:
--- Microsoft PlayReady-- Google Widevine-- Apple FairPlay-- AES-128 encryption-
-Digital rights management (DRM)/Advanced Encryption Standard (AES) encryption of content is performed dynamically upon request for various streaming protocols. DRM license/AES decryption key delivery services also are provided by Media Services.
-
-Besides protecting content for online streaming over various streaming protocols, offline mode for protected content is also an often-requested feature. Offline-mode support is needed for the following scenarios:
-
-* Playback when internet connection isn't available, such as during travel.
-* Some content providers might disallow DRM license delivery beyond a country/region's border. If users want to watch content while traveling outside of the country/region, offline download is needed.
-* In some countries/regions, internet availability and/or bandwidth is still limited. Users might choose to download first to be able to watch content in a resolution that is high enough for a satisfactory viewing experience. In this case, the issue typically isn't network availability but limited network bandwidth. Over-the-top (OTT)/online video platform (OVP) providers request offline-mode support.
-
-This article covers FairPlay Streaming (FPS) offline-mode support that targets devices running iOS 10 or later. This feature isn't supported for other Apple platforms, such as watchOS, tvOS, or Safari on macOS.
-
-## Preliminary steps
-Before you implement offline DRM for FairPlay on an iOS 10+ device:
-
-* Become familiar with online content protection for FairPlay. For more information, see the following articles and samples:
-
- - [Apple FairPlay Streaming for Azure Media Services is generally available](https://azure.microsoft.com/blog/apple-FairPlay-streaming-for-azure-media-services-generally-available/)
- - [Protect your HLS content with Apple FairPlay or Microsoft PlayReady](./media-services-protect-hls-with-fairplay.md)
- - [A sample for online FPS streaming](https://azure.microsoft.com/resources/samples/media-services-dotnet-dynamic-encryption-with-FairPlay/)
-
-* Obtain the FPS SDK from the Apple Developer Network. The FPS SDK contains two components:
-
- - The FPS Server SDK, which contains the Key Security Module (KSM), client samples, a specification, and a set of test vectors.
- - The FPS Deployment Pack, which contains the D function specification, along with instructions about how to generate the FPS Certificate, customer-specific private key, and Application Secret Key. Apple issues the FPS Deployment Pack only to licensed content providers.
-
-## Configuration in Media Services
-For FPS offline-mode configuration via the [Media Services .NET SDK](https://www.nuget.org/packages/windowsazure.mediaservices), use the Media Services .NET SDK version 4.0.0.4 or later, which provides the necessary API to configure FPS offline mode.
-You also need the working code to configure online-mode FPS content protection. After you obtain the code to configure online-mode content protection for FPS, you need only the following two changes.
-
-## Code change in the FairPlay configuration
-The first change is to define an "enable offline mode" Boolean, called objDRMSettings.EnableOfflineMode, that is true when it enables the offline DRM scenario. Depending on this indicator, make the following change to the FairPlay configuration:
-
-```csharp
-if (objDRMSettings.EnableOfflineMode)
- {
- FairPlayConfiguration = Microsoft.WindowsAzure.MediaServices.Client.FairPlay.FairPlayConfiguration.CreateSerializedFairPlayOptionConfiguration(
- objX509Certificate2,
- pfxPassword,
- pfxPasswordId,
- askId,
- iv,
- RentalAndLeaseKeyType.PersistentUnlimited,
- 0x9999);
- }
- else
- {
- FairPlayConfiguration = Microsoft.WindowsAzure.MediaServices.Client.FairPlay.FairPlayConfiguration.CreateSerializedFairPlayOptionConfiguration(
- objX509Certificate2,
- pfxPassword,
- pfxPasswordId,
- askId,
- iv);
- }
-```
-
-## Code change in the asset delivery policy configuration
-The second change is to add the third key into Dictionary<AssetDeliveryPolicyConfigurationKey, string>.
-Add AssetDeliveryPolicyConfigurationKey as shown here:
-
-```csharp
-// FPS offline mode
- if (drmSettings.EnableOfflineMode)
- {
- objDictionary_AssetDeliveryPolicyConfigurationKey.Add(AssetDeliveryPolicyConfigurationKey.AllowPersistentLicense, "true");
- Console.WriteLine("FPS OFFLINE MODE: AssetDeliveryPolicyConfigurationKey.AllowPersistentLicense added into asset delivery policy configuration.");
- }
-
- // for IAssetDelivery for FPS
- IAssetDeliveryPolicy objIAssetDeliveryPolicy = objCloudMediaContext.AssetDeliveryPolicies.Create(
- drmSettings.AssetDeliveryPolicyName,
- AssetDeliveryPolicyType.DynamicCommonEncryptionCbcs,
- AssetDeliveryProtocol.HLS,
- objDictionary_AssetDeliveryPolicyConfigurationKey);
-```
-
-After this step, the <Dictionary_AssetDeliveryPolicyConfigurationKey> string in the FPS asset delivery policy contains the following three entries:
-
-* AssetDeliveryPolicyConfigurationKey.FairPlayBaseLicenseAcquisitionUrl or AssetDeliveryPolicyConfigurationKey.FairPlayLicenseAcquisitionUrl, depending on factors such as the FPS KSM/key server used and whether you reuse the same asset delivery policy across multiple assets
-* AssetDeliveryPolicyConfigurationKey.CommonEncryptionIVForCbcs
-* AssetDeliveryPolicyConfigurationKey.AllowPersistentLicense
-
-Now your Media Services account is configured to deliver offline FairPlay licenses.
-
-## Sample iOS Player
-FPS offline-mode support is available only on iOS 10 and later. The FPS Server SDK (version 3.0 or later) contains the document and sample for FPS offline mode.
-Specifically, FPS Server SDK (version 3.0 or later) contains the following two items related to offline mode:
-
-* Document: "Offline Playback with FairPlay Streaming and HTTP Live Streaming." Apple, September 14, 2016. In FPS Server SDK version 4.0, this document is merged into the main FPS document.
-* Sample code: HLSCatalog sample for FPS offline mode in the \FairPlay Streaming Server SDK version 3.1\Development\Client\HLSCatalog_With_FPS\HLSCatalog\.
-In the HLSCatalog sample app, the following code files are used to implement offline-mode features:
-
- - AssetPersistenceManager.swift code file: AssetPersistenceManager is the main class in this sample that demonstrates how to:
-
- - Manage downloading HLS streams, such as the APIs used to start and cancel downloads and to delete existing assets off devices.
- - Monitor the download progress.
- - AssetListTableViewController.swift and AssetListTableViewCell.swift code files: AssetListTableViewController is the main interface of this sample. It provides a list of assets the sample can use to play, download, delete, or cancel a download.
-
-These steps show how to set up a running iOS player. Assuming you start from the HLSCatalog sample in FPS Server SDK version 4.0.1, make the following code changes:
-
-In HLSCatalog\Shared\Managers\ContentKeyDelegate.swift, implement the method `requestContentKeyFromKeySecurityModule(spcData: Data, assetID: String)` by using the following code. Let "drmUr" be a variable assigned to the HLS URL.
-
-```swift
- var ckcData: Data? = nil
-
- let semaphore = DispatchSemaphore(value: 0)
- let postString = "spc=\(spcData.base64EncodedString())&assetId=\(assetIDString)"
-
- if let postData = postString.data(using: .ascii, allowLossyConversion: true), let drmServerUrl = URL(string: self.drmUrl) {
- var request = URLRequest(url: drmServerUrl)
- request.httpMethod = "POST"
- request.setValue(String(postData.count), forHTTPHeaderField: "Content-Length")
- request.setValue("application/x-www-form-urlencoded", forHTTPHeaderField: "Content-Type")
- request.httpBody = postData
-
- URLSession.shared.dataTask(with: request) { (data, _, error) in
- if let data = data, var responseString = String(data: data, encoding: .utf8) {
- responseString = responseString.replacingOccurrences(of: "<ckc>", with: "").replacingOccurrences(of: "</ckc>", with: "")
- ckcData = Data(base64Encoded: responseString)
- } else {
- print("Error encountered while fetching FairPlay license for URL: \(self.drmUrl), \(error?.localizedDescription ?? "Unknown error")")
- }
-
- semaphore.signal()
- }.resume()
- } else {
- fatalError("Invalid post data")
- }
-
- semaphore.wait()
- return ckcData
-```
-
-In HLSCatalog\Shared\Managers\ContentKeyDelegate.swift, implement the method `requestApplicationCertificate()`. This implementation depends on whether you embed the certificate (public key only) with the device or host the certificate on the web. The following implementation uses the hosted application certificate used in the test samples. Let "certUrl" be a variable that contains the URL of the application certificate.
-
-```swift
-func requestApplicationCertificate() throws -> Data {
-
- var applicationCertificate: Data? = nil
- do {
- applicationCertificate = try Data(contentsOf: URL(string: certUrl)!)
- } catch {
- print("Error loading FairPlay application certificate: \(error)")
- }
-
- return applicationCertificate
- }
-```
-
-For the final integrated test, both the video URL and the application certificate URL are provided in the section "Integrated Test."
-
-In HLSCatalog\Shared\Resources\Streams.plist, add your test video URL. For the content key ID, use the FairPlay license acquisition URL with the skd protocol as the unique value.
-
-![Offline FairPlay iOS App Streams](media/media-services-protect-hls-with-offline-FairPlay/media-services-offline-FairPlay-ios-app-streams.png)
-
-Use your own test video URL, FairPlay license acquisition URL, and application certificate URL, if you have them set up. Or you can continue to the next section, which contains test samples.
-
-## Integrated test
-Three test samples in Media Services cover the following three scenarios:
-
-* FPS protected, with video, audio, and alternate audio track
-* FPS protected, with video and audio, but no alternate audio track
-* FPS protected, with video only and no audio
-
-You can find these samples at [this demo site](https://aka.ms/poc#22), with the corresponding application certificate hosted in an Azure web app.
-With either the version 3 or version 4 sample of the FPS Server SDK, if a master playlist contains alternate audio, during offline mode it plays audio only. Therefore, you need to strip the alternate audio. In other words, the second and third samples listed previously work in online and offline mode. The sample listed first plays audio only during offline mode, while online streaming works properly.
-
-## FAQ
-The following frequently asked questions provide assistance with troubleshooting:
--- **Why does only audio play but not video during offline mode?** This behavior seems to be by design of the sample app. When an alternate audio track is present (which is the case for HLS) during offline mode, both iOS 10 and iOS 11 default to the alternate audio track. To compensate this behavior for FPS offline mode, remove the alternate audio track from the stream. To do this on Media Services, add the dynamic manifest filter "audio-only=false." In other words, an HLS URL ends with .ism/manifest(format=m3u8-aapl,audio-only=false). -- **Why does it still play audio only without video during offline mode after I add audio-only=false?** Depending on the content delivery network (CDN) cache key design, the content might be cached. Purge the cache.-- **Is FPS offline mode also supported on iOS 11 in addition to iOS 10?** Yes. FPS offline mode is supported for iOS 10 and iOS 11.-- **Why can't I find the document "Offline Playback with FairPlay Streaming and HTTP Live Streaming" in the FPS Server SDK?** Since FPS Server SDK version 4, this document was merged into the "FairPlay Streaming Programming Guide."-- **What does the last parameter stand for in the following API for FPS offline mode?**
-`Microsoft.WindowsAzure.MediaServices.Client.FairPlay.FairPlayConfiguration.CreateSerializedFairPlayOptionConfiguration(objX509Certificate2, pfxPassword, pfxPasswordId, askId, iv, RentalAndLeaseKeyType.PersistentUnlimited, 0x9999);`
-
- For the documentation for this API, see [FairPlayConfiguration.CreateSerializedFairPlayOptionConfiguration Method](/dotnet/api/microsoft.windowsazure.mediaservices.client.fairplay.fairplayconfiguration.createserializedfairplayoptionconfiguration). The parameter represents the duration of the offline rental, with second as the unit.
-- **What is the downloaded/offline file structure on iOS devices?** The downloaded file structure on an iOS device looks like the following screenshot. The `_keys` folder stores downloaded FPS licenses, with one store file for each license service host. The `.movpkg` folder stores audio and video content. The first folder with a name that ends with a dash followed by a numeric contains video content. The numeric value is the PeakBandwidth of the video renditions. The second folder with a name that ends with a dash followed by 0 contains audio content. The third folder named "Data" contains the master playlist of the FPS content. Finally, boot.xml provides a complete description of the `.movpkg` folder content. -
-![Offline FairPlay iOS sample app file structure](media/media-services-protect-hls-with-offline-FairPlay/media-services-offline-FairPlay-file-structure.png)
-
-A sample boot.xml file:
-```xml
-<?xml version="1.0" encoding="UTF-8"?>
-<HLSMoviePackage xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xmlns="http://apple.com/IMG/Schemas/HLSMoviePackage" xsi:schemaLocation="http://apple.com/IMG/Schemas/HLSMoviePackage /System/Library/Schemas/HLSMoviePackage.xsd">
- <Version>1.0</Version>
- <HLSMoviePackageType>PersistedStore</HLSMoviePackageType>
- <Streams>
- <Stream ID="1-4DTFY3A3VDRCNZ53YZ3RJ2NPG2AJHNBD-0" Path="1-4DTFY3A3VDRCNZ53YZ3RJ2NPG2AJHNBD-0" NetworkURL="https://willzhanmswest.streaming.mediaservices.windows.net/e7c76dbb-8e38-44b3-be8c-5c78890c4bb4/MicrosoftElite01.ism/QualityLevels(127000)/Manifest(aac_eng_2_127,format=m3u8-aapl)">
- <Complete>YES</Complete>
- </Stream>
- <Stream ID="0-HC6H5GWC5IU62P4VHE7NWNGO2SZGPKUJ-310656" Path="0-HC6H5GWC5IU62P4VHE7NWNGO2SZGPKUJ-310656" NetworkURL="https://willzhanmswest.streaming.mediaservices.windows.net/e7c76dbb-8e38-44b3-be8c-5c78890c4bb4/MicrosoftElite01.ism/QualityLevels(161000)/Manifest(video,format=m3u8-aapl)">
- <Complete>YES</Complete>
- </Stream>
- </Streams>
- <MasterPlaylist>
- <NetworkURL>https://willzhanmswest.streaming.mediaservices.windows.net/e7c76dbb-8e38-44b3-be8c-5c78890c4bb4/MicrosoftElite01.ism/manifest(format=m3u8-aapl,audio-only=false)</NetworkURL>
- </MasterPlaylist>
- <DataItems Directory="Data">
- <DataItem>
- <ID>CB50F631-8227-477A-BCEC-365BBF12BCC0</ID>
- <Category>Playlist</Category>
- <Name>master.m3u8</Name>
- <DataPath>Playlist-master.m3u8-CB50F631-8227-477A-BCEC-365BBF12BCC0.data</DataPath>
- <Role>Master</Role>
- </DataItem>
- </DataItems>
-</HLSMoviePackage>
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Summary
-This document includes the following steps and information you can use to implement FPS offline mode:
-
-* Media Services content protection configuration via the Media Services .NET API configures dynamic FairPlay encryption and FairPlay license delivery in Media Services.
-* An iOS player based on the sample from the FPS Server SDK sets up an iOS player that can play FPS content either in online streaming mode or offline mode.
-* Sample FPS videos are used to test offline mode and online streaming.
-* A FAQ answers questions about FPS offline mode.
-
-## Next steps
-
media-services Media Services Protect With Aes128 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-protect-with-aes128.md
- Title: Use AES-128 dynamic encryption and the key delivery service | Microsoft Docs
-description: This topic shows how to dynamically encrypt with AES-128 and use the key delivery service.
------ Previously updated : 03/10/2021---
-# Use AES-128 dynamic encryption and the key delivery service
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-playready-license-template-overview.md)
-> * [Java](https://github.com/rnrneverdies/azure-sdk-for-media-services-java-samples)
-> * [PHP](https://github.com/Azure/azure-sdk-for-php/tree/master/examples/MediaServices)
->
--
-You can use Media Services to deliver HTTP Live Streaming (HLS) and Smooth Streaming encrypted with the AES by using 128-bit encryption keys. Media Services also provides the key delivery service that delivers encryption keys to authorized users. If you want Media Services to encrypt an asset, you associate an encryption key with the asset and also configure authorization policies for the key. When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES encryption. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-Media Services supports multiple ways of authenticating users who make key requests. The content key authorization policy can have one or more authorization restrictions, either open or token restrictions. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the [simple web token](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_2) (SWT) and [JSON Web Token](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_3) (JWT) formats. For more information, see [Configure the content key's authorization policy](media-services-protect-with-aes128.md#configure_key_auth_policy).
-
-To take advantage of dynamic encryption, you need to have an asset that contains a set of multi-bitrate MP4 files or multi-bitrate Smooth Streaming source files. You also need to configure the delivery policy for the asset (described later in this article). Then, based on the format specified in the streaming URL, the on-demand streaming server ensures that the stream is delivered in the protocol you selected. As a result, you need to store and pay only for the files in single storage format. Media Services builds and serves the appropriate response based on requests from a client.
-
-This article is useful to developers who work on applications that deliver protected media. The article shows you how to configure the key delivery service with authorization policies so that only authorized clients can receive encryption keys. It also shows how to use dynamic encryption.
-
-For information on how to encrypt content with the Advanced Encryption Standard (AES) for delivery to Safari on macOS, see [this blog post](https://azure.microsoft.com/blog/how-to-make-token-authorized-aes-encrypted-hls-stream-working-in-safari/).
-
-## AES-128 dynamic encryption and key delivery service workflow
-
-Perform the following general steps when you encrypt your assets with AES by using the Media Services key delivery service and also by using dynamic encryption:
-
-1. [Create an asset, and upload files into the asset](media-services-protect-with-aes128.md#create_asset).
-
-2. [Encode the asset that contains the file to the adaptive bitrate MP4 set](media-services-protect-with-aes128.md#encode_asset).
-
-3. [Create a content key, and associate it with the encoded asset]media-services-protect-with-aes128.md#create_contentkey). In Media Services, the content key contains the asset's encryption key.
-
-4. [Configure the content key's authorization policy](media-services-protect-with-aes128.md#configure_key_auth_policy). You must configure the content key authorization policy. The client must meet the policy before the content key is delivered to the client.
-
-5. [Configure the delivery policy for an asset](media-services-protect-with-aes128.md#configure_asset_delivery_policy). The delivery policy configuration includes the key acquisition URL and an initialization vector (IV). (AES-128 requires the same IV for encryption and decryption.) The configuration also includes the delivery protocol (for example, MPEG-DASH, HLS, Smooth Streaming, or all) and the type of dynamic encryption (for example, envelope or no dynamic encryption).
-
- You can apply a different policy to each protocol on the same asset. For example, you can apply PlayReady encryption to Smooth/DASH and an AES envelope to HLS. Any protocols that aren't defined in a delivery policy are blocked from streaming. (An example is if you add a single policy that specifies only HLS as the protocol.) The exception is if you have no asset delivery policy defined at all. Then, all protocols are allowed in the clear.
-
-6. [Create an OnDemand locator](media-services-protect-with-aes128.md#create_locator) to get a streaming URL.
-
-The article also shows [how a client application can request a key from the key delivery service](media-services-protect-with-aes128.md#client_request).
-
-You can find a complete [.NET example](media-services-protect-with-aes128.md#example) at the end of the article.
-
-The following image demonstrates the workflow previously described. Here, the token is used for authentication.
-
-![Protect with AES-128](./media/media-services-content-protection-overview/media-services-content-protection-with-aes.png)
-
-The remainder of this article provides explanations, code examples, and links to topics that show you how to achieve the tasks previously described.
-
-## Current limitations
-If you add or update your asset's delivery policy, you must delete any existing locator and create a new locator.
-
-## <a id="create_asset"></a>Create an asset and upload files into the asset
-To manage, encode, and stream your videos, you must first upload your content into Media Services. After it's uploaded, your content is stored securely in the cloud for further processing and streaming.
-
-For more information, see [Upload files into a Media Services account](media-services-dotnet-upload-files.md).
-
-## <a id="encode_asset"></a>Encode the asset that contains the file to the adaptive bitrate MP4 set
-With dynamic encryption, you create an asset that contains a set of multi-bitrate MP4 files or multi-bitrate Smooth Streaming source files. Then, based on the specified format in the manifest or fragment request, the on-demand streaming server ensures that you receive the stream in the protocol you selected. Then, you only need to store and pay for the files in single storage format. Media Services builds and serves the appropriate response based on requests from a client. For more information, see [Dynamic packaging overview](media-services-dynamic-packaging-overview.md).
-
->[!NOTE]
->When your Media Services account is created, a default streaming endpoint is added to your account in the "Stopped" state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content must be in the "Running" state.
->
->Also, to use dynamic packaging and dynamic encryption, your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files.
-
-For instructions on how to encode, see [Encode an asset by using Media Encoder Standard](media-services-dotnet-encode-with-media-encoder-standard.md).
-
-## <a id="create_contentkey"></a>Create a content key and associate it with the encoded asset
-In Media Services, the content key contains the key that you want to encrypt an asset with.
-
-For more information, see [Create a content key](media-services-dotnet-create-contentkey.md).
-
-## <a id="configure_key_auth_policy"></a>Configure the content key's authorization policy
-Media Services supports multiple ways of authenticating users who make key requests. You must configure the content key authorization policy. The client (player) must meet the policy before the key can be delivered to the client. The content key authorization policy can have one or more authorization restrictions, either open, token restriction, or IP restriction.
-
-For more information, see [Configure a content key authorization policy](media-services-dotnet-configure-content-key-auth-policy.md).
-
-## <a id="configure_asset_delivery_policy"></a>Configure an asset delivery policy
-Configure the delivery policy for your asset. Some things that the asset delivery policy configuration includes are:
-
-* The key acquisition URL.
-* The initialization vector (IV) to use for the envelope encryption. AES-128 requires the same IV for encryption and decryption.
-* The asset delivery protocol (for example, MPEG-DASH, HLS, Smooth Streaming, or all).
-* The type of dynamic encryption (for example, AES envelope) or no dynamic encryption.
-
-For more information, see [Configure an asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md).
-
-## <a id="create_locator"></a>Create an OnDemand streaming locator to get a streaming URL
-You need to provide your user with the streaming URL for Smooth Streaming, DASH, or HLS.
-
-> [!NOTE]
-> If you add or update your asset's delivery policy, you must delete any existing locator and create a new locator.
->
->
-
-For instructions on how to publish an asset and build a streaming URL, see [Build a streaming URL](media-services-deliver-streaming-content.md).
-
-## Get a test token
-Get a test token based on the token restriction that was used for the key authorization policy.
-
-```csharp
- // Deserializes a string containing an Xml representation of a TokenRestrictionTemplate
- // back into a TokenRestrictionTemplate class instance.
- TokenRestrictionTemplate tokenTemplate =
- TokenRestrictionTemplateSerializer.Deserialize(tokenTemplateString);
-
- // Generate a test token based on the data in the given TokenRestrictionTemplate.
- //The GenerateTestToken method returns the token without the word "Bearer" in front
- //so you have to add it in front of the token string.
- string testToken = TokenRestrictionTemplateSerializer.GenerateTestToken(tokenTemplate);
- Console.WriteLine("The authorization token is:\nBearer {0}", testToken);
-```
-
-You can use the [Azure Media Services Player](https://aka.ms/azuremediaplayer) to test your stream.
-
-## <a id="client_request"></a>How can your client request a key from the key delivery service?
-In the previous step, you constructed the URL that points to a manifest file. Your client needs to extract the necessary information from the streaming manifest files to make a request to the key delivery service.
-
-### Manifest files
-The client needs to extract the URL (that also contains content key ID [kid]) value from the manifest file. The client then tries to get the encryption key from the key delivery service. The client also needs to extract the IV value and use it to decrypt the stream. The following snippet shows the `<Protection>` element of the Smooth Streaming manifest:
-
-```xml
- <Protection>
- <ProtectionHeader SystemID="B47B251A-2409-4B42-958E-08DBAE7B4EE9">
- <ContentProtection xmlns:sea="urn:mpeg:dash:schema:sea:2012" schemeIdUri="urn:mpeg:dash:sea:2012">
- <sea:SegmentEncryption schemeIdUri="urn:mpeg:dash:sea:aes128-cbc:2013"/>
- <sea:KeySystem keySystemUri="urn:mpeg:dash:sea:keysys:http:2013"/>
- <sea:CryptoPeriod IV="0xD7D7D7D7D7D7D7D7D7D7D7D7D7D7D7D7"
- keyUriTemplate="https://wamsbayclus001kd-hs.cloudapp.net/HlsHandler.ashx?
- kid=da3813af-55e6-48e7-aa9f-a4d6031f7b4d"/>
- </ContentProtection>
- </ProtectionHeader>
- </Protection>
-```
-
-In the case of HLS, the root manifest is broken into segment files.
-
-For example, the root manifest is: http:\//test001.origin.mediaservices.windows.net/8bfe7d6f-34e3-4d1a-b289-3e48a8762490/BigBuckBunny.ism/manifest(format=m3u8-aapl). It contains a list of segment file names.
-
-```text
-. . .
-#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=630133,RESOLUTION=424x240,CODECS="avc1.4d4015,mp4a.40.2",AUDIO="audio"
-QualityLevels(514369)/Manifest(video,format=m3u8-aapl)
-#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=965441,RESOLUTION=636x356,CODECS="avc1.4d401e,mp4a.40.2",AUDIO="audio"
-QualityLevels(842459)/Manifest(video,format=m3u8-aapl)
-…
-```
-
-If you open one of the segment files in a text editor (for example, http:\//test001.origin.mediaservices.windows.net/8bfe7d6f-34e3-4d1a-b289-3e48a8762490/BigBuckBunny.ism/QualityLevels(514369)/Manifest(video,format=m3u8-aapl), it contains #EXT-X-KEY, which indicates that the file is encrypted.
-
-```text
-#EXTM3U
-#EXT-X-VERSION:4
-#EXT-X-ALLOW-CACHE:NO
-#EXT-X-MEDIA-SEQUENCE:0
-#EXT-X-TARGETDURATION:9
-#EXT-X-KEY:METHOD=AES-128,
-URI="https://wamsbayclus001kd-hs.cloudapp.net/HlsHandler.ashx?
- kid=da3813af-55e6-48e7-aa9f-a4d6031f7b4d",
- IV=0XD7D7D7D7D7D7D7D7D7D7D7D7D7D7D7D7
-#EXT-X-PROGRAM-DATE-TIME:1970-01-01T00:00:00.000+00:00
-#EXTINF:8.425708,no-desc
-Fragments(video=0,format=m3u8-aapl)
-#EXT-X-ENDLIST
-```
-
->[!NOTE]
->If you plan to play an AES-encrypted HLS in Safari, see [this blog](https://azure.microsoft.com/blog/how-to-make-token-authorized-aes-encrypted-hls-stream-working-in-safari/).
-
-### Request the key from the key delivery service
-
-The following code shows how to send a request to the Media Services key delivery service by using a key delivery Uri (that was extracted from the manifest) and a token. (This article doesn't explain how to get SWTs from an STS.)
-
-```csharp
- private byte[] GetDeliveryKey(Uri keyDeliveryUri, string token)
- {
- HttpWebRequest request = (HttpWebRequest)WebRequest.Create(keyDeliveryUri);
-
- request.Method = "POST";
- request.ContentType = "text/xml";
- if (!string.IsNullOrEmpty(token))
- {
- request.Headers[AuthorizationHeader] = token;
- }
- request.ContentLength = 0;
-
- var response = request.GetResponse();
-
- var stream = response.GetResponseStream();
- if (stream == null)
- {
- throw new NullReferenceException("Response stream is null");
- }
-
- var buffer = new byte[256];
- var length = 0;
- while (stream.CanRead && length <= buffer.Length)
- {
- var nexByte = stream.ReadByte();
- if (nexByte == -1)
- {
- break;
- }
- buffer[length] = (byte)nexByte;
- length++;
- }
- response.Close();
-
- // AES keys must be exactly 16 bytes (128 bits).
- var key = new byte[length];
- Array.Copy(buffer, key, length);
- return key;
- }
-```
-
-## Protect your content with AES-128 by using .NET
-
-### Create and configure a Visual Studio project
-
-1. Set up your development environment, and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-2. Add the following elements to appSettings, as defined in your app.config file:
-
- ```xml
- <add key="Issuer" value="http://testissuer.com"/>
- <add key="Audience" value="urn:test"/>
- ```
-
-### <a id="example"></a>Example
-
-Overwrite the code in your Program.cs file with the code shown in this section.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different Media Services policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). Use the same policy ID if you always use the same days/access permissions. An example is policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see the "Limit access policies" section in [Manage assets and related entities with the Media Services .NET SDK](media-services-dotnet-manage-entities.md#limit-access-policies).
-
-Make sure to update variables to point to folders where your input files are located.
-
-[!code-csharp[Main](../../../samples-mediaservices-encryptionaes/DynamicEncryptionWithAES/DynamicEncryptionWithAES/Program.cs)]
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Protect With Playready Widevine https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-protect-with-playready-widevine.md
- Title: Use PlayReady and/or Widevine dynamic common encryption | Microsoft Docs
-description: You can use Azure Media Services to deliver MPEG-DASH, Smooth Streaming, and HTTP Live Streaming (HLS) streams protected with Microsoft PlayReady DRM. You also can use it to deliver DASH encrypted with Widevine DRM. This topic shows how to dynamically encrypt with PlayReady and Widevine DRM.
------ Previously updated : 03/10/2021---
-# Use PlayReady and/or Widevine dynamic common encryption
--
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/). > No new features or functionality are being added to Media Services v2. <br/>Check out the latest version, [Media Services v3](../latest/index.yml). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
->
-
-## Overview
-
- You can use Media Services to deliver MPEG-DASH, Smooth Streaming, and HTTP Live Streaming (HLS) streams protected with [PlayReady digital rights management (DRM)](https://www.microsoft.com/playready/overview/). You also can deliver encrypted DASH streams with Widevine DRM licenses. Both PlayReady and Widevine are encrypted per the common encryption (ISO/IEC 23001-7 CENC) specification. You can use the [Media Services .NET SDK](https://www.nuget.org/packages/windowsazure.mediaservices/) (starting with version 3.5.1) or the REST API to configure AssetDeliveryConfiguration to use Widevine.
-
-Media Services provides a service for delivering PlayReady and Widevine DRM licenses. Media Services also provides APIs that you can use to configure the rights and restrictions that you want the PlayReady or Widevine DRM runtime to enforce when a user plays back protected content. When a user requests DRM-protected content, the player application requests a license from the Media Services license service. If the player application is authorized, the Media Services license service issues a license to the player. A PlayReady or Widevine license contains the decryption key that can be used by the client player to decrypt and stream the content.
-
-You also can use the following Media Services partners to help you deliver Widevine licenses:
-
-* [EZDRM](https://ezdrm.com/)
-* [castLabs](https://castlabs.com/company/partners/azure/)
-
-For more information, see integration with [castLabs](media-services-castlabs-integration.md).
-
-Media Services supports multiple ways of authorizing users who make key requests. The content key authorization policy can have one or more authorization restrictions, either open or token restrictions. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the [simple web token](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_2) (SWT) and [JSON Web Token](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_3) (JWT) formats.
-
-For more information, see [Configure the content key's authorization policy](media-services-portal-configure-content-key-auth-policy.md).
-
-To take advantage of dynamic encryption, create an asset that contains a set of multi-bitrate MP4 files or multi-bitrate Smooth Streaming source files. You also need to configure the delivery policies for the asset (described later in this topic). Then, based on the format specified in the streaming URL, the on-demand streaming server ensures that the stream is delivered in the protocol you selected. As a result, you store and pay for the files in only a single storage format. Media Services builds and serves the appropriate HTTP response based on each request from a client.
-
-This article is useful to developers who work on applications that deliver media protected with multiple DRMs, such as PlayReady and Widevine. The article shows you how to configure the PlayReady license delivery service with authorization policies so that only authorized clients can receive PlayReady or Widevine licenses. It also shows how to use dynamic encryption with PlayReady or Widevine DRM over DASH.
-
->[!NOTE]
->When your Media Services account is created, a default streaming endpoint is added to your account in the "Stopped" state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content must be in the "Running" state.
-
-## Download the sample
-You can download the sample described in this article from [Azure samples on GitHub](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-drm).
-
-## Configure dynamic common encryption and DRM license delivery services
-
-Perform the following general steps when you protect your assets with PlayReady by using the Media Services license delivery service and also by using dynamic encryption:
-
-1. Create an asset, and upload files into the asset.
-
-2. Encode the asset that contains the file to the adaptive bitrate MP4 set.
-
-3. Create a content key, and associate it with the encoded asset. In Media Services, the content key contains the asset's encryption key.
-
-4. Configure the content key's authorization policy. You must configure the content key authorization policy. The client must meet the policy before the content key is delivered to the client.
-
- When you create the content key authorization policy, you must specify the delivery method (PlayReady or Widevine) and the restrictions (open or token). You also must specify information specific to the key delivery type that defines how the key is delivered to the client ([PlayReady](media-services-playready-license-template-overview.md) or [Widevine](media-services-widevine-license-template-overview.md) license template).
-
-5. Configure the delivery policy for an asset. The delivery policy configuration includes the delivery protocol (for example, MPEG-DASH, HLS, Smooth Streaming, or all). The configuration also includes the type of dynamic encryption (for example, common encryption) and the PlayReady or Widevine license acquisition URL.
-
- You can apply a different policy to each protocol on the same asset. For example, you can apply PlayReady encryption to Smooth/DASH and an AES envelope to HLS. Any protocols that aren't defined in a delivery policy (for example, if you add a single policy that specifies only HLS as the protocol) are blocked from streaming. The exception is if you have no asset delivery policy defined at all. Then, all protocols are allowed in the clear.
-
-6. Create an OnDemand locator to get a streaming URL.
-
-You can find a complete .NET example at the end of the article.
-
-The following image demonstrates the workflow previously described. Here, the token is used for authentication.
-
-![Protect with PlayReady](media/media-services-content-protection-overview/media-services-content-protection-with-drm.png)
-
-The remainder of this article provides detailed explanations, code examples, and links to topics that show you how to achieve the tasks previously described.
-
-## Current limitations
-If you add or update an asset delivery policy, you must delete any associated locator and create a new locator.
-
-Currently, multiple content keys aren't supported when you encrypt by using Widevine with Media Services.
-
-## Create an asset and upload files into the asset
-To manage, encode, and stream your videos, you must first upload your content into Media Services. After it's uploaded, your content is stored securely in the cloud for further processing and streaming.
-
-For more information, see [Upload files into a Media Services account](media-services-dotnet-upload-files.md).
-
-## Encode the asset that contains the file to the adaptive bitrate MP4 set
-With dynamic encryption, you create an asset that contains a set of multi-bitrate MP4 files or multi-bitrate Smooth Streaming source files. Then, based on the specified format in the manifest and fragment request, the on-demand streaming server ensures that you receive the stream in the protocol you selected. Then, you store and pay for the files in only a single storage format. Media Services builds and serves the appropriate response based on requests from a client. For more information, see [Dynamic packaging overview](media-services-dynamic-packaging-overview.md).
-
-For instructions on how to encode, see [Encode an asset by using Media Encoder Standard](media-services-dotnet-encode-with-media-encoder-standard.md).
-
-## <a id="create_contentkey"></a>Create a content key and associate it with the encoded asset
-In Media Services, the content key contains the key that you want to encrypt an asset with.
-
-For more information, see [Create a content key](media-services-dotnet-create-contentkey.md).
-
-## <a id="configure_key_auth_policy"></a>Configure the content key's authorization policy
-Media Services supports multiple ways of authenticating users who make key requests. You must configure the content key authorization policy. The client (player) must meet the policy before the key is delivered to the client. The content key authorization policy can have one or more authorization restrictions, either open or token restrictions.
-
-For more information, see [Configure a content key authorization policy](media-services-dotnet-configure-content-key-auth-policy.md#playready-dynamic-encryption).
-
-## <a id="configure_asset_delivery_policy"></a>Configure an asset delivery policy
-Configure the delivery policy for your asset. Some things that the asset delivery policy configuration includes are:
-
-* The DRM license acquisition URL.
-* The asset delivery protocol (for example, MPEG-DASH, HLS, Smooth Streaming, or all).
-* The type of dynamic encryption (in this case, common encryption).
-
-For more information, see [Configure an asset delivery policy](media-services-dotnet-configure-asset-delivery-policy.md).
-
-## <a id="create_locator"></a>Create an OnDemand streaming locator to get a streaming URL
-You need to provide your user with the streaming URL for Smooth Streaming, DASH, or HLS.
-
-> [!NOTE]
-> If you add or update your asset's delivery policy, you must delete any existing locator and create a new locator.
->
->
-
-For instructions on how to publish an asset and build a streaming URL, see [Build a streaming URL](media-services-deliver-streaming-content.md).
-
-## Get a test token
-Get a test token based on the token restriction that was used for the key authorization policy.
-
-```csharp
-// Deserializes a string containing an XML representation of a TokenRestrictionTemplate
-// back into a TokenRestrictionTemplate class instance.
-TokenRestrictionTemplate tokenTemplate =
-TokenRestrictionTemplateSerializer.Deserialize(tokenTemplateString);
-
-// Generate a test token based on the data in the given TokenRestrictionTemplate.
-//The GenerateTestToken method returns the token without the word "Bearer" in front,
-//so you have to add it in front of the token string.
-string testToken = TokenRestrictionTemplateSerializer.GenerateTestToken(tokenTemplate);
-Console.WriteLine("The authorization token is:\nBearer {0}", testToken);
-```
-
-You can use the [Azure Media Services Player](https://aka.ms/azuremediaplayer) to test your stream.
-
-## Create and configure a Visual Studio project
-
-1. Set up your development environment, and populate the app.config file with connection information, as described in [Media Services development with .NET](media-services-dotnet-how-to-use.md).
-
-2. Add the following elements to **appSettings** defined in your app.config file:
-
- ```xml
- <add key="Issuer" value="http://testissuer.com"/>
- <add key="Audience" value="urn:test"/>
- ```
-
-## Example
-
-The following sample demonstrates functionality that was introduced in the Media Services SDK for .NET version 3.5.2. (Specifically, it includes the ability to define a Widevine license template and request a Widevine license from Media Services.)
-
-Overwrite the code in your Program.cs file with the code shown in this section.
-
->[!NOTE]
->There is a limit of 1 million policies for different Media Services policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). If you always use the same days/access permissions, use the same policy ID. An example is policies for locators that are intended to remain in place for a long time (non-upload policies).
-
-For more information, see [Manage assets and related entities with the Media Services .NET SDK](media-services-dotnet-manage-entities.md#limit-access-policies).
-
-Make sure to update variables to point to folders where your input files are located.
-
-```csharp
-using System;
-using System.Collections.Generic;
-using System.Configuration;
-using System.IO;
-using System.Linq;
-using System.Threading;
-using Microsoft.WindowsAzure.MediaServices.Client;
-using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
-using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
-using Microsoft.WindowsAzure.MediaServices.Client.Widevine;
-using Newtonsoft.Json;
-
-namespace DynamicEncryptionWithDRM
-{
- class Program
- {
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- private static readonly Uri _sampleIssuer =
- new Uri(ConfigurationManager.AppSettings["Issuer"]);
- private static readonly Uri _sampleAudience =
- new Uri(ConfigurationManager.AppSettings["Audience"]);
-
- // Field for service context.
- private static CloudMediaContext _context = null;
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- bool tokenRestriction = false;
- string tokenTemplateString = null;
-
- IAsset asset = UploadFileAndCreateAsset(_singleMP4File);
- Console.WriteLine("Uploaded asset: {0}", asset.Id);
-
- IAsset encodedAsset = EncodeToAdaptiveBitrateMP4Set(asset);
- Console.WriteLine("Encoded asset: {0}", encodedAsset.Id);
-
- IContentKey key = CreateCommonTypeContentKey(encodedAsset);
- Console.WriteLine("Created key {0} for the asset {1} ", key.Id, encodedAsset.Id);
- Console.WriteLine("PlayReady License Key delivery URL: {0}", key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense));
- Console.WriteLine();
-
- if (tokenRestriction)
- tokenTemplateString = AddTokenRestrictedAuthorizationPolicy(key);
- else
- AddOpenAuthorizationPolicy(key);
-
- Console.WriteLine("Added authorization policy: {0}", key.AuthorizationPolicyId);
- Console.WriteLine();
-
- CreateAssetDeliveryPolicy(encodedAsset, key);
- Console.WriteLine("Created asset delivery policy. \n");
- Console.WriteLine();
-
- if (tokenRestriction && !String.IsNullOrEmpty(tokenTemplateString))
- {
- // Deserializes a string containing an XML representation of a TokenRestrictionTemplate
- // back into a TokenRestrictionTemplate class instance.
- TokenRestrictionTemplate tokenTemplate =
- TokenRestrictionTemplateSerializer.Deserialize(tokenTemplateString);
-
- // Generate a test token based on the data in the given TokenRestrictionTemplate.
- // Note that you need to pass the key ID GUID because
- // TokenClaim.ContentKeyIdentifierClaim was specified during the creation of TokenRestrictionTemplate.
- Guid rawkey = EncryptionUtils.GetKeyIdAsGuid(key.Id);
- string testToken = TokenRestrictionTemplateSerializer.GenerateTestToken(tokenTemplate, null, rawkey,
- DateTime.UtcNow.AddDays(365));
- Console.WriteLine("The authorization token is:\nBearer {0}", testToken);
- Console.WriteLine();
- }
-
- // You can use the https://amsplayer.azurewebsites.net/azuremediaplayer.html player to test streams.
- // Note that DASH works on Internet Explorer 11 (via PlayReady), Microsoft Edge (via PlayReady), and Chrome (via Widevine).
-
- string url = GetStreamingOriginLocator(encodedAsset);
- Console.WriteLine("Encrypted DASH URL: {0}/manifest(format=mpd-time-csf)", url);
-
- Console.ReadLine();
- }
-
- static public IAsset UploadFileAndCreateAsset(string singleFilePath)
- {
- if (!File.Exists(singleFilePath))
- {
- Console.WriteLine("File does not exist.");
- return null;
- }
-
- var assetName = Path.GetFileNameWithoutExtension(singleFilePath);
- IAsset inputAsset = _context.Assets.Create(assetName, AssetCreationOptions.None);
-
- var assetFile = inputAsset.AssetFiles.Create(Path.GetFileName(singleFilePath));
-
- Console.WriteLine("Created assetFile {0}", assetFile.Name);
-
- Console.WriteLine("Upload {0}", assetFile.Name);
-
- assetFile.Upload(singleFilePath);
- Console.WriteLine("Done uploading {0}", assetFile.Name);
-
- return inputAsset;
- }
-
- static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset inputAsset)
- {
- var encodingPreset = "Adaptive Streaming";
-
- IJob job = _context.Jobs.Create(String.Format("Encoding into Mp4 {0} to {1}",
- inputAsset.Name,
- encodingPreset));
-
- var mediaProcessors =
- _context.MediaProcessors.Where(p => p.Name.Contains("Media Encoder Standard")).ToList();
-
- var latestMediaProcessor =
- mediaProcessors.OrderBy(mp => new Version(mp.Version)).LastOrDefault();
-
- ITask encodeTask = job.Tasks.AddNew("Encoding", latestMediaProcessor, encodingPreset, TaskOptions.None);
- encodeTask.InputAssets.Add(inputAsset);
- encodeTask.OutputAssets.AddNew(String.Format("{0} as {1}", inputAsset.Name, encodingPreset), AssetCreationOptions.StorageEncrypted);
-
- job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
- job.Submit();
- job.GetExecutionProgressTask(CancellationToken.None).Wait();
-
- return job.OutputMediaAssets[0];
- }
--
- static public IContentKey CreateCommonTypeContentKey(IAsset asset)
- {
-
- Guid keyId = Guid.NewGuid();
- byte[] contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryption);
-
- // Associate the key with the asset.
- asset.ContentKeys.Add(key);
-
- return key;
- }
-
- static public void AddOpenAuthorizationPolicy(IContentKey contentKey)
- {
-
- // Create ContentKeyAuthorizationPolicy with open restrictions
- // and create an authorization policy.
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
-
- // Configure PlayReady and Widevine license templates.
- string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, PlayReadyLicenseTemplate);
-
- IContentKeyAuthorizationPolicyOption WidevinePolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.Widevine,
- restrictions, WidevineLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
--
- contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
- contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-
- public static string AddTokenRestrictedAuthorizationPolicy(IContentKey contentKey)
- {
- string tokenTemplateString = GenerateTokenRequirements();
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Token Authorization Policy",
- KeyRestrictionType = (int)ContentKeyRestrictionType.TokenRestricted,
- Requirements = tokenTemplateString,
- }
- };
-
- // Configure PlayReady and Widevine license templates.
- string PlayReadyLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- string WidevineLicenseTemplate = ConfigureWidevineLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption PlayReadyPolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, PlayReadyLicenseTemplate);
-
- IContentKeyAuthorizationPolicyOption WidevinePolicy =
- _context.ContentKeyAuthorizationPolicyOptions.Create("Token option",
- ContentKeyDeliveryType.Widevine,
- restrictions, WidevineLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with token restrictions").
- Result;
-
- contentKeyAuthorizationPolicy.Options.Add(PlayReadyPolicy);
- contentKeyAuthorizationPolicy.Options.Add(WidevinePolicy);
-
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
-
- return tokenTemplateString;
- }
-
- static private string GenerateTokenRequirements()
- {
- TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-
- template.PrimaryVerificationKey = new SymmetricVerificationKey();
- template.AlternateVerificationKeys.Add(new SymmetricVerificationKey());
- template.Audience = _sampleAudience.ToString();
- template.Issuer = _sampleIssuer.ToString();
- template.RequiredClaims.Add(TokenClaim.ContentKeyIdentifierClaim);
-
- return TokenRestrictionTemplateSerializer.Serialize(template);
- }
-
- static private string ConfigurePlayReadyLicenseTemplate()
- {
- // The following code configures the PlayReady license template by using .NET classes
- // and returns the XML string.
-
- //The PlayReadyLicenseResponseTemplate class represents the template for the response sent back to the end user.
- //It contains a field for a custom data string between the license server and the application
- //(which might be useful for custom app logic) as well as a list of one or more license templates.
- PlayReadyLicenseResponseTemplate responseTemplate = new PlayReadyLicenseResponseTemplate();
-
- // The PlayReadyLicenseTemplate class represents a license template you can use to create PlayReady licenses
- // to be returned to end users.
- //It contains the data on the content key in the license and any rights or restrictions to be
- //enforced by the PlayReady DRM runtime when you use the content key.
- PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
- //Configure whether the license is persistent (saved in persistent storage on the client)
- //or nonpersistent (held in memory only while the player uses the license).
- licenseTemplate.LicenseType = PlayReadyLicenseType.Nonpersistent;
-
- // AllowTestDevices controls whether test devices can use the license or not.
- // If true, the MinimumSecurityLevel property of the license
- // is set to 150. If false (the default), the MinimumSecurityLevel property of the license is set to 2,000.
- licenseTemplate.AllowTestDevices = true;
-
- // You also can configure the PlayRight in the PlayReady license by using the PlayReadyPlayRight class.
- // It grants the user the ability to play back the content subject to the zero or more restrictions
- // configured in the license and on the PlayRight itself (for playback-specific policy).
- // Much of the policy on the PlayRight has to do with output restrictions,
- // which control the types of outputs that the content can be played over and
- // any restrictions that must be put in place when you use a given output.
- // For example, if DigitalVideoOnlyContentRestriction is enabled,
- // the DRM runtime allows the video to be displayed only over digital outputs
- //(analog video outputs aren't allowed to pass the content).
-
- //IMPORTANT: These types of restrictions can be very powerful but also can affect the consumer experience.
- // If output protections are too restrictive,
- // content might be unplayable on some clients. For more information, see the PlayReady Compliance Rules document.
-
- // For example:
- //licenseTemplate.PlayRight.AgcAndColorStripeRestriction = new AgcAndColorStripeRestriction(1);
-
- responseTemplate.LicenseTemplates.Add(licenseTemplate);
-
- return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
- }
-
- private static string ConfigureWidevineLicenseTemplate()
- {
- var template = new WidevineMessage
- {
- allowed_track_types = AllowedTrackTypes.SD_HD,
- content_key_specs = new[]
- {
- new ContentKeySpecs
- {
- required_output_protection = new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
- security_level = 1,
- track_type = "SD"
- }
- },
- policy_overrides = new
- {
- can_play = true,
- can_persist = true,
- can_renew = false
- }
- };
-
- string configuration = JsonConvert.SerializeObject(template);
- return configuration;
- }
-
- static public void CreateAssetDeliveryPolicy(IAsset asset, IContentKey key)
- {
- // Get the PlayReady license service URL.
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense);
-
- // GetKeyDeliveryUrl for Widevine attaches the KID to the URL.
- // For example: https://amsaccount1.keydelivery.mediaservices.windows.net/Widevine/?KID=268a6dcb-18c8-4648-8c95-f46429e4927c.
- // WidevineBaseLicenseAcquisitionUrl (used in the following) also tells dynamic encryption
- // to append /? KID =< keyId > to the end of the URL when you create the manifest.
- // As a result, the Widevine license acquisition URL has the KID appended twice,
- // so you need to remove the KID in the URL when you call GetKeyDeliveryUrl.
-
- Uri widevineUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.Widevine);
- UriBuilder uriBuilder = new UriBuilder(widevineUrl);
- uriBuilder.Query = String.Empty;
- widevineUrl = uriBuilder.Uri;
-
- Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
- {
- {AssetDeliveryPolicyConfigurationKey.PlayReadyLicenseAcquisitionUrl, acquisitionUrl.ToString()},
- {AssetDeliveryPolicyConfigurationKey.WidevineBaseLicenseAcquisitionUrl, widevineUrl.ToString()}
-
- };
-
- // In this case, we specify only the DASH streaming protocol in the delivery policy.
- // All other protocols are blocked from streaming.
- var assetDeliveryPolicy = _context.AssetDeliveryPolicies.Create(
- "AssetDeliveryPolicy",
- AssetDeliveryPolicyType.DynamicCommonEncryption,
- AssetDeliveryProtocol.Dash,
- assetDeliveryPolicyConfiguration);
--
- // Add AssetDelivery Policy to the asset.
- asset.DeliveryPolicies.Add(assetDeliveryPolicy);
-
- }
-
- /// <summary>
- /// Gets the streaming origin locator.
- /// </summary>
- /// <param name="assets"></param>
- /// <returns></returns>
- static public string GetStreamingOriginLocator(IAsset asset)
- {
-
- // Get a reference to the streaming manifest file from the
- // collection of files in the asset.
-
- var assetFile = asset.AssetFiles.ToList().Where(f => f.Name.ToLower().
- EndsWith(".ism")).
- FirstOrDefault();
-
- // Create a 30-day read-only access policy.
- IAccessPolicy policy = _context.AccessPolicies.Create("Streaming policy",
- TimeSpan.FromDays(30),
- AccessPermissions.Read);
-
- // Create a locator to the streaming content on an origin.
- ILocator originLocator = _context.Locators.CreateLocator(LocatorType.OnDemandOrigin, asset,
- policy,
- DateTime.UtcNow.AddMinutes(-5));
-
- // Create a URL to the manifest file.
- return originLocator.Path + assetFile.Name;
- }
-
- static private void JobStateChanged(object sender, JobStateChangedEventArgs e)
- {
- Console.WriteLine(string.Format("{0}\n State: {1}\n Time: {2}\n\n",
- ((IJob)sender).Name,
- e.CurrentState,
- DateTime.UtcNow.ToString(@"yyyy_M_d__hh_mm_ss")));
- }
-
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
- }
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
--
-## Provide feedback
-
-## See also
-
-* [Use the CENC with multi-DRM and access control](media-services-cenc-with-multidrm-access-control.md)
-* [Configure Widevine packaging with Media Services](https://mingfeiy.com/how-to-configure-widevine-packaging-with-azure-media-services)
-* [Get started with the Java client SDK for Azure Media Services](./media-services-java-how-to-use.md)
-* To download the latest PHP SDK for Media Services, look for version 0.5.7 of the Microsoft/WindowsAzure package in the [Packagist repository](https://packagist.org/packages/microsoft/windowsazure#v0.5.7).
media-services Media Services Quotas And Limitations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-quotas-and-limitations.md
- Title: Media Services quotas and limitation | Microsoft Docs
-description: This topic describes quotas and limitations associated with Microsoft Azure Media Services.
------ Previously updated : 08/24/2021--
-# Quotas and Limitations
--
-This article describes quotas and limitations associated with Microsoft Azure Media Services.
--
-## Open a Support Ticket to request changes to the default quotas
-To request changes to the default quotas provided, you can open a support ticket. Please include detailed information in the request on the desired quota changes, use-case scenarios, and regions required.
-
-### How to open a support ticket
-In the Azure portal, go to [Help + support](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest). If you are not logged in to Azure, you will be prompted to enter your credentials.
--
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Recommended Encoders https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-recommended-encoders.md
- Title: Learn about encoders recommended by Azure Media Services | Microsoft Docs
-description: This article lists on premises encoders recommended by Azure Media Services.
-
-keywords: encoding;encoders;media
--- Previously updated : 03/10/2021----
-# Recommended on-premises encoders
--
-When live streaming with Azure Media Services, you can specify how you want your channel to receive the input stream. If you choose to use an on premises encoder with a live encoding channel, your encoder should push a high-quality single-bitrate stream as output. If you choose to use an on premises encoder with a pass through channel, your encoder should push a multi-bitrate stream as output with all desired output qualities. For more information, see [Live streaming with on premises encoders](media-services-live-streaming-with-onprem-encoders.md).
-
-## Encoder requirements
-
-Encoders must support TLS 1.2 when using HTTPS or RTMPS protocols.
-
-## Live encoders that output RTMP
-
-Azure Media Services recommends using one of following live encoders that have RTMP as output:
--- Adobe Flash Media Live Encoder 3.2-- Haivision Makito X HEVC-- Haivision KB-- Telestream Wirecast (version 13.0.2 or higher due to the TLS 1.2 requirement)-
- Encoders must support TLS 1.2 when using RTMPS protocols.
-- Teradek Slice 756-- OBS Studio-- VMIX-- xStream-- Switcher Studio (iOS)-
-## Live encoders that output fragmented MP4
-
-Azure Media Services recommends using one of the following live encoders that have multi-bitrate fragmented-MP4 (Smooth Streaming) as output:
--- Media Excel Hero Live and Hero 4K (UHD/HEVC)-- Ateme TITAN Live-- Cisco Digital Media Encoder 2200-- Elemental Live (version 2.14.15 and higher due to the TLS 1.2 requirement)-
- Encoders must support TLS 1.2 when using HTTPS protocols.
-- Envivio 4Caster C4 Gen III-- Imagine Communications Selenio MCP3-
-> [!NOTE]
-> A live encoder can send a single-bitrate stream to a pass through channel, but this configuration is not recommended because it does not allow for adaptive bitrate streaming to the client.
-
-## How to become an on premises encoder partner
-
-As an Azure Media Services on premises encoder partner, Media Services promotes your product by recommending your encoder to enterprise customers. To become an on premises encoder partner, you must verify compatibility of your on premises encoder with Media Services. To do so, complete the following verifications:
-
-Pass through channel verification
-1. Create or visit your Azure Media Services account
-2. Create and start a **pass-through** channel
-3. Configure your encoder to push a multi-bitrate live stream.
-4. Create a published live event
-5. Run your live encoder for approximately 10 minutes
-6. Stop the live event
-7. Create, start a Streaming endpoint, use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the archived asset to ensure that playback has no visible glitches for all quality levels (Or alternatively watch and validate via the Preview URL during the live session before step 6)
-8. Record the Asset ID, published streaming URL for the live archive, and the settings and version used from your live encoder
-9. Reset the channel state after creating each sample
-10. Repeat steps 3 through 9 for all configurations supported by your encoder (with and without ad signaling/captions/different encoding speeds)
-
-Live encoding channel verification
-1. Create or visit your Azure Media Services account
-2. Create and start a **live encoding** channel
-3. Configure your encoder to push a single-bitrate live stream.
-4. Create a published live event
-5. Run your live encoder for approximately 10 minutes
-6. Stop the live event
-7. Create, start a Streaming endpoint, use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the archived asset to ensure that playback has no visible glitches for all quality levels (Or alternatively watch and validate via the Preview URL during the live session before step 6)
-8. Record the Asset ID, published streaming URL for the live archive, and the settings and version used from your live encoder
-9. Reset the channel state after creating each sample
-10. Repeat steps 3 through 9 for all configurations supported by your encoder (with and without ad signaling/captions/various encoding speeds)
-
-Longevity verification
-1. Create or visit your Azure Media Services account
-2. Create and start a **pass-through** channel
-3. Configure your encoder to push a multi-bitrate live stream.
-4. Create a published live event
-5. Run your live encoder for one week or longer
-6. Use a player such as [Azure Media Player](https://aka.ms/azuremediaplayer) to watch the live streaming from time to time (or archived asset) to ensure that playback has no visible glitches
-7. Stop the live event
-8. Record the Asset ID, published streaming URL for the live archive, and the settings and version used from your live encoder
-
-Lastly, send your recorded settings and live archive parameters to Media Services by emailing amsstreaming@microsoft.com. Upon receipt, Media Services performs verification tests on the samples from your live encoder. You can contact the Media Services with any questions regarding this process.
media-services Media Services Redactor Walkthrough https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-redactor-walkthrough.md
- Title: Redact faces with Azure Media Analytics walkthrough | Microsoft Docs
-description: This topic shows step by step instructions on how to run a full redaction workflow using Azure Media Services Explorer (AMSE) and Azure Media Redactor Visualizer (open source tool).
------ Previously updated : 03/10/2021---
-# Redact faces with Azure Media Analytics walkthrough
--
-## Overview
-
-**Azure Media Redactor** is an [Azure Media Analytics](./legacy-components.md) media processor (MP) that offers scalable face redaction in the cloud. Face redaction enables you to modify your video in order to blur faces of selected individuals. You may want to use the face redaction service in public safety and news media scenarios. A few minutes of footage that contains multiple faces can take hours to redact manually, but with this service the face redaction process will require just a few simple steps. For more information, see [this](https://azure.microsoft.com/blog/azure-media-redactor/) blog.
-
-For details about **Azure Media Redactor**, see the [Face redaction overview](media-services-face-redaction.md) topic.
-
-This topic shows step by step instructions on how to run a full redaction workflow using Azure Media Services Explorer (AMSE) and Azure Media Redactor Visualizer (open source tool).
-
-For more information, see [this](https://azure.microsoft.com/blog/redaction-preview-available-globally) blog.
-
-## Azure Media Services Explorer workflow
-
-The easiest way to get started with Redactor is to use the open source AMSE tool on GitHub. You can run a simplified workflow via **combined** mode if you donΓÇÖt need access to the annotation json or the face jpg images.
-
-### Download and setup
-
-1. Download the AMSE for AMS v2 tool from [here](https://aka.ms/amseforv2).
-1. Log in to your Media Services account using your service key.
-
- To obtain the account name and key information, go to the [Azure portal](https://portal.azure.com/) and select your AMS account. Then, select Settings > Keys. The Manage keys windows shows the account name and the primary and secondary keys is displayed. Copy values of the account name and the primary key.
-
-![Screenshot shows Microsoft Azure Media Services where you can enter your account name and key.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough001.png)
-
-### First pass ΓÇô analyze mode
-
-1. Upload your media file through Asset ΓÇô> Upload, or via drag and drop.
-1. Right click and process your media file using Media Analytics ΓÇô> Azure Media Redactor ΓÇô> Analyze mode.
--
-![Screenshot shows a menu with Process Asset(s) with Azure Media Redactor.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough002.png)
-
-![Screenshot shows Azure Media Redactor with First Pass: Analyze mode selected.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough003.png)
-
-The output will include an annotations json file with face location data, as well as a jpg of each detected face.
-
-![Screenshot shows the output of the analysis.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough004.png)
-
-### Second pass ΓÇô redact mode
-
-1. Upload your original video asset to the output from the first pass and set as a primary asset.
-
- ![Screenshot shows the Upload and Set as Primary buttons.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough005.png)
-
-2. (Optional) Upload a 'Dance_idlist.txt' file which includes a newline delimited list of the IDs you wish to redact.
-
- ![Screenshot shows the option to upload the text file.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough006.png)
-
-3. (Optional) Make any edits to the annotations.json file such as increasing the bounding box boundaries.
-4. Right click the output asset from the first pass, select the Redactor, and run with the **Redact** mode.
-
- ![Screenshot shows Azure Media Redactor with Second Pass: Redact mode selected.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough007.png)
-
-5. Download or share the final redacted output asset.
-
- ![Screenshot shows the Download button.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough008.png)
-
-## Azure Media Redactor Visualizer open source tool
-
-An open source [visualizer tool](https://github.com/Microsoft/azure-media-redactor-visualizer) is designed to help developers just starting with the annotations format with parsing and using the output.
-
-After you clone the repo, in order to run the project, you will need to download FFMPEG from their [official site](https://ffmpeg.org/download.html).
-
-If you are a developer trying to parse the JSON annotation data, look inside Models.MetaData for sample code examples.
-
-### Set up the tool
-
-1. Download and build the entire solution.
-
- ![Screenshot shows Build Solution selected from the menu.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough009.png)
-
-2. Download FFMPEG from [here](https://ffmpeg.org/download.html). This project was originally developed with version be1d324 (2016-10-04) with static linking.
-3. Copy ffmpeg.exe and ffprobe.exe to the same output folder as AzureMediaRedactor.exe.
-
- ![Screenshot shows the contents of the folder, including ffmpeg and ffprobe.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough010.png)
-
-4. Run AzureMediaRedactor.exe.
-
-### Use the tool
-
-1. Process your video in your Azure Media Services account with the Redactor MP on Analyze mode.
-2. Download both the original video file and the output of the Redaction - Analyze job.
-3. Run the visualizer application and choose the files above.
-
- ![Screenshot shows Azure Media Redactor uploading files.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough011.png)
-
-4. Preview your file. Select which faces you'd like to blur via the sidebar on the right.
-
- ![Screenshot shows Azure Media Redactor where you can preview and select faces to blur.](./media/media-services-redactor-walkthrough/media-services-redactor-walkthrough012.png)
-
-5. The bottom text field will update with the face IDs. Create a file called "idlist.txt" with these IDs as a newline delimited list.
-
- >[!NOTE]
- > The idlist.txt should be saved in ANSI. You can use notepad to save in ANSI.
-
-6. Upload this file to the output asset from step 1. Upload the original video to this asset as well and set as primary asset.
-7. Run Redaction job on this asset with "Redact" mode to get the final redacted video.
-
-## Next steps
--
-## Provide feedback
-
-## Related links
-[Azure Media Services Analytics Overview](./legacy-components.md)
-
-[Azure Media Analytics demos](http://amslabs.azurewebsites.net/demos/Analytics.html)
-
-[Announcing Face Redaction for Azure Media Analytics](https://azure.microsoft.com/blog/azure-media-redactor/)
media-services Media Services Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-release-notes.md
- Title: Azure Media Services release notes | Microsoft Docs
-description: This article talks about the Microsoft Azure Media Services v2 release notes.
------ Previously updated : 03/10/2021--
-# Azure Media Services release notes
--
-These release notes for Azure Media Services summarize changes from previous releases and known issues.
--
-We want to hear from our customers so that we can focus on fixing problems that affect you. To report a problem or ask questions, submit a post in the [Azure Media Services MSDN Forum].
-
-## <a name="issues"></a>Known issues
-### <a name="general_issues"></a>Media Services general issues
-
-| Issue | Description |
-| | |
-| Several common HTTP headers aren't provided in the REST API. |If you develop Media Services applications by using the REST API, you find that some common HTTP header fields (including CLIENT-REQUEST-ID, REQUEST-ID, and RETURN-CLIENT-REQUEST-ID) aren't supported. The headers will be added in a future update. |
-| Percent-encoding isn't allowed. |Media Services uses the value of the IAssetFile.Name property when building URLs for the streaming content (for example, `http://{AMSAccount}.origin.mediaservices.windows.net/{GUID}/{IAssetFile.Name}/streamingParameters`). For this reason, percent-encoding isn't allowed. The value of the Name property can't have any of the following [percent-encoding-reserved characters](https://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters): !*'();:@&=+$,/?%#[]". Also, there can be only one "." for the file name extension. |
-| The ListBlobs method that is part of the Azure Storage SDK version 3.x fails. |Media Services generates SAS URLs based on the [2012-02-12](/rest/api/storageservices/version-2012-02-12) version. If you want to use the Storage SDK to list blobs in a blob container, use the [CloudBlobContainer.ListBlobs](/dotnet/api/microsoft.azure.storage.blob.cloudblobcontainer.listblobs) method that is part of the Storage SDK version 2.x. |
-| The Media Services throttling mechanism restricts the resource usage for applications that make excessive requests to the service. The service might return the "Service Unavailable" 503 HTTP status code. |For more information, see the description of the 503 HTTP status code in [Media Services error codes](media-services-encoding-error-codes.md). |
-| When you query entities, a limit of 1,000 entities is returned at one time because the public REST version 2 limits query results to 1,000 results. |Use Skip and Take (.NET)/top (REST) as described in [this .NET example](media-services-dotnet-manage-entities.md#enumerating-through-large-collections-of-entities) and [this REST API example](media-services-rest-manage-entities.md#enumerating-through-large-collections-of-entities). |
-| Some clients can come across a repeat tag issue in the Smooth Streaming manifest. |For more information, see [this section](media-services-deliver-content-overview.md#known-issues). |
-| Media Services .NET SDK objects can't be serialized and as a result don't work with Azure Cache for Redis. |If you try to serialize the SDK AssetCollection object to add it to Azure Cache for Redis, an exception is thrown. |
-|The REST API responds with an error message saying ΓÇ£The filter cannot be accessed by this version of REST ApiΓÇ¥ when attempting to get an Asset or Account level filter.|The filter was created or modified with a newer API version than is being used to try to get the filter. This can happen if two API versions are being used by code or tools being used by the customer. The best solution here is to upgrade the code or tools to use the newer or the two API versions.|
-
-## <a name="rest_version_history"></a>REST API version history
-For information about the Media Services REST API version history, see the [Azure Media Services REST API reference].
-
-## February 2021
-
-### Azure Media Services v2 API and SDKs deprecation announcement
-
-#### Update your Azure Media Services REST API and SDKs to v3 by 29 February 2024
-
-Because version 3 of Azure Media Services REST API and client SDKs for .NET and Java offers more capabilities than version 2, weΓÇÖre retiring version 2 of the Azure Media Services REST API and client SDKs for .NET and Java.
-We encourage you to make the switch sooner to gain the richer benefits of version 3 of Azure Media Services REST API and client SDKs for .NET and Java.
-Version 3 provides:
-
-- 24x7 live event support-- ARM REST APIs, client SDKs for .NET core, Node.js, Python, Java, Go and Ruby.-- Customer managed keys, trusted storage integration, private link support, and [more](../latest/migrate-v-2-v-3-migration-benefits.md)-
-#### Action Required:
-
-To minimize disruption to your workloads, review the [migration guide](../latest/migrate-v-2-v-3-migration-introduction.md) to transition your code from the version 2 API and SDKs to version 3 API and SDK before 29 February 2024.
-**After 29 February 2024**, Azure Media Services will no longer accept traffic on the version 2 REST API, the ARM account management API version 2015-10-01, or from the version 2 .NET client SDKs. This includes any 3rd party open-source client SDKS that may call the version 2 API.
-
-See the official [Azure Updates announcement](https://azure.microsoft.com/updates/update-your-azure-media-services-rest-api-and-sdks-to-v3-by-29-february-2024/).
-
-## September 2020
-
-The following v2 properties will no longer be populated with historical job progress data:
-
-* [HistoricalEvents](/dotnet/api/microsoft.windowsazure.mediaservices.client.itask.historicalevents)
-* [PerfMessage](/dotnet/api/microsoft.windowsazure.mediaservices.client.itask.perfmessage)
-
-To get task history, you should use the v2 job notifications via webhooks or queue messages using Notification Endpoints. For more information, see:
-
-* [Use Azure Queue storage to monitor Media Services job notifications](media-services-dotnet-check-job-progress-with-queues.md)
-* [Use Azure Webhooks to monitor Media Services job notifications](media-services-dotnet-check-job-progress-with-webhooks.md)
-
-## February 2020
-
-Some analytics media processors will be retired. For the retirement dates, see the [legacy components](legacy-components.md) topic.
-
-## September 2019
-
-### Deprecation of media processors
-
-We are announcing deprecation of *Azure Media Indexer* and *Azure Media Indexer 2 Preview*. Azure Media Services Video Indexer replaces these legacy media processors.
-
-For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-Also see [Migrate from Azure Media Indexer and Azure Media Indexer 2 to Azure Media Services Video Indexer](migrate-indexer-v1-v2.md).
-
-## August 2019
-
-### Deprecation of media processors
-
-We are announcing deprecation of the *Windows Azure Media Encoder* (WAME) and *Azure Media Encoder* (AME) media processors. For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-For details, see [Migrate WAME to Media Encoder Standard](./migrate-windows-azure-media-encoder.md) and [Migrate AME to Media Encoder Standard](./migrate-azure-media-encoder.md).
-
-## March 2019
-
-The Media Hyperlapse Preview feature of Azure Media Services was deprecated.
-
-## December 2018
-
-The Media Hyperlapse Preview feature of Azure Media Services will soon be retired. Starting December 19, 2018, Media Services will no longer make changes or improvements to Media Hyperlapse. On March 29, 2019, it will be retired and no longer available.
-
-## October 2018
-
-### CMAF support
-
-CMAF and 'cbcs' encryption support for Apple HLS (iOS 11+) and MPEG-DASH players that support CMAF.
-
-### Web VTT thumbnail sprites
-
-You can now use Media Services to generate Web VTT thumbnail sprites using our v2 APIs. For more information, see [Generate a thumbnail sprite](generate-thumbnail-sprite.md).
-
-## July 2018
-
-With the latest service release, there are minor formatting changes to the error messages returned by the service when a Job fails, with respect to how it is broken up into two or more lines.
-
-## May 2018
-
-Starting May 12, 2018, live channels will no longer support the RTP/MPEG-2 transport stream ingest protocol. Please migrate from RTP/MPEG-2 to RTMP or fragmented MP4 (Smooth Streaming) ingest protocols.
-
-## October 2017 release
-> [!IMPORTANT]
-> Media Services is deprecating support for Azure Access Control Service authentication keys. On June 22, 2018, you can no longer authenticate with the Media Services back end via code by using Access Control Service keys. You must update your code to use Azure Active Directory (Azure AD) per [Azure AD-based authentication](media-services-use-aad-auth-to-access-ams-api.md). Watch for warnings about this change in the Azure portal.
-
-### Updates for October 2017
-#### SDKs
-* The .NET SDK was updated to support Azure AD authentication. Support for Access Control Service authentication was removed from the latest .NET SDK on Nuget.org to encourage faster migration to Azure AD.
-* The JAVA SDK was updated to support Azure AD authentication. Support for Azure AD authentication was added to the Java SDK. For information on how to use the Java SDK with Media Services, see [Get started with the Java client SDK for Azure Media Services](media-services-java-how-to-use.md)
-
-#### File-based encoding
-* You now can use the Premium Encoder to encode your content to the H.265 high-efficiency video coding (HEVC) video codec. There is no pricing impact if you choose H.265 over other codecs, such as H.264. For information about HEVC patent licenses, see [Online Services Terms](https://azure.microsoft.com/support/legal/).
-* For source video that is encoded with the H.265 (HEVC) video codec, such as video captured by using iOS11 or GoPro Hero 6, you now can use either the Premium Encoder or the Standard Encoder to encode those videos. For information about patent licenses, see [Online Services Terms](https://azure.microsoft.com/support/legal/).
-* For content that contains multiple language audio tracks, the language values must be correctly labeled according to the corresponding file format specification (for example, ISO MP4). Then you can use the Standard Encoder to encode the content for streaming. The resultant streaming locator lists the available audio languages.
-* The Standard Encoder now supports two new audio-only system presets, "AAC Audio" and "AAC Good Quality Audio." Both produce stereo advanced audio coding (AAC) output, at bit rates of 128 Kbps and 192 Kbps, respectively.
-* The Premium Encoder now supports QuickTime/MOV file formats as input. The video codec must be one of the [Apple ProRes types listed in this GitHub article](./media-services-media-encoder-standard-formats.md). The audio must be either AAC or pulse code modulation (PCM). The Premium Encoder doesn't support, for example, DVC/DVCPro video wrapped in QuickTime/MOV files as input. The Standard Encoder does support these video codecs.
-* The following bug fixes were made in encoders:
-
- * You can now submit jobs by using an input asset. After these jobs finish, you can modify the asset (for example, add, delete, or rename files within the asset), and submit additional jobs.
- * The quality of JPEG thumbnails produced by the Standard Encoder is improved.
- * The Standard Encoder handles input metadata and thumbnail generation better in very short duration videos.
- * Improvements to the H.264 decoder used in the Standard Encoder eliminate certain rare artifacts.
-
-#### Media Analytics
-General availability of the Azure Media Redactor: This media processor performs anonymization by blurring the faces of selected individuals and is ideal for use in public safety and news media scenarios.
-
-For an overview on this new processor, see [this blog post](https://azure.microsoft.com/blog/azure-media-redactor/). For information on documentation and settings, see [Redact faces with Azure Media Analytics](media-services-face-redaction.md).
---
-## June 2017 release
-
-Media Services now supports [Azure AD-based authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-> [!IMPORTANT]
-> Currently, Media Services supports the Access Control Service authentication model. Access Control Service authorization will be deprecated on June 1, 2018. We recommend that you migrate to the Azure AD authentication model as soon as possible.
-
-## March 2017 release
-
-You can now use the Standard Encoder to [auto-generate a bitrate ladder](media-services-autogen-bitrate-ladder-with-mes.md) by specifying the "Adaptive Streaming" preset string when you create an encoding task. To encode a video for streaming with Media Services, use the "Adaptive Streaming" preset. To customize an encoding preset for your specific scenario, you can begin with [these presets](media-services-mes-presets-overview.md).
-
-You can now use Media Encoder Standard or Media Encoder Premium Workflow to [create an encoding task that generates fMP4 chunks](media-services-generate-fmp4-chunks.md).
-
-## February 2017 release
-
-Starting April 1, 2017, any job record in your account older than 90 days is automatically deleted, along with its associated task records. Deletion occurs even if the total number of records is below the maximum quota. To archive the job/task information, you can use the code described in [Manage assets and related entities with the Media Services .NET SDK](media-services-dotnet-manage-entities.md).
-
-## January 2017 release
-
-In Media Services, a streaming endpoint represents a streaming service that can deliver content directly to a client player application or to a content delivery network (CDN) for further distribution. Media Services also provides seamless Azure Content Delivery Network integration. The outbound stream from a StreamingEndpoint service can be a live stream, a video on demand, or a progressive download of your asset in your Media Services account. Each Media Services account includes a default streaming endpoint. Additional streaming endpoints can be created under the account.
-
-There are two versions of streaming endpoints, 1.0 and 2.0. Starting January 10, 2017, any newly created Media Services accounts include the version 2.0 default streaming endpoint. Additional streaming endpoints that you add to this account are also version 2.0. This change doesn't affect existing accounts. Existing streaming endpoints are version 1.0 and can be upgraded to version 2.0. There are behavior, billing, and feature changes with this change. For more information, see [Streaming endpoints overview](media-services-streaming-endpoints-overview.md).
-
-Starting with the 2.15 version, Media Services added the following properties to the streaming endpoint entity:
-
-* CdnProvider
-* CdnProfile
-* FreeTrialEndTime
-* StreamingEndpointVersion
-
-For more information on these properties, see [StreamingEndpoint](/rest/api/media/operations/streamingendpoint).
-
-## December 2016 release
-
- You now can use Media Services to access telemetry/metrics data for its services. You can use the current version of Media Services to collect telemetry data for live channel, streaming endpoint, and archive entities. For more information, see [Media Services telemetry](media-services-telemetry-overview.md).
-
-## <a name="july_changes16"></a>July 2016 release
-### Updates to the manifest file (*.ISM) generated by encoding tasks
-When an encoding task is submitted to Media Encoder Standard or Media Encoder Premium, the encoding task generates a [streaming manifest file](media-services-deliver-content-overview.md) (*.ism) in the output asset. With the latest service release, the syntax of this streaming manifest file was updated.
-
-> [!NOTE]
-> The syntax of the streaming manifest (.ism) file is reserved for internal use. It's subject to change in future releases. Do not modify or manipulate the contents of this file.
->
->
-
-### A new client manifest (*.ISMC) file is generated in the output asset when an encoding task outputs one or more MP4 files
-Starting with the latest service release, after the completion of an encoding task that generates one or more MP4 files, the output asset also contains a streaming client manifest (*.ismc) file. The .ismc file helps improve the performance of dynamic streaming.
-
-> [!NOTE]
-> The syntax of the client manifest (.ismc) file is reserved for internal use. It's subject to change in future releases. Do not modify or manipulate the contents of this file.
->
->
-
-For more information, see [this blog](/archive/blogs/randomnumber/encoder-changes-within-azure-media-services-now-create-ismc-file).
-
-### Known issues
-Some clients can come across a repeat tag issue in the Smooth Streaming manifest. For more information, see [this section](media-services-deliver-content-overview.md#known-issues).
-
-## <a id="apr_changes16"></a>April 2016 release
-### Media Analytics
- Media Services introduced Media Analytics for powerful video intelligence. For more information, see [Media Services Analytics overview](./legacy-components.md).
-
-### Apple FairPlay (preview)
-You now can use Media Services to dynamically encrypt your HTTP Live Streaming (HLS) content with Apple FairPlay. You also can use the Media Services license delivery service to deliver FairPlay licenses to clients. For more information, see "Use Azure Media Services to stream your HLS content protected with Apple FairPlay."
-
-## <a id="feb_changes16"></a>February 2016 release
-The latest version of the Media Services SDK for .NET (3.5.3) contains a Google Widevine-related bug fix. It was impossible to reuse AssetDeliveryPolicy for multiple assets encrypted with Widevine. As part of this bug fix, the following property was added to the SDK: WidevineBaseLicenseAcquisitionUrl.
-
-```csharp
-Dictionary<AssetDeliveryPolicyConfigurationKey, string> assetDeliveryPolicyConfiguration =
- new Dictionary<AssetDeliveryPolicyConfigurationKey, string>
-{
- {AssetDeliveryPolicyConfigurationKey.WidevineBaseLicenseAcquisitionUrl,"http://testurl"},
-
-};
-```
-
-## <a id="jan_changes_16"></a>January 2016 release
-Encoding reserved units were renamed to reduce confusion with encoder names.
-
-The Basic, Standard, and Premium encoding reserved units were renamed to S1, S2, and S3 reserved units, respectively. Customers who use Basic encoding reserved units today see S1 as the label in the Azure portal (and in the bill). Customers who use Standard and Premium see the labels S2 and S3, respectively.
-
-## <a id="dec_changes_15"></a>December 2015 release
-
-### Media Encoder deprecation announcement
-
- Media Encoder will be deprecated starting in approximately 12 months from the release of Media Encoder Standard.
-
-### Azure SDK for PHP
-The Azure SDK team published a new release of the [Azure SDK for PHP](https://github.com/Azure/azure-sdk-for-php) package that contains updates and new features for Media Services. In particular, the Media Services SDK for PHP now supports the latest [content protection](media-services-content-protection-overview.md) features. These features are dynamic encryption with AES and DRM (PlayReady and Widevine) with and without token restrictions. It also supports scaling [encoding units](media-services-dotnet-encoding-units.md).
-
-For more information, see:
-
-* The following [code samples](https://github.com/Azure/azure-sdk-for-php/tree/master/examples/MediaServices) help you to get started quickly:
- * **vodworkflow_aes.php**: This PHP file shows how to use AES-128 dynamic encryption and the key delivery service. It's based on the .NET sample explained in [Use AES-128 dynamic encryption and the key delivery service](media-services-playready-license-template-overview.md).
- * **vodworkflow_aes.php**: This PHP file shows how to use PlayReady dynamic encryption and the license delivery service. It's based on the .NET sample explained in [Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md).
- * **scale_encoding_units.php**: This PHP file shows how to scale encoding reserved units.
-
-## <a id="nov_changes_15"></a>November 2015 release
- Media Services now offers the Widevine license delivery service in the cloud. For more information, see [this blog](https://azure.microsoft.com/blog/announcing-google-widevine-license-delivery-services-public-preview-in-azure-media-services/). Also, see [this tutorial](media-services-protect-with-playready-widevine.md) and the [GitHub repository](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-drm).
-
-Widevine license delivery services provided by Media Services are in preview. For more information, see [this blog](https://azure.microsoft.com/blog/announcing-google-widevine-license-delivery-services-public-preview-in-azure-media-services/).
-
-## <a id="oct_changes_15"></a>October 2015 release
-Media Services is now live in the following data centers: Brazil South, India West, India South, and India Central. You can now use the Azure portal to [create Media Service accounts](media-services-portal-create-account.md) and perform various tasks described in the [Media Services documentation webpage](https://azure.microsoft.com/documentation/services/media-services/). Live Encoding isn't enabled in these data centers. Further, not all types of encoding reserved units are available in these data centers.
-
-* Brazil South: Only Standard and Basic encoding reserved units are available.
-* India West, India South, and India Central: Only Basic encoding reserved units are available.
-
-## <a id="september_changes_15"></a>September 2015 release
-Media Services now offers the ability to protect both video on demand and live streams with Widevine modular DRM technology. You can use the following delivery services partners to help you deliver Widevine licenses:
-* [Axinom](https://www.axinom.com)
-* [EZDRM](https://ezdrm.com/)
-* [castLabs](https://castlabs.com/company/partners/azure/)
-
-For more information, see [this blog](https://azure.microsoft.com/blog/azure-media-services-adds-google-widevine-packaging-for-delivering-multi-drm-stream/).
-
-You can use the [Media Services .NET SDK](https://www.nuget.org/packages/windowsazure.mediaservices/) (starting with version 3.5.1) or the REST API to configure AssetDeliveryConfiguration to use Widevine.
-* Media Services added support for Apple ProRes videos. You can now upload your QuickTime source videos files that use Apple ProRes or other codecs. For more information, see [this blog](https://azure.microsoft.com/blog/announcing-support-for-apple-prores-videos-in-azure-media-services/).
-* You can now use Media Encoder Standard to do subclipping and live archive extraction. For more information, see [this blog](https://azure.microsoft.com/blog/sub-clipping-and-live-archive-extraction-with-media-encoder-standard/).
-* The following filtering updates were made:
-
- * You can now use the Apple HLS format with an audio-only filter. You can use this update to remove an audio-only track by specifying (audio-only=false) in the URL.
- * When you define filters for your assets, you now can combine multiple (up to three) filters in a single URL.
-
- For more information, see [this blog](https://azure.microsoft.com/blog/azure-media-services-release-dynamic-manifest-composition-remove-hls-audio-only-track-and-hls-i-frame-track-support/).
-* Media Services now supports I-frames in HLS version 4. I-frame support optimizes fast-forward and rewind operations. By default, all HLS version 4 outputs include the I-frame playlist (EXT-X-I-FRAME-STREAM-INF).
-For more information, see [this blog](https://azure.microsoft.com/blog/azure-media-services-release-dynamic-manifest-composition-remove-hls-audio-only-track-and-hls-i-frame-track-support/).
-
-## <a id="august_changes_15"></a>August 2015 release
-* The Media Services SDK for the Java version 0.8.0 release and new samples are now available. For more information, see:
-
-* The Azure Media Player was updated with multi-audio stream support. For more information, see [this blog post](https://azure.microsoft.com/blog/2015/08/13/azure-media-player-update-with-multi-audio-stream-support/).
-
-## <a id="july_changes_15"></a>July 2015 release
-* The general availability of Media Encoder Standard was announced. For more information, see [this blog post](https://azure.microsoft.com/blog/2015/07/16/announcing-the-general-availability-of-media-encoder-standard/).
-
- Media Encoder Standard uses presets, as described in [this section](./media-services-mes-presets-overview.md). When you use a preset for 4K encodes, get the Premium reserved unit type. For more information, see [Scale encoding](media-services-scale-media-processing-overview.md).
-* Live real-time captions were used with Media Services and the Media Player. For more information, see [this blog post](https://azure.microsoft.com/blog/2015/07/08/live-real-time-captions-with-azure-media-services-and-player/).
-
-### Media Services .NET SDK updates
-The Media Services .NET SDK is now version 3.4.0.0. The following updates were made:
-
-* Support was implemented for live archive. You can't download an asset that contains a live archive.
-* Support was implemented for dynamic filters.
-* Functionality was implemented so that users can keep a storage container while they delete an asset.
-* Bug fixes were made related to retry policies in channels.
-* Media Encoder Premium Workflow was enabled.
-
-## <a id="june_changes_15"></a>June 2015 release
-### Media Services .NET SDK updates
-The Media Services .NET SDK is now version 3.3.0.0. The following updates were made:
-
-* Support was added for the OpenId Connect discovery spec.
-* Support was added for handling keys rollover on the identity provider side.
-
-If you use an identity provider that exposes an OpenID Connect discovery document (as Azure AD, Google, and Salesforce do), you can instruct Media Services to obtain signing keys for validation of JSON Web Tokens (JWTs) from the OpenID Connect discovery spec.
-
-For more information, see [Use JSON web keys from the OpenID Connect discovery spec to work with JWT authentication in Media Services](http://gtrifonov.com/2015/06/07/using-json-web-keys-from-openid-connect-discovery-spec-to-work-with-jwt-token-authentication-in-azure-media-services/).
-
-## <a id="may_changes_15"></a>May 2015 release
-The following new features were announced:
-
-* [A preview of live encoding with Media Services](media-services-manage-live-encoder-enabled-channels.md)
-* [Dynamic manifest](media-services-dynamic-manifest-overview.md)
-
-## <a id="april_changes_15"></a>April 2015 release
-### General Media Services updates
-* [Media Player](https://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/) was announced.
-* Starting with the Media Services REST 2.10, channels that are configured to ingest a Real-Time Messaging Protocol (RTMP) are created with primary and secondary ingest URLs. For more information, see [Channel ingest configurations](media-services-live-streaming-with-onprem-encoders.md#channel_input).
-* Azure Media Indexer was updated.
-* Support for Spanish language was added.
-* A new configuration for the XML format was added.
-
-For more information, see [this blog](https://azure.microsoft.com/blog/2015/04/13/azure-media-indexer-spanish-v1-2/).
-
-### Media Services .NET SDK updates
-The Media Services .NET SDK is now version 3.2.0.0. The following updates were made:
-
-* Breaking change: TokenRestrictionTemplate.Issuer and TokenRestrictionTemplate.Audience were changed to be of a string type.
-* Updates were made related to creating custom retry policies.
-* Bug fixes were made related to uploading and downloading files.
-* The MediaServicesCredentials class now accepts primary and secondary access control endpoints to authenticate against.
-
-## <a id="march_changes_15"></a>March 2015 release
-### General Media Services updates
-* Media Services now provides Content Delivery Network integration. To support the integration, the CdnEnabled property was added to StreamingEndpoint. CdnEnabled can be used with REST APIs starting with version 2.9. For more information, see [StreamingEndpoint](/rest/api/media/operations/streamingendpoint). CdnEnabled can be used with the .NET SDK starting with version 3.1.0.2. For more information, see [StreamingEndpoint](/archive/blogs/randomnumber/encoder-changes-within-azure-media-services-now-create-ismc-file).
-* The Media Encoder Premium Workflow was announced. For more information, see [Introducing Premium encoding in Azure Media Services](https://azure.microsoft.com/blog/2015/03/05/introducing-premium-encoding-in-azure-media-services/).
-
-## <a id="february_changes_15"></a>February 2015 release
-### General Media Services updates
-The Media Services REST API is now version 2.9. Starting with this version, you can enable the Content Delivery Network integration with streaming endpoints. For more information, see [StreamingEndpoint](/rest/api/media/operations/streamingendpoint).
-
-## <a id="january_changes_15"></a>January 2015 release
-### General Media Services updates
-The general availability of content protection with dynamic encryption was announced. For more information, see [Media Services enhances streaming security with general availability of DRM technology](https://azure.microsoft.com/blog/2015/01/29/azure-media-services-enhances-streaming-security-with-general-availability-of-drm-technology/).
-
-### Media Services .NET SDK updates
-The Media Services .NET SDK is now version 3.1.0.1.
-
-This release marked the default Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization.TokenRestrictionTemplate constructor as obsolete. The new constructor takes TokenType as an argument.
-
-```csharp
-TokenRestrictionTemplate template = new TokenRestrictionTemplate(TokenType.SWT);
-```
--
-## <a id="december_changes_14"></a>December 2014 release
-### General Media Services updates
-* Some updates and new features were added to the Media Indexer. For more information, see [Azure Media Indexer version 1.1.6.7 release notes](https://azure.microsoft.com/blog/2014/12/03/azure-media-indexer-version-1-1-6-7-release-notes/).
-* A new REST API was added that you can use to update encoding reserved units. For more information, see [EncodingReservedUnitType with REST](/rest/api/media/operations/encodingreservedunittype).
-* CORS support was added for the key delivery service.
-* Performance improvements were made to querying authorization policy options.
-* In the China data center, the [key delivery URL](/rest/api/media/operations/contentkey#get_delivery_service_url) is now per customer (just like in other data centers).
-* HLS auto target duration was added. When doing live streaming, HLS is always packaged dynamically. By default, Media Services automatically calculates the HLS segment packaging ratio (FragmentsPerSegment) based on the keyframe interval (KeyFrameInterval). This method is also referred to as a group of pictures (GOP) that is received from the live encoder. For more information, see [Work with Media Services live streaming](/previous-versions/azure/dn783466(v=azure.100)).
-
-### Media Services .NET SDK updates
-The [Media Services .NET SDK](https://www.nuget.org/packages/windowsazure.mediaservices/) is now version 3.1.0.0. The following updates were made:
-
-* The .NET SDK dependency was upgraded to the .NET 4.5 Framework.
-* A new API that you can use to update encoding reserved units was added. For more information, see [Update reserved unit type and increase encoding reserved units by using .NET](media-services-dotnet-encoding-units.md).
-* JWT support for token authentication was added. For more information, see [JWT token authentication in Media Services and dynamic encryption](http://www.gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/).
-* Relative offsets for BeginDate and ExpirationDate in the PlayReady license template were added.
-
-## <a id="november_changes_14"></a>November 2014 release
-* You now can use Media Services to ingest live Smooth Streaming (fMP4) content over a TLS connection. To ingest over TLS, make sure to update the ingest URL to HTTPS. Currently, Media Services doesn't support TLS with custom domains. For more information about live streaming, see [Work with Azure Media Services Live Streaming](/previous-versions/azure/dn783466(v=azure.100)).
-* Currently, you can't ingest an RTMP live stream over a TLS connection.
-* You can stream over TLS only if the streaming endpoint from which you deliver your content was created after September 10, 2014. If your streaming URLs are based on the streaming endpoints created after September 10, 2014, the URL contains "streaming.mediaservices.windows.net" (the new format). Streaming URLs that contain "origin.mediaservices.windows.net" (the old format) don't support TLS. If your URL is in the old format and you want to stream over TLS, [create a new streaming endpoint](media-services-portal-manage-streaming-endpoints.md). To stream your content over TLS, use URLs based on the new streaming endpoint.
-
-### <a id="oct_sdk"></a>Media Services .NET SDK
-The Media Services SDK for .NET extensions is now version 2.0.0.3.
-
-The Media Services SDK for .NET is now version 3.0.0.8. The following updates were made:
-
-* Refactoring was implemented in retry policy classes.
-* A user agent string was added to HTTP request headers.
-* A NuGet restore build step was added.
-* Scenario tests were fixed to use x509 cert from repository.
-* Validation settings were added for when the channel and streaming end update.
-
-### New GitHub repository to host Media Services samples
-Samples are in the [Media Services samples GitHub repository](https://github.com/Azure/Azure-Media-Services-Samples).
-
-## <a id="september_changes_14"></a>September 2014 release
-The Media Services REST metadata is now version 2.7. For more information about the latest REST updates, see the [Media Services REST API reference](/rest/api/media/operations/azure-media-services-rest-api-reference).
-
-The Media Services SDK for .NET is now version 3.0.0.7
-
-### <a id="sept_14_breaking_changes"></a>Breaking changes
-* Origin was renamed to [StreamingEndpoint].
-* A change was made in the default behavior when you use the Azure portal to encode and then publish MP4 files.
-
-### <a id="sept_14_GA_changes"></a>New features/scenarios that are part of the general availability release
-* The Media Indexer media processor was introduced. For more information, see [Index media files with the Media Indexer](/previous-versions/azure/dn783455(v=azure.100)).
-* You can use the [StreamingEndpoint] entity to add custom domain (host) names.
-
- To use a custom domain name as the Media Services streaming endpoint name, add custom host names to your streaming endpoint. Use the Media Services REST APIs or the .NET SDK to add custom host names.
-
- The following considerations apply:
-
- * You must have the ownership of the custom domain name.
- * The ownership of the domain name must be validated by Media Services. To validate the domain, create a CName that maps the MediaServicesAccountId parent domain to verify DNS mediaservices-dns-zone.
- * You must create another CName that maps the custom host name (for example, sports.contoso.com) to your Media Services StreamingEndpoint host name (for example, amstest.streaming.mediaservices.windows.net).
-
- For more information, see the CustomHostNames property in the [StreamingEndpoint](/rest/api/media/operations/streamingendpoint) article.
-
-### <a id="sept_14_preview_changes"></a>New features/scenarios that are part of the public preview release
-* Live streaming preview. For more information, see [Work with Media Services live streaming](/previous-versions/azure/dn783466(v=azure.100)).
-* Key delivery service. For more information, see [Use AES-128 dynamic encryption and the key delivery service](/previous-versions/azure/dn783457(v=azure.100)).
-* AES dynamic encryption. For more information, see [Use AES-128 dynamic encryption and the key delivery service](/previous-versions/azure/dn783457(v=azure.100)).
-* PlayReady license delivery service.
-* PlayReady dynamic encryption.
-* Media Services PlayReady license template. For more information, see the [Media Services PlayReady license template overview].
-* Stream storage-encrypted assets. For more information, see [Stream storage-encrypted content](/previous-versions/azure/dn783451(v=azure.100)).
-
-## <a id="august_changes_14"></a>August 2014 release
-When you encode an asset, an output asset is produced when the encoding job is finished. Until this release, the Media Services Encoder produced metadata about output assets. Starting with this release, the encoder also produces metadata about input assets. For more information, see [Input metadata] and [Output metadata].
-
-## <a id="july_changes_14"></a>July 2014 release
-The following bug fixes were made for the Azure Media Services Packager and Encryptor:
-
-* When a live archive asset is transmitted to HLS, only audio plays back: This issue was fixed, and now both audio and video can play.
-* When an asset is packaged to HLS and AES 128-bit envelope encryption, the packaged streams don't play back on Android devices: This bug was fixed, and the packaged stream plays back on Android devices that support HLS.
-
-## <a id="may_changes_14"></a>May 2014 release
-### <a id="may_14_changes"></a>General Media Services updates
-You can now use [dynamic packaging] to stream HLS version 3. To stream HLS version 3, add the following format to the origin locator path: * .ism/manifest(format=m3u8-aapl-v3). For more information, see [this forum](https://social.msdn.microsoft.com/Forums/13b8a776-9519-4145-b9ed-d2b632861fde/dynamic-packaging-to-hls-v3).
-
-Dynamic packaging now also supports delivering HLS (version 3 and version 4) encrypted with PlayReady based on Smooth Streaming statically encrypted with PlayReady. For information on how to encrypt Smooth Streaming with PlayReady, see [Protect Smooth Streaming with PlayReady](/previous-versions/azure/dn189154(v=azure.100)).
-
-### <a name="may_14_donnet_changes"></a>Media Services .NET SDK updates
-The Media Services .NET SDK is now version 3.0.0.5. The following updates were made:
-
-* Speed and resilience are better when you upload and download media assets.
-* Improvements were made in retry logic and transient exception handling:
-
- * Transient error detection and retry logic were improved for exceptions that are caused when you query, save changes, and upload or download files.
- * When you get web exceptions (for example, during an Access Control Service token request), fatal errors fail faster now.
-
-For more information, see [Retry logic in the Media Services SDK for .NET].
-
-## <a id="jan_feb_changes_14"></a>January/February 2014 releases
-### <a name="jan_fab_14_donnet_changes"></a>Media Services .NET SDK 3.0.0.1, 3.0.0.2 and 3.0.0.3
-The changes in 3.0.0.1 and 3.0.0.2 include:
-
-* Issues related to the usage of LINQ queries with OrderBy statements were fixed.
-* Test solutions in [GitHub] were split into unit-based tests and scenario-based tests.
-
-For more information about the changes, see the [Media Services .NET SDK 3.0.0.1 and 3.0.0.2 releases](http://gtrifonov.com/2014/02/07/windows-azure-media-services-net-sdk-3-0-0-2-release/https://docsupdatetracker.net/index.html).
-
-The following changes were made in version 3.0.0.3:
-
-* Azure storage dependencies were upgraded to use version 3.0.3.0.
-* A backward-compatibility issue was fixed for 3.0.*.* releases.
-
-## <a id="december_changes_13"></a>December 2013 release
-### <a name="dec_13_donnet_changes"></a>Media Services .NET SDK 3.0.0.0
-> [!NOTE]
-> The 3.0.x.x releases are not backward compatible with 2.4.x.x releases.
->
->
-
-The latest version of the Media Services SDK is now 3.0.0.0. You can download the latest package from NuGet or get the bits from [GitHub].
-
-Starting with the Media Services SDK version 3.0.0.0, you can reuse the [Azure AD Access Control Service](/previous-versions/azure/azure-services/hh147631(v=azure.100)) tokens. For more information, see the section "Reuse Access Control Service tokens" in [Connect to Media Services with the Media Services SDK for .NET](/previous-versions/azure/jj129571(v=azure.100)).
-
-### <a name="dec_13_donnet_ext_changes"></a>Media Services .NET SDK extensions 2.0.0.0
- The Media Services .NET SDK extensions are a set of extension methods and helper functions that simplify your code and make it easier to develop with Media Services. You can get the latest bits from [Media Services .NET SDK extensions](https://github.com/Azure/azure-sdk-for-media-services-extensions/tree/dev).
-
-## <a id="november_changes_13"></a>November 2013 release
-### <a name="nov_13_donnet_changes"></a>Media Services .NET SDK changes
-Starting with this version, the Media Services SDK for .NET handles transient fault errors that might occur when calls are made to the Media Services REST API layer.
-
-## <a id="august_changes_13"></a>August 2013 release
-### <a name="aug_13_powershell_changes"></a>Media Services PowerShell cmdlets included in Azure SDK tools
-The following Media Services PowerShell cmdlets are now included in [Azure SDK tools](https://github.com/Azure/azure-sdk-tools):
-
-* Get-AzureMediaServices
-
- For example: `Get-AzureMediaServicesAccount`
-* New-AzureMediaServicesAccount
-
- For example: `New-AzureMediaServicesAccount -Name "MediaAccountName" -Location "Region" -StorageAccountName "StorageAccountName"`
-* New-AzureMediaServicesKey
-
- For example: `New-AzureMediaServicesKey -Name "MediaAccountName" -KeyType Secondary -Force`
-* Remove-AzureMediaServicesAccount
-
- For example: `Remove-AzureMediaServicesAccount -Name "MediaAccountName" -Force`
-
-## <a id="june_changes_13"></a>June 2013 release
-### <a name="june_13_general_changes"></a>Media Services changes
-The following changes mentioned in this section are updates included in the June 2013 Media Services releases:
-
-* Ability to link multiple storage accounts to a Media Service account.
- * StorageAccount
- * Asset.StorageAccountName and Asset.StorageAccount
-* Ability to update Job.Priority.
-* Notification-related entities and properties:
- * JobNotificationSubscription
- * NotificationEndPoint
- * Job
-* Asset.Uri
-* Locator.Name
-
-### <a name="june_13_dotnet_changes"></a>Media Services .NET SDK changes
-The following changes are included in the June 2013 Media Services SDK releases. The latest Media Services SDK is available on GitHub.
-
-* Starting with version 2.3.0.0, the Media Services SDK supports linking multiple storage accounts to a Media Services account. The following APIs support this feature:
-
- * IStorageAccount type
- * Microsoft.WindowsAzure.MediaServices.Client.CloudMediaContext.StorageAccounts property
- * StorageAccount property
- * StorageAccountName property
-
- For more information, see [Manage Media Services assets across multiple storage accounts](/previous-versions/azure/dn271889(v=azure.100)).
-* Notification-related APIs. Starting with version 2.2.0.0, you can listen to Azure Queue storage notifications. For more information, see [Handle Media Services job notifications](/previous-versions/azure/dn261241(v=azure.100)).
-
- * Microsoft.WindowsAzure.MediaServices.Client.IJob.JobNotificationSubscriptions property
- * Microsoft.WindowsAzure.MediaServices.Client.INotificationEndPoint type
- * Microsoft.WindowsAzure.MediaServices.Client.IJobNotificationSubscription type
- * Microsoft.WindowsAzure.MediaServices.Client.NotificationEndPointCollection type
- * Microsoft.WindowsAzure.MediaServices.Client.NotificationEndPointType type
-* Dependency on the Storage client SDK 2.0 (Microsoft.WindowsAzure.StorageClient.dll)
-* Dependency on OData 5.5 (Microsoft.Data.OData.dll)
-
-## <a id="december_changes_12"></a>December 2012 release
-### <a name="dec_12_dotnet_changes"></a>Media Services .NET SDK changes
-* IntelliSense: Missing IntelliSense documentation was added for many types.
-* Microsoft.Practices.TransientFaultHandling.Core: An issue was fixed where the SDK still had a dependency to an old version of this assembly. The SDK now references version 5.1.1209.1 of this assembly.
-
-Fixes for issues found in the November 2012 SDK:
-
-* IAsset.Locators.Count: This count is now correctly reported on new IAsset interfaces after all locators are deleted.
-* IAssetFile.ContentFileSize: This value is now properly set after an upload by IAssetFile.Upload(filepath).
-* IAssetFile.ContentFileSize: This property can now be set when you create an asset file. It was previously read only.
-* IAssetFile.Upload(filepath): An issue was fixed where this synchronous upload method was throwing the following error when multiple files were uploaded to the asset. The error was "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature."
-* IAssetFile.UploadAsync: An issue was fixed that limited the simultaneous upload of files to five files.
-* IAssetFile.UploadProgressChanged: This event is now provided by the SDK.
-* IAssetFile.DownloadAsync(string, BlobTransferClient, ILocator, CancellationToken): This method overload is now provided.
-* IAssetFile.DownloadAsync: An issue was fixed that limited the simultaneous download of files to five files.
-* IAssetFile.Delete(): An issue was fixed where calling delete might throw an exception if no file was uploaded for the IAssetFile.
-* Jobs: An issue was fixed where chaining an "MP4 to Smooth Streams task" with a "PlayReady Protection Task" by using a job template didn't create any tasks at all.
-* EncryptionUtils.GetCertificateFromStore(): This method no longer throws a null reference exception due to a failure in finding the certificate based on certificate configuration issues.
-
-## <a id="november_changes_12"></a>November 2012 release
-The changes mentioned in this section were updates included in the November 2012 (version 2.0.0.0) SDK. These changes might require any code written for the June 2012 preview SDK release to be modified or rewritten.
-
-* Assets
-
- * IAsset.Create(assetName) is the *only* asset creation function. IAsset.Create no longer uploads files as part of the method call. Use IAssetFile for uploading.
- * The IAsset.Publish method and the AssetState.Publish enumeration value were removed from the Services SDK. Any code that relies on this value must be rewritten.
-* FileInfo
-
- * This class was removed and replaced by IAssetFile.
-
-* IAssetFiles
-
- * IAssetFile replaces FileInfo and has a different behavior. To use it, instantiate the IAssetFiles object, followed by a file upload either by using the Media Services SDK or the Storage SDK. The following IAssetFile.Upload overloads can be used:
-
- * IAssetFile.Upload(filePath): This synchronous method blocks the thread, and we recommend it only when you upload a single file.
- * IAssetFile.UploadAsync(filePath, blobTransferClient, locator, cancellationToken): This asynchronous method is the preferred upload mechanism.
-
- Known bug: If you use the cancellation token, the upload is canceled. The tasks can have many cancellation states. You must properly catch and handle exceptions.
-* Locators
-
- * The origin-specific versions were removed. The SAS-specific context.Locators.CreateSasLocator (asset, accessPolicy) will be marked deprecated or removed by general availability. See the "Locators" section under "New functionality" for updated behavior.
-
-## <a id="june_changes_12"></a>June 2012 preview release
-The following functionality was new in the November release of the SDK:
-
-* Deleting entities
-
- * IAsset, IAssetFile, ILocator, IAccessPolicy, and IContentKey objects are now deleted at the object level, that is, IObject.Delete(), instead of requiring a delete in the Collection, that is, cloudMediaContext.ObjCollection.Delete(objInstance).
-* Locators
-
- * Locators now must be created by using the CreateLocator method. They must use the LocatorType.SAS or LocatorType.OnDemandOrigin enum values as an argument for the specific type of locator you want to create.
- * New properties were added to locators to make it easier to obtain usable URIs for your content. This redesign of locators provides more flexibility for future third-party extensibility and increases the ease of use for media client applications.
-* Asynchronous method support
-
- * Asynchronous support was added to all methods.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Provide feedback
-
-<!-- Anchors. -->
-
-<!-- Images. -->
-
-<! URLs. >
-[Microsoft Q&A question page for Azure Media Services]: /answers/topics/azure-media-services.html
-[Azure Media Services REST API reference]: /rest/api/media/operations/azure-media-services-rest-api-reference
-[Media Services pricing details]: https://azure.microsoft.com/pricing/details/media-services/
-[Input metadata]: ./media-services-input-metadata-schema.md
-[Output metadata]: ./media-services-output-metadata-schema.md
-[Deliver content]: /previous-versions/azure/hh973618(v=azure.100)
-[Index media files with the Azure Media Indexer]: /previous-versions/azure/dn783455(v=azure.100)
-[StreamingEndpoint]: /rest/api/media/operations/streamingendpoint
-[Work with Media Services live streaming]: /previous-versions/azure/dn783466(v=azure.100)
-[Use AES-128 dynamic encryption and the key delivery service]: /previous-versions/azure/dn783457(v=azure.100)
-[Use PlayReady dynamic encryption and the license delivery service]: /previous-versions/azure/dn783467(v=azure.100)
-[Preview features]: https://azure.microsoft.com/services/preview/
-[Media Services PlayReady license template overview]: /previous-versions/azure/dn783459(v=azure.100)
-[Stream storage-encrypted content]: /previous-versions/azure/dn783451(v=azure.100)
-[Azure portal]: https://portal.azure.com
-[Dynamic packaging]: /previous-versions/azure/jj889436(v=azure.100)
-[Nick Drouin's blog]: http://blog-ndrouin.azurewebsites.net/hls-v3-new-old-thing/
-[Protect Smooth Streaming with PlayReady]: /previous-versions/azure/dn189154(v=azure.100)
-[Retry logic in the Media Services SDK for .NET]: ./media-services-retry-logic-in-dotnet-sdk.md
-[Grass Valley announces EDIUS 7 streaming through the cloud]: https://www.streamingmedia.com/Producer/Articles/ReadArticle.aspx?ArticleID=96351&utm_source=dlvr.it&utm_medium=twitter
-[Control Media Services Encoder output file names]: /previous-versions/azure/dn303341(v=azure.100)
-[Create overlays]: /previous-versions/azure/dn640496(v=azure.100)
-[Stitch video segments]: /previous-versions/azure/dn640504(v=azure.100)
-[Azure Media Services .NET SDK 3.0.0.1 and 3.0.0.2 releases]: http://www.gtrifonov.com/2014/02/07/windows-azure-media-services-.net-sdk-3.0.0.2-release/
-[Azure AD Access Control Service]: /previous-versions/azure/azure-services/hh147631(v=azure.100)
-[Connect to Media Services with the Media Services SDK for .NET]: /previous-versions/azure/jj129571(v=azure.100)
-[Media Services .NET SDK extensions]: https://github.com/Azure/azure-sdk-for-media-services-extensions/tree/dev
-[Azure SDK tools]: https://github.com/Azure/azure-sdk-tools
-[GitHub]: https://github.com/Azure/azure-sdk-for-media-services
-[Manage Media Services assets across multiple Storage accounts]: /previous-versions/azure/dn271889(v=azure.100)
-[Handle Media Services job notifications]: /previous-versions/azure/dn261241(v=azure.100)
media-services Media Services Rest Check Job Progress https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-check-job-progress.md
- Title: How to check job progress using REST API | Microsoft Docs
-description: This article demonstrates how to check job progress using Azure Media Services v2 REST API.
------ Previously updated : 03/10/2021--
-# How to: check job progress
---
-When you run jobs, you often require a way to track job progress. You can find out the Job status by using the Job's State property. For more information on the State property, see [Job Entity Properties](/rest/api/media/operations/job#job_entity_properties).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Check job progress
-
-Request:
-
-```console
-GET https://media.windows.net/api/Jobs()?$filter=Id%20eq%20'nb%3Ajid%3AUUID%3Af3c43f94-327f-2347-90bb-3bf79f8559f1'&$top=1 HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-Response:
-
-```output
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 450
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: d9b83c57-e26c-4d10-a20b-2be634c4b6a8
-request-id: 91d2be35-20ed-4e1c-a147-e82cd000c193
-x-ms-request-id: 91d2be35-20ed-4e1c-a147-e82cd000c193
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-
-{"odata.metadata":"https://media.windows.net/api/$metadata#Jobs","value":[{"Id":"nb:jid:UUID:f3c43f94-327f-2347-90bb-3bf79f8559f1","Name":"Encoding BigBuckBunny into to H264 Adaptive Bitrate MP4 Set 720p","Created":"2015-02-11T01:46:08.897","LastModified":"2015-02-11T01:46:08.897","EndTime":null,"Priority":0,"RunningDuration":0.0,"StartTime":"2015-02-11T01:46:16.58","State":2,"TemplateId":null,"JobNotificationSubscriptions":[]}]}
-```
--
-## Media Services learning paths
-
-## Provide feedback
-
-## See also
-
-[Media Services operations REST API overview](media-services-rest-how-to-use.md)
media-services Media Services Rest Configure Asset Delivery Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-configure-asset-delivery-policy.md
- Title: Configuring asset delivery policies using Media Services REST API | Microsoft Docs
-description: This topic shows how to configure different asset delivery policies using Media Services REST API.
------ Previously updated : 03/10/2021---
-# Configuring asset delivery policies
---
-If you plan to deliver dynamically encrypted assets, one of the steps in the Media Services content delivery workflow is configuring delivery policies for assets. The asset delivery policy tells Media Services how you want for your asset to be delivered: into which streaming protocol should your asset be dynamically packaged (for example, MPEG DASH, HLS, Smooth Streaming, or all), whether or not you want to dynamically encrypt your asset and how (envelope or common encryption).
-
-This topic discusses why and how to create and configure asset delivery policies.
-
-> [!NOTE]
-> When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
->
-> Also, to be able to use dynamic packaging and dynamic encryption your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files.
-
-You could apply different policies to the same asset. For example, you could apply PlayReady encryption to Smooth Streaming and AES Envelope encryption to MPEG DASH and HLS. Any protocols that are not defined in a delivery policy (for example, you add a single policy that only specifies HLS as the protocol) will be blocked from streaming. The exception to this is if you have no asset delivery policy defined at all. Then, all protocols will be allowed in the clear.
-
-If you want to deliver a storage encrypted asset, you must configure the assetΓÇÖs delivery policy. Before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy. For example, to deliver your asset encrypted with Advanced Encryption Standard (AES) envelope encryption key, set the policy type to **DynamicEnvelopeEncryption**. To remove storage encryption and stream the asset in the clear, set the policy type to **NoDynamicEncryption**. Examples that show how to configure these policy types follow.
-
-Depending on how you configure the asset delivery policy you would be able to dynamically package, dynamically encrypt, and stream the following streaming protocols: Smooth Streaming, HLS, MPEG DASH streams.
-
-The following list shows the formats that you use to stream Smooth, HLS, DASH.
-
-Smooth Streaming:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
-
-HLS:
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl)
-
-MPEG DASH
-
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=mpd-time-csf)
--
-For instructions on how to publish an asset and build a streaming URL, see [Build a streaming URL](media-services-deliver-streaming-content.md).
-
-## Considerations
-* You cannot delete an AssetDeliveryPolicy associated with an asset while an OnDemand (streaming) locator exists for that asset. The recommendation is to remove the policy from the asset before deleting the policy.
-* A streaming locator cannot be created on a storage encrypted asset when no asset delivery policy is set. If the Asset isnΓÇÖt storage encrypted, the system will let you create a locator and stream the asset in the clear without an asset delivery policy.
-* You can have multiple asset delivery policies associated with a single asset but you can only specify one way to handle a given AssetDeliveryProtocol. Meaning if you try to link two delivery policies that specify the AssetDeliveryProtocol.SmoothStreaming protocol that will result in an error because the system does not know which one you want it to apply when a client makes a Smooth Streaming request.
-* If you have an asset with an existing streaming locator, you cannot link a new policy to the asset, unlink an existing policy from the asset, or update a delivery policy associated with the asset. You first have to remove the streaming locator, adjust the policies, and then re-create the streaming locator. You can use the same locatorId when you recreate the streaming locator but you should ensure that wonΓÇÖt cause issues for clients since content can be cached by the origin or a downstream CDN.
-
-> [!NOTE]
->
-> When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Clear asset delivery policy
-### <a id="create_asset_delivery_policy"></a>Create asset delivery policy
-The following HTTP request creates an asset delivery policy that specifies to not apply dynamic encryption and to deliver the stream in any of the following protocols: MPEG DASH, HLS, and Smooth Streaming protocols.
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-Request:
-
-```console
-POST https://media.windows.net/api/AssetDeliveryPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 4651882c-d7ad-4d5e-86ab-f07f47dcb41e
-Host: media.windows.net
-
-{"Name":"Clear Policy",
-"AssetDeliveryProtocol":7,
-"AssetDeliveryPolicyType":2,
-"AssetDeliveryConfiguration":null}
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 363
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://media.windows.net/api/AssetDeliveryPolicies('nb%3Aadpid%3AUUID%3A92b0f6ba-3c9f-49b6-a5fa-2a8703b04ecd')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 4651882c-d7ad-4d5e-86ab-f07f47dcb41e
-request-id: 6aedbf93-4bc2-4586-8845-fd45590136af
-x-ms-request-id: 6aedbf93-4bc2-4586-8845-fd45590136af
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Sun, 08 Feb 2015 06:21:27 GMT
-
-{"odata.metadata":"https://media.windows.net/api/$metadata#AssetDeliveryPolicies/@Element",
-"Id":"nb:adpid:UUID:92b0f6ba-3c9f-49b6-a5fa-2a8703b04ecd",
-"Name":"Clear Policy",
-"AssetDeliveryProtocol":7,
-"AssetDeliveryPolicyType":2,
-"AssetDeliveryConfiguration":null,
-"Created":"2015-02-08T06:21:27.6908329Z",
-"LastModified":"2015-02-08T06:21:27.6908329Z"}
-```
-
-### <a id="link_asset_with_asset_delivery_policy"></a>Link asset with asset delivery policy
-The following HTTP request links the specified asset to the asset delivery policy to.
-
-Request:
-
-```console
-POST https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3A86933344-9539-4d0c-be7d-f842458693e0')/$links/DeliveryPolicies HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Content-Type: application/json
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 56d2763f-6e72-419d-ba3c-685f6db97e81
-Host: media.windows.net
-
-{"uri":"https://media.windows.net/api/AssetDeliveryPolicies('nb%3Aadpid%3AUUID%3A92b0f6ba-3c9f-49b6-a5fa-2a8703b04ecd')"}
-```
-
-Response:
-
-```output
-HTTP/1.1 204 No Content
-```
--
-## DynamicEnvelopeEncryption asset delivery policy
-### Create content key of the EnvelopeEncryption type and link it to the asset
-When specifying DynamicEnvelopeEncryption delivery policy, you need to make sure to link your asset to a content key of the EnvelopeEncryption type. For more information, see: [Creating a content key](media-services-rest-create-contentkey.md)).
-
-### <a id="get_delivery_url"></a>Get delivery URL
-Get the delivery URL for the specified delivery method of the content key created in the previous step. A client uses the returned URL to request an AES key or a PlayReady license in order to playback the protected content.
-
-Specify the type of the URL to get in the body of the HTTP request. If you are protecting your content with PlayReady, request a Media Services PlayReady license acquisition URL, using 1 for the keyDeliveryType: {"keyDeliveryType":1}. If you are protecting your content with the envelope encryption, request a key acquisition URL by specifying 2 for keyDeliveryType: {"keyDeliveryType":2}.
-
-Request:
-
-```console
-POST https://media.windows.net/api/ContentKeys('nb:kid:UUID:dc88f996-2859-4cf7-a279-c52a9d6b2f04')/GetKeyDeliveryUrl HTTP/1.1
-Content-Type: application/json
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 569d4b7c-a446-4edc-b77c-9fb686083dd8
-Host: media.windows.net
-Content-Length: 21
-
-{"keyDeliveryType":2}
-```
-
-Response:
-
-```output
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 198
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 569d4b7c-a446-4edc-b77c-9fb686083dd8
-request-id: d26f65d2-fe65-4136-8fcf-31545be68377
-x-ms-request-id: d26f65d2-fe65-4136-8fcf-31545be68377
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Sun, 08 Feb 2015 21:42:30 GMT
-
-{"odata.metadata":"media.windows.net/api/$metadata#Edm.String","value":"https://amsaccount1.keydelivery.mediaservices.windows.net/?KID=dc88f996-2859-4cf7-a279-c52a9d6b2f04"}
-```
--
-### Create asset delivery policy
-The following HTTP request creates the **AssetDeliveryPolicy** that is configured to apply dynamic envelope encryption (**DynamicEnvelopeEncryption**) to the **HLS** protocol (in this example, other protocols will be blocked from streaming).
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-Request:
-
-```console
-POST https://media.windows.net/api/AssetDeliveryPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: fff319f6-71dd-4f6c-af27-b675c0066fa7
-Host: media.windows.net
-
-{"Name":"AssetDeliveryPolicy","AssetDeliveryProtocol":4,"AssetDeliveryPolicyType":3,"AssetDeliveryConfiguration":"[{\"Key\":2,\"Value\":\"https:\\/\\/amsaccount1.keydelivery.mediaservices.windows.net\\/\"}]"}
-```
--
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 460
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: media.windows.net/api/AssetDeliveryPolicies('nb%3Aadpid%3AUUID%3Aec9b994e-672c-4a5b-8490-a464eeb7964b')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: fff319f6-71dd-4f6c-af27-b675c0066fa7
-request-id: c2a1ac0e-9644-474f-b38f-b9541c3a7c5f
-x-ms-request-id: c2a1ac0e-9644-474f-b38f-b9541c3a7c5f
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 09 Feb 2015 05:24:38 GMT
-
-{"odata.metadata":"media.windows.net/api/$metadata#AssetDeliveryPolicies/@Element","Id":"nb:adpid:UUID:ec9b994e-672c-4a5b-8490-a464eeb7964b","Name":"AssetDeliveryPolicy","AssetDeliveryProtocol":4,"AssetDeliveryPolicyType":3,"AssetDeliveryConfiguration":"[{\"Key\":2,\"Value\":\"https:\\/\\/amsaccount1.keydelivery.mediaservices.windows.net\\/\"}]","Created":"2015-02-09T05:24:38.9167436Z","LastModified":"2015-02-09T05:24:38.9167436Z"}
-```
--
-### Link asset with asset delivery policy
-See [Link asset with asset delivery policy](#link_asset_with_asset_delivery_policy)
-
-## DynamicCommonEncryption asset delivery policy
-### Create content key of the CommonEncryption type and link it to the asset
-When specifying DynamicCommonEncryption delivery policy, you need to make sure to link your asset to a content key of the CommonEncryption type. For more information, see: [Creating a content key](media-services-rest-create-contentkey.md)).
-
-### Get Delivery URL
-Get the delivery URL for the PlayReady delivery method of the content key created in the previous step. A client uses the returned URL to request a PlayReady license in order to playback the protected content. For more information, see [Get Delivery URL](#get_delivery_url).
-
-### Create asset delivery policy
-The following HTTP request creates the **AssetDeliveryPolicy** that is configured to apply dynamic common encryption (**DynamicCommonEncryption**) to the **Smooth Streaming** protocol (in this example, other protocols will be blocked from streaming).
-
-For information on what values you can specify when creating an AssetDeliveryPolicy, see the [Types used when defining AssetDeliveryPolicy](#types) section.
-
-Request:
-
-```console
-POST https://media.windows.net/api/AssetDeliveryPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: fff319f6-71dd-4f6c-af27-b675c0066fa7
-Host: media.windows.net
-
-{"Name":"AssetDeliveryPolicy","AssetDeliveryProtocol":1,"AssetDeliveryPolicyType":4,"AssetDeliveryConfiguration":"[{\"Key\":2,\"Value\":\"https:\\/\\/amsaccount1.keydelivery.mediaservices.windows.net\/PlayReady\/"}]"}
-```
--
-If you want to protect your content using Widevine DRM, update the AssetDeliveryConfiguration values to use WidevineLicenseAcquisitionUrl (which has the value of 7) and specify the URL of a license delivery service. You can use the following AMS partners to help you deliver Widevine licenses: [Axinom](https://www.axinom.com), [EZDRM](https://ezdrm.com/), [castLabs](https://castlabs.com/company/partners/azure/).
-
-For example:
-
-```console
-{"Name":"AssetDeliveryPolicy","AssetDeliveryProtocol":2,"AssetDeliveryPolicyType":4,"AssetDeliveryConfiguration":"[{\"Key\":7,\"Value\":\"https:\\/\\/example.net\/WidevineLicenseAcquisition\/"}]"}
-```
-
-> [!NOTE]
-> When encrypting with Widevine, you would only be able to deliver using DASH. Make sure to specify DASH (2) in the asset delivery protocol.
->
->
-
-### Link asset with asset delivery policy
-See [Link asset with asset delivery policy](#link_asset_with_asset_delivery_policy)
-
-## <a id="types"></a>Types used when defining AssetDeliveryPolicy
-
-### AssetDeliveryProtocol
-
-The following enum describes values you can set for the asset delivery protocol.
-
-```csharp
-[Flags]
-public enum AssetDeliveryProtocol
-{
- /// <summary>
- /// No protocols.
- /// </summary>
- None = 0x0,
-
- /// <summary>
- /// Smooth streaming protocol.
- /// </summary>
- SmoothStreaming = 0x1,
-
- /// <summary>
- /// MPEG Dynamic Adaptive Streaming over HTTP (DASH)
- /// </summary>
- Dash = 0x2,
-
- /// <summary>
- /// Apple HTTP Live Streaming protocol.
- /// </summary>
- HLS = 0x4,
-
- ProgressiveDownload = 0x10,
-
- /// <summary>
- /// Include all protocols.
- /// </summary>
- All = 0xFFFF
-}
-```
-
-### AssetDeliveryPolicyType
-
-The following enum describes values you can set for the asset delivery policy type.
-
-```csharp
-public enum AssetDeliveryPolicyType
-{
- /// <summary>
- /// Delivery Policy Type not set. An invalid value.
- /// </summary>
- None,
-
- /// <summary>
- /// The Asset should not be delivered via this AssetDeliveryProtocol.
- /// </summary>
- Blocked,
-
- /// <summary>
- /// Do not apply dynamic encryption to the asset.
- /// </summary>
- ///
- NoDynamicEncryption,
-
- /// <summary>
- /// Apply Dynamic Envelope encryption.
- /// </summary>
- DynamicEnvelopeEncryption,
-
- /// <summary>
- /// Apply Dynamic Common encryption.
- /// </summary>
- DynamicCommonEncryption
- }
-```
-
-### ContentKeyDeliveryType
-
-The following enum describes values you can use to configure the delivery method of the content key to the client.
-
-```csharp
-public enum ContentKeyDeliveryType
-{
- /// <summary>
- /// None.
- ///
- </summary>
- None = 0,
-
- /// <summary>
- /// Use PlayReady License acquisition protocol
- ///
- </summary>
- PlayReadyLicense = 1,
-
- /// <summary>
- /// Use MPEG Baseline HTTP key protocol.
- ///
- </summary>
- BaselineHttp = 2,
-
- /// <summary>
- /// Use Widevine License acquisition protocol
- ///
- </summary>
- Widevine = 3
-
-}
-```
--
-### AssetDeliveryPolicyConfigurationKey
-
-The following enum describes values you can set to configure keys used to get specific configuration for an asset delivery policy.
-
-```csharp
-public enum AssetDeliveryPolicyConfigurationKey
-{
- /// <summary>
- /// No policies.
- /// </summary>
- None,
-
- /// <summary>
- /// Exact Envelope key URL.
- /// </summary>
- EnvelopeKeyAcquisitionUrl,
-
- /// <summary>
- /// Base key url that will have KID=<Guid> appended for Envelope.
- /// </summary>
- EnvelopeBaseKeyAcquisitionUrl,
-
- /// <summary>
- /// The initialization vector to use for envelope encryption in Base64 format.
- /// </summary>
- EnvelopeEncryptionIVAsBase64,
-
- /// <summary>
- /// The PlayReady License Acquisition Url to use for common encryption.
- /// </summary>
- PlayReadyLicenseAcquisitionUrl,
-
- /// <summary>
- /// The PlayReady Custom Attributes to add to the PlayReady Content Header
- /// </summary>
- PlayReadyCustomAttributes,
-
- /// <summary>
- /// The initialization vector to use for envelope encryption.
- /// </summary>
- EnvelopeEncryptionIV,
-
- /// <summary>
- /// Widevine DRM acquisition url
- /// </summary>
- WidevineLicenseAcquisitionUrl
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Rest Configure Content Key Auth Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-configure-content-key-auth-policy.md
- Title: Configure a content key authorization policy with REST - Azure | Microsoft Docs
-description: Learn how to configure an authorization policy for a content key by using the Media Services REST API.
------ Previously updated : 03/10/2021---
-# Dynamic encryption: Configure a content key authorization policy
-
-
-
-## Overview
- You can use Azure Media Services to deliver your content encrypted (dynamically) with the Advanced Encryption Standard (AES) by using 128-bit encryption keys and PlayReady or Widevine digital rights management (DRM). Media Services also provides a service for delivering keys and PlayReady/Widevine licenses to authorized clients.
-
-If you want Media Services to encrypt an asset, you need to associate an encryption key (CommonEncryption or EnvelopeEncryption) with the asset. For more information, see [Create content keys with REST](media-services-rest-create-contentkey.md). You also need to configure authorization policies for the key (as described in this article).
-
-When a stream is requested by a player, Media Services uses the specified key to dynamically encrypt your content by using AES or PlayReady encryption. To decrypt the stream, the player requests the key from the key delivery service. To determine whether the user is authorized to get the key, the service evaluates the authorization policies that you specified for the key.
-
-Media Services supports multiple ways of authenticating users who make key requests. The content key authorization policy can have one or more authorization restrictions by using either the open or token restriction. The token-restricted policy must be accompanied by a token issued by a security token service (STS). Media Services supports tokens in the simple web token ([SWT](/previous-versions/azure/azure-services/gg185950(v=azure.100)#BKMK_2)) and JSON Web Token (JWT) formats.
-
-Media Services doesn't provide STS. You can create a custom STS or use Azure Active Directory (Azure AD) to issue tokens. The STS must be configured to create a token signed with the specified key and issue claims that you specified in the token restriction configuration (as described in this article). If the token is valid and the claims in the token match those configured for the content key, the Media Services key delivery service returns the encryption key to the client.
-
-For more information, see the following articles:
-- [JWT token authentication](http://www.gtrifonov.com/2015/01/03/jwt-token-authentication-in-azure-media-services-and-dynamic-encryption/)-- [Integrate an Azure Media Services OWIN MVC-based app with Azure Active Directory, and restrict content key delivery based on JWT claims](http://www.gtrifonov.com/2015/01/24/mvc-owin-azure-media-services-ad-integration/)-
-### Some considerations apply
-* To use dynamic packaging and dynamic encryption, make sure the streaming endpoint from which you want to stream your content is in the "Running" state.
-* Your asset must contain a set of adaptive bitrate MP4s or adaptive bitrate Smooth Streaming files. For more information, see [Encode an asset](media-services-encode-asset.md).
-* Upload and encode your assets by using the AssetCreationOptions.StorageEncrypted option.
-* If you plan to have multiple content keys that require the same policy configuration, we recommend that you create a single authorization policy and reuse it with multiple content keys.
-* The key delivery service caches ContentKeyAuthorizationPolicy and its related objects (policy options and restrictions) for 15 minutes. You can create ContentKeyAuthorizationPolicy and specify to use a token restriction, test it, and then update the policy to the open restriction. This process takes roughly 15 minutes before the policy switches to the open version of the policy.
-* If you add or update your asset's delivery policy, you must delete any existing locator and create a new locator.
-* Currently, you can't encrypt progressive downloads.
-* Media Services streaming endpoint sets the value of the CORS Access-Control-Allow-Origin header in preflight response as the wildcard "\*." This value works well with most players, including Azure Media Player, Roku and JWPlayer, and others. However, some players that use dash.js don't work because, with the credentials mode set to "include," XMLHttpRequest in their dash.js doesn't allow the wildcard "\*" as the value of Access-Control-Allow-Origin. As a workaround to this limitation in dash.js, if you host your client from a single domain, Media Services can specify that domain in the preflight response header. For assistance, open a support ticket through the Azure portal.
-
-## AES-128 dynamic encryption
-> [!NOTE]
-> When you work with the Media Services REST API, the following considerations apply.
->
-> When you access entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API development](media-services-rest-how-to-use.md).
->
->
->
-
-### Open restriction
-Open restriction means the system delivers the key to anyone who makes a key request. This restriction might be useful for testing purposes.
-
-The following example creates an open authorization policy and adds it to the content key.
-
-#### <a id="ContentKeyAuthorizationPolicies"></a>Create ContentKeyAuthorizationPolicies
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: d732dbfa-54fc-474c-99d6-9b46a006f389
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 36
-
-{"Name":"Open Authorization Policy"}
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 211
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicies('nb%3Ackpid%3AUUID%3Adb4593da-f4d1-4cc5-a92a-d20eacbabee4')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: d732dbfa-54fc-474c-99d6-9b46a006f389
-request-id: aabfa731-e884-4bf3-8314-492b04747ac4
-x-ms-request-id: aabfa731-e884-4bf3-8314-492b04747ac4
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 08:25:56 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicies/@Element","Id":"nb:ckpid:UUID:db4593da-f4d1-4cc5-a92a-d20eacbabee4","Name":"Open Authorization Policy"}
-```
-
-#### <a id="ContentKeyAuthorizationPolicyOptions"></a>Create ContentKeyAuthorizationPolicyOptions
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 3.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: d225e357-e60e-4f42-add8-9d93aba1409a
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 168
-
-{"Name":"policy","KeyDeliveryType":2,"KeyDeliveryConfiguration":"","Restrictions":[{"Name":"HLS Open Authorization Policy","KeyRestrictionType":0,"Requirements":null}]}
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 349
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions('nb%3Ackpoid%3AUUID%3A57829b17-1101-4797-919b-f816f4a007b7')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: d225e357-e60e-4f42-add8-9d93aba1409a
-request-id: 81bcad37-295b-431f-972f-b23f2e4172c9
-x-ms-request-id: 81bcad37-295b-431f-972f-b23f2e4172c9
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 08:56:40 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicyOptions/@Element","Id":"nb:ckpoid:UUID:57829b17-1101-4797-919b-f816f4a007b7","Name":"policy","KeyDeliveryType":2,"KeyDeliveryConfiguration":"","Restrictions":[{"Name":"HLS Open Authorization Policy","KeyRestrictionType":0,"Requirements":null}]}
-```
-
-#### <a id="LinkContentKeyAuthorizationPoliciesWithOptions"></a>Link ContentKeyAuthorizationPolicies with Options
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicies('nb%3Ackpid%3AUUID%3A0baa438b-8ac2-4c40-a53c-4d4722b78715')/$links/Options HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Content-Type: application/json
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 9847f705-f2ca-4e95-a478-8f823dbbaa29
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 154
-
-{"uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions('nb%3Ackpoid%3AUUID%3A57829b17-1101-4797-919b-f816f4a007b7')"}
-```
-
-Response:
-
-```output
-HTTP/1.1 204 No Content
-```
-
-#### <a id="AddAuthorizationPolicyToKey"></a>Add an authorization policy to the content key
-Request:
-
-```console
-PUT https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeys('nb%3Akid%3AUUID%3A2e6d36a7-a17c-4e9a-830d-eca23ad1a6f9') HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: e613efff-cb6a-41b4-984a-f4f8fb6e76a4
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 78
-
-{"AuthorizationPolicyId":"nb:ckpid:UUID:c06cebb8-c4f0-4d1a-ba00-3273fb2bc3ad"}
-```
-
-Response:
-
-```output
-HTTP/1.1 204 No Content
-```
-
-### Token restriction
-This section describes how to create a content key authorization policy and associate it with the content key. The authorization policy describes what authorization requirements must be met to determine if the user is authorized to receive the key. For example, does the verification key list contain the key that the token was signed with?
-
-To configure the token restriction option, you need to use an XML to describe the token's authorization requirements. The token restriction configuration XML must conform to the following XML schema:
--
-#### <a id="schema"></a>Token restriction schema
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<xs:schema xmlns:tns="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1" elementFormDefault="qualified" targetNamespace="http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1" xmlns:xs="https://www.w3.org/2001/XMLSchema">
- <xs:complexType name="TokenClaim">
- <xs:sequence>
- <xs:element name="ClaimType" nillable="true" type="xs:string" />
- <xs:element minOccurs="0" name="ClaimValue" nillable="true" type="xs:string" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="TokenClaim" nillable="true" type="tns:TokenClaim" />
- <xs:complexType name="TokenRestrictionTemplate">
- <xs:sequence>
- <xs:element minOccurs="0" name="AlternateVerificationKeys" nillable="true" type="tns:ArrayOfTokenVerificationKey" />
- <xs:element name="Audience" nillable="true" type="xs:anyURI" />
- <xs:element name="Issuer" nillable="true" type="xs:anyURI" />
- <xs:element name="PrimaryVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- <xs:element minOccurs="0" name="RequiredClaims" nillable="true" type="tns:ArrayOfTokenClaim" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="TokenRestrictionTemplate" nillable="true" type="tns:TokenRestrictionTemplate" />
- <xs:complexType name="ArrayOfTokenVerificationKey">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="TokenVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfTokenVerificationKey" nillable="true" type="tns:ArrayOfTokenVerificationKey" />
- <xs:complexType name="TokenVerificationKey">
- <xs:sequence />
- </xs:complexType>
- <xs:element name="TokenVerificationKey" nillable="true" type="tns:TokenVerificationKey" />
- <xs:complexType name="ArrayOfTokenClaim">
- <xs:sequence>
- <xs:element minOccurs="0" maxOccurs="unbounded" name="TokenClaim" nillable="true" type="tns:TokenClaim" />
- </xs:sequence>
- </xs:complexType>
- <xs:element name="ArrayOfTokenClaim" nillable="true" type="tns:ArrayOfTokenClaim" />
- <xs:complexType name="SymmetricVerificationKey">
- <xs:complexContent mixed="false">
- <xs:extension base="tns:TokenVerificationKey">
- <xs:sequence>
- <xs:element name="KeyValue" nillable="true" type="xs:base64Binary" />
- </xs:sequence>
- </xs:extension>
- </xs:complexContent>
- </xs:complexType>
- <xs:element name="SymmetricVerificationKey" nillable="true" type="tns:SymmetricVerificationKey" />
-</xs:schema>
-```
-
-When you configure the token-restricted policy, you must specify the primary verification key, issuer, and audience parameters. The primary verification key contains the key that the token was signed with. The issuer is the STS that issues the token. The audience (sometimes called scope) describes the intent of the token or the resource the token authorizes access to. The Media Services key delivery service validates that these values in the token match the values in the template.
-
-The following example creates an authorization policy with a token restriction. In this example, the client must present a token that contains the signing key (VerificationKey), a token issuer, and required claims.
-
-### Create ContentKeyAuthorizationPolicies
-Create a token restriction policy, as shown in the section "[Create ContentKeyAuthorizationPolicies](#ContentKeyAuthorizationPolicies)."
-
-### Create ContentKeyAuthorizationPolicyOptions
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 3.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 2643d836-bfe7-438e-9ba2-bc6ff28e4a53
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 1079
-
-{"Name":"Token option for HLS","KeyDeliveryType":2,"KeyDeliveryConfiguration":null,"Restrictions":[{"Name":"Token Authorization Policy","KeyRestrictionType":1,"Requirements":"<TokenRestrictionTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1\"><AlternateVerificationKeys><TokenVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>BklyAFiPTQsuJNKriQJBZHYaKM2CkCTDQX2bw9sMYuvEC9sjW0W7GUIBygQL/+POEeUqCYPnmEU2g0o1GW2Oqg==</KeyValue></TokenVerificationKey></AlternateVerificationKeys><Audience>urn:test</Audience><Issuer>http://testissuer.com/</Issuer><PrimaryVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>E5BUHiN4vBdzUzdP0IWaHFMMU3D1uRZgF16TOhSfwwHGSw+Kbf0XqsHzEIYk11M372viB9vbiacsdcQksA0ftw==</KeyValue></PrimaryVerificationKey><RequiredClaims><TokenClaim><ClaimType>urn:microsoft:azure:media
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 1260
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions('nb%3Ackpoid%3AUUID%3Ae1ef6145-46e8-4ee6-9756-b1cf96328c23')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 2643d836-bfe7-438e-9ba2-bc6ff28e4a53
-request-id: 2310b716-aeaa-421e-913e-3ce2f6f685ca
-x-ms-request-id: 2310b716-aeaa-421e-913e-3ce2f6f685ca
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 09:10:37 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicyOptions/@Element","Id":"nb:ckpoid:UUID:e1ef6145-46e8-4ee6-9756-b1cf96328c23","Name":"Token option for HLS","KeyDeliveryType":2,"KeyDeliveryConfiguration":null,"Restrictions":[{"Name":"Token Authorization Policy","KeyRestrictionType":1,"Requirements":"<TokenRestrictionTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1\"><AlternateVerificationKeys><TokenVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>BklyAFiPTQsuJNKriQJBZHYaKM2CkCTDQX2bw9sMYuvEC9sjW0W7GUIBygQL/+POEeUqCYPnmEU2g0o1GW2Oqg==</KeyValue></TokenVerificationKey></AlternateVerificationKeys><Audience>urn:test</Audience><Issuer>http://testissuer.com/</Issuer><PrimaryVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>E5BUHiN4vBdzUzdP0IWaHFMMU3D1uRZgF16TOhSfwwHGSw+Kbf0XqsHzEIYk11M372viB9vbiacsdcQksA0ftw==</KeyValue></PrimaryVerificationKey><RequiredClaims><TokenClaim><ClaimType>urn:microsoft:azure:media
-```
-
-#### Link ContentKeyAuthorizationPolicies with options
-Link ContentKeyAuthorizationPolicies with options, as shown in the section "[Create ContentKeyAuthorizationPolicies](#ContentKeyAuthorizationPolicies)."
-
-#### Add an authorization policy to the content key
-Add AuthorizationPolicy to ContentKey, as shown in the section "[Add an authorization policy to the content key](#AddAuthorizationPolicyToKey)."
-
-## PlayReady dynamic encryption
-You can use Media Services to configure the rights and restrictions that you want the PlayReady DRM runtime to enforce when a user tries to play back protected content.
-
-When you protect your content with PlayReady, one of the things you need to specify in your authorization policy is an XML string that defines the [PlayReady license template](media-services-playready-license-template-overview.md).
-
-### Open restriction
-Open restriction means the system delivers the key to anyone who makes a key request. This restriction might be useful for testing purposes.
-
-The following example creates an open authorization policy and adds it to the content key.
-
-#### <a id="ContentKeyAuthorizationPolicies2"></a>Create ContentKeyAuthorizationPolicies
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 9e7fa407-f84e-43aa-8f05-9790b46e279b
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 58
-
-{"Name":"Deliver Common Content Key"}
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 233
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicies('nb%3Ackpid%3AUUID%3Acc3c64a8-e2fc-4e09-bf60-ac954251a387')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 9e7fa407-f84e-43aa-8f05-9790b46e279b
-request-id: b3d33c1b-a9cb-4120-ac0c-18f64846c147
-x-ms-request-id: b3d33c1b-a9cb-4120-ac0c-18f64846c147
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 09:26:00 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicies/@Element","Id":"nb:ckpid:UUID:cc3c64a8-e2fc-4e09-bf60-ac954251a387","Name":"Deliver Common Content Key"}
-```
--
-#### Create ContentKeyAuthorizationPolicyOptions
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 3.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: f160ad25-b457-4bc6-8197-315604c5e585
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 593
-
-{"Name":"","KeyDeliveryType":1,"KeyDeliveryConfiguration":"<PlayReadyLicenseResponseTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1\"><LicenseTemplates><PlayReadyLicenseTemplate><AllowTestDevices>false</AllowTestDevices><ContentKey i:type=\"ContentEncryptionKeyFromHeader\" /><LicenseType>Nonpersistent</LicenseType><PlayRight /></PlayReadyLicenseTemplate></LicenseTemplates></PlayReadyLicenseResponseTemplate>","Restrictions":[{"Name":"Open","KeyRestrictionType":0,"Requirements":null}]}
-```
-
-Response:
-
-```output
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 774
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions('nb%3Ackpoid%3AUUID%3A1052308c-4df7-4fdb-8d21-4d2141fc2be0')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: f160ad25-b457-4bc6-8197-315604c5e585
-request-id: 563f5a42-50a4-4c4a-add8-a833f8364231
-x-ms-request-id: 563f5a42-50a4-4c4a-add8-a833f8364231
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 09:23:24 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicyOptions/@Element","Id":"nb:ckpoid:UUID:1052308c-4df7-4fdb-8d21-4d2141fc2be0","Name":"","KeyDeliveryType":1,"KeyDeliveryConfiguration":"<PlayReadyLicenseResponseTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1\"><LicenseTemplates><PlayReadyLicenseTemplate><AllowTestDevices>false</AllowTestDevices><ContentKey i:type=\"ContentEncryptionKeyFromHeader\" /><LicenseType>Nonpersistent</LicenseType><PlayRight /></PlayReadyLicenseTemplate></LicenseTemplates></PlayReadyLicenseResponseTemplate>","Restrictions":[{"Name":"Open","KeyRestrictionType":0,"Requirements":null}]}
-```
-
-#### Link ContentKeyAuthorizationPolicies with options
-Link ContentKeyAuthorizationPolicies with options, as shown in the section "[Create ContentKeyAuthorizationPolicies](#ContentKeyAuthorizationPolicies)."
-
-#### Add an authorization policy to the content key
-Add AuthorizationPolicy to ContentKey, as shown in the section "[Add an authorization policy to the content key](#AddAuthorizationPolicyToKey)."
-
-### Token restriction
-To configure the token restriction option, you need to use an XML to describe the token's authorization requirements. The token restriction configuration XML must conform to the XML schema shown in the section "[Token restriction schema](#schema)."
-
-#### Create ContentKeyAuthorizationPolicies
-Create ContentKeyAuthorizationPolicies, as shown in the section "[Create ContentKeyAuthorizationPolicies](#ContentKeyAuthorizationPolicies2)."
-
-#### Create ContentKeyAuthorizationPolicyOptions
-Request:
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 3.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: ab079b0e-2ba9-4cf1-b549-a97bfa6cd2d3
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 1525
-
-{"Name":"Token option","KeyDeliveryType":1,"KeyDeliveryConfiguration":"<PlayReadyLicenseResponseTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1\"><LicenseTemplates><PlayReadyLicenseTemplate><AllowTestDevices>false</AllowTestDevices><ContentKey i:type=\"ContentEncryptionKeyFromHeader\" /><LicenseType>Nonpersistent</LicenseType><PlayRight /></PlayReadyLicenseTemplate></LicenseTemplates></PlayReadyLicenseResponseTemplate>","Restrictions":[{"Name":"Token Authorization Policy","KeyRestrictionType":1,"Requirements":"<TokenRestrictionTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1\"><AlternateVerificationKeys><TokenVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>w52OyHVqXT8aaupGxuJ3NGt8M6opHDOtx132p4r6q4hLI6ffnLusgEGie1kedUewVoIe1tqDkVE6xsIV7O91KA==</KeyValue></TokenVerificationKey></AlternateVerificationKeys><Audience>urn:test</Audience><Issuer>http://testissuer.com/</Issuer><PrimaryVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>dYwLKIEMBljLeY9VM7vWdlhps31Fbt0XXhqP5VyjQa33bJXleBtkzQ6dF5AtwI9gDcdM2dV2TvYNhCilBKjMCg==</KeyValue></PrimaryVerificationKey><RequiredClaims><TokenClaim><ClaimType>urn:microsoft:azure:media
-```
-
-Response:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 1706
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeyAuthorizationPolicyOptions('nb%3Ackpoid%3AUUID%3Ae42bbeae-de42-4077-90e9-a844f297ef70')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: ab079b0e-2ba9-4cf1-b549-a97bfa6cd2d3
-request-id: ccf8a4ba-731e-4124-8192-079592c251cc
-x-ms-request-id: ccf8a4ba-731e-4124-8192-079592c251cc
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Tue, 10 Feb 2015 09:58:47 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeyAuthorizationPolicyOptions/@Element","Id":"nb:ckpoid:UUID:e42bbeae-de42-4077-90e9-a844f297ef70","Name":"Token option","KeyDeliveryType":1,"KeyDeliveryConfiguration":"<PlayReadyLicenseResponseTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/PlayReadyTemplate/v1\"><LicenseTemplates><PlayReadyLicenseTemplate><AllowTestDevices>false</AllowTestDevices><ContentKey i:type=\"ContentEncryptionKeyFromHeader\" /><LicenseType>Nonpersistent</LicenseType><PlayRight /></PlayReadyLicenseTemplate></LicenseTemplates></PlayReadyLicenseResponseTemplate>","Restrictions":[{"Name":"Token Authorization Policy","KeyRestrictionType":1,"Requirements":"<TokenRestrictionTemplate xmlns:i=\"https://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/Azure/MediaServices/KeyDelivery/TokenRestrictionTemplate/v1\"><AlternateVerificationKeys><TokenVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>w52OyHVqXT8aaupGxuJ3NGt8M6opHDOtx132p4r6q4hLI6ffnLusgEGie1kedUewVoIe1tqDkVE6xsIV7O91KA==</KeyValue></TokenVerificationKey></AlternateVerificationKeys><Audience>urn:test</Audience><Issuer>http://testissuer.com/</Issuer><PrimaryVerificationKey i:type=\"SymmetricVerificationKey\"><KeyValue>dYwLKIEMBljLeY9VM7vWdlhps31Fbt0XXhqP5VyjQa33bJXleBtkzQ6dF5AtwI9gDcdM2dV2TvYNhCilBKjMCg==</KeyValue></PrimaryVerificationKey><RequiredClaims><TokenClaim><ClaimType>urn:microsoft:azure:media
-```
-
-#### Link ContentKeyAuthorizationPolicies with options
-Link ContentKeyAuthorizationPolicies with options, as shown in the section "[Create ContentKeyAuthorizationPolicies](#ContentKeyAuthorizationPolicies)."
-
-#### Add an authorization policy to the content key
-Add AuthorizationPolicy to ContentKey, as shown in the section "[Add an authorization policy to the content key](#AddAuthorizationPolicyToKey)."
-
-## <a id="types"></a>Types used when you define ContentKeyAuthorizationPolicy
-### <a id="ContentKeyRestrictionType"></a>ContentKeyRestrictionType
-```csharp
-public enum ContentKeyRestrictionType
-{
- Open = 0,
- TokenRestricted = 1,
- IPRestricted = 2, // IP restriction on content key is not currently supported, reserved for future.
-}
-```
--
-> [!NOTE]
-> IP restriction on content key authorization policies is not yet available in the service.
--
-### <a id="ContentKeyDeliveryType"></a>ContentKeyDeliveryType
-```csharp
-public enum ContentKeyDeliveryType
-{
- None = 0,
- PlayReadyLicense = 1,
- BaselineHttp = 2,
- Widevine = 3
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-Now that you have configured a content key's authorization policy, see [Configure asset delivery policy](media-services-rest-configure-asset-delivery-policy.md).
media-services Media Services Rest Connect With Aad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-connect-with-aad.md
- Title: Use Azure AD authentication to access Azure Media Services API with REST | Microsoft Docs
-description: Learn how to access Azure Media Services API with Azure Active Directory authentication by using REST.
------ Previously updated : 03/10/2021----
-# Use Azure AD authentication to access the Media Services API with REST
---
-When you're using Azure AD authentication with Azure Media Services, you can authenticate in one of two ways:
--- **User authentication** authenticates a person who is using the app to interact with Azure Media Services resources. The interactive application should first prompt the user for credentials. An example is a management console app that's used by authorized users to monitor encoding jobs or live streaming. -- **Service principal authentication** authenticates a service. Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs, such as web apps, function apps, logic apps, APIs, or microservices.-
- This tutorial shows you how to use Azure AD **service principal** authentication to access AMS API with REST.
-
- > [!NOTE]
- > **Service principal** is the recommended best practice for most applications connecting to Azure Media Services.
-
-In this tutorial, you learn how to:
-
-> [!div class="checklist"]
-> * Get the authentication information from the Azure portal
-> * Get the access token using Postman
-> * Test the **Assets** API using the access token
--
-> [!IMPORTANT]
-> Currently, Media Services supports the Azure Access Control services authentication model. However, Access Control authentication will be deprecated June 1, 2018. We recommend that you migrate to the Azure AD authentication model as soon as possible.
-
-## Prerequisites
--- If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.-- [Create an Azure Media Services account using the Azure portal](media-services-portal-create-account.md).-- Review the [Accessing Azure Media Services API with Azure AD authentication overview](media-services-use-aad-auth-to-access-ams-api.md) article.-- Install the [Postman](https://www.getpostman.com/) REST client to execute the REST APIs shown in this article. -
- In this tutorial, we are using **Postman** but any REST tool would be suitable. Other alternatives are: **Visual Studio Code** with the REST plugin or **Telerik Fiddler**.
-
-## Get the authentication information from the Azure portal
-
-### Overview
-
-To access Media Services API, you need to collect the following data points.
-
-|Setting|Example|Description|
-||-|--|
-|Azure Active Directory tenant domain|microsoft.onmicrosoft.com|Azure AD as a Secure Token Service (STS) endpoint is created using the following format: <https://login.microsoftonline.com/{your-ad-tenant-name.onmicrosoft.com}/oauth2/token>. Azure AD issues a JWT in order to access resources (an access token).|
-|REST API endpoint|<https://amshelloworld.restv2.westus.media.azure.net/api/>|This is the endpoint against which all Media Services REST API calls in your application are made.|
-|Client ID (Application ID)|f7fbbb29-a02d-4d91-bbc6-59a2579259d2|Azure AD application (client) ID. The client ID is required to get the access token. |
-|Client Secret|+mUERiNzVMoJGggD6aV1etzFGa1n6KeSlLjIq+Dbim0=|Azure AD application keys (client secret). The client secret is required to get the access token.|
-
-### Get Azure Active Directory auth info from the Azure portal
-
-To get the information, follow these steps:
-
-1. Log in to the [Azure portal](https://portal.azure.com).
-2. Navigate to your AMS instance.
-3. Select **API access**.
-4. Click on **Connect to Azure Media Services API with service principal**.
-
- ![Screenshot that shows "A P I access" selected from the "Media Services" menu and "Connect to Azure Media Services A P I with service principal" selected from the right pane.](./media/connect-with-rest/connect-with-rest01.png)
-
-5. Select an existing **Azure AD application** or create a new one (shown below).
-
- > [!NOTE]
- > For the Azure Media REST request to succeed, the calling user must have a **Contributor** or **Owner** role for the Media Services account it is trying to access. If you get an exception that says "The remote server returned an error: (401) Unauthorized," see [Access control](media-services-use-aad-auth-to-access-ams-api.md#access-control).
-
- If you need to create a new AD app, follow these steps:
-
- 1. Press **Create New**.
- 2. Enter a name.
- 3. Press **Create New** again.
- 4. Press **Save**.
-
- ![Screenshot that shows the "Create New" dialog with the "Create app" text box highlighted, and the "Save" button selected.](./media/connect-with-rest/new-app.png)
-
- The new app shows up on the page.
-
-6. Get the **Client ID** (Application ID).
-
- 1. Select the application.
- 2. Get the **Client ID** from the window on the right.
-
- ![Screenshot that shows the "Azure A D app" and "Manage application" selected, and the "Client I D" highlighted in the right pane.](./media/connect-with-rest/existing-client-id.png)
-
-7. Get the application's **Key** (client secret).
-
- 1. Click the **Manage application** button (notice that the Client ID info is under **Application ID**).
- 2. Press **Keys**.
-
- ![Screenshot that shows the "Manage application" button selected, the "Application I D" in the center pane highlighted, and "Keys" selected in the right pane.](./media/connect-with-rest/manage-app.png)
- 3. Generate the app key (client secret) by filling in **DESCRIPTION** and **EXPIRES** and pressing **Save**.
-
- Once the **Save** button is pressed, the key value appears. Copy the key value before leaving the blade.
-
- ![API access](./media/connect-with-rest/connect-with-rest03.png)
-
-You can add values for AD connection parameters to your web.config or app.config file, to later use in your code.
-
-> [!IMPORTANT]
-> The **Client key** is an important secret and should be properly secured in a key vault or encrypted in production.
-
-## Get the access token using Postman
-
-This section shows how to use **Postman** to execute a REST API that returns a JWT Bearer Token (access token). To call any Media Services REST API, you need to add the "Authorization" header to the calls, and add the value of "Bearer *your_access_token*" to each call (as shown in the next section of this tutorial). 
-
-1. Open **Postman**.
-2. Select **POST**.
-3. Enter the URL that includes your tenant name using the following format: the tenant name should end with **.onmicrosoft.com** and the URL should end with **oauth2/token**:
-
- `https://login.microsoftonline.com/{your-aad-tenant-name.onmicrosoft.com}/oauth2/token`
-
-4. Select the **Headers** tab.
-5. Enter the **Headers** information using the "Key/Value" data grid.
-
- ![Screenshot that shows the "Headers" tab and the "Bulk Edit" action selected.](./media/connect-with-rest/headers-data-grid.png)
-
- Alternatively, click **Bulk Edit** link on the right of the Postman window and paste the following code.
-
- ```javascript
- Content-Type:application/x-www-form-urlencoded
- Keep-Alive:true
- ```
-
-6. Press the **Body** tab.
-7. Enter the body information using the "Key/Value" data grid (replace the client ID and secret values).
-
- ![Data Grid](./media/connect-with-rest/data-grid.png)
-
- Alternatively, click **Bulk Edit** on the right of the Postman window and paste the following body (replace the client ID and secret values):
-
- ```javascript
- grant_type:client_credentials
- client_id:{Your Client ID that you got from your Azure AD Application}
- client_secret:{Your client secret that you got from your Azure AD Application's Keys}
- resource:https://rest.media.azure.net
- ```
-
-8. Press **Send**.
-
- ![Screenshot that shows the "Post" text box, "Headers" and "Body" tabs, and "access_token" highlighted, and the "Send" button detected.](./media/connect-with-rest/connect-with-rest04.png)
-
-The returned response contains the **access token** that you need to use to access any AMS APIs.
-
-## Test the **Assets** API using the access token
-
-This section shows how to access the **Assets** API using **Postman**.
-
-1. Open **Postman**.
-2. Select **GET**.
-3. Paste the REST API endpoint (for example, https://amshelloworld.restv2.westus.media.azure.net/api/Assets)
-4. Select the **Authorization** tab.
-5. Select **Bearer Token**.
-6. Paste the token that was created in the previous section.
-
- ![get token](./media/connect-with-rest/connect-with-rest05.png)
-
- > [!NOTE]
- > The Postman UX could be different between a Mac and PC. If the Mac version does not have the "Bearer Token" option in the **Authentication** section dropdown, you should add the **Authorization** header manually on the Mac client.
-
- ![Auth header](./media/connect-with-rest/auth-header.png)
-
-7. Select **Headers**.
-5. Click **Bulk Edit** link on the right the Postman window.
-6. Paste the following headers:
-
- ```javascript
- x-ms-version:2.19
- Accept:application/json
- Content-Type:application/json
- DataServiceVersion:3.0
- MaxDataServiceVersion:3.0
- ```
-
-7. Press **Send**.
-
-The returned response contains the assets that are in your account.
-
-## Next steps
-
-* Try this sample code in [Azure AD Authentication for Azure Media Services Access: Both via REST API](https://github.com/willzhan/WAMSRESTSoln)
-* [Upload files with .NET](media-services-dotnet-upload-files.md)
media-services Media Services Rest Create Contentkey https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-create-contentkey.md
- Title: Create content keys with REST | Microsoft Docs
-description: This article demonstrates how to create content keys that provide secure access to assets.
------ Previously updated : 03/10/2021--
-# Create content keys with REST
--
-> [!div class="op_single_selector"]
-> * [REST](media-services-rest-create-contentkey.md)
-> * [.NET](media-services-dotnet-create-contentkey.md)
->
->
-
-Media Services enables you to deliver encrypted assets. A **ContentKey** provides secure access to your **Asset**s.
-
-When you create a new asset (for example, before you [upload files](media-services-rest-upload-files.md)), you can specify the following encryption options: **StorageEncrypted**, **CommonEncryptionProtected**, or **EnvelopeEncryptionProtected**.
-
-When you deliver assets to your clients, you can [configure for assets to be dynamically encrypted](media-services-rest-configure-asset-delivery-policy.md) with one of the following two encryptions: **DynamicEnvelopeEncryption** or **DynamicCommonEncryption**.
-
-Encrypted assets have to be associated with **ContentKey**s. This article describes how to create a content key.
-
-The following are general steps for generating content keys that you associate with assets that you want to be encrypted.
-
-1. Randomly generate a 16-byte AES key (for common and envelope encryption) or a 32-byte AES key (for storage encryption).
-
- This is the content key for your asset, which means all files associated with that asset need to use the same content key during decryption.
-2. Call the [GetProtectionKeyId](/rest/api/media/operations/rest-api-functions#getprotectionkeyid) and [GetProtectionKey](/rest/api/media/operations/rest-api-functions#getprotectionkey) methods to get the correct X.509 Certificate that must be used to encrypt your content key.
-3. Encrypt your content key with the public key of the X.509 Certificate.
-
- Media Services .NET SDK uses RSA with OAEP when doing the encryption. You can see an example in the [EncryptSymmetricKeyData function](https://github.com/Azure/azure-sdk-for-media-services/blob/dev/src/net/Client/Common/Common.FileEncryption/EncryptionUtils.cs).
-4. Create a checksum value (based on the PlayReady AES key checksum algorithm) calculated using the key identifier and content key. For more information, see the ΓÇ£PlayReady AES Key Checksum AlgorithmΓÇ¥ section of the PlayReady Header Object document located [here](https://www.microsoft.com/playready/documents/).
-
- The following .NET example calculates the checksum using the GUID part of the key identifier and the clear content key.
-
- ```console
- public static string CalculateChecksum(byte[] contentKey, Guid keyId)
- {
- byte[] array = null;
- using (AesCryptoServiceProvider aesCryptoServiceProvider = new AesCryptoServiceProvider())
- {
- aesCryptoServiceProvider.Mode = CipherMode.ECB;
- aesCryptoServiceProvider.Key = contentKey;
- aesCryptoServiceProvider.Padding = PaddingMode.None;
- ICryptoTransform cryptoTransform = aesCryptoServiceProvider.CreateEncryptor();
- array = new byte[16];
- cryptoTransform.TransformBlock(keyId.ToByteArray(), 0, 16, array, 0);
- }
- byte[] array2 = new byte[8];
- Array.Copy(array, array2, 8);
- return Convert.ToBase64String(array2);
- }
- ```
-
-5. Create the Content key with the **EncryptedContentKey** (converted to base64-encoded string), **ProtectionKeyId**, **ProtectionKeyType**, **ContentKeyType**, and **Checksum** values you have received in previous steps.
-6. Associate the **ContentKey** entity with your **Asset** entity through the $links operation.
-
-This article does not show how to generate an AES key, encrypt the key, and calculate the checksum.
-
-> [!NOTE]
->
-> When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Retrieve the ProtectionKeyId
-The following example shows how to retrieve the ProtectionKeyId, a certificate thumbprint, for the certificate you must use when encrypting your content key. Do this step to make sure that you already have the appropriate certificate on your machine.
-
-Request:
-
-```console
-GET https://media.windows.net/api/GetProtectionKeyId?contentKeyType=0 HTTP/1.1
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-Response:
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 139
-charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 2b6aa7a4-3a09-4b08-b581-26b55667f817
-x-ms-request-id: 2b6aa7a4-3a09-4b08-b581-26b55667f817
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 04 Feb 2015 02:42:52 GMT
-
-$metadata#Edm.String","value":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C"}
-```
-
-## Retrieve the ProtectionKey for the ProtectionKeyId
-The following example shows how to retrieve the X.509 certificate using the ProtectionKeyId you received in the previous step.
-
-Request:
-
-```console
-GET https://media.windows.net/api/GetProtectionKey?ProtectionKeyId='7D9BB04D9D0A4A24800CADBFEF232689E048F69C' HTTP/1.1
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 78d1247a-58d7-40e5-96cc-70ff0dfa7382
-Host: media.windows.net
-```
--
-Response:
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 1227
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 78d1247a-58d7-40e5-96cc-70ff0dfa7382
-request-id: 1523e8f3-8ed2-40fe-8a9a-5d81eb572cc8
-x-ms-request-id: 1523e8f3-8ed2-40fe-8a9a-5d81eb572cc8
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Thu, 05 Feb 2015 07:52:30 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Edm.String", "value":"MIIDSTCCAjGgAwIBAgIQqf92wku/HLJGCbMAU8GEnDANBgkqhkiG9w0BAQQFADAuMSwwKgYDVQQDEyN3YW1zYmx1cmVnMDAxZW5jcnlwdGFsbHNlY3JldHMtY2VydDAeFw0xMjA1MjkwNzAwMDBaFw0zMjA1MjkwNzAwMDBaMC4xLDAqBgNVBAMTI3dhbXNibHVyZWcwMDFlbmNyeXB0YWxsc2VjcmV0cy1jZXJ0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAzR0SEbXefvUjb9wCUfkEiKtGQ5Gc328qFPrhMjSo+YHe0AVviZ9YaxPPb0m1AaaRV4dqWpST2+JtDhLOmGpWmmA60tbATJDdmRzKi2eYAyhhE76MgJgL3myCQLP42jDusWXWSMabui3/tMDQs+zfi1sJ4Ch/lm5EvksYsu6o8sCv29VRwxfDLJPBy2NlbV4GbWz5Qxp2tAmHoROnfaRhwp6WIbquk69tEtu2U50CpPN2goLAqx2PpXAqA+prxCZYGTHqfmFJEKtZHhizVBTFPGS3ncfnQC9QIEwFbPw6E5PO5yNaB68radWsp5uvDg33G1i8IT39GstMW6zaaG7cNQIDAQABo2MwYTBfBgNVHQEEWDBWgBCOGT2hPhsvQioZimw8M+jOoTAwLjEsMCoGA1UEAxMjd2Ftc2JsdXJlZzAwMWVuY3J5cHRhbGxzZWNyZXRzLWNlcnSCEKn/dsJLvxyyRgmzAFPBhJwwDQYJKoZIhvcNAQEEBQADggEBABcrQPma2ekNS3Wc5wGXL/aHyQaQRwFGymnUJ+VR8jVUZaC/U/f6lR98eTlwycjVwRL7D15BfClGEHw66QdHejaViJCjbEIJJ3p2c9fzBKhjLhzB3VVNiLIaH6RSI1bMPd2eddSCqhDIn3VBN605GcYXMzhYp+YA6g9+YMNeS1b+LxX3fqixMQIxSHOLFZ1G/H2xfNawv0VikH3djNui3EKT1w/8aRkUv/AAV0b3rYkP/jA1I0CPn0XFk7STYoiJ3gJoKq9EMXhit+Iwfz0sMkfhWG12/XO+TAWqsK1ZxEjuC9OzrY7pFnNxs4Mu4S8iinehduSpY+9mDd3dHynNwT4="}
-```
-
-## Create the ContentKey
-After you have retrieved the X.509 certificate and used its public key to encrypt your content key, create a **ContentKey** entity and set its property values accordingly.
-
-One of the values that you must set when create the content key is the type. Choose from one of the following values:
-
-```console
-public enum ContentKeyType
-{
- /// <summary>
- /// Specifies a content key for common encryption.
- /// </summary>
- /// <remarks>This is the default value.</remarks>
- CommonEncryption = 0,
-
- /// <summary>
- /// Specifies a content key for storage encryption.
- /// </summary>
- StorageEncryption = 1,
-
- /// <summary>
- /// Specifies a content key for configuration encryption.
- /// </summary>
- ConfigurationEncryption = 2,
-
- /// <summary>
- /// Specifies a content key for Envelope encryption. Only used internally.
- /// </summary>
- EnvelopeEncryption = 4
- }
-```
-
-The following example shows how to create a **ContentKey** with a **ContentKeyType** set for storage encryption ("1") and the **ProtectionKeyType** set to "0" to indicate that the protection key ID is the X.509 certificate thumbprint.
-
-Request
-
-```console
-POST https://media.windows.net/api/ContentKeys HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-{
-"Name":"ContentKey",
-"ProtectionKeyId":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C",
-"ContentKeyType":"1",
-"ProtectionKeyType":"0",
-"EncryptedContentKey":"your encrypted content key",
-"Checksum":"calculated checksum"
-}
-```
-
-Response:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 777
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://media.windows.net/api/ContentKeys('nb%3Akid%3AUUID%3A9c8ea9c6-52bd-4232-8a43-8e43d8564a99')
-Server: Microsoft-IIS/8.5
-request-id: 76e85e0f-5cf1-44cb-b689-b3455888682c
-x-ms-request-id: 76e85e0f-5cf1-44cb-b689-b3455888682c
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 04 Feb 2015 02:37:46 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeys/@Element",
-"Id":"nb:kid:UUID:9c8ea9c6-52bd-4232-8a43-8e43d8564a99","Created":"2015-02-04T02:37:46.9684379Z",
-"LastModified":"2015-02-04T02:37:46.9684379Z",
-"ContentKeyType":1,
-"EncryptedContentKey":"your encrypted content key",
-"Name":"ContentKey",
-"ProtectionKeyId":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C",
-"ProtectionKeyType":0,
-"Checksum":"calculated checksum"}
-```
-
-## Associate the ContentKey with an Asset
-After creating the ContentKey, associate it with your Asset using the $links operation, as shown in the following example:
-
-Request:
-
-```console
-POST https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3Afbd7ce05-1087-401b-aaae-29f16383c801')/$links/ContentKeys HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Content-Type: application/json
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-
-{"uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeys('nb%3Akid%3AUUID%3A01e6ea36-2285-4562-91f1-82c45736047c')"}
-```
-
-Response:
-
-```console
-HTTP/1.1 204 No Content
-```
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Rest Deliver Streaming Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-deliver-streaming-content.md
- Title: Publish Azure Media Services content using REST
-description: Learn how to create a locator that is used to build a streaming URL. The code uses REST API.
------ Previously updated : 03/10/2021--
-# Publish Azure Media Services content using REST
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-deliver-streaming-content.md)
-> * [REST](media-services-rest-deliver-streaming-content.md)
-> * [Portal](media-services-portal-publish.md)
->
->
-
-You can stream an adaptive bitrate MP4 set by creating an OnDemand streaming locator and building a streaming URL. The [encoding an asset](media-services-rest-encode-asset.md) article shows how to encode into an adaptive bitrate MP4 set. If your content is encrypted, configure asset delivery policy (as described in [this](media-services-rest-configure-asset-delivery-policy.md) article) before creating a locator.
-
-You can also use an OnDemand streaming locator to build URLs that point to MP4 files that can be progressively downloaded.
-
-This article shows how to create an OnDemand streaming locator in order to publish your asset and build a Smooth, MPEG DASH, and HLS streaming URLs. It also shows how to build progressive download URLs.
-
-The [following](#types) section shows the enum types whose values are used in the REST calls.
-
-> [!NOTE]
-> When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
->
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
->[!NOTE]
->After successfully connecting to `https://media.windows.net`, you will receive a 301 redirect specifying another Media Services URI. You must make subsequent calls to the new URI.
-
-## Create an OnDemand streaming locator
-To create the OnDemand streaming locator and get URLs, you need to do the following:
-
-1. If the content is encrypted, define an access policy.
-2. Create an OnDemand streaming locator.
-3. If you plan to stream, get the streaming manifest file (.ism) in the asset.
-
- If you plan to progressively download, get the names of MP4 files in the asset.
-4. Build URLs to the manifest file or MP4 files.
-5. You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
-
-### Create an access policy
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). Use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-Request:
-
-```console
-POST https://media.windows.net/api/AccessPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 6bcfd511-a561-448d-a022-a319a89ecffa
-Host: media.windows.net
-Content-Length: 68
-
-{"Name":"access policy","DurationInMinutes":43200.0,"Permissions":1}
-```
-
-Response:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 311
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https:/media.windows.net/api/AccessPolicies('nb%3Apid%3AUUID%3A69c80d98-7830-407f-a9af-e25f4b0d3e5f')
-Server: Microsoft-IIS/8.5
-request-id: a877528a-bdb4-4414-9862-273f8e64f882
-x-ms-request-id: a877528a-bdb4-4414-9862-273f8e64f882
-x-ms-client-request-id: 6bcfd511-a561-448d-a022-a319a89ecffa
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 18 Feb 2015 06:52:09 GMT
-
-{"odata.metadata":"https://media.windows.net/api/$metadata#AccessPolicies/@Element","Id":"nb:pid:UUID:69c80d98-7830-407f-a9af-e25f4b0d3e5f","Created":"2015-02-18T06:52:09.8862191Z","LastModified":"2015-02-18T06:52:09.8862191Z","Name":"access policy","DurationInMinutes":43200.0,"Permissions":1}
-```
-
-### Create an OnDemand streaming locator
-Create the locator for the specified asset and asset policy.
-
-Request:
-
-```console
-POST https://media.windows.net/api/Locators HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: ac159492-9a0c-40c3-aacc-551b1b4c5f62
-Host: media.windows.net
-Content-Length: 181
-
-{"AccessPolicyId":"nb:pid:UUID:1480030d-c481-430a-9687-535c6a5cb272","AssetId":"nb:cid:UUID:cc1e445d-1500-80bd-538e-f1e4b71b465e","StartTime":"2015-02-18T06:34:47.267872Z","Type":2}
-```
-Response:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 637
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://media.windows.net/api/Locators('nb%3Alid%3AUUID%3Abe245661-2bbd-4fc6-b14f-9cf9a1492e5e')
-Server: Microsoft-IIS/8.5
-request-id: 5bd5864a-0afd-44c0-a67a-4044a2c9043b
-x-ms-request-id: 5bd5864a-0afd-44c0-a67a-4044a2c9043b
-x-ms-client-request-id: ac159492-9a0c-40c3-aacc-551b1b4c5f62
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 18 Feb 2015 06:58:37 GMT
-
-{"odata.metadata":"https://media.windows.net/api/$metadata#Locators/@Element","Id":"nb:lid:UUID:be245661-2bbd-4fc6-b14f-9cf9a1492e5e","ExpirationDateTime":"2015-03-20T06:34:47.267872+00:00","Type":2,"Path":"https://amstest1.streaming.mediaservices.windows.net/be245661-2bbd-4fc6-b14f-9cf9a1492e5e/","BaseUri":"https://amstest1.streaming.mediaservices.windows.net","ContentAccessComponent":"be245661-2bbd-4fc6-b14f-9cf9a1492e5e","AccessPolicyId":"nb:pid:UUID:1480030d-c481-430a-9687-535c6a5cb272","AssetId":"nb:cid:UUID:cc1e445d-1500-80bd-538e-f1e4b71b465e","StartTime":"2015-02-18T06:34:47.267872+00:00","Name":null}
-```
-
-### Build streaming URLs
-Use the **Path** value returned after the creation of the locator to build the Smooth, HLS, and MPEG DASH URLs.
-
-Smooth Streaming: **Path** + manifest file name + "/manifest"
-
-example:
-
-`https://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest`
-
-HLS: **Path** + manifest file name + "/manifest(format=m3u8-aapl)"
-
-example:
-
-`https://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest(format=m3u8-aapl)`
--
-DASH: **Path** + manifest file name + "/manifest(format=mpd-time-csf)"
-
-example:
-
-`https://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny.ism/manifest(format=mpd-time-csf)`
--
-### Build progressive download URLs
-Use the **Path** value returned after the creation of the locator to build the progressive download URL.
-
-URL: **Path** + asset file mp4 name
-
-example:
-
-`https://amstest1.streaming.mediaservices.windows.net/3c5fe676-199c-4620-9b03-ba014900f214/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4`
-
-## <a id="types"></a>Enum types
-
-```console
-[Flags]
-public enum AccessPermissions
-{
- None = 0,
- Read = 1,
- Write = 2,
- Delete = 4,
- List = 8,
-}
-
-public enum LocatorType
-{
- None = 0,
- Sas = 1,
- OnDemandOrigin = 2,
-}
-```
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See also
-[Media Services operations REST API overview](media-services-rest-how-to-use.md)
-
-[Configure asset delivery policy](media-services-rest-configure-asset-delivery-policy.md)
-
media-services Media Services Rest Dynamic Manifest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-dynamic-manifest.md
- Title: Creating Filters with Azure Media Services REST API | Microsoft Docs
-description: This topic describes how to create filters so your client can use them to stream specific sections of a stream. Media Services REST API creates dynamic manifests to achieve this selective streaming.
------ Previously updated : 03/10/2021-
-ms.reviewr: cenkdin
-
-# Creating Filters with Azure Media Services REST API
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-dynamic-manifest.md)
-> * [REST](media-services-rest-dynamic-manifest.md)
->
->
-
-Starting with 2.17 release, Media Services enables you to define filters for your assets. These filters are server-side rules that allow your customers to choose to do things like: playback only a section of a video (instead of playing the whole video), or specify only a subset of audio and video renditions that your customer's device can handle (instead of all the renditions that are associated with the asset). This filtering of your assets is archived through **Dynamic Manifest**s that are created upon your customer's request to stream a video based on specified filter(s).
-
-For more detailed information related to filters and Dynamic Manifest, see [Dynamic manifests overview](media-services-dynamic-manifest-overview.md).
-
-This article shows how to use REST APIs to create, update, and delete filters.
-
-## Types used to create filters
-The following types are used when creating filters:
-
-* [Filter](/rest/api/media/operations/filter)
-* [AssetFilter](/rest/api/media/operations/assetfilter)
-* [PresentationTimeRange](/rest/api/media/operations/presentationtimerange)
-* [FilterTrackSelect and FilterTrackPropertyCondition](/rest/api/media/operations/filtertrackselect)
-
-> [!NOTE]
->
-> When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Create filters
-### Create global Filters
-To create a global Filter, use the following HTTP requests:
-
-#### HTTP Request
-Request Headers
-
-```console
-POST https://media.windows.net/API/Filters HTTP/1.1
-DataServiceVersion:3.0
-MaxDataServiceVersion: 3.0
-Content-Type: application/json
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host:media.windows.net
-```
-
-Request body
-
-```console
-{
- "Name":"GlobalFilter",
- "PresentationTimeRange":{
- "StartTimestamp":"0",
- "EndTimestamp":"9223372036854775807",
- "PresentationWindowDuration":"12000000000",
- "LiveBackoffDuration":"0",
- "Timescale":"10000000"
- },
- "Tracks":[
- {
- "PropertyConditions":
- [
- {
- "Property":"Type",
- "Value":"audio",
- "Operator":"Equal"
- },
- {
- "Property":"Bitrate",
- "Value":"0-2147483647",
- "Operator":"Equal"
- }
- ]
- }
- ]
-}
-```
---
-#### HTTP Response
-
-```console
-HTTP/1.1 201 Created
-```
-
-### Create local AssetFilters
-To create a local AssetFilter, use the following HTTP requests:
-
-#### HTTP Request
-Request Headers
-
-```console
-POST https://media.windows.net/API/AssetFilters HTTP/1.1
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Content-Type: application/json
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host: media.windows.net
-```
-
-Request body
-
-```console
-{
- "Name":"AssetFilter",
- "ParentAssetId":"nb:cid:UUID:536e555d-1500-80c3-92dc-f1e4fdc6c592",
- "PresentationTimeRange":{
- "StartTimestamp":"0",
- "EndTimestamp":"9223372036854775807",
- "PresentationWindowDuration":"12000000000",
- "LiveBackoffDuration":"0",
- "Timescale":"10000000"
- },
- "Tracks":[
- {
- "PropertyConditions":
- [
- {
- "Property":"Type",
- "Value":"audio",
- "Operator":"Equal"
- },
- {
- "Property":"Bitrate",
- "Value":"0-2147483647",
- "Operator":"Equal"
- }
- ]
- }
- ]
-}
-```
-
-#### HTTP Response
-
-```console
-HTTP/1.1 201 Created
-. . .
-```
-
-## List filters
-### Get all global **Filter**s in the AMS account
-To list filters, use the following HTTP requests:
-
-#### HTTP Request
-
-```console
-GET https://media.windows.net/API/Filters HTTP/1.1
-DataServiceVersion:3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-### Get **AssetFilter**s associated with an asset
-#### HTTP Request
-
-```console
-GET https://media.windows.net/API/Assets('nb%3Acid%3AUUID%3A536e555d-1500-80c3-92dc-f1e4fdc6c592')/AssetFilters HTTP/1.1
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host: media.windows.net
-```
-
-### Get an **AssetFilter** based on its Id
-#### HTTP Request
-
-```console
-GET https://media.windows.net/API/AssetFilters('nb%3Acid%3AUUID%3A536e555d-1500-80c3-92dc-f1e4fdc6c592__%23%23%23__TestFilter') HTTP/1.1
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000
-```
-
-## Update filters
-Use PATCH, PUT, or MERGE to update a filter with new property values. For more information about these operations, see [PATCH, PUT, MERGE](/openspecs/windows_protocols/ms-odata/59d5abd3-7b12-490a-a0e2-9d9324b91893).
-
-If you update a filter, it can take up to two minutes for streaming endpoint to refresh the rules. If the content was served using this filter (and cached in proxies and CDN caches), updating this filter can result in player failures. Clear the cache after updating the filter. If this option is not possible, consider using a different filter.
-
-### Update global Filters
-To update a global filter, use the following HTTP requests:
-
-#### HTTP Request
-Request headers:
-
-```console
-MERGE https://media.windows.net/API/Filters('filterName') HTTP/1.1
-DataServiceVersion:3.0
-MaxDataServiceVersion: 3.0
-Content-Type: application/json
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host: media.windows.net
-Content-Length: 384
-```
-
-Request body:
-
-```console
-{
- "Tracks":[
- {
- "PropertyConditions":
- [
- {
- "Property":"Type",
- "Value":"audio",
- "Operator":"Equal"
- },
- {
- "Property":"Bitrate",
- "Value":"0-2147483647",
- "Operator":"Equal"
- }
- ]
- }
- ]
-}
-```
-
-### Update local AssetFilters
-To update a local filter, use the following HTTP requests:
-
-#### HTTP Request
-Request headers:
-
-```console
-MERGE https://media.windows.net/API/AssetFilters('nb%3Acid%3AUUID%3A536e555d-1500-80c3-92dc-f1e4fdc6c592__%23%23%23__TestFilter') HTTP/1.1
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Content-Type: application/json
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host: media.windows.net
-```
-
-Request body:
-
-```console
-{
- "Tracks":[
- {
- "PropertyConditions":
- [
- {
- "Property":"Type",
- "Value":"audio",
- "Operator":"Equal"
- },
- {
- "Property":"Bitrate",
- "Value":"0-2147483647",
- "Operator":"Equal"
- }
- ]
- }
- ]
-}
-```
-
-## Delete filters
-### Delete global Filters
-To delete a global Filter, use the following HTTP requests:
-
-#### HTTP Request
-
-```console
-DELETE https://media.windows.net/api/Filters('GlobalFilter') HTTP/1.1
-DataServiceVersion:3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-### Delete local AssetFilters
-To delete a local AssetFilter, use the following HTTP requests:
-
-#### HTTP Request
-
-```console
-DELETE https://media.windows.net/API/AssetFilters('nb%3Acid%3AUUID%3A536e555d-1500-80c3-92dc-f1e4fdc6c592__%23%23%23__LocalFilter') HTTP/1.1
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-## Build streaming URLs that use filters
-For information on how to publish and deliver your assets, see [Delivering Content to Customers Overview](media-services-deliver-content-overview.md).
-
-The following examples show how to add filters to your streaming URLs.
-
-**MPEG DASH**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf, filter=MyFilter)`
-
-**Apple HTTP Live Streaming (HLS) V4**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl, filter=MyFilter)`
-
-**Apple HTTP Live Streaming (HLS) V3**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=m3u8-aapl-v3, filter=MyFilter)`
-
-**Smooth Streaming**
-
-`http://testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(filter=MyFilter)`
-
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Dynamic manifests overview](media-services-dynamic-manifest-overview.md)
media-services Media Services Rest Encode Asset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-encode-asset.md
- Title: How to encode an Azure asset by using Media Encoder Standard | Microsoft Docs
-description: Learn how to use Media Encoder Standard to encode media content on Azure Media Services. Code samples use REST API.
------ Previously updated : 03/10/2021--
-# How to encode an asset by using Media Encoder Standard
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-encode-with-media-encoder-standard.md)
-> * [REST](media-services-rest-encode-asset.md)
-> * [Portal](media-services-portal-encode.md)
->
->
-
-## Overview
-
-To deliver digital video over the Internet, you must compress the media. Digital video files are large and may be too large to deliver over the Internet, or for your customersΓÇÖ devices to display properly. Encoding is the process of compressing video and audio so your customers can view your media.
-
-Encoding jobs are one of the most common processing operations in Azure Media Services. You create encoding jobs to convert media files from one encoding to another. When you encode, you can use the Media Services built-in encoder (Media Encoder Standard). You can also use an encoder provided by a Media Services partner. Third-party encoders are available through the Azure Marketplace. You can specify the details of encoding tasks by using preset strings defined for your encoder, or by using preset configuration files. To see the types of presets that are available, see [Task Presets for Media Encoder Standard](./media-services-mes-presets-overview.md).
-
-Each job can have one or more tasks depending on the type of processing that you want to accomplish. Through the REST API, you can create jobs and their related tasks in one of two ways:
-
-* Tasks can be defined inline through the Tasks navigation property on Job entities.
-* Use OData batch processing.
-
-We recommend that you always encode your source files into an adaptive bitrate MP4 set, and then convert the set to the desired format by using [dynamic packaging](media-services-dynamic-packaging-overview.md).
-
-If your output asset is storage encrypted, you must configure the asset delivery policy. For more information, see [Configuring asset delivery policy](media-services-rest-configure-asset-delivery-policy.md).
-
-## Considerations
-
-When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-Before you start referencing media processors, verify that you have the correct media processor ID. For more information, see [Get media processors](media-services-rest-get-media-processor.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Create a job with a single encoding task
-
-> [!NOTE]
-> When you're working with the Media Services REST API, the following considerations apply:
->
-> When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API development](media-services-rest-how-to-use.md).
->
-> When using JSON and specifying to use the **__metadata** keyword in the request (for example, to reference a linked object), you must set the **Accept** header to [JSON Verbose format](https://www.odata.org/documentation/odata-version-3-0/json-verbose-format/): Accept: application/json;odata=verbose.
->
->
-
-The following example shows you how to create and post a job with one task set to encode a video at a specific resolution and quality. When you encode with Media Encoder Standard, you can use task configuration presets specified [here](./media-services-mes-presets-overview.md).
-
-Request:
-
-```console
-POST https://media.windows.net/API/Jobs HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
- Authorization: Bearer <ENCODED JWT TOKEN>
- x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
- Host: media.windows.net
-
-{"Name" : "NewTestJob", "InputMediaAssets" : [{"__metadata" : {"uri" : "https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3Aaab7f15b-3136-4ddf-9962-e9ecb28fb9d2')"}}], "Tasks" : [{"Configuration" : "Adaptive Streaming", "MediaProcessorId" : "nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56", "TaskBody" : "<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset>JobOutputAsset(0)</outputAsset></taskBody>"}]}
-```
-
-Response:
-
-```console
-HTTP/1.1 201 Created
-
-. . .
-```
-
-### Set the output asset's name
-The following example shows how to set the assetName attribute:
-
-```console
-{ "TaskBody" : "<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset assetName=\"CustomOutputAssetName\">JobOutputAsset(0)</outputAsset></taskBody>"}`
-```
-
-### Considerations
-* TaskBody properties must use literal XML to define the number of input, or output assets that are used by the task. The task article contains the XML Schema Definition for the XML.
-* In the TaskBody definition, each inner value for `<inputAsset>` and `<outputAsset>` must be set as JobInputAsset(value) or JobOutputAsset(value).
-* A task can have multiple output assets. One JobOutputAsset(x) can only be used once as an output of a task in a job.
-* You can specify JobInputAsset or JobOutputAsset as an input asset of a task.
-* Tasks must not form a cycle.
-* The value parameter that you pass to JobInputAsset or JobOutputAsset represents the index value for an asset. The actual assets are defined in the InputMediaAssets and OutputMediaAssets navigation properties on the job entity definition.
-* Because Media Services is built on OData v3, the individual assets in the InputMediaAssets and OutputMediaAssets navigation property collections are referenced through a "__metadata: uri" name-value pair.
-* InputMediaAssets maps to one or more assets that you created in Media Services. OutputMediaAssets are created by the system. They don't reference an existing asset.
-* OutputMediaAssets can be named by using the assetName attribute. If this attribute is not present, then the name of the OutputMediaAsset is whatever the inner text value of the `<outputAsset>` element is with a suffix of either the Job Name value, or the Job Id value (in the case where the Name property isn't defined). For example, if you set a value for assetName to "Sample," then the OutputMediaAsset Name property is set to "Sample." However, if you didn't set a value for assetName, but did set the job name to "NewJob," then the OutputMediaAsset Name would be "JobOutputAsset(value)_NewJob."
-
-## Create a job with chained tasks
-In many application scenarios, developers want to create a series of processing tasks. In Media Services, you can create a series of chained tasks. Each task performs different processing steps and can use different media processors. The chained tasks can hand off an asset from one task to another, performing a linear sequence of tasks on the asset. However, the tasks performed in a job are not required to be in a sequence. When you create a chained task, the chained **ITask** objects are created in a single **IJob** object.
-
-> [!NOTE]
-> There is currently a limit of 30 tasks per job. If you need to chain more than 30 tasks, create more than one job to contain the tasks.
->
->
-
-```console
-POST https://media.windows.net/api/Jobs HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-
-{
- "Name":"NewTestJob",
- "InputMediaAssets":[
- {
- "__metadata":{
- "uri":"https://testrest.cloudapp.net/api/Assets('nb%3Acid%3AUUID%3A910ffdc1-2e25-4b17-8a42-61ffd4b8914c')"
- }
- }
- ],
- "Tasks":[
- {
- "Configuration":"H264 Adaptive Bitrate MP4 Set 720p",
- "MediaProcessorId":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "TaskBody":"<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset>JobOutputAsset(0)</outputAsset></taskBody>"
- },
- {
- "Configuration":"H264 Smooth Streaming 720p",
- "MediaProcessorId":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "TaskBody":"<?xml version=\"1.0\" encoding=\"utf-16\"?><taskBody><inputAsset>JobOutputAsset(0)</inputAsset><outputAsset>JobOutputAsset(1)</outputAsset></taskBody>"
- }
- ]
-}
-```
-
-### Considerations
-To enable task chaining:
-
-* A job must have at least two tasks.
-* There must be at least one task whose input is the output of another task in the job.
-
-## Use OData batch processing
-The following example shows how to use OData batch processing to create a job and tasks. For information on batch processing, see [Open Data Protocol (OData) Batch Processing](https://www.odata.org/documentation/odata-version-3-0/batch-processing/).
-
-```console
-POST https://media.windows.net/api/$batch HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Content-Type: multipart/mixed; boundary=batch_a01a5ec4-ba0f-4536-84b5-66c5a5a6d34e
-Accept: multipart/mixed
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-Host: media.windows.net
--batch_a01a5ec4-ba0f-4536-84b5-66c5a5a6d34e
-Content-Type: multipart/mixed; boundary=changeset_122fb0a4-cd80-4958-820f-346309967e4d
-changeset_122fb0a4-cd80-4958-820f-346309967e4d
-Content-Type: application/http
-Content-Transfer-Encoding: binary
-
-POST https://media.windows.net/api/Jobs HTTP/1.1
-Content-ID: 1
-Content-Type: application/json
-Accept: application/json
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-
-{"Name" : "NewTestJob", "InputMediaAssets@odata.bind":["https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3A2a22445d-1500-80c6-4b34-f1e5190d33c6')"]}
-changeset_122fb0a4-cd80-4958-820f-346309967e4d
-Content-Type: application/http
-Content-Transfer-Encoding: binary
-
-POST https://media.windows.net/api/$1/Tasks HTTP/1.1
-Content-ID: 2
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 00000000-0000-0000-0000-000000000000
-
-{
- "Configuration":"H264 Adaptive Bitrate MP4 Set 720p",
- "MediaProcessorId":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "TaskBody":"<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset assetName=\"Custom output name\">JobOutputAsset(0)</outputAsset></taskBody>"
-}
-changeset_122fb0a4-cd80-4958-820f-346309967e4d--batch_a01a5ec4-ba0f-4536-84b5-66c5a5a6d34e--
-```
--
-## Create a job by using a JobTemplate
-When you process multiple assets by using a common set of tasks, use a JobTemplate to specify the default task presets, or to set the order of tasks.
-
-The following example shows how to create a JobTemplate with a TaskTemplate that is defined inline. The TaskTemplate uses the Media Encoder Standard as the MediaProcessor to encode the asset file. However, other MediaProcessors can be used as well.
-
-```console
-POST https://media.windows.net/API/JobTemplates HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
--
-{"Name" : "NewJobTemplate25", "JobTemplateBody" : "<?xml version=\"1.0\" encoding=\"utf-8\"?><jobTemplate><taskBody taskTemplateId=\"nb:ttid:UUID:071370A3-E63E-4E81-A099-AD66BCAC3789\"><inputAsset>JobInputAsset(0)</inputAsset><outputAsset>JobOutputAsset(0)</outputAsset></taskBody></jobTemplate>", "TaskTemplates" : [{"Id" : "nb:ttid:UUID:071370A3-E63E-4E81-A099-AD66BCAC3789", "Configuration" : "H264 Smooth Streaming 720p", "MediaProcessorId" : "nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56", "Name" : "SampleTaskTemplate2", "NumberofInputAssets" : 1, "NumberofOutputAssets" : 1}] }
-```
-
-> [!NOTE]
-> Unlike other Media Services entities, you must define a new GUID identifier for each TaskTemplate and place it in the taskTemplateId and Id property in your request body. The content identification scheme must follow the scheme described in Identify Azure Media Services Entities. Also, JobTemplates cannot be updated. Instead, you must create a new one with your updated changes.
->
->
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-
-. . .
-```
-
-The following example shows how to create a job that references a JobTemplate Id:
-
-```console
-POST https://media.windows.net/API/Jobs HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
--
-{"Name" : "NewTestJob", "InputMediaAssets" : [{"__metadata" : {"uri" : "https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3A3f1fe4a2-68f5-4190-9557-cd45beccef92')"}}], "TemplateId" : "nb:jtid:UUID:15e6e5e6-ac85-084e-9dc2-db3645fbf0aa"}
-```
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-
-. . .
-```
--
-## Advanced Encoding Features to explore
-* [How to generate thumbnails](media-services-dotnet-generate-thumbnail-with-mes.md)
-* [Generating thumbnails during encoding](media-services-dotnet-generate-thumbnail-with-mes.md#example-of-generating-a-thumbnail-while-encoding)
-* [Crop videos during encoding](media-services-crop-video.md)
-* [Customizing encoding presets](media-services-custom-mes-presets-with-dotnet.md)
-* [Overlay or watermark a video with an image](media-services-advanced-encoding-with-mes.md#overlay)
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-Now that you know how to create a job to encode an asset, see [How to check job progress with Media Services](media-services-rest-check-job-progress.md).
-
-## See also
-[Get Media Processors](media-services-rest-get-media-processor.md)
media-services Media Services Rest Get Media Processor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-get-media-processor.md
- Title: How to get a Media Processor instance using REST | Microsoft Docs
-description: Learn how to create a media processor component to encode, convert format, encrypt, or decrypt media content for Azure Media Services.
------ Previously updated : 3/10/2021--
-# How to get a Media Processor instance
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-get-media-processor.md)
-> * [REST](media-services-rest-get-media-processor.md)
--
-## Overview
-
-Media Processors are a component that handles a specific video or audio processing task, such as encoding, format conversion, encrypting, or decrypting media content. All tasks submitted to Media Services require a media processor to encode, encrypt, or convert the video or audio content.
-
-## Azure media processors
-
-The following topic provides lists of media processors:
-
-* [Encoding media processors](scenarios-and-availability.md)
-* [Analytics media processors](scenarios-and-availability.md)
-
->[!NOTE]
->When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
--
-## Get a media processor
-
-The following REST call shows how to get a media processor instance by name (in this case, **Media Encoder Standard**).
-
-Request:
-
-```console
-GET https://media.windows.net/api/MediaProcessors()?$filter=Name%20eq%20'Media%20Encoder%20Standard' HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <token>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-Response:
-
-```console
-. . .
-
-{
- "odata.metadata":"https://media.windows.net/api/$metadata#MediaProcessors",
- "value":[
- {
- "Id":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "Description":"Media Encoder Standard",
- "Name":"Media Encoder Standard",
- "Sku":"",
- "Vendor":"Microsoft",
- "Version":"1.1"
- }
- ]
-}
-```
--
-## Media Services learning paths
-
-## Provide feedback
-
-## Next Steps
-Now that you know how to get a media processor instance, go to the [How to Encode an Asset](media-services-rest-get-started.md) article which demonstrates how to use the Media Encoder Standard to encode an asset.
-
media-services Media Services Rest Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-get-started.md
- Title: Get started with delivering content on demand using REST | Microsoft Docs
-description: This tutorial walks you through the steps of implementing an on demand content delivery application with Azure Media Services using REST API.
------ Previously updated : 3/10/2021--
-# Get started with delivering content on demand using REST
---
-This quickstart walks you through the steps of implementing a Video-on-Demand (VoD) content delivery application using Azure Media Services (AMS) REST APIs.
-
-The tutorial introduces the basic Media Services workflow and the most common programming objects and tasks required for Media Services development. At the completion of the tutorial, you are able to stream or progressively download a sample media file that you uploaded, encoded, and downloaded.
-
-The following image shows some of the most commonly used objects when developing VoD applications against the Media Services OData model.
-
-Click the image to view it full size.
-
-[![Diagram showing some of the most commonly used objects in the Azure Media Services object data model for developing Video on Demand applications.](./media/media-services-rest-get-started/media-services-overview-object-model-small.png)](./media/media-services-rest-get-started/media-services-overview-object-model.png#lightbox)
-
-## Prerequisites
-The following prerequisites are required to start developing with Media Services with REST APIs.
-
-* An Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/).
-* A Media Services account. To create a Media Services account, see [How to Create a Media Services Account](media-services-portal-create-account.md).
-* Understanding of how to develop with Media Services REST API. For more information, see [Media Services REST API overview](media-services-rest-how-to-use.md).
-* An application of your choice that can send HTTP requests and responses. This tutorial uses [Fiddler](https://www.telerik.com/download/fiddler).
-
-The following tasks are shown in this quickstart.
-
-1. Start streaming endpoint (using the Azure portal).
-2. Connect to the Media Services account with REST API.
-3. Create a new asset and upload a video file with REST API.
-4. Encode the source file into a set of adaptive bitrate MP4 files with REST API.
-5. Publish the asset and get streaming and progressive download URLs with REST API.
-6. Play your content.
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). Use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-For details about AMS REST entities used in this article, see [Azure Media Services REST API Reference](/rest/api/medi).
-
->[!NOTE]
->When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Start streaming endpoints using the Azure portal
-
-When working with Azure Media Services, one of the most common scenarios is delivering video via adaptive bitrate streaming. Media Services provides dynamic packaging, which allows you to deliver your adaptive bitrate MP4 encoded content in streaming formats supported by Media Services (MPEG DASH, HLS, Smooth Streaming) just-in-time, without you having to store pre-packaged versions of each of these streaming formats.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-To start the streaming endpoint, do the following:
-
-1. Log in at the [Azure portal](https://portal.azure.com/).
-2. In the Settings window, click Streaming endpoints.
-3. Click the default streaming endpoint.
-
- The DEFAULT STREAMING ENDPOINT DETAILS window appears.
-
-4. Click the Start icon.
-5. Click the Save button to save your changes.
-
-## <a id="connect"></a>Connect to the Media Services account with REST API
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## <a id="upload"></a>Create a new asset and upload a video file with REST API
-
-In Media Services, you upload your digital files into an asset. The **Asset** entity can contain video, audio, images, thumbnail collections, text tracks and closed caption files (and the metadata about these files.) Once the files are uploaded into the asset, your content is stored securely in the cloud for further processing and streaming.
-
-One of the values that you have to provide when creating an asset is asset creation options. The **Options** property is an enumeration value that describes the encryption options that an Asset can be created with. A valid value is one of the values from the list below, not a combination of values from this list:
-
-* **None** = **0** - No encryption is used. When using this option your content is not protected in transit or at rest in storage.
- If you plan to deliver an MP4 using progressive download, use this option.
-* **StorageEncrypted** = **1** - Encrypts your clear content locally using AES-256 bit encryption and then uploads it to Azure Storage where it is stored encrypted at rest. Assets protected with Storage Encryption are automatically unencrypted and placed in an encrypted file system prior to encoding, and optionally re-encrypted prior to uploading back as a new output asset. The primary use case for Storage Encryption is when you want to secure your high-quality input media files with strong encryption at rest on disk.
-* **CommonEncryptionProtected** = **2** - Use this option if you are uploading content that has already been encrypted and protected with Common Encryption or PlayReady DRM (for example, Smooth Streaming protected with PlayReady DRM).
-* **EnvelopeEncryptionProtected** = **4** ΓÇô Use this option if you are uploading HLS encrypted with AES. The files must have been encoded and encrypted by Transform Manager.
-
-### Create an asset
-An asset is a container for multiple types or sets of objects in Media Services, including video, audio, images, thumbnail collections, text tracks, and closed caption files. In the REST API, creating an Asset requires sending POST request to Media Services and placing any property information about your asset in the request body.
-
-The following example shows how to create an asset.
-
-**HTTP Request**
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/Assets HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: c59de965-bc89-4295-9a57-75d897e5221e
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 45
-
-{"Name":"BigBuckBunny.mp4", "Options":"0"}
-```
-
-**HTTP Response**
-
-If successful, the following is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 452
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Assets('nb%3Acid%3AUUID%3A9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: c59de965-bc89-4295-9a57-75d897e5221e
-request-id: e98be122-ae09-473a-8072-0ccd234a0657
-x-ms-request-id: e98be122-ae09-473a-8072-0ccd234a0657
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Sun, 18 Jan 2015 22:06:40 GMT
-
-{
- odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Assets/@Element",
- "Id":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "State":0,
- "Created":"2015-01-18T22:06:40.6010903Z",
- "LastModified":"2015-01-18T22:06:40.6010903Z",
- "AlternateId":null,
- "Name":"BigBuckBunny2.mp4",
- "Options":0,
- "Uri":"https://storagetestaccount001.blob.core.windows.net/asset-9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "StorageAccountName":"storagetestaccount001"
-}
-```
-
-### Create an AssetFile
-The [AssetFile](/rest/api/media/operations/assetfile) entity represents a video or audio file that is stored in a blob container. An asset file is always associated with an asset, and an asset may contain one or many AssetFiles. The Media Services Encoder task fails if an asset file object is not associated with a digital file in a blob container.
-
-After you upload your digital media file into a blob container, you use the **MERGE** HTTP request to update the AssetFile with information about your media file (as shown later in the topic).
-
-**HTTP Request**
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/Files HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 164
-
-{
- "IsEncrypted":"false",
- "IsPrimary":"false",
- "MimeType":"video/mp4",
- "Name":"BigBuckBunny.mp4",
- "ParentAssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1"
-}
-```
-
-**HTTP Response**
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 535
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Files('nb%3Acid%3AUUID%3Af13a0137-0a62-9d4c-b3b9-ca944b5142c5')
-Server: Microsoft-IIS/8.5
-request-id: 98a30e2d-f379-4495-988e-0b79edc9b80e
-x-ms-request-id: 98a30e2d-f379-4495-988e-0b79edc9b80e
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 00:34:07 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Files/@Element",
- "Id":"nb:cid:UUID:f13a0137-0a62-9d4c-b3b9-ca944b5142c5",
- "Name":"BigBuckBunny.mp4",
- "ContentFileSize":"0",
- "ParentAssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "EncryptionVersion":null,
- "EncryptionScheme":null,
- "IsEncrypted":false,
- "EncryptionKeyId":null,
- "InitializationVector":null,
- "IsPrimary":false,
- "LastModified":"2015-01-19T00:34:08.1934137Z",
- "Created":"2015-01-19T00:34:08.1934137Z",
- "MimeType":"video/mp4",
- "ContentChecksum":null
-}
-```
-
-### Creating the AccessPolicy with write permission
-Before uploading any files into blob storage, set the access policy rights for writing to an asset. To do that, POST an HTTP request to the AccessPolicies entity set. Define a DurationInMinutes value upon creation or you receive a 500 Internal Server error message back in response. For more information on AccessPolicies, see [AccessPolicy](/rest/api/media/operations/accesspolicy).
-
-The following example shows how to create an AccessPolicy:
-
-**HTTP Request**
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/AccessPolicies HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 74
-
-{"Name":"NewUploadPolicy", "DurationInMinutes":"440", "Permissions":"2"}
-```
-
-**HTTP Response**
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 312
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/AccessPolicies('nb%3Apid%3AUUID%3Abe0ac48d-af7d-4877-9d60-1805d68bffae')
-Server: Microsoft-IIS/8.5
-request-id: 74c74545-7e0a-4cd6-a440-c1c48074a970
-x-ms-request-id: 74c74545-7e0a-4cd6-a440-c1c48074a970
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Sun, 18 Jan 2015 22:18:06 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#AccessPolicies/@Element",
- "Id":"nb:pid:UUID:be0ac48d-af7d-4877-9d60-1805d68bffae",
- "Created":"2015-01-18T22:18:06.6370575Z",
- "LastModified":"2015-01-18T22:18:06.6370575Z",
- "Name":"NewUploadPolicy",
- "DurationInMinutes":440.0,
- "Permissions":2
-}
-```
-
-### Get the Upload URL
-
-To receive the actual upload URL, create a SAS Locator. Locators define the start time and type of connection endpoint for clients that want to access Files in an Asset. You can create multiple Locator entities for a given AccessPolicy and Asset pair to handle different client requests and needs. Each of these Locators uses the StartTime value plus the DurationInMinutes value of the AccessPolicy to determine the length of time a URL can be used. For more information, see [Locator](/rest/api/media/operations/locator).
-
-A SAS URL has the following format:
-
-`{https://myaccount.blob.core.windows.net}/{asset name}/{video file name}?{SAS signature}`
-
-Some considerations apply:
-
-* You cannot have more than five unique Locators associated with a given Asset at one time.
-* If you need to upload your files immediately, you should set your StartTime value to five minutes before the current time. This is because there may be clock skew between your client machine and Media Services. Also, your StartTime value must be in the following DateTime format: YYYY-MM-DDTHH:mm:ssZ (for example, "2014-05-23T17:53:50Z").
-* There may be a 30-40 second delay after a Locator is created to when it is available for use. This issue applies to both [SAS URL](../../storage/common/storage-sas-overview.md) and Origin Locators.
-
-The following example shows how to create a SAS URL Locator, as defined by the Type property in the request body ("1" for a SAS locator and "2" for an On-Demand origin locator). The **Path** property returned contains the URL that you must use to upload your file.
-
-**HTTP Request**
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/Locators HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 178
-
-{
- "AccessPolicyId":"nb:pid:UUID:be0ac48d-af7d-4877-9d60-1805d68bffae",
- "AssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "StartTime":"2015-02-18T16:45:53",
- "Type":1
-}
-```
-
-**HTTP Response**
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 949
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Locators('nb%3Alid%3AUUID%3Aaf57bdd8-6751-4e84-b403-f3c140444b54')
-Server: Microsoft-IIS/8.5
-request-id: 2adeb1f8-89c5-4cc8-aa4f-08cdfef33ae0
-x-ms-request-id: 2adeb1f8-89c5-4cc8-aa4f-08cdfef33ae0
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 03:01:29 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Locators/@Element",
- "Id":"nb:lid:UUID:af57bdd8-6751-4e84-b403-f3c140444b54",
- "ExpirationDateTime":"2015-02-19T00:05:53",
- "Type":1,
- "Path":"https://storagetestaccount001.blob.core.windows.net/asset-f438649c-313c-46e2-8d68-7d2550288247?sv=2012-02-12&sr=c&si=af57bdd8-6751-4e84-b403-f3c140444b54&sig=fE4btwEfZtVQFeC0Wh3Kwks2OFPQfzl5qTMW5YytiuY%3D&st=2015-02-18T16%3A45%3A53Z&se=2015-02-19T00%3A05%3A53Z",
- "BaseUri":"https://storagetestaccount001.blob.core.windows.net/asset-f438649c-313c-46e2-8d68-7d2550288247",
- "ContentAccessComponent":"?sv=2012-02-12&sr=c&si=af57bdd8-6751-4e84-b403-f3c140444b54&sig=fE4btwEfZtVQFeC0Wh3Kwks2OFPQfzl5qTMW5YytiuY%3D&st=2015-02-18T16%3A45%3A53Z&se=2015-02-19T00%3A05%3A53Z",
- "AccessPolicyId":"nb:pid:UUID:be0ac48d-af7d-4877-9d60-1805d68bffae",
- "AssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "StartTime":"2015-02-18T16:45:53",
- "Name":null
-}
-```
-
-### Upload a file into a blob storage container
-Once you have the AccessPolicy and Locator set, the actual file is uploaded to an Azure blob storage container using the Azure Storage REST APIs. You must upload the files as block blobs. Page blobs are not supported by Azure Media Services.
-
-> [!NOTE]
-> You must add the file name for the file you want to upload to the Locator **Path** value received in the previous section. For example, `https://storagetestaccount001.blob.core.windows.net/asset-e7b02da4-5a69-40e7-a8db-e8f4f697aac0/BigBuckBunny.mp4?`.
->
->
-
-For more information on working with Azure storage blobs, see [Blob Service REST API](/rest/api/storageservices/blob-service-rest-api).
-
-### Update the AssetFile
-Now that you've uploaded your file, update the FileAsset size (and other) information. For example:
-
-```console
-MERGE https://wamsbayclus001rest-hs.cloudapp.net/api/Files('nb%3Acid%3AUUID%3Af13a0137-0a62-9d4c-b3b9-ca944b5142c5') HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-
-{
- "ContentFileSize":"1186540",
- "Id":"nb:cid:UUID:f13a0137-0a62-9d4c-b3b9-ca944b5142c5",
- "MimeType":"video/mp4",
- "Name":"BigBuckBunny.mp4",
- "ParentAssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1"
-}
-```
-
-**HTTP Response**
-
-If successful, the following is returned:
-
-```console
-HTTP/1.1 204 No Content
-...
-```
-
-## Delete the Locator and AccessPolicy
-**HTTP Request**
-
-```console
-DELETE https://wamsbayclus001rest-hs.cloudapp.net/api/Locators('nb%3Alid%3AUUID%3Aaf57bdd8-6751-4e84-b403-f3c140444b54') HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-```
-
-**HTTP Response**
-
-If successful, the following is returned:
-
-```console
-HTTP/1.1 204 No Content
-...
-```
-
-**HTTP Request**
-
-```console
-DELETE https://wamsbayclus001rest-hs.cloudapp.net/api/AccessPolicies('nb%3Apid%3AUUID%3Abe0ac48d-af7d-4877-9d60-1805d68bffae') HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-```
-
-**HTTP Response**
-
-If successful, the following is returned:
-
-```console
-HTTP/1.1 204 No Content
-...
-```
-
-## <a id="encode"></a>Encode the source file into a set of adaptive bitrate MP4 files
-
-After ingesting Assets into Media Services, media can be encoded, transmuxed, watermarked, and so on, before it is delivered to clients. These activities are scheduled and run against multiple background role instances to ensure high performance and availability. These activities are called Jobs and each Job is composed of atomic Tasks that do the actual work on the Asset file (for more information, see [Job](/rest/api/media/operations/job), [Task](/rest/api/media/operations/task) descriptions).
-
-As was mentioned earlier, when working with Azure Media Services one of the most common scenarios is delivering adaptive bitrate streaming to your clients. Media Services can dynamically package a set of adaptive bitrate MP4 files into one of the following formats: HTTP Live Streaming (HLS), Smooth Streaming, MPEG DASH.
-
-The following section shows how to create a job that contains one encoding task. The task specifies to transcode the mezzanine file into a set of adaptive bitrate MP4s using **Media Encoder Standard**. The section also shows how to monitor the job processing progress. When the job is complete, you would be able to create locators that are needed to get access to your assets.
-
-### Get a media processor
-In Media Services, a media processor is a component that handles a specific processing task, such as encoding, format conversion, encrypting, or decrypting media content. For the encoding task shown in this tutorial, we are going to use the Media Encoder Standard.
-
-The following code requests the encoder's id.
-
-**HTTP Request**
-
-```console
- GET https://wamsbayclus001rest-hs.cloudapp.net/api/MediaProcessors()?$filter=Name%20eq%20'Media%20Encoder%20Standard' HTTP/1.1
- DataServiceVersion: 1.0;NetFx
- MaxDataServiceVersion: 3.0;NetFx
- Accept: application/json
- Accept-Charset: UTF-8
- Authorization: Bearer <ENCODED JWT TOKEN>
- x-ms-version: 2.19
- Host: wamsbayclus001rest-hs.cloudapp.net
-```
-
-**HTTP Response**
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 273
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 6beb04b4-55a7-480d-8aa8-e5c5d59ffa1f
-x-ms-request-id: 6beb04b4-55a7-480d-8aa8-e5c5d59ffa1f
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 07:54:09 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#MediaProcessors",
- "value":[
- {
- "Id":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "Description":"Media Encoder Standard",
- "Name":"Media Encoder Standard",
- "Sku":"",
- "Vendor":"Microsoft",
- "Version":"1.1"
- }
- ]
-}
-```
-
-### Create a job
-Each Job can have one or more Tasks depending on the type of processing that you want to accomplish. Through the REST API, you can create Jobs and their related Tasks in one of two ways: Tasks can be defined inline through the Tasks navigation property on Job entities, or through OData batch processing. The Media Services SDK uses batch processing. However, for the readability of the code examples in this article, tasks are defined inline. For information on batch processing, see [Open Data Protocol (OData) Batch Processing](https://www.odata.org/documentation/odata-version-3-0/batch-processing/).
-
-The following example shows you how to create and post a Job with one Task set to encode a video at a specific resolution and quality. The following documentation section contains the list of all the [task presets](./media-services-mes-presets-overview.md) supported by the Media Encoder Standard processor.
-
-**HTTP Request**
-
-```console
-POST https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-Content-Length: 482
-
-{
- "Name":"NewTestJob",
- "InputMediaAssets":[
- {
- "__metadata":{
- "uri":"https://wamsbayclus001rest-hs.net/api/Assets('nb%3Acid%3AUUID%3A9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1')"
- }
- }
- ],
- "Tasks":[
- {
- "Configuration":"Adaptive Streaming",
- "MediaProcessorId":"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56",
- "TaskBody":"<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset>
- <outputAsset>JobOutputAsset(0)</outputAsset></taskBody>"
- }
- ]
-}
-```
-
-**HTTP Response**
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 1215
-Content-Type: application/json;odata=verbose;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')
-Server: Microsoft-IIS/8.5
-request-id: 532ac1ec-a475-4dce-b2d5-7c8ce94ac87c
-x-ms-request-id: 532ac1ec-a475-4dce-b2d5-7c8ce94ac87c
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 08:04:35 GMT
-
-{
- "d":{
- "__metadata":{
- "id":"https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')",
- "uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')",
- "type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.Job"
- },
- "Tasks":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')/Tasks"
- }
- },
- "OutputMediaAssets":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')/OutputMediaAssets"
- }
- },
- "InputMediaAssets":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1')/InputMediaAssets"
- }
- },
- "Id":"nb:jid:UUID:71d2dd33-efdf-ec43-8ea1-136a110bd42c",
- "Name":"NewTestJob",
- "Created":"2015-01-19T08:04:34.3287057Z",
- "LastModified":"2015-01-19T08:04:34.3287057Z",
- "EndTime":null,
- "Priority":0,
- "RunningDuration":0,
- "StartTime":null,
- "State":0,
- "TemplateId":null,
- "JobNotificationSubscriptions":{
- "__metadata":{
- "type":"Collection(Microsoft.Cloud.Media.Vod.Rest.Data.Models.JobNotificationSubscription)"
- },
- "results":[
-
- ]
- }
- }
-}
-```
-
-There are a few important things to note in any Job request:
-
-* TaskBody properties MUST use literal XML to define the number of input, or output assets that are used by the Task. The Task article contains the XML Schema Definition for the XML.
-* In the TaskBody definition, each inner value for `<inputAsset>` and `<outputAsset>` must be set as JobInputAsset(value) or JobOutputAsset(value).
-* A task can have multiple output assets. One JobOutputAsset(x) can only be used once as an output of a task in a job.
-* You can specify JobInputAsset or JobOutputAsset as an input asset of a task.
-* Tasks must not form a cycle.
-* The value parameter that you pass to JobInputAsset or JobOutputAsset represents the index value for an Asset. The actual Assets are defined in the InputMediaAssets and OutputMediaAssets navigation properties on the Job entity definition.
-
-> [!NOTE]
-> Because Media Services is built on OData v3, the individual assets in InputMediaAssets and OutputMediaAssets navigation property collections are referenced through a "__metadata : uri" name-value pair.
->
->
-
-* InputMediaAssets maps to one or more Assets you have created in Media Services. OutputMediaAssets are created by the system. They do not reference an existing asset.
-* OutputMediaAssets can be named using the assetName attribute. If this attribute is not present, then the name of the OutputMediaAsset is whatever the inner text value of the `<outputAsset>` element is with a suffix of either the Job Name value, or the Job Id value (in the case where the Name property isn't defined). For example, if you set a value for assetName to "Sample", then the OutputMediaAsset Name property would be set to "Sample". However, if you did not set a value for assetName, but did set the job name to "NewJob", then the OutputMediaAsset Name would be "JobOutputAsset(value)_NewJob".
-
- The following example shows how to set the assetName attribute:
-
- ```console
- "<?xml version=\"1.0\" encoding=\"utf-8\"?><taskBody><inputAsset>JobInputAsset(0)</inputAsset><outputAsset assetName=\"CustomOutputAssetName\">JobOutputAsset(0)</outputAsset></taskBody>"
- ```
-* To enable task chaining:
-
- * A job must have at least two tasks
- * There must be at least one task whose input is output of another task in the job.
-
-For more information see, [Creating an Encoding Job with the Media Services REST API](media-services-rest-encode-asset.md).
-
-### Monitor Processing Progress
-You can retrieve the Job status by using the State property, as shown in the following example:
-
-**HTTP Request**
-
-```console
-GET https://wamsbayclus001rest-hs.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')/State HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: wamsbayclus001rest-hs.net
-Content-Length: 0
-```
-
-**HTTP Response**
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 17
-Content-Type: application/json;odata=verbose;charset=utf-8
-Server: Microsoft-IIS/7.5
-x-ms-request-id: 01324d81-18e2-4493-a3e5-c6186209f0c8
-X-Content-Type-Options: nosniff
-DataServiceVersion: 1.0;
-X-AspNet-Version: 4.0.30319
-X-Powered-By: ASP.NET
-Date: Sun, 13 May 2012 05:16:53 GMT
-
-{"d":{"State":2}}
-```
-
-### Cancel a job
-Media Services allows you to cancel running jobs through the CancelJob function. This call returns a 400 error code if you try to cancel a Job when its state is canceled, canceling, error, or finished.
-
-The following example shows how to call CancelJob.
-
-**HTTP Request**
-
-```console
-GET https://wamsbayclus001rest-hs.net/API/CancelJob?jobid='nb%3ajid%3aUUID%3a71d2dd33-efdf-ec43-8ea1-136a110bd42c' HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: wamsbayclus001rest-hs.net
-```
-
-If successful, a 204 response code is returned with no message body.
-
-> [!NOTE]
-> You must URL-encode the job id (normally nb:jid:UUID: somevalue) when passing it in as a parameter to CancelJob.
->
->
-
-### Get the output asset
-The following code shows how to request the output asset Id.
-
-**HTTP Request**
-
-```console
-GET https://wamsbayclus001rest-hs.cloudapp.net/api/Jobs('nb%3Ajid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')/OutputMediaAssets() HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: wamsbayclus001rest-hs.cloudapp.net
-```
-
-**HTTP Response**
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 445
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 73cd605d-066a-46f1-8358-f4bd25a9220a
-x-ms-request-id: 73cd605d-066a-46f1-8358-f4bd25a9220a
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 08:28:13 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Assets",
- "value":[
- {
- "Id":"nb:cid:UUID:71d2dd33-efdf-ec43-8ea1-136a110bd42c",
- "State":0,
- "Created":"2015-01-19T07:52:15.603",
- "LastModified":"2015-01-19T07:52:15.603",
- "AlternateId":null,
- "Name":"Multibitrate MP4s",
- "Options":0,
- "Uri":"https://storagetestaccount001.blob.core.windows.net/asset-71d2dd33-efdf-ec43-8ea1-136a110bd42c",
- "StorageAccountName":"storagetestaccount001"
- }
- ]
-}
-```
-
-## <a id="publish_get_urls"></a>Publish the asset and get streaming and progressive download URLs with REST API
-
-To stream or download an asset, you first need to "publish" it by creating a locator. Locators provide access to files contained in the asset. Media Services supports two types of locators: OnDemandOrigin locators, used to stream media (for example, MPEG DASH, HLS, or Smooth Streaming) and Access Signature (SAS) locators, used to download media files.
-
-Once you create the locators, you can build the URLs that are used to stream or download your files.
-
->[!NOTE]
->When your AMS account is created a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the **Running** state.
-
-A streaming URL for Smooth Streaming has the following format:
-
-```console
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest
-```
-
-A streaming URL for HLS has the following format:
-
-```console
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=m3u8-aapl)
-```
-
-A streaming URL for MPEG DASH has the following format:
-
-```console
-{streaming endpoint name-media services account name}.streaming.mediaservices.windows.net/{locator ID}/{filename}.ism/Manifest(format=mpd-time-csf)
-```
-
-A SAS URL used to download files has the following format:
-
-```console
-{blob container name}/{asset name}/{file name}/{SAS signature}
-```
-
-This section shows how to perform the following tasks necessary to "publish" your assets.
-
-* Creating the AccessPolicy with read permission
-* Creating a SAS URL for downloading content
-* Creating an origin URL for streaming content
-
-### Creating the AccessPolicy with read permission
-Before downloading or streaming any media content, first define an AccessPolicy with read permissions and create the appropriate Locator entity that specifies the type of delivery mechanism you want to enable for your clients. For more information on the properties available, see [AccessPolicy Entity Properties](/rest/api/media/operations/accesspolicy#accesspolicy_properties).
-
-The following example shows how to specify the AccessPolicy for read permissions for a given Asset.
-
-```console
-POST https://wamsbayclus001rest-hs.net/API/AccessPolicies HTTP/1.1
-Content-Type: application/json
-Accept: application/json
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: wamsbayclus001rest-hs.net
-Content-Length: 74
-Expect: 100-continue
-
-{"Name": "DownloadPolicy", "DurationInMinutes" : "300", "Permissions" : 1}
-```
-
-If successful, a 201 success code is returned describing the AccessPolicy entity that you created. You then use the AccessPolicy Id along with the Asset Id of the asset that contains the file you want to deliver(such as an output asset) to create the Locator entity.
-
-> [!NOTE]
-> This basic workflow is the same as uploading a file when ingesting an Asset (as was discussed earlier in this topic). Also, like uploading files, if you (or your clients) need to access your files immediately, set your StartTime value to five minutes before the current time. This action is necessary because there may be clock skew between the client and Media Services. The StartTime value must be in the following DateTime format: YYYY-MM-DDTHH:mm:ssZ (for example, "2014-05-23T17:53:50Z").
->
->
-
-### Creating a SAS URL for downloading content
-The following code shows how to get a URL that can be used to download a media file created and uploaded previously. The AccessPolicy has read permissions set and the Locator path refers to a SAS download URL.
-
-```console
-POST https://wamsbayclus001rest-hs.net/API/Locators HTTP/1.1
-Content-Type: application/json
-Accept: application/json
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: wamsbayclus001rest-hs.net
-Content-Length: 182
-Expect: 100-continue
-
-{"AccessPolicyId": "nb:pid:UUID:38c71dd0-44c5-4c5f-8418-08bb6fbf7bf8", "AssetId" : "nb:cid:UUID:71d2dd33-efdf-ec43-8ea1-136a110bd42c", "StartTime" : "2014-05-17T16:45:53", "Type":1}
-```
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 1150
-Content-Type: application/json;odata=verbose;charset=utf-8
-Location: https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A8e5a821d-2194-4d00-8884-adf979856874')
-Server: Microsoft-IIS/7.5
-x-ms-request-id: 8cfd8556-3064-416a-97f2-67d9e35f0911
-X-Content-Type-Options: nosniff
-DataServiceVersion: 1.0;
-X-AspNet-Version: 4.0.30319
-X-Powered-By: ASP.NET
-Date: Mon, 14 May 2012 21:41:32 GMT
-
-{
- "d":{
- "__metadata":{
- "id":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A8e5a821d-2194-4d00-8884-adf979856874')",
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A8e5a821d-2194-4d00-8884-adf979856874')",
- "type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.Locator"
- },
- "AccessPolicy":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A8e5a821d-2194-4d00-8884-adf979856874')/AccessPolicy"
- }
- },
- "Asset":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A71d2dd33-efdf-ec43-8ea1-136a110bd42c')/Asset"
- }
- },
- "Id":"nb:lid:UUID:8e5a821d-2194-4d00-8884-adf979856874",
- "ExpirationDateTime":"\/Date(1337049393000)\/",
- "Type":1,
- "Path":"https://storagetestaccount001.blob.core.windows.net/asset-71d2dd33-efdf-ec43-8ea1-136a110bd42c?st=2012-05-14T21%3A36%3A33Z&se=2012-05-15T02%3A36%3A33Z&sr=c&si=8e5a821d-2194-4d00-8884-adf979856874&sig=y75dViDpC5V8WutrXM%2B%2FGpR3uOtqmlISiNlHU1YUBOg%3D",
- "AccessPolicyId":"nb:pid:UUID:38c71dd0-44c5-4c5f-8418-08bb6fbf7bf8",
- "AssetId":"nb:cid:UUID:71d2dd33-efdf-ec43-8ea1-136a110bd42c",
- "StartTime":"\/Date(1337031393000)\/"
- }
-}
-```
-
-The returned **Path** property contains the SAS URL.
-
-> [!NOTE]
-> If you download storage encrypted content, you must manually decrypt it before rendering it, or use the Storage Decryption MediaProcessor in a processing task to output processed files in the clear to an OutputAsset and then download from that Asset. For more information on processing, see Creating an Encoding Job with the Media Services REST API. Also, SAS URL Locators cannot be updated after they have been created. For example, you cannot reuse the same Locator with an updated StartTime value. This is because of the way SAS URLs are created. If you want to access an asset for download after a Locator has expired, then you must create a new one with a new StartTime.
->
->
-
-### Download files
-Once you have the AccessPolicy and Locator set, you can download files using the Azure Storage REST APIs.
-
-> [!NOTE]
-> You must add the file name for the file you want to download to the Locator **Path** value received in the previous section. For example, `https://storagetestaccount001.blob.core.windows.net/asset-e7b02da4-5a69-40e7-a8db-e8f4f697aac0/BigBuckBunny.mp4`? . . .
-
-For more information on working with Azure storage blobs, see [Blob Service REST API](/rest/api/storageservices/blob-service-rest-api).
-
-As a result of the encoding job that you performed earlier (encoding into Adaptive MP4 set), you have multiple MP4 files that you can progressively download. For example:
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_650kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_400kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_3400kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_2250kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_1500kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_H264_1000kbps_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_AAC_und_ch2_96kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-* `https://storagetestaccount001.blob.core.windows.net/asset-38058602-a4b8-4b33-b9f0-6880dc1490ea/BigBuckBunny_AAC_und_ch2_56kbps.mp4?sv=2012-02-12&sr=c&si=166d5154-b801-410b-a226-ee2f8eac1929&sig=P2iNZJAvAWpp%2Bj9yV6TQjoz5DIIaj7ve8ARynmEM6Xk%3D&se=2015-02-14T01:13:05Z`
-
-### Creating a streaming URL for streaming content
-The following code shows how to create a streaming URL Locator:
-
-```console
-POST https://wamsbayclus001rest-hs/API/Locators HTTP/1.1
-Content-Type: application/json
-Accept: application/json
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: wamsbayclus001rest-hs
-Content-Length: 182
-Expect: 100-continue
-
-{"AccessPolicyId": "nb:pid:UUID:38c71dd0-44c5-4c5f-8418-08bb6fbf7bf8", "AssetId" : "nb:cid:UUID:eb5540a2-116e-4d36-b084-7e9958f7f3c3", "StartTime" : "2014-05-17T16:45:53",, "Type":2}
-```
-
-If successful, the following response is returned:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 981
-Content-Type: application/json;odata=verbose;charset=utf-8
-Location: https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A52034bf6-dfae-4d83-aad3-3bd87dcb1a5d')
-Server: Microsoft-IIS/7.5
-x-ms-request-id: 2eac4158-fc78-4fa2-81ee-c9f582dc385f
-X-Content-Type-Options: nosniff
-DataServiceVersion: 1.0;
-X-AspNet-Version: 4.0.30319
-X-Powered-By: ASP.NET
-Date: Mon, 14 May 2012 21:41:39 GMT
-
-{
- "d":{
- "__metadata":{
- "id":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A52034bf6-dfae-4d83-aad3-3bd87dcb1a5d')",
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A52034bf6-dfae-4d83-aad3-3bd87dcb1a5d')",
- "type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.Locator"
- },
- "AccessPolicy":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A52034bf6-dfae-4d83-aad3-3bd87dcb1a5d')/AccessPolicy"
- }
- },
- "Asset":{
- "__deferred":{
- "uri":"https://wamsbayclus001rest-hs.net/api/Locators('nb%3Alid%3AUUID%3A52034bf6-dfae-4d83-aad3-3bd87dcb1a5d')/Asset"
- }
- },
- "Id":"nb:lid:UUID:52034bf6-dfae-4d83-aad3-3bd87dcb1a5d",
- "ExpirationDateTime":"\/Date(1337049395000)\/",
- "Type":2,
- "Path":"http://wamsbayclus001rest-hs.net/52034bf6-dfae-4d83-aad3-3bd87dcb1a5d/",
- "AccessPolicyId":"nb:pid:UUID:38c71dd0-44c5-4c5f-8418-08bb6fbf7bf8",
- "AssetId":"nb:cid:UUID:eb5540a2-116e-4d36-b084-7e9958f7f3c3",
- "StartTime":"\/Date(1337031395000)\/"
- }
-}
-```
-
-To stream a Smooth Streaming origin URL in a streaming media player, you must append the Path property with the name of the Smooth Streaming manifest file, followed by "/manifest".
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest`
-
-To stream HLS, append (format=m3u8-aapl) after the "/manifest".
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest(format=m3u8-aapl)`
-
-To stream MPEG DASH, append (format=mpd-time-csf) after the "/manifest".
-
-`http://amstestaccount001.streaming.mediaservices.windows.net/ebf733c4-3e2e-4a68-b67b-cc5159d1d7f2/BigBuckBunny.ism/manifest(format=mpd-time-csf)`
--
-## <a id="play"></a>Play your content
-To stream you video, use [Azure Media Services Player](https://aka.ms/azuremediaplayer).
-
-To test progressive download, paste a URL into a browser (for example, IE, Chrome, Safari).
-
-## Next Steps: Media Services learning paths
-
-## Provide feedback
media-services Media Services Rest How To Use https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-how-to-use.md
- Title: Media Services operations REST API overview | Microsoft Docs
-description: The Media Services Operations REST API is used for creating Jobs, Assets, Live Channels and other resources in a Media Services account. This article provides an Azure Media Services v2 REST API overview.
------ Previously updated : 3/10/2021---
-# Media Services operations REST API overview
---
-The **Media Services Operations REST** API is used for creating Jobs, Assets, Live Channels and other resources in a Media Services account. For more information, see [Media Services Operations REST API reference](/rest/api/media/operations/azure-media-services-rest-api-reference).
-
-Media Services provides a REST API that accepts both JSON or atom+pub XML format. Media Services REST API requires specific HTTP headers that each client must send when connecting to Media Services, as well as a set of optional headers. The following sections describe the headers and HTTP verbs you can use when creating requests and receiving responses from Media Services.
-
-Authentication to the Media Services REST API is done through Azure Active Directory authentication which is outlined in the article [Use Azure AD authentication to access the Azure Media Services API with REST](media-services-rest-connect-with-aad.md)
-
-## Considerations
-
-The following considerations apply when using REST.
-
-* When querying entities, there is a limit of 1000 entities returned at one time because public REST v2 limits query results to 1000 results. You need to use **Skip** and **Take** (.NET)/ **top** (REST) as described in [this .NET example](media-services-dotnet-manage-entities.md#enumerating-through-large-collections-of-entities) and [this REST API example](media-services-rest-manage-entities.md#enumerating-through-large-collections-of-entities).
-* When using JSON and specifying to use the **__metadata** keyword in the request (for example, to reference a linked object) you MUST set the **Accept** header to [JSON Verbose format](https://www.odata.org/documentation/odata-version-3-0/json-verbose-format/) (see the following example). Odata does not understand the **__metadata** property in the request, unless you set it to verbose.
-
- ```console
- POST https://media.windows.net/API/Jobs HTTP/1.1
- Content-Type: application/json;odata=verbose
- Accept: application/json;odata=verbose
- DataServiceVersion: 3.0
- MaxDataServiceVersion: 3.0
- x-ms-version: 2.19
- Authorization: Bearer <ENCODED JWT TOKEN>
- Host: media.windows.net
-
- {
- "Name" : "NewTestJob",
- "InputMediaAssets" :
- [{"__metadata" : {"uri" : "https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3Aba5356eb-30ff-4dc6-9e5a-41e4223540e7')"}}]
- . . .
- ```
-
-## Standard HTTP request headers supported by Media Services
-For every call you make into Media Services, there is a set of required headers you must include in your request and also a set of optional headers you may want to include. The table below lists the required headers:
-
-| Header | Type | Value |
-| | | |
-| Authorization |Bearer |Bearer is the only accepted authorization mechanism. The value must also include the access token provided by Azure Active Directory. |
-| x-ms-version |Decimal |2.17 (or most recent version)|
-| DataServiceVersion |Decimal |3.0 |
-| MaxDataServiceVersion |Decimal |3.0 |
-
-> [!NOTE]
-> Because Media Services uses OData to expose its REST APIs, the DataServiceVersion and MaxDataServiceVersion headers should be included in all requests; however, if they are not, then currently Media Services assumes the DataServiceVersion value in use is 3.0.
->
->
-
-The following is a set of optional headers:
-
-| Header | Type | Value |
-| | | |
-| Date |RFC 1123 date |Timestamp of the request |
-| Accept |Content type |The requested content type for the response such as the following:<p> -application/json;odata=verbose<p> - application/atom+xml<p> Responses may have a different content type, such as a blob fetch, where a successful response contains the blob stream as the payload. |
-| Accept-Encoding |Gzip, deflate |GZIP and DEFLATE encoding, when applicable. Note: For large resources, Media Services may ignore this header and return noncompressed data. |
-| Accept-Language |"en", "es", and so on. |Specifies the preferred language for the response. |
-| Accept-Charset |Charset type like ΓÇ£UTF-8ΓÇ¥ |Default is UTF-8. |
-| X-HTTP-Method |HTTP Method |Allows clients or firewalls that do not support HTTP methods like PUT or DELETE to use these methods, tunneled via a GET call. |
-| Content-Type |Content type |Content type of the request body in PUT or POST requests. |
-| client-request-id |String |A caller-defined value that identifies the given request. If specified, this value will be included in the response message as a way to map the request. <p><p>**Important**<p>Values should be capped at 2096b (2k). |
-
-## Standard HTTP response headers supported by Media Services
-The following is a set of headers that may be returned to you depending on the resource you were requesting and the action you intended to perform.
-
-| Header | Type | Value |
-| | | |
-| request-id |String |A unique identifier for the current operation, service generated. |
-| client-request-id |String |An identifier specified by the caller in the original request, if present. |
-| Date |RFC 1123 date |The date/time that the request was processed. |
-| Content-Type |Varies |The content type of the response body. |
-| Content-Encoding |Varies |Gzip or deflate, as appropriate. |
-
-## Standard HTTP verbs supported by Media Services
-The following is a complete list of HTTP verbs that can be used when making HTTP requests:
-
-| Verb | Description |
-| | |
-| GET |Returns the current value of an object. |
-| POST |Creates an object based on the data provided, or submits a command. |
-| PUT |Replaces an object, or creates a named object (when applicable). |
-| DELETE |Deletes an object. |
-| MERGE |Updates an existing object with named property changes. |
-| HEAD |Returns metadata of an object for a GET response. |
-
-## Discover and browse the Media Services entity model
-To make Media Services entities more discoverable, the $metadata operation can be used. It allows you to retrieve all valid entity types, entity properties, associations, functions, actions, and so on. By adding the $metadata operation to the end of your Media Services REST API endpoint, you can access this discovery service.
-
- /api/$metadata.
-
-You should append "?api-version=2.x" to the end of the URI if you want to view the metadata in a browser, or do not include the x-ms-version header in your request.
-
-## Authenticate with Media Services REST using Azure Active Directory
-
-Authentication on the REST API is done through Azure Active Directory(AAD).
-For details on how to get required authentication details for your Media Services account from the Azure Portal, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-For details on writing code that connects to the REST API using Azure AD authentication, see the article [Use Azure AD authentication to access the Azure Media Services API with REST](media-services-rest-connect-with-aad.md).
-
-## Next steps
-To learn how to use Azure AD authentication with Media Services REST API, see [Use Azure AD authentication to access the Azure Media Services API with REST](media-services-rest-connect-with-aad.md).
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Rest Manage Entities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-manage-entities.md
- Title: Managing Media Services entities with REST | Microsoft Docs
-description: This article demonstrates how to manage Media Services entities with REST API.
------ Previously updated : 3/10/2021--
-# Managing Media Services entities with REST
--
-> [!div class="op_single_selector"]
-> * [REST](media-services-rest-manage-entities.md)
-> * [.NET](media-services-dotnet-manage-entities.md)
->
->
-
-Microsoft Azure Media Services is a REST-based service built on OData v3. You can add, query, update, and delete entities in much the same way as you can on any other OData service. Exceptions will be called out when applicable. For more information on OData, see [Open Data Protocol documentation](https://www.odata.org/documentation/).
-
-This topic shows you how to manage Azure Media Services entities with REST.
-
->[!NOTE]
-> Starting April 1, 2017, any Job record in your account older than 90 days will be automatically deleted, along with its associated Task records, even if the total number of records is below the maximum quota. For example, on April 1, 2017, any Job record in your account older than December 31, 2016, will be automatically deleted. If you need to archive the job/task information, you can use the code described in this topic.
-
-## Considerations
-
-When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Adding entities
-Every entity in Media Services is added to an entity set, such as Assets, through a POST HTTP request.
-
-The following example shows how to create an AccessPolicy.
-
-```console
-POST https://media.windows.net/API/AccessPolicies HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-Content-Length: 74
-Expect: 100-continue
-
-{"Name": "DownloadPolicy", "DurationInMinutes" : "300", "Permissions" : 1}
-```
-
-## Querying entities
-Querying and listing entities is straightforward and only involves a GET HTTP request and optional OData operations.
-The following example retrieves a list of all MediaProcessor entities.
-
-```console
-GET https://media.windows.net/API/MediaProcessors HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-```
-
-You can also retrieve a specific entity or all entity sets associated with a specific entity, such as in the following examples:
-
-```console
-GET https://media.windows.net/API/JobTemplates('nb:jtid:UUID:e81192f5-576f-b247-b781-70a790c20e7c') HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-
-GET https://media.windows.net/API/JobTemplates('nb:jtid:UUID:e81192f5-576f-b247-b781-70a790c20e7c')/TaskTemplates HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-```
-
-The following example returns only the State property of all Jobs.
-
-```console
-GET https://media.windows.net/API/Jobs?$select=State HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-```
-
-The following example returns all JobTemplates with the name "SampleTemplate."
-
-```console
-GET https://media.windows.net/API/JobTemplates?$filter=startswith(Name,%20'SampleTemplate') HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-```
-
-> [!NOTE]
-> The $expand operation is not supported in Media Services as well as the unsupported LINQ methods described in LINQ Considerations (WCF Data Services).
->
->
-
-## Enumerating through large collections of entities
-When querying entities, there is a limit of 1000 entities returned at one time because public REST v2 limits query results to 1000 results. Use **skip** and **top** to enumerate through the large collection of entities.
-
-The following example shows how to use **skip** and **top** to skip the first 2000 jobs and get the next 1000 jobs.
-
-```console
-GET https://media.windows.net/api/Jobs()?$skip=2000&$top=1000 HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-```
-
-## Updating entities
-Depending on the entity type and the state that it is in, you can update properties on that entity through a PATCH, PUT, or MERGE HTTP requests. For more information about these operations, see [PATCH/PUT/MERGE](/openspecs/windows_protocols/ms-odata/59d5abd3-7b12-490a-a0e2-9d9324b91893).
-
-The following code example shows how to update the Name property on an Asset entity.
-
-```console
-MERGE https://media.windows.net/API/Assets('nb:cid:UUID:80782407-3f87-4e60-a43e-5e4454232f60') HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-Content-Length: 21
-Expect: 100-continue
-
-{"Name" : "NewName" }
-```
-
-## Deleting entities
-Entities can be deleted in Media Services by using a DELETE HTTP request. Depending on the entity, the order in which you delete entities may be important. For example, entities such as Assets require that you revoke (or delete) all Locators that reference that particular Asset before deleting the Asset.
-
-The following example shows how to delete a Locator that was used to upload a file into blob storage.
-
-```console
-DELETE https://media.windows.net/API/Locators('nb:lid:UUID:76dcc8e8-4230-463d-97b0-ce25c41b5c8d') HTTP/1.1
-Content-Type: application/json;odata=verbose
-Accept: application/json;odata=verbose
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-x-ms-version: 2.19
-Authorization: Bearer <ENCODED JWT TOKEN>
-Host: media.windows.net
-Content-Length: 0
-```
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Rest Storage Encryption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-storage-encryption.md
- Title: Encrypting your content with storage encryption using AMS REST API
-description: Learn how to encrypt your content with storage encryption using AMS REST APIs.
------ Previously updated : 3/10/2021---
-# Encrypting your content with storage encryption
--
-> [!NOTE]
-> To complete this tutorial, you need an Azure account. For details, see [Azure Free Trial](https://azure.microsoft.com/pricing/free-trial/). > No new features or functionality are being added to Media Services v2. <br/>Check out the latest version, [Media Services v3](../latest/index.yml). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
->
-
-This article gives an overview of AMS storage encryption and shows you how to upload the storage encrypted content:
-
-* Create a content key.
-* Create an Asset. Set the AssetCreationOption to StorageEncryption when creating the Asset.
-
- Encrypted assets are associated with content keys.
-* Link the content key to the asset.
-* Set the encryption-related parameters on the AssetFile entities.
-
-## Considerations
-
-If you want to deliver a storage encrypted asset, you must configure the assetΓÇÖs delivery policy. Before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy. For more information, see [Configuring Asset Delivery Policies](media-services-rest-configure-asset-delivery-policy.md).
-
-When accessing entities in Media Services, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md).
-
-### Storage side encryption
-
-|Encryption option|Description|Media Services v2|Media Services v3|
-|||||
-|Media Services Storage Encryption|AES-256 encryption, key managed by Media Services|Supported<sup>(1)</sup>|Not supported<sup>(2)</sup>|
-|[Storage Service Encryption for Data at Rest](../../storage/common/storage-service-encryption.md)|Server-side encryption offered by Azure Storage, key managed by Azure or by customer|Supported|Supported|
-|[Storage Client-Side Encryption](../../storage/common/storage-client-side-encryption.md)|Client-side encryption offered by Azure storage, key managed by customer in Key Vault|Not supported|Not supported|
-
-<sup>1</sup> While Media Services does support handling of content in the clear/without any form of encryption, doing so is not recommended.
-
-<sup>2</sup> In Media Services v3, storage encryption (AES-256 encryption) is only supported for backwards compatibility when your Assets were created with Media Services v2. Meaning v3 works with existing storage encrypted assets but will not allow creation of new ones.
-
-## Connect to Media Services
-
-For information on how to connect to the AMS API, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Storage encryption overview
-The AMS storage encryption applies **AES-CTR** mode encryption to the entire file. AES-CTR mode is a block cipher that can encrypt arbitrary length data without need for padding. It operates by encrypting a counter block with the AES algorithm and then XOR-ing the output of AES with the data to encrypt or decrypt. The counter block used is constructed by copying the value of the InitializationVector to bytes 0 to 7 of the counter value and bytes 8 to 15 of the counter value are set to zero. Of the 16-byte counter block, bytes 8 to 15 (that is, the least significant bytes) are used as a simple 64-bit unsigned integer that is incremented by one for each subsequent block of data processed and is kept in network byte order. If this integer reaches the maximum value (0xFFFFFFFFFFFFFFFF), then incrementing it resets the block counter to zero (bytes 8 to 15) without affecting the other 64 bits of the counter (that is, bytes 0 to 7). In order to maintain the security of the AES-CTR mode encryption, the InitializationVector value for a given Key Identifier for each content key shall be unique for each file and files shall be less than 2^64 blocks in length. This unique value is to ensure that a counter value is never reused with a given key. For more information about the CTR mode, see [this wiki page](https://en.wikipedia.org/wiki/Block_cipher_mode_of_operation#CTR) (the wiki article uses the term "Nonce" instead of "InitializationVector").
-
-Use **Storage Encryption** to encrypt your clear content locally using AES-256 bit encryption and then upload it to Azure Storage where it is stored encrypted at rest. Assets protected with storage encryption are automatically unencrypted and placed in an encrypted file system prior to encoding, and optionally re-encrypted prior to uploading back as a new output asset. The primary use case for storage encryption is when you want to secure your high-quality input media files with strong encryption at rest on disk.
-
-In order to deliver a storage encrypted asset, you must configure the assetΓÇÖs delivery policy so Media Services knows how you want to deliver your content. Before your asset can be streamed, the streaming server removes the storage encryption and streams your content using the specified delivery policy (for example, AES, common encryption, or no encryption).
-
-## Create ContentKeys used for encryption
-Encrypted assets are associated with Storage Encryption keys. Create the content key to be used for encryption before creating the asset files. This section describes how to create a content key.
-
-The following are general steps for generating content keys that you associate with assets that you want to be encrypted.
-
-1. For storage encryption, randomly generate a 32-byte AES key.
-
- The 32-byte AES Key is the content key for your asset, which means all files associated with that asset need to use the same content key during decryption.
-2. Call the [GetProtectionKeyId](/rest/api/media/operations/rest-api-functions#getprotectionkeyid) and [GetProtectionKey](/rest/api/media/operations/rest-api-functions#getprotectionkey) methods to get the correct X.509 Certificate that must be used to encrypt your content key.
-3. Encrypt your content key with the public key of the X.509 Certificate.
-
- Media Services .NET SDK uses RSA with OAEP when doing the encryption. You can see a .NET example in the [EncryptSymmetricKeyData function](https://github.com/Azure/azure-sdk-for-media-services/blob/dev/src/net/Client/Common/Common.FileEncryption/EncryptionUtils.cs).
-4. Create a checksum value calculated using the key identifier and content key. The following .NET example calculates the checksum using the GUID part of the key identifier and the clear content key.
-
- ```csharp
- public static string CalculateChecksum(byte[] contentKey, Guid keyId)
- {
- const int ChecksumLength = 8;
- const int KeyIdLength = 16;
-
- byte[] encryptedKeyId = null;
-
- // Checksum is computed by AES-ECB encrypting the KID
- // with the content key.
- using (AesCryptoServiceProvider rijndael = new AesCryptoServiceProvider())
- {
- rijndael.Mode = CipherMode.ECB;
- rijndael.Key = contentKey;
- rijndael.Padding = PaddingMode.None;
-
- ICryptoTransform encryptor = rijndael.CreateEncryptor();
- encryptedKeyId = new byte[KeyIdLength];
- encryptor.TransformBlock(keyId.ToByteArray(), 0, KeyIdLength, encryptedKeyId, 0);
- }
-
- byte[] retVal = new byte[ChecksumLength];
- Array.Copy(encryptedKeyId, retVal, ChecksumLength);
-
- return Convert.ToBase64String(retVal);
- }
- ```
-
-5. Create the Content key with the **EncryptedContentKey** (converted to base64-encoded string), **ProtectionKeyId**, **ProtectionKeyType**, **ContentKeyType**, and **Checksum** values you have received in previous steps.
-
- For storage encryption, the following properties should be included in the request body.
-
- Request body property | Description
- |
- Id | The ContentKey ID is generated using the following format, ΓÇ£nb:kid:UUID:\<NEW GUID>ΓÇ¥.
- ContentKeyType | The content key type is an integer that defines the key. For storage encryption format, the value is 1.
- EncryptedContentKey | We create a new content key value that is a 256-bit (32 bytes) value. The key is encrypted using the storage encryption X.509 certificate that we retrieve from Microsoft Azure Media Services by executing an HTTP GET request for the GetProtectionKeyId and GetProtectionKey Methods. As an example, see the following .NET code: the **EncryptSymmetricKeyData** method defined [here](https://github.com/Azure/azure-sdk-for-media-services/blob/dev/src/net/Client/Common/Common.FileEncryption/EncryptionUtils.cs).
- ProtectionKeyId | This is the protection key ID for the storage encryption X.509 certificate that was used to encrypt our content key.
- ProtectionKeyType | This is the encryption type for the protection key that was used to encrypt the content key. This value is StorageEncryption(1) for our example.
- Checksum |The MD5 calculated checksum for the content key. It is computed by encrypting the content ID with the content key. The example code demonstrates how to calculate the checksum.
--
-### Retrieve the ProtectionKeyId
-The following example shows how to retrieve the ProtectionKeyId, a certificate thumbprint, for the certificate you must use when encrypting your content key. Do this step to make sure that you already have the appropriate certificate on your machine.
-
-Request:
-
-```console
-GET https://media.windows.net/api/GetProtectionKeyId?contentKeyType=0 HTTP/1.1
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-```
-
-Response:
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 139
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 2b6aa7a4-3a09-4b08-b581-26b55667f817
-x-ms-request-id: 2b6aa7a4-3a09-4b08-b581-26b55667f817
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 04 Feb 2015 02:42:52 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Edm.String","value":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C"}
-```
-
-### Retrieve the ProtectionKey for the ProtectionKeyId
-The following example shows how to retrieve the X.509 certificate using the ProtectionKeyId you received in the previous step.
-
-Request:
-
-```console
-GET https://media.windows.net/api/GetProtectionKey?ProtectionKeyId='7D9BB04D9D0A4A24800CADBFEF232689E048F69C' HTTP/1.1
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-x-ms-client-request-id: 78d1247a-58d7-40e5-96cc-70ff0dfa7382
-Host: media.windows.net
-```
-
-Response:
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 1227
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: 78d1247a-58d7-40e5-96cc-70ff0dfa7382
-request-id: 1523e8f3-8ed2-40fe-8a9a-5d81eb572cc8
-x-ms-request-id: 1523e8f3-8ed2-40fe-8a9a-5d81eb572cc8
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Thu, 05 Feb 2015 07:52:30 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Edm.String",
-"value":"MIIDSTCCAjGgAwIBAgIQqf92wku/HLJGCbMAU8GEnDANBgkqhkiG9w0BAQQFADAuMSwwKgYDVQQDEyN3YW1zYmx1cmVnMDAxZW5jcnlwdGFsbHNlY3JldHMtY2VydDAeFw0xMjA1MjkwNzAwMDBaFw0zMjA1MjkwNzAwMDBaMC4xLDAqBgNVBAMTI3dhbXNibHVyZWcwMDFlbmNyeXB0YWxsc2VjcmV0cy1jZXJ0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAzR0SEbXefvUjb9wCUfkEiKtGQ5Gc328qFPrhMjSo+YHe0AVviZ9YaxPPb0m1AaaRV4dqWpST2+JtDhLOmGpWmmA60tbATJDdmRzKi2eYAyhhE76MgJgL3myCQLP42jDusWXWSMabui3/tMDQs+zfi1sJ4Ch/lm5EvksYsu6o8sCv29VRwxfDLJPBy2NlbV4GbWz5Qxp2tAmHoROnfaRhwp6WIbquk69tEtu2U50CpPN2goLAqx2PpXAqA+prxCZYGTHqfmFJEKtZHhizVBTFPGS3ncfnQC9QIEwFbPw6E5PO5yNaB68radWsp5uvDg33G1i8IT39GstMW6zaaG7cNQIDAQABo2MwYTBfBgNVHQEEWDBWgBCOGT2hPhsvQioZimw8M+jOoTAwLjEsMCoGA1UEAxMjd2Ftc2JsdXJlZzAwMWVuY3J5cHRhbGxzZWNyZXRzLWNlcnSCEKn/dsJLvxyyRgmzAFPBhJwwDQYJKoZIhvcNAQEEBQADggEBABcrQPma2ekNS3Wc5wGXL/aHyQaQRwFGymnUJ+VR8jVUZaC/U/f6lR98eTlwycjVwRL7D15BfClGEHw66QdHejaViJCjbEIJJ3p2c9fzBKhjLhzB3VVNiLIaH6RSI1bMPd2eddSCqhDIn3VBN605GcYXMzhYp+YA6g9+YMNeS1b+LxX3fqixMQIxSHOLFZ1G/H2xfNawv0VikH3djNui3EKT1w/8aRkUv/AAV0b3rYkP/jA1I0CPn0XFk7STYoiJ3gJoKq9EMXhit+Iwfz0sMkfhWG12/XO+TAWqsK1ZxEjuC9OzrY7pFnNxs4Mu4S8iinehduSpY+9mDd3dHynNwT4="}
-```
-
-### Create the content key
-After you have retrieved the X.509 certificate and used its public key to encrypt your content key, create a **ContentKey** entity and set its property values accordingly.
-
-One of the values that you must set when create the content key is the type. When using storage encryption, the value should be set to '1'.
-
-The following example shows how to create a **ContentKey** with a **ContentKeyType** set for storage encryption ("1") and the **ProtectionKeyType** set to "0" to indicate that the protection key ID is the X.509 certificate thumbprint.
-
-Request
-
-```console
-POST https://media.windows.net/api/ContentKeys HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-User-Agent: Microsoft ADO.NET Data Services
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-{
-"Name":"ContentKey",
-"ProtectionKeyId":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C",
-"ContentKeyType":"1",
-"ProtectionKeyType":"0",
-"EncryptedContentKey":"your encrypted content key",
-"Checksum":"calculated checksum"
-}
-```
-
-Response:
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 777
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://media.windows.net/api/ContentKeys('nb%3Akid%3AUUID%3A9c8ea9c6-52bd-4232-8a43-8e43d8564a99')
-Server: Microsoft-IIS/8.5
-request-id: 76e85e0f-5cf1-44cb-b689-b3455888682c
-x-ms-request-id: 76e85e0f-5cf1-44cb-b689-b3455888682c
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 04 Feb 2015 02:37:46 GMT
-
-{"odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#ContentKeys/@Element",
-"Id":"nb:kid:UUID:9c8ea9c6-52bd-4232-8a43-8e43d8564a99","Created":"2015-02-04T02:37:46.9684379Z",
-"LastModified":"2015-02-04T02:37:46.9684379Z",
-"ContentKeyType":1,
-"EncryptedContentKey":"your encrypted content key",
-"Name":"ContentKey",
-"ProtectionKeyId":"7D9BB04D9D0A4A24800CADBFEF232689E048F69C",
-"ProtectionKeyType":0,
-"Checksum":"calculated checksum"}
-```
-
-## Create an asset
-The following example shows how to create an asset.
-
-**HTTP Request**
-
-```console
-POST https://media.windows.net/api/Assets HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-
-{"Name":"BigBuckBunny" "Options":1}
-```
-
-**HTTP Response**
-
-If successful, the following response is returned:
-
-```console
-HTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 452
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Assets('nb%3Acid%3AUUID%3A9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1')
-Server: Microsoft-IIS/8.5
-x-ms-client-request-id: c59de965-bc89-4295-9a57-75d897e5221e
-request-id: e98be122-ae09-473a-8072-0ccd234a0657
-x-ms-request-id: e98be122-ae09-473a-8072-0ccd234a0657
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Sun, 18 Jan 2015 22:06:40 GMT
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Assets/@Element",
- "Id":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "State":0,
- "Created":"2015-01-18T22:06:40.6010903Z",
- "LastModified":"2015-01-18T22:06:40.6010903Z",
- "AlternateId":null,
- "Name":"BigBuckBunny.mp4",
- "Options":1,
- "Uri":"https://storagetestaccount001.blob.core.windows.net/asset-9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "StorageAccountName":"storagetestaccount001"
-}
-```
-
-## Associate the ContentKey with an Asset
-After creating the ContentKey, associate it with your Asset using the $links operation, as shown in the following example:
-
-Request:
-
-```console
-POST https://media.windows.net/api/Assets('nb%3Acid%3AUUID%3Afbd7ce05-1087-401b-aaae-29f16383c801')/$links/ContentKeys HTTP/1.1
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Content-Type: application/json
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-
-{"uri":"https://wamsbayclus001rest-hs.cloudapp.net/api/ContentKeys('nb%3Akid%3AUUID%3A01e6ea36-2285-4562-91f1-82c45736047c')"}
-```
-
-Response:
-
-```console
-HTTP/1.1 204 No Content
-```
-
-## Create an AssetFile
-The [AssetFile](/rest/api/media/operations/assetfile) entity represents a video or audio file that is stored in a blob container. An asset file is always associated with an asset, and an asset may contain one or many asset files. The Media Services Encoder task fails if an asset file object is not associated with a digital file in a blob container.
-
-The **AssetFile** instance and the actual media file are two distinct objects. The AssetFile instance contains metadata about the media file, while the media file contains the actual media content.
-
-After you upload your digital media file into a blob container, you will use the **MERGE** HTTP request to update the AssetFile with information about your media file (not shown in this article).
-
-**HTTP Request**
-
-```console
-POST https://media.windows.net/api/Files HTTP/1.1
-Content-Type: application/json
-DataServiceVersion: 1.0;NetFx
-MaxDataServiceVersion: 3.0;NetFx
-Accept: application/json
-Accept-Charset: UTF-8
-Authorization: Bearer <ENCODED JWT TOKEN>
-x-ms-version: 2.19
-Host: media.windows.net
-Content-Length: 164
-
-{
- "IsEncrypted":"true",
- "EncryptionScheme" : "StorageEncryption",
- "EncryptionVersion" : "1.0",
- "EncryptionKeyId" : "nb:kid:UUID:32e6efaf-5fba-4538-b115-9d1cefe43510",
- "InitializationVector" : "397304628502661816</d:InitializationVector",
- "Options":0,
- "IsPrimary":"false",
- "MimeType":"video/mp4",
- "Name":"BigBuckBunny.mp4",
- "ParentAssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1"
-}
-```
-
-**HTTP Response**
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 535
-Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
-Location: https://wamsbayclus001rest-hs.cloudapp.net/api/Files('nb%3Acid%3AUUID%3Af13a0137-0a62-9d4c-b3b9-ca944b5142c5')
-Server: Microsoft-IIS/8.5
-request-id: 98a30e2d-f379-4495-988e-0b79edc9b80e
-x-ms-request-id: 98a30e2d-f379-4495-988e-0b79edc9b80e
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Mon, 19 Jan 2015 00:34:07 GMT
-
-{
- "odata.metadata":"https://wamsbayclus001rest-hs.cloudapp.net/api/$metadata#Files/@Element",
- "Id":"nb:cid:UUID:f13a0137-0a62-9d4c-b3b9-ca944b5142c5",
- "Name":"BigBuckBunny.mp4",
- "ContentFileSize":"0",
- "ParentAssetId":"nb:cid:UUID:9bc8ff20-24fb-4fdb-9d7c-b04c7ee573a1",
- "EncryptionVersion": "1.0",
- "EncryptionScheme": "StorageEncryption",
- "IsEncrypted":true,
- "EncryptionKeyId":"nb:kid:UUID:32e6efaf-5fba-4538-b115-9d1cefe43510",
- "InitializationVector":"397304628502661816</d:InitializationVector",
- "IsPrimary":false,
- "LastModified":"2015-01-19T00:34:08.1934137Z",
- "Created":"2015-01-19T00:34:08.1934137Z",
- "MimeType":"video/mp4",
- "ContentChecksum":null
-}
-```
media-services Media Services Rest Telemetry https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-telemetry.md
- Title: Configuring Azure Media Services telemetry with REST| Microsoft Docs
-description: This article shows you how to use the Azure Media Services telemetry using REST API..
------ Previously updated : 3/10/2021---
-# Configuring Azure Media Services telemetry with REST
--
-This topic describes general steps that you might take when configuring the Azure Media Services (AMS) telemetry using REST API.
-
->[!NOTE]
->For the detailed explanation of what is AMS telemetry and how to consume it, see the [overview](media-services-telemetry-overview.md) topic.
-
-The steps described in this topic are:
--- Getting the storage account associated with a Media Services account-- Getting the Notification Endpoints-- Creating a Notification Endpoint for Monitoring. -
- To create a Notification Endpoint, set the EndPointType to AzureTable (2) and endPontAddress set to the storage table (for example, https:\//telemetryvalidationstore.table.core.windows.net/).
-
-- Get the monitoring configurations-
- Create a monitoring configuration settings for the services you want to monitor. No more than one monitoring configuration settings is allowed.
--- Add a monitoring configuration--
-
-## Get the storage account associated with a Media Services account
-
-### Request
-
-```console
-GET https://wamsbnp1clus001rest-hs.cloudapp.net/api/StorageAccounts HTTP/1.1
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-Response
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 370
-Content-Type: application/json;odata=verbose;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 8206e222-2a59-482c-a6a9-de6b8bda57fb
-x-ms-request-id: 8206e222-2a59-482c-a6a9-de6b8bda57fb
-X-Content-Type-Options: nosniff
-DataServiceVersion: 2.0;
-access-control-expose-headers: request-id, x-ms-request-id
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 02 Dec 2015 05:10:40 GMT
-
-{"d":{"results":[{"__metadata":{"id":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/StorageAccounts('telemetryvalidationstore')","uri":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/StorageAccounts('telemetryvalidationstore')","type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.StorageAccount"},"Name":"telemetryvalidationstore","IsDefault":true,"BytesUsed":null}]}}
-```
-
-## Get the Notification Endpoints
-
-### Request
-
-```console
-GET https://wamsbnp1clus001rest-hs.cloudapp.net/api/NotificationEndPoints HTTP/1.1
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-```
-
-### Response
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 20
-Content-Type: application/json;odata=verbose;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: c68de2b3-0be1-4823-b622-6ca6f94a96b5
-x-ms-request-id: c68de2b3-0be1-4823-b622-6ca6f94a96b5
-X-Content-Type-Options: nosniff
-DataServiceVersion: 2.0;
-access-control-expose-headers: request-id, x-ms-request-id
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 02 Dec 2015 05:10:40 GMT
-
-{
- "d":{
- "results":[]
- }
-}
- ```
-
-## Create a Notification Endpoint for monitoring
-
-### Request
-
-```console
-POST https://wamsbnp1clus001rest-hs.cloudapp.net/api/NotificationEndPoints HTTP/1.1
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Content-Type: application/json; charset=utf-8
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-Content-Length: 115
-
-{
- "Name":"monitoring",
- "EndPointAddress":"https:\//telemetryvalidationstore.table.core.windows.net/",
- "EndPointType":2
-}
-```
-
-> [!NOTE]
-> Don't forget to change the "https:\//telemetryvalidationstore.table.core.windows.net" value to your storage account.
-
-### Response
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 578
-Content-Type: application/json;odata=verbose;charset=utf-8
-Location: https://wamsbnp1clus001rest-hs.cloudapp.net/api/NotificationEndPoints('nb%3Anepid%3AUUID%3A76bb4faf-ea29-4815-840a-9a8e20102fc4')
-Server: Microsoft-IIS/8.5
-request-id: e8fa5a60-7d8b-4b00-a7ee-9b0f162fe0a9
-x-ms-request-id: e8fa5a60-7d8b-4b00-a7ee-9b0f162fe0a9
-X-Content-Type-Options: nosniff
-DataServiceVersion: 1.0;
-access-control-expose-headers: request-id, x-ms-request-id
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 02 Dec 2015 05:10:42 GMT
-
-{"d":{"__metadata":{"id":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/NotificationEndPoints('nb%3Anepid%3AUUID%3A76bb4faf-ea29-4815-840a-9a8e20102fc4')","uri":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/NotificationEndPoints('nb%3Anepid%3AUUID%3A76bb4faf-ea29-4815-840a-9a8e20102fc4')","type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.NotificationEndPoint"},"Id":"nb:nepid:UUID:76bb4faf-ea29-4815-840a-9a8e20102fc4","Name":"monitoring","Created":"\/Date(1449033042667)\/","EndPointAddress":"https://telemetryvalidationstore.table.core.windows.net/","EndPointType":2}}
- ```
-
-## Get the monitoring configurations
-
-### Request
-
-```console
-GET https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations HTTP/1.1
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-```
-
-### Response
-
-```console
-HTTP/1.1 200 OK
-Cache-Control: no-cache
-Content-Length: 20
-Content-Type: application/json;odata=verbose;charset=utf-8
-Server: Microsoft-IIS/8.5
-request-id: 00a3ee37-bb19-4fca-b5c7-a92b629d4416
-x-ms-request-id: 00a3ee37-bb19-4fca-b5c7-a92b629d4416
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-access-control-expose-headers: request-id, x-ms-request-id
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 02 Dec 2015 05:10:42 GMT
-
-{"d":{"results":[]}}
-```
-
-## Add a monitoring configuration
-
-### Request
-
-```console
-POST https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations HTTP/1.1
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Content-Type: application/json; charset=utf-8
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-Content-Length: 133
-
-{
- "NotificationEndPointId":"nb:nepid:UUID:76bb4faf-ea29-4815-840a-9a8e20102fc4",
- "Settings":[
- {
- "Component":"Channel",
- "Level":"Normal"
- }
- ]
-}
-```
-
-### Response
-
-```console
-HTTP/1.1 201 Created
-Cache-Control: no-cache
-Content-Length: 825
-Content-Type: application/json;odata=verbose;charset=utf-8
-Location: https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations('nb%3Amcid%3AUUID%3A1a8931ae-799f-45fd-8aeb-9641740295c2')
-Server: Microsoft-IIS/8.5
-request-id: daede9cb-8684-41b0-a921-a3af66430cbe
-x-ms-request-id: daede9cb-8684-41b0-a921-a3af66430cbe
-X-Content-Type-Options: nosniff
-DataServiceVersion: 3.0;
-access-control-expose-headers: request-id, x-ms-request-id
-X-Powered-By: ASP.NET
-Strict-Transport-Security: max-age=31536000; includeSubDomains
-Date: Wed, 02 Dec 2015 05:10:43 GMT
-
-{"d":{"__metadata":{"id":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations('nb%3Amcid%3AUUID%3A1a8931ae-799f-45fd-8aeb-9641740295c2')","uri":"https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations('nb%3Amcid%3AUUID%3A1a8931ae-799f-45fd-8aeb-9641740295c2')","type":"Microsoft.Cloud.Media.Vod.Rest.Data.Models.MonitoringConfiguration"},"Id":"nb:mcid:UUID:1a8931ae-799f-45fd-8aeb-9641740295c2","NotificationEndPointId":"nb:nepid:UUID:76bb4faf-ea29-4815-840a-9a8e20102fc4","Created":"2015-12-02T05:10:43.7680396Z","LastModified":"2015-12-02T05:10:43.7680396Z","Settings":{"__metadata":{"type":"Collection(Microsoft.Cloud.Media.Vod.Rest.Data.Models.ComponentMonitoringSettings)"},"results":[{"Component":"Channel","Level":"Normal"},{"Component":"StreamingEndpoint","Level":"Disabled"}]}}}
-```
-
-## Stop telemetry
-
-### Request
-
-```console
-DELETE https://wamsbnp1clus001rest-hs.cloudapp.net/api/MonitoringConfigurations('nb%3Amcid%3AUUID%3A1a8931ae-799f-45fd-8aeb-9641740295c2')
-x-ms-version: 2.19
-DataServiceVersion: 3.0
-MaxDataServiceVersion: 3.0
-Accept: application/json; odata=verbose
-Authorization: (redacted)
-Content-Type: application/json; charset=utf-8
-Host: wamsbnp1clus001rest-hs.cloudapp.net
-```
-
-## Consuming telemetry information
-
-For information about consuming telemetry information, see [this](media-services-telemetry-overview.md) topic.
-
-## Next steps
--
-## Provide feedback
-
media-services Media Services Rest Upload Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-rest-upload-files.md
- Title: Upload files into an Azure Media Services account using REST | Microsoft Docs
-description: Learn how to get media content into Media Services by creating and uploading assets using REST.
------ Previously updated : 3/10/2021--
-# Upload files into a Media Services account using REST
--
-> [!div class="op_single_selector"]
-> * [.NET](media-services-dotnet-upload-files.md)
-> * [REST](media-services-rest-upload-files.md)
-> * [Portal](media-services-portal-upload-files.md)
->
-
-In Media Services, you upload your digital files into an asset. The [Asset](/rest/api/media/operations/asset) entity can contain video, audio, images, thumbnail collections, text tracks and closed caption files (and the metadata about these files.) Once the files are uploaded into the asset, your content is stored securely in the cloud for further processing and streaming.
-
-In this tutorial, you learn how to upload a file and other operation associated with it:
-
-> [!div class="checklist"]
-> * Set up Postman for all the upload operations
-> * Connect to Media Services
-> * Create an access policy with write permission
-> * Create an asset
-> * Create a SAS locator and create the upload URL
-> * Upload a file to blob storage using the upload URL
-> * Create a metadata in the asset for the media file you uploaded
-
-## Prerequisites
--- If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.-- [Create an Azure Media Services account using the Azure portal](media-services-portal-create-account.md).-- Review the [Accessing Azure Media Services API with AAD authentication overview](media-services-use-aad-auth-to-access-ams-api.md) article.-- Also for more information Review the [Use Azure AD authentication to access the Media Services API with REST](./media-services-rest-connect-with-aad.md) article.-- Configure **Postman** as described in [Configure Postman for Media Services REST API calls](media-rest-apis-with-postman.md).-
-## Considerations
-
-The following considerations apply when using Media Services REST API:
-
-* When accessing entities using Media Services REST API, you must set specific header fields and values in your HTTP requests. For more information, see [Setup for Media Services REST API Development](media-services-rest-how-to-use.md). <br/>The Postman collection used in this tutorial takes care of setting all the necessary headers.
-* Media Services uses the value of the IAssetFile.Name property when building URLs for the streaming content (for example, http://{AMSAccount}.origin.mediaservices.windows.net/{GUID}/{IAssetFile.Name}/streamingParameters.) For this reason, percent-encoding is not allowed. The value of the **Name** property cannot have any of the following [percent-encoding-reserved characters](https://en.wikipedia.org/wiki/Percent-encoding#Percent-encoding_reserved_characters): !*'();:@&=+$,/?%#[]". Also, there can only be one '.' for the file name extension.
-* The length of the name should not be greater than 260 characters.
-* There is a limit to the maximum file size supported for processing in Media Services. See [this](media-services-quotas-and-limitations.md) article for details about the file size limitation.
-
-## Set up Postman
-
-For steps on how to set up Postman for this tutorial, see [Configure Postman](media-rest-apis-with-postman.md).
-
-## Connect to Media Services
-
-1. Add connection values to your environment.
-
- Some variables that are part of the **MediaServices** [environment](postman-environment.md) need to be set manually before you can start executing operations defined in the [collection](postman-collection.md).
-
- To get values for the first five variables, see [Access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
- ![Screenshot that shows the "Cog" icon selected from the top-right, and the first five variables selected from the "Management Environments" tab.](./media/media-services-rest-upload-files/postman-import-env.png)
-2. Specify the value for the **MediaFileName** environment variable.
-
- Specify the file name of the media you are planning to upload. In this example, we are going to upload the BigBuckBunny.mp4.
-3. Examine the **AzureMediaServices.postman_environment.json** file. You will see that almost all operations in the collection execute a "test" script. The scripts take some values returned by the response and set appropriate environment variables.
-
- For example, the first operation gets an access token and set it on the **AccessToken** environment variable that is used in all other operations.
-
- ```
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"AccessToken\", json.access_token);"
- ]
- }
- ```
-4. On the left of the **Postman** window, click on **1. Get AAD Auth token** -> **Get Azure AD Token for Service Principal**.
-
- The URL portion is filled with the **AzureADSTSEndpoint** environment variable (earlier in the tutorial, you set the values of environment variables that support the collection).
-
- ![Screenshot that shows "1. Get A A D Auth token - Get Azure A D Token for Service Principal" selected from the "Postman" window, and the "Send" button selected.](./media/media-services-rest-upload-files/postment-get-token.png)
-
-5. Press **Send**.
-
- You can see the response that contains "access_token". The "test" script takes this value and sets the **AccessToken** environment variable (as described above). If you examine your environment variables, you will see that this variable now contains the access token (bearer token) value that is used in the rest of the operations.
-
- If the token expires go through the "Get Azure AD Token for Service Principal" step again. 
-
-## Create an access policy with write permission
-
-### Overview
-
->[!NOTE]
->There is a limit of 1,000,000 policies for different AMS policies (for example, for Locator policy or ContentKeyAuthorizationPolicy). You should use the same policy ID if you are always using the same days / access permissions, for example, policies for locators that are intended to remain in place for a long time (non-upload policies). For more information, see [this](media-services-dotnet-manage-entities.md#limit-access-policies) article.
-
-Before uploading any files into blob storage, set the access policy rights for writing to an asset. To do that, POST an HTTP request to the AccessPolicies entity set. Define a DurationInMinutes value upon creation or you receive a 500 Internal Server error message back in response. For more information on AccessPolicies, see [AccessPolicy](/rest/api/media/operations/accesspolicy).
-
-### Create an access policy
-
-1. Select **AccessPolicy** -> **Create AccessPolicy for Upload**.
-2. Press **Send**.
-
- ![Screenshot that shows "AccessPolicy - Create AccessPolicy for Upload" selected from the left-side menu, and the "Send" button selected.](./media/media-services-rest-upload-files/postman-access-policy.png)
-
- The "test" script gets the AccessPolicy Id and sets the appropriate environment variable.
-
-## Create an asset
-
-### Overview
-
-An [asset](/rest/api/media/operations/asset) is a container for multiple types or sets of objects in Media Services, including video, audio, images, thumbnail collections, text tracks, and closed caption files. In the REST API, creating an Asset requires sending POST request to Media Services and placing any property information about your asset in the request body.
-
-One of the properties that you can add when creating an asset is **Options**. You can specify one of the following encryption options: **None** (default, no encryption is used), **StorageEncrypted** (for content that has been pre-encrypted with client-side storage encryption), **CommonEncryptionProtected**, or **EnvelopeEncryptionProtected**. When you have an encrypted asset, you need to configure a delivery policy. For more information, see [Configuring asset delivery policies](media-services-rest-configure-asset-delivery-policy.md).
-
-If your asset is encrypted, you must create a **ContentKey** and link it to your asset as described in the following article: [How to create a ContentKey](media-services-rest-create-contentkey.md). After you upload the files into the asset, you need to update the encryption properties on the **AssetFile** entity with the values you got during the **Asset** encryption. Do it by using the **MERGE** HTTP request.
-
-In this example, we are creating an unencrypted asset.
-
-### Create an asset
-
-1. Select **Assets** -> **Create Asset**.
-2. Press **Send**.
-
- ![Screenshot that shows "Assets - Create Asset" selected from the "Collections" menu, and the "Send" button selected.](./media/media-services-rest-upload-files/postman-create-asset.png)
-
- The "test" script gets the Asset Id and sets the appropriate environment variable.
-
-## Create a SAS locator and create the Upload URL
-
-### Overview
-
-Once you have the AccessPolicy and Locator set, the actual file is uploaded to an Azure Blob Storage container using the Azure Storage REST APIs. You must upload the files as block blobs. Page blobs are not supported by Azure Media Services.
-
-For more information on working with Azure storage blobs, see [Blob Service REST API](/rest/api/storageservices/blob-service-rest-api).
-
-To receive the actual upload URL, create a SAS Locator (shown below). Locators define the start time and type of connection endpoint for clients that want to access Files in an Asset. You can create multiple Locator entities for a given AccessPolicy and Asset pair to handle different client requests and needs. Each of these Locators uses the StartTime value plus the DurationInMinutes value of the AccessPolicy to determine the length of time a URL can be used. For more information, see [Locator](/rest/api/media/operations/locator).
-
-A SAS URL has the following format:
-
-`{https://myaccount.blob.core.windows.net}/{asset name}/{video file name}?{SAS signature}`
-
-### Considerations
-
-Some considerations apply:
-
-* You cannot have more than five unique Locators associated with a given Asset at one time. For more information, see Locator.
-* If you need to upload your files immediately, you should set your StartTime value to five minutes before the current time. This is because there may be clock skew between your client machine and Media Services. Also, your StartTime value must be in the following DateTime format: YYYY-MM-DDTHH:mm:ssZ (for example, "2014-05-23T17:53:50Z").
-* There may be a 30-40 second delay after a Locator is created to when it is available for use.
-
-### Create a SAS locator
-
-1. Select **Locator** -> **Create SAS Locator**.
-2. Press **Send**.
-
- The "test" script creates the "Upload URL" based on the media file name you specified and SAS locator information and sets the appropriate environment variable.
-
- ![Screenshot that shows "Locator - Create S A S Locator" selected from the "Collections" menu, and the "Send" button selected.](./media/media-services-rest-upload-files/postman-create-sas-locator.png)
-
-## Upload a file to blob storage using the upload URL
-
-### Overview
-
-Now that you have the upload URL, you need to write some code using the Azure Blob APIs directly to upload your file to the SAS container. For more information, see the following articles:
--- [Using the Azure Storage REST API](../../storage/common/storage-rest-api-auth.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json)-- [PUT Blob](/rest/api/storageservices/put-blob)-- [Upload blobs to Blob storage](/previous-versions/azure/storage/storage-use-azcopy#upload-blobs-to-blob-storage)-
-### Upload a file with Postman
-
-As an example, we use Postman to upload a small .mp4 file. There may be a file size limit on uploading binary through Postman.
-
-The upload request is not part of the **AzureMedia** collection.
-
-Create and set up a new request:
-1. Press **+**, to create a new request tab.
-2. Select **PUT** operation and paste **{{UploadURL}}** in the URL.
-2. Leave **Authorization** tab as is (do not set it to the **Bearer Token**).
-3. In the **Headers** tab, specify: **Key**: "x-ms-blob-type" and **Value**: "BlockBlob".
-2. In the **Body** tab, click **binary**.
-4. Choose the file with the name that you specified in the **MediaFileName** environment variable.
-5. Press **Send**.
-
- ![Screenshot that shows the "(UploadU R L)" tab selected.](./media/media-services-rest-upload-files/postman-upload-file.png)
-
-## Create a metadata in the asset
-
-Once the file has been uploaded, you need to create a metadata in the asset for the media file you uploaded into the blob storage associated with your asset.
-
-1. Select **AssetFiles** -> **CreateFileInfos**.
-2. Press **Send**.
-
- ![Upload a file](./media/media-services-rest-upload-files/postman-create-file-info.png)
-
-The file should be uploaded and its metadata set.
-
-## Validate
-
-To validate that the file has been uploaded successfully, you might want to query the [AssetFile](/rest/api/media/operations/assetfile) and compare the **ContentFileSize** (or other details) to what you expect to see in the new asset.
-
-For example, the following **GET** operation brings file data for your asset file (in or case, the BigBuckBunny.mp4 file). The query is using the [environment variables](postman-environment.md) that you set earlier.
-
-`{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')/Files`
-
-Response will contain size, name, and other information.
-
-```console
-"Id": "nb:cid:UUID:69e72ede-2886-4f2a-8d36-80a59da09913",
-"Name": "BigBuckBunny.mp4",
-"ContentFileSize": "3186542",
-"ParentAssetId": "nb:cid:UUID:0b8f3b04-72fb-4f38-8e7b-d7dd78888938",
-```
-
-## Next steps
-
-You can now encode your uploaded assets. For more information, see [Encode assets](media-services-portal-encode.md).
-
-You can also use Azure Functions to trigger an encoding job based on a file arriving in the configured container. For more information, see [this sample](https://azure.microsoft.com/resources/samples/media-services-dotnet-functions-integration/ ).
media-services Media Services Retry Logic In Dotnet Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-retry-logic-in-dotnet-sdk.md
- Title: Retry logic in the Media Services SDK for .NET | Microsoft Docs
-description: The topic gives an overview of retry logic in the Media Services SDK for .NET.
------ Previously updated : 3/10/2021--
-# Retry logic in the Media Services SDK for .NET
--
-When working with Microsoft Azure services, transient faults can occur. If a transient fault occurs, in most cases, after a few retries the operation succeeds. The Media Services SDK for .NET implements the retry logic to handle transient faults associated with exceptions and errors that are caused by web requests, executing queries, saving changes, and storage operations. By default, the Media Services SDK for .NET executes four retries before re-throwing the exception to your application. The code in your application must then handle this exception properly.
-
- The following is a brief guideline of Web Request, Storage, Query, and SaveChanges policies:
-
-* The Storage policy is used for blob storage operations (uploads or download of asset files).
-* The Web Request policy is used for generic web requests (for example, for getting an authentication token and resolving the users cluster endpoint).
-* The Query policy is used for querying entities from REST (for example, mediaContext.Assets.Where(…)).
-* The SaveChanges policy is used for doing anything that changes data within the service (for example, creating an entity updating an entity, calling a service function for an operation).
-
- This topic lists exception types and error codes that are handled by the Media Services SDK for .NET retry logic.
-
-## Exception types
-The following table describes exceptions that the Media Services SDK for .NET handles or does not handle for some operations that may cause transient faults.
-
-| Exception | Web Request | Storage | Query | SaveChanges |
-| | | | | |
-| WebException<br/>For more information, see the [WebException status codes](media-services-retry-logic-in-dotnet-sdk.md#WebExceptionStatus) section. |Yes |Yes |Yes |Yes |
-| DataServiceClientException<br/> For more information, see [HTTP error status codes](media-services-retry-logic-in-dotnet-sdk.md#HTTPStatusCode). |No |Yes |Yes |Yes |
-| DataServiceQueryException<br/> For more information, see [HTTP error status codes](media-services-retry-logic-in-dotnet-sdk.md#HTTPStatusCode). |No |Yes |Yes |Yes |
-| DataServiceRequestException<br/> For more information, see [HTTP error status codes](media-services-retry-logic-in-dotnet-sdk.md#HTTPStatusCode). |No |Yes |Yes |Yes |
-| DataServiceTransportException |No |No |Yes |Yes |
-| TimeoutException |Yes |Yes |Yes |No |
-| SocketException |Yes |Yes |Yes |Yes |
-| StorageException |No |Yes |No |No |
-| IOException |No |Yes |No |No |
-
-### <a name="WebExceptionStatus"></a> WebException status codes
-The following table shows for which WebException error codes the retry logic is implemented. The [WebExceptionStatus](/dotnet/api/system.net.webexceptionstatus) enumeration defines the status codes.
-
-| Status | Web Request | Storage | Query | SaveChanges |
-| | | | | |
-| ConnectFailure |Yes |Yes |Yes |Yes |
-| NameResolutionFailure |Yes |Yes |Yes |Yes |
-| ProxyNameResolutionFailure |Yes |Yes |Yes |Yes |
-| SendFailure |Yes |Yes |Yes |Yes |
-| PipelineFailure |Yes |Yes |Yes |No |
-| ConnectionClosed |Yes |Yes |Yes |No |
-| KeepAliveFailure |Yes |Yes |Yes |No |
-| UnknownError |Yes |Yes |Yes |No |
-| ReceiveFailure |Yes |Yes |Yes |No |
-| RequestCanceled |Yes |Yes |Yes |No |
-| Timeout |Yes |Yes |Yes |No |
-| ProtocolError <br/>The retry on ProtocolError is controlled by the HTTP status code handling. For more information, see [HTTP error status codes](media-services-retry-logic-in-dotnet-sdk.md#HTTPStatusCode). |Yes |Yes |Yes |Yes |
-
-### <a name="HTTPStatusCode"></a> HTTP error status codes
-When Query and SaveChanges operations throw DataServiceClientException, DataServiceQueryException, or DataServiceQueryException, the HTTP error status code is returned in the StatusCode property. The following table shows for which error codes the retry logic is implemented.
-
-| Status | Web Request | Storage | Query | SaveChanges |
-| | | | | |
-| 401 |No |Yes |No |No |
-| 403 |No |Yes<br/>Handling retries with longer waits. |No |No |
-| 408 |Yes |Yes |Yes |Yes |
-| 429 |Yes |Yes |Yes |Yes |
-| 500 |Yes |Yes |Yes |No |
-| 502 |Yes |Yes |Yes |No |
-| 503 |Yes |Yes |Yes |Yes |
-| 504 |Yes |Yes |Yes |No |
-
-If you want to take a look at the actual implementation of the Media Services SDK for .NET retry logic, see [azure-sdk-for-media-services](https://github.com/Azure/azure-sdk-for-media-services/tree/dev/src/net/Client/TransientFaultHandling).
-
-## Next steps
-
-## Provide feedback
media-services Media Services Roll Storage Access Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-roll-storage-access-keys.md
- Title: Update Media Services after rolling storage access keys | Microsoft Docs
-description: This articles give you guidance on how to update Media Services after rolling storage access keys.
------ Previously updated : 3/10/2021----
-# Update Media Services after rolling storage access keys
--
-When you create a new Azure Media Services (AMS) account, you are also asked to select an Azure Storage account that is used to store your media content. You can add more than one storage accounts to your Media Services account. This article shows how to rotate storage keys. It also shows how to add storage accounts to a media account.
-
-To perform the actions described in this article, you should be using [Azure Resource Manager APIs](/rest/api/medi).
--
-## Overview
-
-When a new storage account is created, Azure generates two 512-bit storage access keys, which are used to authenticate access to your storage account. To keep your storage connections more secure, it is recommended to periodically regenerate and rotate your storage access key. Two access keys (primary and secondary) are provided in order to enable you to maintain connections to the storage account using one access key while you regenerate the other access key. This procedure is also called "rolling access keys".
-
-Media Services depends on a storage key provided to it. Specifically, the locators that are used to stream or download your assets depend on the specified storage access key. When an AMS account is created, it takes a dependency on the primary storage access key by default but as a user you can update the storage key that AMS has. You must make sure to let Media Services know which key to use by following steps described in this article.
-
->[!NOTE]
-> If you have multiple storage accounts, you would perform this procedure with each storage account. The order in which you rotate storage keys is not fixed. You can rotate the secondary key first and then the primary key or vice versa.
->
-> Before executing steps described in this article on a production account, make sure to test them on a pre-production account.
->
-
-## Steps to rotate storage keys
-
- 1. Change the storage account Primary key through the powershell cmdlet or [Azure](https://portal.azure.com/) portal.
- 2. Call Sync-AzMediaServiceStorageKeys cmdlet with appropriate params to force media account to pick up storage account keys
-
- The following example shows how to sync keys to storage accounts.
-
- `Sync-AzMediaServiceStorageKeys -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccountId $storageAccountId`
-
- 3. Wait an hour or so. Verify the streaming scenarios are working.
- 4. Change storage account secondary key through the powershell cmdlet or Azure portal.
- 5. Call Sync-AzMediaServiceStorageKeys powershell with appropriate params to force media account to pick up new storage account keys.
- 6. Wait an hour or so. Verify the streaming scenarios are working.
-
-### A powershell cmdlet example
-
-The following example demonstrates how to get the storage account and sync it with the AMS account.
-
-```console
-$regionName = "West US"
-$resourceGroupName = "SkyMedia-USWest-App"
-$mediaAccountName = "sky"
-$storageAccountName = "skystorage"
-$storageAccountId = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Storage/storageAccounts/$storageAccountName"
-
-Sync-AzMediaServiceStorageKeys -ResourceGroupName $resourceGroupName -AccountName $mediaAccountName -StorageAccountId $storageAccountId
-```
-
-## Steps to add storage accounts to your AMS account
-
-The following article shows how to add storage accounts to your AMS account: [Attach multiple storage accounts to a Media Services account](./media-services-managing-multiple-storage-accounts.md).
-
-## Media Services learning paths
-
-## Provide feedback
-
-### Acknowledgments
-We would like to acknowledge the following people who contributed towards creating this document: Cenk Dingiloglu, Milan Gada, Seva Titov.
media-services Media Services Scale Media Processing Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-scale-media-processing-overview.md
- Title: Media reserved units overview | Microsoft Docs
-description: This article is an overview of scaling Media Processing with Azure Media Services.
------ Previously updated : 08/24/2021--
-# Media reserved units
--
-Media Reserved Units (MRUs) were previously used to control encoding concurrency and performance. MRUs are only being used for the following legacy media processors that are to be deprecated soon. See [Azure Media Services legacy components](legacy-components.md) for retirement info for these legacy processors:
-
-* Media Encoder Premium Workflow
-* Media Indexer V1 and V2
-
-For all other media processors, you no longer need to manage MRUs or request quota increases for any media services account as the system will automatically scale up and down based on load. You will also see performance that is equal to or improved in comparison to using MRUs.
-
-## Billing
-
-While there were previously charges for Media Reserved Units, as of April 17, 2021 there are no longer any charges for accounts that have configuration for Media Reserved Units.
-
-## Scaling MRUs
-
-For compatibility purposes, you can continue to use the Azure portal or the following APIs to manage and scale MRUs:
-
-[.NET](media-services-dotnet-encoding-units.md)
-[Portal](media-services-portal-scale-media-processing.md)
-[REST](/rest/api/media/operations/encodingreservedunittype)
-[Java](https://github.com/rnrneverdies/azure-sdk-for-media-services-java-samples)
-[PHP](https://github.com/Azure/azure-sdk-for-php/tree/master/examples/MediaServices)
-
-However, by default none of the MRU configuration that you set will be used to control encoding concurrency or performance. The only exception to this configuration is if you are encoding with one of the following legacy media processors: Media Encoder Premium Workflow or Media Indexer V1.
media-services Media Services Set Up Computer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-set-up-computer.md
- Title: Set up your Azure Media Services development environment | Microsoft Docs
-description: This article describes how to set up your environment for developing with Azure Media Services.
------ Previously updated : 3/10/2021--
-# Set up your Media Services development environment
---
-The following steps describe prerequisites required for developing with Azure Media Services.
-
-## Create a Media Services account
-Use the Azure portal, the .NET SDK, or the REST API to create an Azure Media Services account.
-
-<a id="setup_dev_env"></a>
-
-## Set up the development environment
-
-To set up .NET dev environment, see [this](media-services-dotnet-how-to-use.md) topic.
-
-To set up REST dev environment, see [this](media-services-rest-how-to-use.md) topic.
-
-<a id="connect"></a>
-
-## Connect programmatically
-
-To connect to the Azure Media Services API, see [access the Azure Media Services API with Azure AD authentication](media-services-use-aad-auth-to-access-ams-api.md).
-
-## Next steps
-
-Find multiple code samples in the **Azure Code Samples** gallery: [Azure Media Services code samples](https://azure.microsoft.com/resources/samples/?service=media-services&sort=0).
--
-## Provide feedback
media-services Media Services Specifications Live Timed Metadata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-specifications-live-timed-metadata.md
- Title: Azure Media Services - Signaling Timed Metadata in Live Streaming
-description: This specification outlines methods for signaling timed metadata when ingesting and streaming to Azure Media Services. This includes support for generic timed metadata signals (ID3), as well as SCTE-35 signaling for ad insertion and splice condition signaling.
------- Previously updated : 08/22/2019---
-# Signaling Timed Metadata in Live Streaming
-
-Last Updated: 2019-08-22
-
-### Conformance Notation
-
-The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in RFC 2119
-
-## 1. Introduction
-
-In order to signal the insertion of advertisements or custom metadata events on a client player, broadcasters often make use of timed metadata embedded within the video. To enable these scenarios, Media Services provides support for the transport of timed metadata from the ingest point of the live streaming channel to the client application.
-This specification outlines several modes that are supported by Media Services for timed metadata within live streaming signals.
-
-1. [SCTE-35] signaling that complies with the standards outlined by [SCTE-35], [SCTE-214-1], [SCTE-214-3] and [RFC8216]
-
-2. [SCTE-35] signaling that complies with the legacy [Adobe-Primetime] specification for RTMP ad signaling.
-
-3. A generic timed metadata signaling mode, for messages that are **NOT** [SCTE-35] and could carry [ID3v2] or other custom schemas defined by the application developer.
-
-## 1.1 Terms Used
-
-| Term | Definition |
-| - | |
-| Ad Break | A location or point in time where one or more ads may be scheduled for delivery; same as avail and placement opportunity. |
-| Ad Decision Service | external service that decides which ad(s) and durations will be shown to the user. The services is typically provided by a partner and are out of scope for this document. |
-| Cue | Indication of time and parameters of the upcoming ad break. Note that cues can indicate a pending switch to an ad break, pending switch to the next ad within an ad break, and pending switch from an ad break to the main content. |
-| Packager | The Azure Media Services "Streaming Endpoint" provides dynamic packaging capabilities for DASH and HLS and is referred to as a "Packager" in the media industry. |
-| Presentation Time | The time that an event is presented to a viewer. The time represents the moment on the media timeline that a viewer would see the event. For example, the presentation time of a SCTE-35 splice_info() command message is the splice_time(). |
-| Arrival Time | The time that an event message arrives. The time is typically distinct from the presentation time of the event, since event messages are sent ahead of the presentation time of the event. |
-| Sparse track | media track that is not continuous, and is time synchronized with a parent or control track. |
-| Origin | The Azure Media Streaming Service |
-| Channel Sink | The Azure Media Live Streaming Service |
-| HLS | Apple HTTP Live Streaming protocol |
-| DASH | Dynamic Adaptive Streaming Over HTTP |
-| Smooth | Smooth Streaming Protocol |
-| MPEG2-TS | MPEG 2 Transport Streams |
-| RTMP | Real-Time Multimedia Protocol |
-| uimsbf | Unsigned integer, most significant bit first. |
---
-## 1.2 Normative References
-
-The following documents contain provisions, which, through reference in this text, constitute provisions of this document. All documents are subject to revision by the standards bodies, and readers are encouraged to investigate the possibility of applying the most recent editions of the documents listed below. Readers are also reminded that newer editions of the referenced documents might not be compatible with this version of the timed metadata specification for Azure Media Services.
--
-| Standard | Definition |
-| -- | -- |
-| [Adobe-Primetime] | Primetime Digital Program Insertion Signaling Specification 1.2 |
-| [Adobe-Flash-AS] | [FLASH ActionScript Language Reference](https://help.adobe.com/archive/en_US/as2/flashlite_2.x_3.x_aslr.pdf) |
-| [AMF0] | ["Action Message Format AMF0"](https://download.macromedia.com/pub/labs/amf/amf0_spec_121207.pdf) |
-| [DASH-IF-IOP] | DASH Industry Forum Interop Guidance v 4.2 [https://dashif-documents.azurewebsites.net/DASH-IF-IOP/master/DASH-IF-IOP.html](https://dashif-documents.azurewebsites.net/DASH-IF-IOP/master/DASH-IF-IOP.html) |
-| [HLS-TMD] | Timed Metadata for HTTP Live Streaming - [https://developer.apple.com/streaming](https://developer.apple.com/streaming) |
-| [CMAF-ID3] | [Timed Metadata in the Common Media Application Format (CMAF)](https://github.com/AOMediaCodec/id3-emsg) |
-| [ID3v2] | ID3 Tag version 2.4.0 [http://id3.org/id3v2.4.0-structure](http://id3.org/id3v2.4.0-structure) |
-| [ISO-14496-12] | ISO/IEC 14496-12: Part 12 ISO base media file format, FourthEdition 2012-07-15 |
-| [MPEGDASH] | Information technology -- Dynamic adaptive streaming over HTTP (DASH) -- Part 1: Media presentation description and segment formats. May 2014. Published. URL: https://www.iso.org/standard/65274.html |
-| [MPEGCMAF] | Information technology -- Multimedia application format (MPEG-A) -- Part 19: Common media application format (CMAF) for segmented media. January 2018. Published. URL: https://www.iso.org/standard/71975.html |
-| [MPEGCENC] | Information technology -- MPEG systems technologies -- Part 7: Common encryption in ISO base media file format files. February 2016. Published. URL: https://www.iso.org/standard/68042.html |
-| [MS-SSTR] | ["Microsoft Smooth Streaming Protocol", May 15, 2014](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251) |
-| [MS-SSTR-Ingest] | [Azure Media Services Fragmented MP4 Live Ingest Specification](./media-services-fmp4-live-ingest-overview.md) |
-| [RFC8216] | R. Pantos, Ed.; W. May. HTTP Live Streaming. August 2017. Informational. [https://tools.ietf.org/html/rfc8216](https://tools.ietf.org/html/rfc8216) |
-| [RFC4648] | The Base16, Base32, and Base64 Data Encodings - [https://tools.ietf.org/html/rfc4648](https://tools.ietf.org/html/rfc4648) |
-| [RTMP] | ["Adobe's Real-Time Messaging Protocol", December 21, 2012](https://rtmp.veriskope.com/docs/spec/) |
-| [SCTE-35-2019] | SCTE 35: 2019 - Digital Program Insertion Cueing Message for Cable - https://scte-cms-resource-storage.s3.amazonaws.com/ANSI_SCTE-35-2019a-1582645390859.pdf |
-| [SCTE-214-1] | SCTE 214-1 2016 ΓÇô MPEG DASH for IP-Based Cable Services Part 1: MPD Constraints and Extensions |
-| [SCTE-214-3] | SCTE 214-3 2015 MPEG DASH for IP-Based Cable Services Part 3: DASH/FF Profile |
-| [SCTE-224] | SCTE 224 2018r1 ΓÇô Event Scheduling and Notification Interface |
-| [SCTE-250] | Event and Signaling Management API (ESAM) |
----
-## 2. Timed Metadata Ingest
-
-Azure Media Services supports real-time in-band metadata for both [RTMP] and Smooth Streaming [MS-SSTR-Ingest] protocols. Real-time metadata can be used to define custom events, with your own unique custom schemas (JSON, Binary, XML), as well as industry defined formats like ID3, or SCTE-35 for ad signaling in a broadcast stream.
-
-This article provides the details for how to send custom timed metadata signals using the supported ingest protocols of Azure Media Services. The article also explains how the manifests for HLS, DASH, and Smooth Streaming are decorated with the timed metadata signals, as well as how it is carried in-band when the content is delivered using CMAF (MP4 fragments) or Transport Stream (TS) segments for HLS.
-
-Common use case scenarios for timed metadata include:
-
-
-Azure Media Services Live Events and Packager are capable of receiving these timed metadata signals and converting them into a stream of metadata that can reach client applications using standards-based protocols like HLS and DASH.
--
-## 2.1 RTMP Timed Metadata
-
-The [RTMP] protocol allows for timed metadata signals to be sent for various scenarios including custom metadata, and SCTE-35 ad signals.
-
-Advertising signals (cue messages) are sent as [AMF0] cue messages embedded within the [RTMP] stream. The cue messages may be sent sometime before the actual event or [SCTE35] ad splice signal needs to occur. To support this scenario, the actual presentation timestamp of the event is sent within the cue message. For more information, see [AMF0].
-
-The following [AMF0] commands are supported by Azure Media Services for RTMP ingest:
--- **onUserDataEvent** - used for custom metadata or [ID3v2] timed metadata-- **onAdCue** - used primarily for signaling an advertisement placement opportunity in the live stream. Two forms of the cue are supported, a simple mode and a "SCTE-35" mode. -- **onCuePoint** - supported by certain on-premises hardware encoders, like the Elemental Live encoder, to signal [SCTE35] messages.
-
-
-The following table describes the format of the AMF message payload that Media Services will ingest for both "simple" and [SCTE35] message modes.
-
-The name of the [AMF0] message can be used to differentiate multiple event streams of the same type. For both [SCTE-35] messages and "simple" mode, the name of the AMF message MUST be "onAdCue" as required in the [Adobe-Primetime] specification. Any fields not listed below SHALL be ignored by Azure Media Services at ingest.
-
-## 2.1.1 RTMP with custom metadata using "onUserDataEvent"
-
-If you want to provide custom metadata feeds from your upstream encoder, IP Camera, Drone, or device using the RTMP protocol, use the "onUserDataEvent" [AMF0] data message command type.
-
-The **"onUserDataEvent"** data message command MUST carry a message payload with the following definition to be captured by Media Services and packaged into the in-band file format as well as the manifests for HLS, DASH and Smooth Streaming.
-It is recommended to send timed-metadata messages no more frequently than once every 0.5 seconds (500ms) or stability issues with the live stream may occur. Each message could aggregate metadata from multiple frames if you need to provide frame-level metadata.
-If you are sending multi-bitrate streams, it is recommended that you also provide the metadata on a single bitrate only to reduce the bandwidth and avoid interference with video/audio processing.
-
-The payload for the **"onUserDataEvent"** should be an [MPEGDASH] EventStream XML format message. This makes it easy to pass in custom defined schemas that can be carried in 'emsg' payloads in-band for CMAF [MPEGCMAF] content that is delivered over HLS or DASH protocols.
-Each DASH Event Stream message contains a schemeIdUri that functions as a URN message scheme identifier and defines the payload of the message. Some schemes such as "https://aomedia.org/emsg/ID3" for [ID3v2], or **urn:scte:scte35:2013:bin** for [SCTE-35] are standardized by industry consortia for interoperability. Any application provider can define their own custom scheme using a URL that they control (owned domain) and may provide a specification at that URL if they choose. If a player has a handler for the defined scheme, then that is the only component that needs to understand the payload and protocol.
-
-The schema for the [MPEG-DASH] EventStream XML payload is defined as (excerpt from DASH ISO-IEC-23009-1-3rd Edition).
-Note that only one "EventType" per "EventStream" is supported at this time. Only the first **Event** element will be processed if multiple events are provided in the **EventStream**.
-
-```xml
- <!-- Event Stream -->
- <xs:complexType name="EventStreamType">
- <xs:sequence>
- <xs:element name="Event" type="EventType" minOccurs="0" maxOccurs="unbounded"/>
- <xs:any namespace="##other" processContents="lax" minOccurs="0" maxOccurs="unbounded"/>
- </xs:sequence>
- <xs:attribute ref="xlink:href"/>
- <xs:attribute ref="xlink:actuate" default="onRequest"/>
- <xs:attribute name="schemeIdUri" type="xs:anyURI" use="required"/>
- <xs:attribute name="value" type="xs:string"/>
- <xs:attribute name="timescale" type="xs:unsignedInt"/>
- </xs:complexType>
- <!-- Event -->
- <xs:complexType name="EventType">
- <xs:sequence>
- <xs:any namespace="##other" processContents="lax" minOccurs="0" maxOccurs="unbounded"/>
- </xs:sequence>
- <xs:attribute name="presentationTime" type="xs:unsignedLong" default="0"/>
- <xs:attribute name="duration" type="xs:unsignedLong"/>
- <xs:attribute name="id" type="xs:unsignedInt"/>
- <xs:attribute name="contentEncoding" type="ContentEncodingType"/>
- <xs:attribute name="messageData" type="xs:string"/>
- <xs:anyAttribute namespace="##other" processContents="lax"/>
- </xs:complexType>
-```
--
-### Example XML Event Stream with ID3 schema ID and base64-encoded data payload.
-```xml
- <?xml version="1.0" encoding="UTF-8"?>
- <EventStream schemeIdUri="https://aomedia.org/emsg/ID3">
- <Event contentEncoding="Base64">
- -- base64 encoded ID3v2 full payload here per [CMAF-TMD] --
- </Event>
- <EventStream>
-```
-
-### Example Event Stream with custom schema ID and base64-encoded binary data
-```xml
- <?xml version="1.0" encoding="UTF-8"?>
- <EventStream schemeIdUri="urn:example.org:custom:binary">
- <Event contentEncoding="Base64">
- -- base64 encoded custom binary data message --
- </Event>
- <EventStream>
-```
-
-### Example Event Stream with custom schema ID and custom JSON
-```xml
- <?xml version="1.0" encoding="UTF-8"?>
- <EventStream schemeIdUri="urn:example.org:custom:JSON">
- <Event>
- [
- {"key1" : "value1"},
- {"key2" : "value2"}
- ]
- </Event>
- <EventStream>
-```
-
-### Built-in supported Scheme ID URIs
-| Scheme ID URI | Description |
-| -- | - |
-| https:\//aomedia.org/emsg/ID3 | Describes how [ID3v2] metadata can be carried as timed metadata in a CMAF-compatible [MPEGCMAF] fragmented MP4. For more information see the [Timed Metadata in the Common Media Application Format (CMAF)](https://github.com/AOMediaCodec/id3-emsg) |
-
-### Event processing and manifest signaling
-
-On receipt of a valid **"onUserDataEvent"** event, Azure Media Services will look for a valid XML payload that matches the EventStreamType (defined in [MPEGDASH] ), parse the XML payload and convert it into an [MPEGCMAF] MP4 fragment 'emsg' version 1 box for storage in the live archive and transmission to the Media Services Packager. The Packager will detect the 'emsg' box in the live stream and:
--- (a) "dynamically package" it into TS segments for delivery to HLS clients in compliance with the HLS timed metadata specification [HLS-TMD], or-- (b) pass it through for delivery in CMAF fragments via HLS or DASH, or -- (c) convert it into a sparse track signal for delivery via Smooth Streaming [MS-SSTR].-
-In addition to the in-band 'emsg' format CMAF or TS PES packets for HLS, the manifests for DASH (MPD), and Smooth Streaming will contain a reference to the in-band event streams (also known as sparse stream track in Smooth Streaming).
-
-Individual events or their data payloads are NOT output directly in the HLS, DASH, or Smooth manifests.
-
-### Additional informational constraints and defaults for onUserDataEvent events
--- If the timescale is not set in the EventStream element, the RTMP 1 kHz timescale is used by default-- Delivery of an onUserDataEvent message is limited to once every 500ms max. If you send events more frequently, it can impact the bandwidth and the stability of the live feed-
-## 2.1.2 RTMP ad cue signaling with "onAdCue"
-
-Azure Media Services can listen and respond to several [AMF0] message types which can be used to signal various real time synchronized metadata in the live stream. The [Adobe-Primetime] specification defines two cue types called "simple" and "SCTE-35" mode. For "simple" mode, Media Services supports a single AMF cue message called "onAdCue" using a payload that matches the table below defined for the "Simple Mode" signal.
-
-The following section shows RTMP "simple" mode" payload, which can be used to signal a basic "spliceOut" ad signal that will be carried through to the client manifest for HLS, DASH, and Microsoft Smooth Streaming. This is very useful for scenarios where the customer does not have a complex SCTE-35 based ad signaling deployment or insertion system, and is using a basic on-premises encoder to send in the cue message via an API. Typically the on-premises encoder will support a REST-based API to trigger this signal, which will also "splice-condition" the video stream by inserting an IDR frame into the video, and starting a new GOP.
-
-## 2.1.3 RTMP ad cue signaling with "onAdCue" - Simple Mode
-
-| Field Name | Field Type | Required? | Descriptions |
-| - | - | | -- |
-| type | String | Required | The event message. Shall be "SpliceOut" to designate a simple mode splice. |
-| id | String | Required | A unique identifier describing the splice or segment. Identifies this instance of the message |
-| duration | Number | Required | The duration of the splice. Units are fractional seconds. |
-| elapsed | Number | Optional | When the signal is being repeated in order to support tune in, this field shall be the amount of presentation time that has elapsed since the splice began. Units are fractional seconds. When using simple mode, this value should not exceed the original duration of the splice. |
-| time | Number | Required | Shall be the time of the splice, in presentation time. Units are fractional seconds. |
--
-
-#### Example MPEG DASH manifest output when using Adobe RTMP simple mode
-
-See example [3.3.2.1 MPEG DASH .mpd EventStream using Adobe simple mode](#3321-example-mpeg-dash-mpd-manifest-signaling-of-rtmp-streaming-using-adobe-simple-mode)
-
-See example [3.3.3.1 DASH manifest with single period and Adobe simple mode ](#3331-example-mpeg-dash-manifest-mpd-with-single-period-eventstream-using-adobe-simple-mode-signals)
-
-#### Example HLS manifest output when using Adobe RTMP simple mode
-
-See example [3.2.2 HLS manifest using Adobe simple mode and EXT-X-CUE tag](#322-apple-hls-with-adobe-primetime-ext-x-cue)
-
-## 2.1.4 RTMP ad cue signaling with "onAdCue" - SCTE-35 Mode
-
-When you are working with a more advanced broadcast production workflow that requires the full SCTE-35 payload message to be carried through to the HLS or DASH manifest, it is best to use the "SCTE-35 Mode" of the [Adobe-Primetime] specification. This mode supports in-band SCTE-35 signals being sent directly into an on-premises live encoder, which then encodes the signals out into the RTMP stream using the "SCTE-35 Mode" specified in the [Adobe-Primetime] specification.
-
-Typically SCTE-35 messages can appear only in MPEG-2 transport stream (TS) inputs on an on-premises encoder. Check with your encoder manufacturer for details on how to configure a transport stream ingest that contains SCTE-35 and enable it for pass-through to RTMP in Adobe SCTE-35 mode.
-
-In this scenario, the following payload MUST be sent from the on-premises encoder using the **"onAdCue"** [AMF0] message type.
-
-| Field Name | Field Type | Required? | Descriptions |
-| - | - | | - |
-| cue | String | Required | The event message. For [SCTE-35] messages, this MUST be the base64-encoded [RFC4648] binary splice_info_section() in order for messages to be sent to HLS, Smooth, and Dash clients. |
-| type | String | Required | A URN or URL identifying the message scheme. For [SCTE-35] messages, this **SHOULD** be **"scte35"** in order for messages to be sent to HLS, Smooth, and Dash clients, in compliance with [Adobe-Primetime]. Optionally, the URN "urn:scte:scte35:2013:bin" may also be used to signal a [SCTE-35] message. |
-| id | String | Required | A unique identifier describing the splice or segment. Identifies this instance of the message. Messages with equivalent semantics shall have the same value. |
-| duration | Number | Required | The duration of the event or ad splice-segment, if known. If unknown, the value **SHOULD** be 0. |
-| elapsed | Number | Optional | When the [SCTE-35] ad signal is being repeated in order to tune in, this field shall be the amount of presentation time that has elapsed since the splice began. Units are fractional seconds. In [SCTE-35] mode, this value may exceed the original specified duration of the splice or segment. |
-| time | Number | Required | The presentation time of the event or ad splice. The presentation time and duration **SHOULD** align with Stream Access Points (SAP) of type 1 or 2, as defined in [ISO-14496-12] Annex I. For HLS egress, time and duration **SHOULD** align with segment boundaries. The presentation time and duration of different event messages within the same event stream MUST not overlap. Units are fractional seconds. |
---
-<!
-#### Example MPEG DASH .mpd manifest with SCTE-35 mode
-See [Section 3.3.3.2 example DASH manifest with SCTE-35](#3332-example-mpeg-dash-manifest-mpd-with-multi-period-eventstream-using-adobe-scte35-mode-signaling)
->
-
-#### Example HLS manifest .m3u8 with SCTE-35 mode signal
-See [Section 3.2.1.1 example HLS manifest with SCTE-35](#3211-example-hls-manifest-m3u8-showing-ext-x-cue-signaling-of-scte-35)
---
-## 2.1.5 RTMP Ad signaling with "onCuePoint" for Elemental Live
-
-The Elemental Live on-premises encoder supports ad markers in the RTMP signal. Azure Media Services currently only supports the "onCuePoint" Ad Marker type for RTMP. This can be enabled in the Adobe RTMP Group Settings in the Elemental Media Live encoder settings or API by setting the "**ad_markers**" to "onCuePoint". Please refer to the Elemental Live documentation for details.
-Enabling this feature in the RTMP Group will pass SCTE-35 signals to the Adobe RTMP outputs to be processed by Azure Media Services.
-
-The "onCuePoint" message type is defined in [Adobe-Flash-AS] and has the following payload structure when sent from the Elemental Live RTMP output.
--
-| Property | Description |
-| - | - |
-| name | The name SHOULD be '**scte35**' by Elemental Live. |
-| time | The time in seconds at which the cue point occurred in the video file during timeline |
-| type | The type of cue point SHOULD be set to "**event**". |
-| parameters | An associative array of name/value pair strings containing the information from the SCTE-35 message, including Id and duration. These values are parsed out by Azure Media Services and included in the manifest decoration tag. |
--
-When this mode of ad marker is used, the HLS manifest output is similar to Adobe "Simple" mode.
--
-#### Example MPEG DASH MPD, single period, Adobe Simple mode signals
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<MPD xmlns="urn:mpeg:dash:schema:mpd:2011"
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="dynamic" publishTime="2020-01-07T18:58:03Z" minimumUpdatePeriod="PT0S" timeShiftBufferDepth="PT58M56S" availabilityStartTime="2020-01-07T17:44:47Z" minBufferTime="PT7S">
- <Period start="PT0S">
- <EventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="scte35" timescale="10000000">
- <Event presentationTime="1583497601000000" duration="300000000" id="1085900"/>
- <Event presentationTime="1583500901666666" duration="300000000" id="1415966"/>
- <Event presentationTime="1583504202333333" duration="300000000" id="1746033"/>
- <Event presentationTime="1583507502666666" duration="300000000" id="2076066"/>
- <Event presentationTime="1583510803333333" duration="300000000" id="2406133"/>
- <Event presentationTime="1583514104000000" duration="300000000" id="2736200"/>
- <Event presentationTime="1583517404666666" duration="300000000" id="3066266"/>
- <Event presentationTime="1583520705333333" duration="300000000" id="3396333"/>
- <Event presentationTime="1583524006000000" duration="300000000" id="3726400"/>
- <Event presentationTime="1583527306666666" duration="300000000" id="4056466"/>
- <Event presentationTime="1583530607333333" duration="300000000" id="4386533"/>
- </EventStream>
- <AdaptationSet id="1" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.4D400C" maxWidth="256" maxHeight="144" startWithSAP="1">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="scte35"/>
- <SegmentTemplate timescale="10000000" presentationTimeOffset="1583486678426666" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="1583495318000000" d="64000000" r="34"/>
- <S d="43000000"/>
- <S d="21000000"/>
- <!-- ... Truncated for brevity of sample-->
-
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="1583495318000000" type="0" wallClockTime="2020-01-07T17:59:10.957Z" presentationTime="1583495318000000"/>
- <Representation id="1_V_video_3750956353252827751" bandwidth="149952" width="256" height="144"/>
- </AdaptationSet>
- <AdaptationSet id="2" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.2" lang="en">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="scte35"/>
- <Label>ambient</Label>
- <SegmentTemplate timescale="10000000" presentationTimeOffset="1583486678426666" media="QualityLevels($Bandwidth$)/Fragments(ambient=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(ambient=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="1583495254426666" d="64000000" r="35"/>
- <S d="43093334"/>
- <S d="20906666"/>
- <!-- ... Truncated for brevity of sample-->
-
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="1583495254426666" type="0" wallClockTime="2020-01-07T17:59:04.600Z" presentationTime="1583495254426666"/>
- <Representation id="5_A_ambient_9125670592623055209" bandwidth="96000" audioSamplingRate="48000"/>
- </AdaptationSet>
- </Period>
-</MPD>
-```
-
-#### Example HLS playlist, Adobe Simple mode signals using EXT-X-CUE tag (truncated "..." for brevity)
-
-The following example shows the output from the Media Services dynamic packager for an RTMP ingest stream using Adobe "simple" mode signals and the legacy [Adobe-Primetime] EXT-X-CUE tag.
-
-```
-#EXTM3U
-#EXT-X-VERSION:8
-#EXT-X-MEDIA-SEQUENCE:0
-#EXT-X-TARGETDURATION:7
-#EXT-X-INDEPENDENT-SEGMENTS
-#EXT-X-PROGRAM-DATE-TIME:2020-01-07T17:44:47Z
-#EXTINF:6.400000,no-desc
-Fragments(video=1583486742000000,format=m3u8-aapl-v8)
-#EXTINF:6.400000,no-desc
-Fragments(video=1583486806000000,format=m3u8-aapl-v8)
-...
-#EXTINF:6.166667,no-desc
-Fragments(video=1583487638000000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667
-#EXTINF:0.233333,no-desc
-Fragments(video=1583487699666666,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667,ELAPSED=0.233333
-#EXTINF:6.400000,no-desc
-Fragments(video=1583487702000000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667,ELAPSED=6.633333
-#EXTINF:6.400000,no-desc
-Fragments(video=1583487766000000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667,ELAPSED=13.033333
-#EXTINF:6.400000,no-desc
-Fragments(video=1583487830000000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667,ELAPSED=19.433333
-#EXTINF:6.400000,no-desc
-Fragments(video=1583487894000000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID=95766,TYPE="SpliceOut",DURATION=30.000000,TIME=158348769.966667,ELAPSED=25.833333
-#EXTINF:4.166667,no-desc
-Fragments(video=1583487958000000,format=m3u8-aapl-v8)
-#EXTINF:2.233333,no-desc
-Fragments(video=1583487999666666,format=m3u8-aapl-v8)
-#EXTINF:6.400000,no-desc
-Fragments(video=1583488022000000,format=m3u8-aapl-v8)
-...
-```
-
-### 2.1.6 Cancellation and Updates
-
-Messages can be canceled or updated by sending multiple messages with the same
-presentation time and ID. The presentation time and ID uniquely identify the
-event, and the last message received for a specific presentation time that meets
-pre-roll constraints is the message that is acted upon. The updated event replaces any
-previously received messages. The pre-roll constraint is four seconds. Messages
-received at least four seconds prior to the presentation time will be acted upon.
-
-## 2.2 Fragmented MP4 Ingest (Smooth Streaming)
-
-Refer to [MS-SSTR-Ingest] for requirements on live stream ingest. The following sections provide details regarding ingest of timed presentation metadata. Timed presentation metadata is ingested as a sparse track, which is defined in both the Live Server Manifest Box (see MS-SSTR) and the Movie Box ('moov').
-
-Each sparse fragment consists of a Movie Fragment Box ('moof') and Media Data Box ('mdat'), where the 'mdat' box is the binary message.
-
-In order to achieve frame-accurate insertion of ads, the encoder MUST split the fragment at the presentation time where the cue is required to be inserted. A new fragment MUST be created that begins with a newly created IDR frame, or Stream Access Points (SAP) of type 1 or 2, as defined in [ISO-14496-12] Annex I.
-<! This allows the Azure Media Packager to properly generate an HLS manifest and a DASH multi-period manifest where the new Period begins at the frame-accurate splice conditioned presentation time. >
-
-### 2.2.1 Live Server Manifest Box
-
-The sparse track **MUST** be declared in the Live Server Manifest box with a
-**\<textstream\>** entry and it **MUST** have the following attributes set:
-
-| **Attribute Name** | **Field Type** | **Required?** | **Description** |
-| | -- | - | - |
-| systemBitrate | Number | Required | **MUST** be "0", indicating a track with unknown, variable bitrate. |
-| parentTrackName | String | Required | **MUST** be the name of the parent track, to which the sparse track time codes are timescale aligned. The parent track cannot be a sparse track. |
-| manifestOutput | Boolean | Required | **MUST** be "true", to indicate that the sparse track will be embedded in the Smooth client manifest. |
-| Subtype | String | Required | **MUST** be the four character code "DATA". |
-| Scheme | String | Required | **MUST** be a URN or URL identifying the message scheme. For [SCTE-35] messages, this **MUST** be "urn:scte:scte35:2013:bin" in order for messages to be sent to HLS, Smooth, and Dash clients in compliance with [SCTE-35]. |
-| trackName | String | Required | **MUST** be the name of the sparse track. The trackName can be used to differentiate multiple event streams with the same scheme. Each unique event stream **MUST** have a unique track name. |
-| timescale | Number | Optional | **MUST** be the timescale of the parent track. |
---
-### 2.2.2 Movie Box
-
-The Movie Box ('moov') follows the Live Server Manifest Box as part of the
-stream header for a sparse track.
-
-The 'moov' box **SHOULD** contain a **TrackHeaderBox ('tkhd')** box as defined in
-[ISO-14496-12] with the following constraints:
-
-| **Field Name** | **Field Type** | **Required?** | **Description** |
-| -- | -- | - | |
-| duration | 64-bit unsigned integer | Required | **SHOULD** be 0, since the track box has zero samples and the total duration of the samples in the track box is 0. |
---
-The 'moov' box **SHOULD** contain a **HandlerBox ('hdlr')** as defined in
-[ISO-14496-12] with the following constraints:
-
-| **Field Name** | **Field Type** | **Required?** | **Description** |
-| -- | -- | - | |
-| handler_type | 32-bit unsigned integer | Required | **SHOULD** be 'meta'. |
---
-The 'stsd' box **SHOULD** contain a MetaDataSampleEntry box with a coding name as defined in [ISO-14496-12]. For example, for SCTE-35 messages the coding name **SHOULD** be 'scte'.
-
-### 2.2.3 Movie Fragment Box and Media Data Box
-
-Sparse track fragments consist of a Movie Fragment Box ('moof') and a Media Data
-Box ('mdat').
-
-> [!NOTE]
-> In order to achieve frame-accurate insertion of ads, the encoder MUST split the fragment at the presentation time where the cue is
-> required to be inserted. A new fragment MUST be created that begins with a newly created IDR frame, or Stream Access Points (SAP) of
-> type 1 or 2, as defined in [ISO-14496-12] Annex I
->
-
-The MovieFragmentBox ('moof') box **MUST** contain a
-**TrackFragmentExtendedHeaderBox ('uuid')** box as defined in [MS-SSTR] with the
-following fields:
-
-| **Field Name** | **Field Type** | **Required?** | **Description** |
-| - | -- | - | |
-| fragment_absolute_time | 64-bit unsigned integer | Required | **MUST** be the arrival time of the event. This value aligns the message with the parent track. |
-| fragment_duration | 64-bit unsigned integer | Required | **MUST** be the duration of the event. The duration can be zero to indicate that the duration is unknown. |
----
-The MediaDataBox ('mdat') box **MUST** have the following format:
-
-| **Field Name** | **Field Type** | **Required?** | **Description** |
-| -- | -- | - | |
-| version | 32-bit unsigned integer (uimsbf) | Required | Determines the format of the contents of the 'mdat' box. Unrecognized versions will be ignored. Currently the only supported version is 1. |
-| id | 32-bit unsigned integer (uimsbf) | Required | Identifies this instance of the message. Messages with equivalent semantics shall have the same value; that is, processing any one event message box with the same id is sufficient. |
-| presentation_time_delta | 32-bit unsigned integer (uimsbf) | Required | The sum of the fragment_absolute_time, specified in the TrackFragmentExtendedHeaderBox, and the presentation_time_delta **MUST** be the presentation time of the event. The presentation time and duration **SHOULD** align with Stream Access Points (SAP) of type 1 or 2, as defined in [ISO-14496-12] Annex I. For HLS egress, time and duration **SHOULD** align with segment boundaries. The presentation time and duration of different event messages within the same event stream **MUST** not overlap. |
-| message | byte array | Required | The event message. For [SCTE-35] messages, the message is the binary splice_info_section(). For [SCTE-35] messages, this **MUST** be the splice_info_section() in order for messages to be sent to HLS, Smooth, and Dash clients in compliance with [SCTE-35]. For [SCTE-35] messages, the binary splice_info_section() is the payload of the 'mdat' box, and it is **NOT** base64 encoded. |
----
-### 2.2.4 Cancellation and Updates
-
-Messages can be canceled or updated by sending multiple messages with the same presentation time and ID. The presentation time and ID uniquely identify the event. The last message received for a specific presentation time, that meets pre-roll constraints, is the message that is acted upon. The updated message replaces any previously received messages. The pre-roll constraint is four seconds. Messages received at least four seconds prior to the presentation time will be acted upon.
--
-## 3 Timed Metadata Delivery
-
-Event stream data is opaque to Media Services. Media Services merely
-passes three pieces of information between the ingest endpoint and the client
-endpoint. The following properties are delivered to the client, in compliance
-with [SCTE-35] and/or the client's streaming protocol:
-
-1. Scheme ΓÇô a URN or URL identifying the scheme of the message.
-2. Presentation Time ΓÇô the presentation time of the event on the media
- timeline.
-3. Duration ΓÇô the duration of the event.
-4. ID ΓÇô an optional unique identifier for the event.
-5. Message ΓÇô the event data.
-
-## 3.1 Microsoft Smooth Streaming Manifest
-
-Refer to sparse track handling [MS-SSTR] for details on how to format a sparse message track.
-For [SCTE35] messages, Smooth Streaming will output the base64-encoded splice_info_section() into a sparse fragment.
-The StreamIndex **MUST** have a Subtype of "DATA", and the CustomAttributes **MUST** contain an Attribute with Name="Schema" and Value="urn:scte:scte35:2013:bin".
-
-#### Smooth Client Manifest Example showing base64-encoded [SCTE35] splice_info_section()
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<SmoothStreamingMedia MajorVersion="2" MinorVersion="0" TimeScale="10000000" IsLive="true" Duration="0"
- LookAheadFragmentCount="2" DVRWindowLength="6000000000">
-
- <StreamIndex Type="video" Name="video" Subtype="" Chunks="0" TimeScale="10000000"
- Url="QualityLevels({bitrate})/Fragments(video={start time})">
- <QualityLevel Index="0" Bitrate="230000"
- CodecPrivateData="250000010FC3460B50878A0B5821FF878780490800704704DC0000010E5A67F840" FourCC="WVC1"
- MaxWidth="364" MaxHeight="272"/>
- <QualityLevel Index="1" Bitrate="305000"
- CodecPrivateData="250000010FC3480B50878A0B5821FF87878049080894E4A7640000010E5A67F840" FourCC="WVC1"
- MaxWidth="364" MaxHeight="272"/>
- <c t="0" d="20000000" r="300" />
- </StreamIndex>
- <StreamIndex Type="audio" Name="audio" Subtype="" Chunks="0" TimeScale="10000000"
- Url="QualityLevels({bitrate})/Fragments(audio={start time})">
- <QualityLevel Index="0" Bitrate="96000" CodecPrivateData="1000030000000000000000000000E00042C0"
- FourCC="WMAP" AudioTag="354" Channels="2" SamplingRate="44100" BitsPerSample="16" PacketSize="4459"/>
- <c t="0" d="20000000" r="300" />
- </StreamIndex>
- <StreamIndex Type="text" Name="scte35-sparse-stream" Subtype="DATA" Chunks="0" TimeScale="10000000"
- ParentStreamIndex="video" ManifestOutput="true"
- Url="QualityLevels({bitrate})/Fragments(captions={start time})">
- <QualityLevel Index="0" Bitrate="0" CodecPrivateData="" FourCC="">
- <CustomAttributes>
- <Attribute Name="Scheme" Value="urn:scte:scte35:2013:bin"/>
- </CustomAttributes>
- </QualityLevel>
- <!-- The following <c> and <f> fragments contains the base64-encoded [SCTE35] splice_info_section() message -->
- <c t="600000000" d="300000000"> <f>PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz48QWNxdWlyZWRTaWduYWwgeG1sbnM9InVybjpjYWJsZWxhYnM6bWQ6eHNkOnNpZ25hbGluZzozLjAiIGFjcXVpc2l0aW9uUG9pbnRJZGVudGl0eT0iRVNQTl9FYXN0X0FjcXVpc2l0aW9uX1BvaW50XzEiIGFjcXVpc2l0aW9uU2lnbmFsSUQ9IjRBNkE5NEVFLTYyRkExMUUxQjFDQTg4MkY0ODI0MDE5QiIgYWNxdWlzaXRpb25UaW1lPSIyMDEyLTA5LTE4VDEwOjE0OjI2WiI+PFVUQ1BvaW50IHV0Y1BvaW50PSIyMDEyLTA5LTE4VDEwOjE0OjM0WiIvPjxTQ1RFMzVQb2ludERlc2NyaXB0b3Igc3BsaWNlQ29tbWFuZFR5cGU9IjUiPjxTcGxpY2VJbnNlcnQgc3BsaWNlRXZlbnRJRD0iMzQ0NTY4NjkxIiBvdXRPZk5ldHdvcmtJbmRpY2F0b3I9InRydWUiIHVuaXF1ZVByb2dyYW1JRD0iNTUzNTUiIGR1cmF0aW9uPSJQVDFNMFMiIGF2YWlsTnVtPSIxIiBhdmFpbHNFeHBlY3RlZD0iMTAiLz48L1NDVEUzNVBvaW50RGVzY3JpcHRvcj48U3RyZWFtVGltZXM+PFN0cmVhbVRpbWUgdGltZVR5cGU9IkhTUyIgdGltZVZhbHVlPSI1MTUwMDAwMDAwMDAiLz48L1N0cmVhbVRpbWVzPjwvQWNxdWlyZWRTaWduYWw+</f>
- </c>
- <c t="1200000000" d="400000000"> <f>PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz48QWNxdWlyZWRTaWduYWwgeG1sbnM9InVybjpjYWJsZWxhYnM6bWQ6eHNkOnNpZ25hbGluZzozLjAiIGFjcXVpc2l0aW9uUG9pbnRJZGVudGl0eT0iRVNQTl9FYXN0X0FjcXVpc2l0aW9uX1BvaW50XzEiIGFjcXVpc2l0aW9uU2lnbmFsSUQ9IjRBNkE5NEVFLTYyRkExMUUxQjFDQTg4MkY0ODI0MDE5QiIgYWNxdWlzaXRpb25UaW1lPSIyMDEyLTA5LTE4VDEwOjE0OjI2WiI+PFVUQ1BvaW50IHV0Y1BvaW50PSIyMDEyLTA5LTE4VDEwOjE0OjM0WiIvPjxTQ1RFMzVQb2ludERlc2NyaXB0b3Igc3BsaWNlQ29tbWFuZFR5cGU9IjUiPjxTcGxpY2VJbnNlcnQgc3BsaWNlRXZlbnRJRD0iMzQ0NTY4NjkxIiBvdXRPZk5ldHdvcmtJbmRpY2F0b3I9InRydWUiIHVuaXF1ZVByb2dyYW1JRD0iNTUzNTUiIGR1cmF0aW9uPSJQVDFNMFMiIGF2YWlsTnVtPSIxIiBhdmFpbHNFeHBlY3RlZD0iMTAiLz48L1NDVEUzNVBvaW50RGVzY3JpcHRvcj48U3RyZWFtVGltZXM+PFN0cmVhbVRpbWUgdGltZVR5cGU9IkhTUyIgdGltZVZhbHVlPSI1MTYyMDAwMDAwMDAiLz48L1N0cmVhbVRpbWVzPjwvQWNxdWlyZWRTaWduYWw+</f>
- </c>
- </StreamIndex>
-</SmoothStreamingMedia>
-```
-
-## 3.2 Apple HLS Manifest Decoration
-
-Azure Media Services supports the following HLS manifest tags for signaling ad avail information during a live or on-demand event.
-
-<! EXT-X-DATERANGE as defined in Apple HLS [RFC8216] >
-- EXT-X-CUE as defined in [Adobe-Primetime]
-<! this mode is considered "legacy". Customers should adopt the EXT-X-DATERANGE tag when possible. >
-
-The data output to each tag will vary based on the ingest signal mode used. For example, RTMP ingest with Adobe Simple mode does not contain the full SCTE-35 base64-encoded payload.
-
-<!
-## 3.2.1 Apple HLS with EXT-X-DATERANGE (recommended)
-
-The Apple HTTP Live Streaming [RFC8216] specification allows for signaling of [SCTE-35] messages. The messages are inserted into the segment playlist in an EXT-X-DATERANGE tag per [RFC8216] section titled "Mapping SCTE-35 into EXT-X-DATERANGE". The client application layer can parse the M3U playlist and process M3U tags, or receive the events through the Apple player framework.
-
-The **RECOMMENDED** approach in Azure Media Services (version 3 API) is to follow [RFC8216] and use the EXT-X_DATERANGE tag for [SCTE35] ad avail decoration in the manifest.
->
--
-## 3.2.1.1 Example HLS manifest .m3u8 showing EXT-X-CUE signaling of SCTE-35
-
-The following example HLS manifest output from the Media Services dynamic packager shows EXT-X-CUE tag for [Adobe-Primetime] in SCTE35 mode.
-
-```
-#EXTM3U
-#EXT-X-VERSION:8
-#EXT-X-MEDIA-SEQUENCE:0
-#EXT-X-TARGETDURATION:2
-#EXT-X-INDEPENDENT-SEGMENTS
-#EXT-X-PROGRAM-DATE-TIME:2020-01-07T19:40:50Z
-#EXTINF:1.501500,no-desc
-Fragments(video=22567545,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22702680,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22837815,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22972950,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=23108085,format=m3u8-aapl-v8)
-#EXTINF:1.234567,no-desc
-Fragments(video=23243220,format=m3u8-aapl-v8)
-#EXTINF:0.016689,no-desc
-Fragments(video=23354331,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=0.000022
-#EXTINF:0.250244,no-desc
-Fragments(video=23355833,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=0.250267
-#EXTINF:0.850856,no-desc
-Fragments(video=23378355,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.101122
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=0.000000,TIME=260.610344,CUE="/DAgAAAAAAXdAP/wDwUAAAPqf0/+AWXk0wABAQEAAGB86Fo="
-#EXTINF:0.650644,no-desc
-Fragments(video=23454932,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.751767
-#EXTINF:0.050044,no-desc
-Fragments(video=23513490,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.801811
-#EXTINF:1.451456,no-desc
-Fragments(video=23517994,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=3.253267
-#EXTINF:1.501500,no-desc
-Fragments(video=23648625,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=4.754767
-#EXTINF:1.501500,no-desc
-Fragments(video=23783760,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=6.256267
-#EXTINF:1.501500,no-desc
-Fragments(video=23918895,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=7.757767
-#EXTINF:1.501500,no-desc
-Fragments(video=24054030,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=9.259267
-#EXTINF:1.501500,no-desc
-Fragments(video=24189165,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=10.760767
-#EXTINF:1.501500,no-desc
-Fragments(video=24324300,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=12.262267
-#EXTINF:1.501500,no-desc
-Fragments(video=24459435,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=13.763767
-#EXTINF:1.501500,no-desc
-Fragments(video=24594570,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=15.265267
-#EXTINF:1.501500,no-desc
-Fragments(video=24729705,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=16.766767
-#EXTINF:1.501500,no-desc
-Fragments(video=24864840,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=18.268267
-#EXTINF:1.501500,no-desc
-Fragments(video=24999975,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=19.769767
-#EXTINF:1.501500,no-desc
-Fragments(video=25135110,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=21.271267
-#EXTINF:1.501500,no-desc
-Fragments(video=25270245,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=22.772767
-#EXTINF:1.501500,no-desc
-Fragments(video=25405380,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=24.274267
-#EXTINF:1.501500,no-desc
-Fragments(video=25540515,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=25.775767
-#EXTINF:1.501500,no-desc
-Fragments(video=25675650,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=27.277267
-#EXTINF:1.501500,no-desc
-Fragments(video=25810785,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=28.778767
-#EXTINF:1.501500,no-desc
-Fragments(video=25945920,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=30.280267
-#EXTINF:1.501500,no-desc
-Fragments(video=26081055,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=31.781767
-#EXTINF:1.501500,no-desc
-Fragments(video=26216190,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=33.283267
-#EXTINF:1.501500,no-desc
-Fragments(video=26351325,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=34.784767
-#EXTINF:1.501500,no-desc
-Fragments(video=26486460,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=36.286267
-#EXTINF:1.501500,no-desc
-Fragments(video=26621595,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=37.787767
-#EXTINF:1.501500,no-desc
-Fragments(video=26756730,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=39.289267
-#EXTINF:1.501500,no-desc
-Fragments(video=26891865,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=40.790767
-#EXTINF:1.501500,no-desc
-Fragments(video=27027000,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=42.292267
-#EXTINF:1.501500,no-desc
-Fragments(video=27162135,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=43.793767
-#EXTINF:1.501500,no-desc
-Fragments(video=27297270,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=45.295267
-#EXTINF:1.501500,no-desc
-Fragments(video=27432405,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=46.796767
-#EXTINF:1.501500,no-desc
-Fragments(video=27567540,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=48.298267
-#EXTINF:1.501500,no-desc
-Fragments(video=27702675,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=49.799767
-#EXTINF:1.501500,no-desc
-Fragments(video=27837810,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=51.301267
-#EXTINF:1.501500,no-desc
-Fragments(video=27972945,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=52.802767
-#EXTINF:1.501500,no-desc
-Fragments(video=28108080,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=54.304267
-#EXTINF:1.501500,no-desc
-Fragments(video=28243215,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=55.805767
-#EXTINF:1.501500,no-desc
-Fragments(video=28378350,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=57.307267
-#EXTINF:1.501500,no-desc
-Fragments(video=28513485,format=m3u8-aapl-v8)
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=58.808767
-#EXTINF:1.501500,no-desc
-Fragments(video=28648620,format=m3u8-aapl-v8)
-
-```
--
-<!
-THIS VERSION HAS THE HLSv8 DATERANGE Tags in it
-~~~
-#EXTM3U
-#EXT-X-VERSION:8
-#EXT-X-MEDIA-SEQUENCE:0
-#EXT-X-TARGETDURATION:2
-#EXT-X-INDEPENDENT-SEGMENTS
-#EXT-X-PROGRAM-DATE-TIME:2020-01-07T19:40:50Z
-#EXTINF:1.501500,no-desc
-Fragments(video=22567545,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22702680,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22837815,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=22972950,format=m3u8-aapl-v8)
-#EXTINF:1.501500,no-desc
-Fragments(video=23108085,format=m3u8-aapl-v8)
-#EXTINF:1.234567,no-desc
-Fragments(video=23243220,format=m3u8-aapl-v8)
-#EXTINF:0.016689,no-desc
-Fragments(video=23354331,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=0.000022
-#EXTINF:0.250244,no-desc
-Fragments(video=23355833,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=0.250267
-#EXTINF:0.850856,no-desc
-Fragments(video=23378355,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.101122
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:10.610Z",SCTE35-IN=0xFC30200000000005DD00FFF00F05000003EA7F4FFE0165E4D3000101010000607CE85A
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=0.000000,TIME=260.610344,CUE="/DAgAAAAAAXdAP/wDwUAAAPqf0/+AWXk0wABAQEAAGB86Fo="
-#EXTINF:0.650644,no-desc
-Fragments(video=23454932,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.751767
-#EXTINF:0.050044,no-desc
-Fragments(video=23513490,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=1.801811
-#EXTINF:1.451456,no-desc
-Fragments(video=23517994,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=3.253267
-#EXTINF:1.501500,no-desc
-Fragments(video=23648625,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=4.754767
-#EXTINF:1.501500,no-desc
-Fragments(video=23783760,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=6.256267
-#EXTINF:1.501500,no-desc
-Fragments(video=23918895,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=7.757767
-#EXTINF:1.501500,no-desc
-Fragments(video=24054030,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=9.259267
-#EXTINF:1.501500,no-desc
-Fragments(video=24189165,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=10.760767
-#EXTINF:1.501500,no-desc
-Fragments(video=24324300,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=12.262267
-#EXTINF:1.501500,no-desc
-Fragments(video=24459435,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=13.763767
-#EXTINF:1.501500,no-desc
-Fragments(video=24594570,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=15.265267
-#EXTINF:1.501500,no-desc
-Fragments(video=24729705,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=16.766767
-#EXTINF:1.501500,no-desc
-Fragments(video=24864840,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=18.268267
-#EXTINF:1.501500,no-desc
-Fragments(video=24999975,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=19.769767
-#EXTINF:1.501500,no-desc
-Fragments(video=25135110,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=21.271267
-#EXTINF:1.501500,no-desc
-Fragments(video=25270245,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=22.772767
-#EXTINF:1.501500,no-desc
-Fragments(video=25405380,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=24.274267
-#EXTINF:1.501500,no-desc
-Fragments(video=25540515,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=25.775767
-#EXTINF:1.501500,no-desc
-Fragments(video=25675650,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=27.277267
-#EXTINF:1.501500,no-desc
-Fragments(video=25810785,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=28.778767
-#EXTINF:1.501500,no-desc
-Fragments(video=25945920,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=30.280267
-#EXTINF:1.501500,no-desc
-Fragments(video=26081055,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=31.781767
-#EXTINF:1.501500,no-desc
-Fragments(video=26216190,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=33.283267
-#EXTINF:1.501500,no-desc
-Fragments(video=26351325,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=34.784767
-#EXTINF:1.501500,no-desc
-Fragments(video=26486460,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=36.286267
-#EXTINF:1.501500,no-desc
-Fragments(video=26621595,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=37.787767
-#EXTINF:1.501500,no-desc
-Fragments(video=26756730,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=39.289267
-#EXTINF:1.501500,no-desc
-Fragments(video=26891865,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=40.790767
-#EXTINF:1.501500,no-desc
-Fragments(video=27027000,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=42.292267
-#EXTINF:1.501500,no-desc
-Fragments(video=27162135,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=43.793767
-#EXTINF:1.501500,no-desc
-Fragments(video=27297270,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=45.295267
-#EXTINF:1.501500,no-desc
-Fragments(video=27432405,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=46.796767
-#EXTINF:1.501500,no-desc
-Fragments(video=27567540,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=48.298267
-#EXTINF:1.501500,no-desc
-Fragments(video=27702675,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=49.799767
-#EXTINF:1.501500,no-desc
-Fragments(video=27837810,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=51.301267
-#EXTINF:1.501500,no-desc
-Fragments(video=27972945,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=52.802767
-#EXTINF:1.501500,no-desc
-Fragments(video=28108080,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=54.304267
-#EXTINF:1.501500,no-desc
-Fragments(video=28243215,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=55.805767
-#EXTINF:1.501500,no-desc
-Fragments(video=28378350,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=57.307267
-#EXTINF:1.501500,no-desc
-Fragments(video=28513485,format=m3u8-aapl-v8)
-#EXT-X-DATERANGE:ID="1002",START-DATE="2020-01-07T19:45:09.509Z",SCTE35-OUT=0xFC30250000000005DD00FFF01405000003EA7FEFFE016461B8FE00526363000101010000F20D5E37
-#EXT-X-CUE:ID="1002",TYPE="scte35",DURATION=59.993278,TIME=259.509244,CUE="/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==",ELAPSED=58.808767
-#EXTINF:1.501500,no-desc
-Fragments(video=28648620,format=m3u8-aapl-v8)
-
-~~~
-
->
-
-## 3.2.2 Apple HLS with Adobe Primetime EXT-X-CUE
-
-Media Services (version 2 and 3 API) supports the output of the EXT-X-CUE tag as defined in [Adobe-Primetime] "SCTE-35 Mode". In this mode, Azure Media Services will embed base64-encoded [SCTE-35] splice_info_section() in the EXT-X-CUE tag.
-
-The "legacy" EXT-X-CUE tag is defined as below and also can be normative referenced in the [Adobe-Primetime] specification. This should only be used for legacy SCTE35 signaling where needed, otherwise the recommended tag is defined in [RFC8216] as EXT-X-DATERANGE.
-
-| **Attribute Name** | **Type** | **Required?** | **Description** |
-| | -- | -- | - |
-| CUE | quoted string | Required | The message encoded as a base64-encoded string as described in [RFC4648]. For [SCTE-35] messages, this is the base64-encoded splice_info_section(). |
-| TYPE | quoted string | Required | A URN or URL identifying the message scheme. For [SCTE-35] messages, the type takes the special value "scte35". |
-| ID | quoted string | Required | A unique identifier for the event. If the ID is not specified when the message is ingested, Azure Media Services will generate a unique ID. |
-| DURATION | decimal floating point number | Required | The duration of the event. If unknown, the value **SHOULD** be 0. Units are factional seconds. |
-| ELAPSED | decimal floating point number | Optional, but Required for sliding window | When the signal is being repeated to support a sliding presentation window, this field **MUST** be the amount of presentation time that has elapsed since the event began. Units are fractional seconds. This value may exceed the original specified duration of the splice or segment. |
-| TIME | decimal floating point number | Required | The presentation time of the event. Units are fractional seconds. |
-
-The HLS player application layer will use the TYPE to identify the format of the message, decode the message, apply the necessary time conversions, and process the event. The events are time synchronized in the segment playlist of the parent track, according to the event timestamp. They are inserted before the nearest segment (#EXTINF tag).
--
-### 3.2.3 HLS .m3u8 manifest example using Adobe Primetime EXT-X-CUE
-
-The following example shows HLS manifest decoration using the Adobe Primetime EXT-X-CUE tag. The "CUE" parameter contains only the TYPE and Duration properties which means that this was an RTMP source using Adobe "simple" mode signaling.
-<!If this was a SCTE-35 mode signal, the tag would include the base64 encoded binary SCTE-35 payload as seen in the [3.2.1.1 example](#3211-example-hls-manifest-m3u8-showing-ext-x-daterange-signaling-of-scte-35).
->
--
-```
-#EXTM3U
-#EXT-X-VERSION:4
-#EXT-X-PLAYLIST-TYPE:VOD
-#EXT-X-ALLOW-CACHE:NO
-#EXT-X-MEDIA-SEQUENCE:0
-#EXT-X-TARGETDURATION:11
-#EXT-X-PROGRAM-DATE-TIME:2019-12-10T09:18:14Z
-#EXTINF:10.010000,no-desc
-Fragments(video=4011540820,format=m3u8-aapl)
-#EXTINF:10.010000,no-desc
-Fragments(video=4011550830,format=m3u8-aapl)
-#EXTINF:10.010000,no-desc
-Fragments(video=4011560840,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000
-#EXTINF:8.008000,no-desc
-Fragments(video=4011570850,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=0.593000
-#EXTINF:4.170000,no-desc
-Fragments(video=4011578858,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=4.763000
-#EXTINF:9.844000,no-desc
-Fragments(video=4011583028,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=14.607000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011592872,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=24.617000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011602882,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=34.627000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011612892,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=44.637000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011622902,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=54.647000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011632912,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=64.657000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011642922,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=74.667000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011652932,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=84.677000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011662942,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=94.687000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011672952,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=104.697000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011682962,format=m3u8-aapl)
-#EXT-X-CUE:ID=4011578265,TYPE="SpliceOut",DURATION=119.987000,TIME=4011578.265000,ELAPSED=114.707000
-#EXTINF:10.010000,no-desc
-Fragments(video=4011692972,format=m3u8-aapl)
-#EXTINF:8.008000,no-desc
-Fragments(video=4011702982,format=m3u8-aapl)
-
-```
-
-### 3.2.4 HLS Message Handling for Adobe Primetime EXT-X-CUE
-
-Events are signaled in the segment playlist of each video and audio track. The
-position of the EXT-X-CUE tag **MUST** always be either immediately before the first
-HLS segment (for splice out or segment start) or immediately after the last HLS
-segment (for splice in or segment end) to which its TIME and DURATION
-attributes refer, as required by [Adobe-Primetime].
-
-When a sliding presentation window is enabled, the EXT-X-CUE tag **MUST** be
-repeated often enough that the splice or segment is always fully described in
-the segment playlist, and the ELAPSED attribute **MUST** be used to indicate the
-amount of time the splice or segment has been active, as required by [Adobe-Primetime].
-
-When a sliding presentation window is enabled, the EXT-X-CUE tags are removed
-from the segment playlist when the media time that they refer to has rolled out
-of the sliding presentation window.
-
-## 3.3 DASH Manifest Decoration (MPD)
-
-[MPEGDASH] provides three ways to signal events:
-
-1. Events signaled in the MPD EventStream
-2. Events signaled in-band using the Event Message Box ('emsg')
-3. A combination of both 1 and 2
-
-Events signaled in the MPD EventStream are useful for VOD streaming because clients have
-access to all the events, immediately when the MPD is downloaded. It is also useful for SSAI signaling, where the downstream SSAI vendor needs to parse the signals from the MPD manifest, and insert ad content dynamically. The in-band ('emsg')solution is useful for live streaming where clients do not need to download the MPD again, or there is no SSAI manifest manipulation happening between the client and the origin.
-
-Azure Media Services default behavior for DASH is to signal both in the MPD EventStream and in-band using the Event Message Box ('emsg').
-
-Cue messages ingested over [RTMP] or [MS-SSTR-Ingest] are mapped into DASH events, using in-band 'emsg' boxes and/or in-MPD EventStreams.
-
-In-band SCTE-35 signaling for DASH follows the definition and requirements defined in [SCTE-214-3] and also in [DASH-IF-IOP] section 13.12.2 ('SCTE35 Events').
-
-For in-band [SCTE-35] carriage, the Event Message box ('emsg') uses the schemeId = "urn:scte:scte35:2013:bin".
-For MPD manifest decoration the EventStream schemeId uses "urn:scte:scte35:2014:xml+bin". This format is an XML representation of the event which includes a binary base64-encoded output of the complete SCTE-35 message that arrived at ingest.
-
-Normative reference definitions of carriage of [SCTE-35] cue messages in DASH are available in [SCTE-214-1] sec 6.7.4 (MPD) and [SCTE-214-3] sec 7.3.2 (Carriage of SCTE 35 cue messages).
-
-### 3.3.1 MPEG DASH (MPD) EventStream Signaling
-
-Manifest (MPD) decoration of events will be signaled in the MPD using the EventStream element, which appears within the Period element. The schemeId used is "urn:scte:scte35:2014:xml+bin".
--
-> [!NOTE]
-> For brevity purposes [SCTE-35] allows use of the base64-encoded section in Signal.Binary element (rather than the
-> Signal.SpliceInfoSection element) as an alternative to
-> carriage of a completely parsed cue message.
-> Azure Media Services uses this 'xml+bin' approach to signaling in the MPD manifest.
-> This is also the recommended method used in the [DASH-IF-IOP] - see section titled ['Ad insertion event streams' of the DASH IF IOP guideline](https://dashif-documents.azurewebsites.net/DASH-IF-IOP/master/DASH-IF-IOP.html#ads-insertion-event-streams)
->
--
-The EventStream element has the following attributes:
-
-| **Attribute Name** | **Type** | **Required?** | **Description** |
-| | -- | - | |
-| scheme_id_uri | string | Required | Identifies the scheme of the message. The scheme is set to the value of the Scheme attribute in the Live Server Manifest box. The value **SHOULD** be a URN or URL identifying the message scheme; The supported output schemeId should be "urn:scte:scte35:2014:xml+bin" per [SCTE-214-1] sec 6.7.4 (MPD), as the service supports only "xml+bin" at this time for brevity in the MPD. |
-| value | string | Optional | An additional string value used by the owners of the scheme to customize the semantics of the message. In order to differentiate multiple event streams with the same scheme, the value **MUST** be set to the name of the event stream (trackName for [MS-SSTR-Ingest] or AMF message name for [RTMP] ingest). |
-| Timescale | 32-bit unsigned integer | Required | The timescale, in ticks per second. |
--
-### 3.3.2 Example Event Streams for MPEG DASH
-
-#### 3.3.2.1 Example MPEG DASH .mpd manifest signaling of RTMP streaming using Adobe simple mode
-
-The following example shows an excerpt EventStream from the Media Services dynamic packager for an RTMP stream using Adobe "simple" mode signaling.
-
-```xml
-<!-- Example EventStream element using "urn:com:adobe:dpi:simple:2015" Adobe simple signaling per [Adobe-Primetime] -->
- <EventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="simplesignal" timescale="10000000">
- <Event presentationTime="1583497601000000" duration="300000000" id="1085900"/>
- <Event presentationTime="1583500901666666" duration="300000000" id="1415966"/>
- <Event presentationTime="1583504202333333" duration="300000000" id="1746033"/>
- <Event presentationTime="1583507502666666" duration="300000000" id="2076066"/>
- <Event presentationTime="1583510803333333" duration="300000000" id="2406133"/>
- <Event presentationTime="1583514104000000" duration="300000000" id="2736200"/>
- <Event presentationTime="1583517404666666" duration="300000000" id="3066266"/>
- <Event presentationTime="1583520705333333" duration="300000000" id="3396333"/>
- <Event presentationTime="1583524006000000" duration="300000000" id="3726400"/>
- <Event presentationTime="1583527306666666" duration="300000000" id="4056466"/>
- <Event presentationTime="1583530607333333" duration="300000000" id="4386533"/>
- </EventStream>
-```
--
-#### 3.3.2.2 Example MPEG DASH .mpd manifest signaling of an RTMP stream using Adobe SCTE-35 mode
-
-The following example shows an excerpt EventStream from the Media Services dynamic packager for an RTMP stream using Adobe SCTE-35 mode signaling.
-
-Example EventStream element using xml+bin style signaling per [SCTE-214-1]
-
-```xml
-
- <EventStream schemeIdUri="urn:scte:scte35:2014:xml+bin" value="scte35" timescale="10000000">
- <Event presentationTime="2595092444" duration="11011000" id="1002">
- <Signal xmlns="http://www.scte.org/schemas/35/2016">
- <Binary>/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==</Binary>
- </Signal>
- </Event>
- <Event presentationTime="2606103444" id="1002">
- <Signal xmlns="http://www.scte.org/schemas/35/2016">
- <Binary>/DAgAAAAAAXdAP/wDwUAAAPqf0/+AWXk0wABAQEAAGB86Fo=</Binary>
- </Signal>
- </Event>
- </EventStream>
-```
---
-> [!IMPORTANT]
-> Note that presentationTime is the presentation time of the [SCTE-35] event translated to be relative to the Period Start time, not the arrival time of the message.
-> [MPEGDASH] defines the Event@presentationTime as "Specifies the presentation time of the event relative to the start of the Period.
-> The value of the presentation time in seconds is the division of the value of this attribute and the value of the EventStream@timescale attribute.
-> If not present, the value of the presentation time is 0.
--
-#### 3.3.3.1 Example MPEG DASH manifest (MPD) with single-period, EventStream, using Adobe simple mode signals
-
-The following example shows the output from the Media Services dynamic packager for a source RTMP stream using the Adobe "simple" mode ad signal method. The output is a single period manifest showing an EventStream using the schemeId Uri set to "urn:com:adobe:dpi:simple:2015" and value property set to "simplesignal".
-Each simple signal is provided in an Event element with the @presentationTime, @duration, and @id properties populated based on the incoming simple signals.
-
-```xml
-<?xml version="1.0" encoding="utf-8"?>
-<MPD xmlns="urn:mpeg:dash:schema:mpd:2011"
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" mediaPresentationDuration="PT28M1.680S" minBufferTime="PT3S">
- <Period>
- <EventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="simplesignal" timescale="1000">
- <Event presentationTime="4011578265" duration="119987" id="4011578265"/>
- </EventStream>
- <AdaptationSet id="1" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.4D4028" maxWidth="1920" maxHeight="1080" startWithSAP="1">
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="simplesignal"/>
- <ProducerReferenceTime id="4011460740" type="0" wallClockTime="2020-01-25T19:35:54.740Z" presentationTime="4011460740"/>
- <SegmentTemplate timescale="1000" presentationTimeOffset="4011460740" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="4011460740" d="2002" r="57"/>
- <S d="1401"/>
- <S d="601"/>
- <S d="2002"/>
-
- <!--> ... video segments truncated for sample brevity </-->
-
- </SegmentTimeline>
- </SegmentTemplate>
- <Representation id="1_V_video_14759481473095519504" bandwidth="6000000" width="1920" height="1080"/>
- <Representation id="1_V_video_1516803357996956148" bandwidth="4000000" codecs="avc1.4D401F" width="1280" height="720"/>
- <Representation id="1_V_video_5430608182379669372" bandwidth="2600000" codecs="avc1.4D401F" width="960" height="540"/>
- <Representation id="1_V_video_3780180650986497347" bandwidth="1000000" codecs="avc1.4D401E" width="640" height="360"/>
- <Representation id="1_V_video_13759117363700265707" bandwidth="699000" codecs="avc1.4D4015" width="480" height="270"/>
- <Representation id="1_V_video_6140004908920393176" bandwidth="400000" codecs="avc1.4D4015" width="480" height="270"/>
- <Representation id="1_V_video_10673801877453424365" bandwidth="200000" codecs="avc1.4D400D" width="320" height="180"/>
- </AdaptationSet>
- <AdaptationSet id="2" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.2">
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="simplesignal"/>
- <ProducerReferenceTime id="4011460761" type="0" wallClockTime="2020-01-25T19:35:54.761Z" presentationTime="4011460761"/>
- <Label>audio</Label>
- <SegmentTemplate timescale="1000" presentationTimeOffset="4011460740" media="QualityLevels($Bandwidth$)/Fragments(audio=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(audio=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="4011460761" d="1984"/>
- <S d="2005" r="1"/>
- <S d="2006"/>
-
- <!--> ... audio segments truncated for example brevity </-->
-
- </SegmentTimeline>
- </SegmentTemplate>
- <Representation id="5_A_audio_17504386117102112482" bandwidth="128000" audioSamplingRate="48000"/>
- </AdaptationSet>
- </Period>
-</MPD>
-
-```
-
-<!
-#### 3.3.3.2 Example MPEG DASH manifest (MPD) with multi-period, EventStream, using Adobe SCTE35 mode signaling
-
-The following example shows the output from the Media Services dynamic packager for a source RTMP stream using the Adobe SCTE35 mode signaling.
-In this case, the output manifest is a multi-period DASH .mpd with an EventStream element, and @schemeIdUri property set to "urn:scte:scte35:2014:xml+bin" and a @value property set to "scte35". Each Event element in the EventStream contains the full base64 encoded binary SCTE35 signal
-
-~~~ xml
-<?xml version="1.0" encoding="utf-8"?>
-<MPD xmlns="urn:mpeg:dash:schema:mpd:2011"
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="dynamic" publishTime="2020-01-07T19:42:44Z" minimumUpdatePeriod="PT0S" timeShiftBufferDepth="PT58M56S" availabilityStartTime="2020-01-07T19:40:50Z" minBufferTime="PT4S">
- <Period start="PT2M48.168S" id="main-content_0">
- <AdaptationSet id="1" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.640020" maxWidth="1280" maxHeight="720" startWithSAP="1">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <SegmentTemplate timescale="90000" presentationTimeOffset="15135120" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="15135120" d="135135" r="59"/>
- <S d="111111"/>
- <S d="1502"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="15135120" type="0" wallClockTime="2020-01-07T19:40:50Z" presentationTime="15135120"/>
- <Representation id="1_V_video_5322324134428436312" bandwidth="3500000" width="1280" height="720"/>
- <Representation id="1_V_video_16981495139092747609" bandwidth="2200000" width="960" height="540"/>
- <Representation id="1_V_video_1384718563016940751" bandwidth="1350000" codecs="avc1.64001F" width="704" height="396"/>
- <Representation id="1_V_video_4425970933904124207" bandwidth="850000" codecs="avc1.64001E" width="512" height="288"/>
- <Representation id="1_V_video_11952982975776937431" bandwidth="550000" codecs="avc1.640016" width="384" height="216"/>
- <Representation id="1_V_video_10673801877453424365" bandwidth="200000" codecs="avc1.640015" width="340" height="192"/>
- </AdaptationSet>
- <AdaptationSet id="2" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.5" lang="en">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <Label>audio</Label>
- <SegmentTemplate timescale="44100" presentationTimeOffset="7416208" media="QualityLevels($Bandwidth$)/Fragments(audio=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(audio=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="7417856" d="133120"/>
- <S d="132096" r="1"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="7417856" type="0" wallClockTime="2020-01-07T19:40:50.037Z" presentationTime="7417856"/>
- <Representation id="5_A_audio_17504386117102112482" bandwidth="128000" audioSamplingRate="44100"/>
- </AdaptationSet>
- </Period>
- <Period start="PT4M19.509S" id="scte-35_0">
- <EventStream schemeIdUri="urn:scte:scte35:2014:xml+bin" value="scte35" timescale="10000000">
- <Event presentationTime="2595092444" duration="11011000" id="1002">
- <Signal xmlns="http://www.scte.org/schemas/35/2016">
- <Binary>/DAlAAAAAAXdAP/wFAUAAAPqf+/+AWRhuP4AUmNjAAEBAQAA8g1eNw==</Binary>
- </Signal>
- </Event>
- <Event presentationTime="2606103444" id="1002">
- <Signal xmlns="http://www.scte.org/schemas/35/2016">
- <Binary>/DAgAAAAAAXdAP/wDwUAAAPqf0/+AWXk0wABAQEAAGB86Fo=</Binary>
- </Signal>
- </Event>
- </EventStream>
- <AdaptationSet id="1" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.640020" maxWidth="1280" maxHeight="720" startWithSAP="1">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <SegmentTemplate timescale="90000" presentationTimeOffset="23355832" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="23355833" d="22522"/>
- <S d="76577"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="23355833" type="0" wallClockTime="2020-01-07T19:42:21.341Z" presentationTime="23355833"/>
- <Representation id="1_V_video_5322324134428436312" bandwidth="3500000" width="1280" height="720"/>
- <Representation id="1_V_video_16981495139092747609" bandwidth="2200000" width="960" height="540"/>
- <Representation id="1_V_video_1384718563016940751" bandwidth="1350000" codecs="avc1.64001F" width="704" height="396"/>
- <Representation id="1_V_video_4425970933904124207" bandwidth="850000" codecs="avc1.64001E" width="512" height="288"/>
- <Representation id="1_V_video_11952982975776937431" bandwidth="550000" codecs="avc1.640016" width="384" height="216"/>
- <Representation id="1_V_video_10673801877453424365" bandwidth="200000" codecs="avc1.640015" width="340" height="192"/>
- </AdaptationSet>
- <AdaptationSet id="2" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.5" lang="en">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <Label>audio</Label>
- <SegmentTemplate timescale="44100" presentationTimeOffset="11444358" media="QualityLevels($Bandwidth$)/Fragments(audio=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(audio=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="11446272" d="49152"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="11446272" type="0" wallClockTime="2020-01-07T19:42:21.384Z" presentationTime="11446272"/>
- <Representation id="5_A_audio_17504386117102112482" bandwidth="128000" audioSamplingRate="44100"/>
- </AdaptationSet>
- </Period>
- <Period start="PT4M20.610S" id="main-content_1">
- <AdaptationSet id="1" group="1" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="video" mimeType="video/mp4" codecs="avc1.640020" maxWidth="1280" maxHeight="720" startWithSAP="1">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <SegmentTemplate timescale="90000" presentationTimeOffset="23454931" media="QualityLevels($Bandwidth$)/Fragments(video=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(video=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="23454932" d="58558"/>
- <S d="4504"/>
- <S d="130631"/>
- <S d="135135" r="12"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="23454932" type="0" wallClockTime="2020-01-07T19:42:22.442Z" presentationTime="23454932"/>
- <Representation id="1_V_video_5322324134428436312" bandwidth="3500000" width="1280" height="720"/>
- <Representation id="1_V_video_16981495139092747609" bandwidth="2200000" width="960" height="540"/>
- <Representation id="1_V_video_1384718563016940751" bandwidth="1350000" codecs="avc1.64001F" width="704" height="396"/>
- <Representation id="1_V_video_4425970933904124207" bandwidth="850000" codecs="avc1.64001E" width="512" height="288"/>
- <Representation id="1_V_video_11952982975776937431" bandwidth="550000" codecs="avc1.640016" width="384" height="216"/>
- <Representation id="1_V_video_10673801877453424365" bandwidth="200000" codecs="avc1.640015" width="340" height="192"/>
- </AdaptationSet>
- <AdaptationSet id="2" group="5" profiles="ccff" bitstreamSwitching="false" segmentAlignment="true" contentType="audio" mimeType="audio/mp4" codecs="mp4a.40.5" lang="en">
- <InbandEventStream schemeIdUri="urn:mpeg:dash:event:2012" value="1"/>
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
- <Label>audio</Label>
- <SegmentTemplate timescale="44100" presentationTimeOffset="11492916" media="QualityLevels($Bandwidth$)/Fragments(audio=$Time$,format=mpd-time-csf)" initialization="QualityLevels($Bandwidth$)/Fragments(audio=i,format=mpd-time-csf)">
- <SegmentTimeline>
- <S t="11495424" d="28672"/>
- <S d="1024"/>
- <S d="131072"/>
- <S d="132096"/>
- <S d="133120"/>
- <S d="132096" r="1"/>
- <S d="133120"/>
- </SegmentTimeline>
- </SegmentTemplate>
- <ProducerReferenceTime id="11495424" type="0" wallClockTime="2020-01-07T19:42:22.499Z" presentationTime="11495424"/>
- <Representation id="5_A_audio_17504386117102112482" bandwidth="128000" audioSamplingRate="44100"/>
- </AdaptationSet>
- </Period>
-</MPD>
-
-~~~
-
->
-
-### 3.3.4 MPEG DASH In-band Event Message Box Signaling
-
-An in-band event stream requires the MPD to have an InbandEventStream element at the Adaptation Set level. This element has a mandatory schemeIdUri attribute and optional timescale attribute, which also appear in the Event Message Box ('emsg'). Event message boxes with scheme identifiers that are not defined in the MPD **SHOULD** not be present.
-
-For in-band [SCTE-35] carriage, signals **MUST** use the schemeId = "urn:scte:scte35:2013:bin".
-Normative definitions of carriage of [SCTE-35] in-band messages are defined in [SCTE-214-3] sec 7.3.2 (Carriage of SCTE 35 cue messages).
-
-The following details outline the specific values the client should expect in the 'emsg' in compliance with [SCTE-214-3]:
-
-| **Field Name** | **Field Type** | **Required?** | **Description** |
-| -- | -- | - | |
-| scheme_id_uri | string | Required | Identifies the scheme of the message. The scheme is set to the value of the Scheme attribute in the Live Server Manifest box. The value **MUST** be a URN identifying the message scheme. For [SCTE-35] messages, this **MUST** be "urn:scte:scte35:2013:bin" in compliance with [SCTE-214-3] |
-| Value | string | Required | An additional string value used by the owners of the scheme to customize the semantics of the message. In order to differentiate multiple event streams with the same scheme, the value will be set to the name of the event stream (trackName for Smooth ingest or AMF message name for RTMP ingest). |
-| Timescale | 32-bit unsigned integer | Required | The timescale, in ticks per second, of the times and duration fields within the 'emsg' box. |
-| Presentation_time_delta | 32-bit unsigned integer | Required | The media presentation time delta of the presentation time of the event and the earliest presentation time in this segment. The presentation time and duration **SHOULD** align with Stream Access Points (SAP) of type 1 or 2, as defined in [ISO-14496-12] Annex I. |
-| event_duration | 32-bit unsigned integer | Required | The duration of the event, or 0xFFFFFFFF to indicate an unknown duration. |
-| Id | 32-bit unsigned integer | Required | Identifies this instance of the message. Messages with equivalent semantics shall have the same value. If the ID is not specified when the message is ingested, Azure Media Services will generate a unique ID. |
-| Message_data | byte array | Required | The event message. For [SCTE-35] messages, the message data is the binary splice_info_section() in compliance with [SCTE-214-3] |
--
-#### Example InBandEvenStream entity for Adobe Simple mode
-```xml
-
- <InbandEventStream schemeIdUri="urn:com:adobe:dpi:simple:2015" value="amssignal"/>
-```
-
-### 3.3.5 DASH Message Handling
-
-Events are signaled in-band, within the 'emsg' box, for both video and audio tracks. The signaling occurs for all segment requests for which the presentation_time_delta is 15 seconds or less.
-
-When a sliding presentation window is enabled, event messages are removed from the MPD when the sum of the time and duration of the event message is less than the time of the media data in the manifest. In other words, the event messages are removed from the manifest when the media time to which they refer has rolled out of the sliding presentation window.
-
-## 4. SCTE-35 Ingest Implementation Guidance for Encoder Vendors
-
-The following guidelines are common issues that can impact an encoder vendor's implementation of this specification. The guidelines below have been collected based on real world partner feedback to make it easier to implement this specification for others.
-
-[SCTE-35] messages are ingested in binary format using the Scheme
-**"urn:scte:scte35:2013:bin"** for [MS-SSTR-Ingest] and the type **"scte35"** for
-[RTMP] ingest. To facilitate the conversion of [SCTE-35] timing, which is based on
-MPEG-2 transport stream presentation time stamps (PTS), a mapping between PTS
-(pts_time + pts_adjustment of the splice_time()) and the media timeline is
-provided by the event presentation time (the fragment_absolute_time field for
-Smooth ingest and the time field for RTMP ingest). The mapping is necessary because the
-33-bit PTS value rolls over approximately every 26.5 hours.
-
-Smooth Streaming ingest [MS-SSTR-Ingest] requires that the Media Data Box ('mdat') **MUST** contain the **splice_info_section()** defined in [SCTE-35].
-
-For RTMP ingest,the cue attribute of the AMF message is set to the base64-encoded **splice_info_section()** defined in [SCTE-35].
--
-When the messages have the format described above, they are sent to HLS, Smooth, and DASH clients as defined above.
-
-When testing your implementation with the Azure Media Services platform, please start testing with a "pass-through" LiveEvent first, before moving to testing on an encoding LiveEvent.
---
-## Change History
-
-| Date | Changes |
-| -- | - |
-| 07/2/19 | Revised RTMP ingest support, added RTMP "onCuePoint" for Elemental Live |
-| 08/22/19 | Updated to add OnUserDataEvent to RTMP for custom metadata |
-| 1/08/20 | Fixed error on RTMP Simple and RTMP SCTE35 mode. Changed from "onCuePoint" to "onAdCue". Updated Simple mode table. |
-| 08/4/20 | Removed support for DATERANGE tag to match the implementation in production service. |
media-services Media Services Specifications Ms Sstr Amendment Hevc https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-specifications-ms-sstr-amendment-hevc.md
- Title: Smooth Streaming Protocol (MS-SSTR) Amendment for HEVC - Azure
-description: This specification describes the protocol and format for fragmented MP4-based live streaming with HEVC in Azure Media Services. Only the changes required to deliver HEVC are specified in this article, except were ΓÇ£(No Change)ΓÇ¥ indicates text is copied for clarification only.
------- Previously updated : 08/19/2019---
-# Smooth Streaming Protocol (MS-SSTR) Amendment for HEVC
-
-## 1 Introduction
-
-This article provides detailed amendments to be applied to the Smooth Streaming Protocol
-specification [MS-SSTR] to enable Smooth Streaming of HEVC encoded video. In this specification, we
-outline only the changes required to deliver the HEVC video codec. The article follows the same
-numbering schema as the [MS-SSTR] specification. The empty headlines presented throughout the article are provided to orient the reader to their position in the [MS-SSTR] specification. ΓÇ£(No Change)ΓÇ¥ indicates text is copied for clarification purposes only.
-
-The article provides technical implementation requirements for the signaling of HEVC video codec (using either 'hev1' or 'hvc1' format tracks) in a Smooth Streaming manifest and normative references are updated to reference the current MPEG standards that include HEVC, Common Encryption of HEVC, and box names for ISO Base Media File Format have been updated to be consistent with the latest specifications.
-
-The referenced Smooth Streaming Protocol specification [MS-SSTR] describes the
-wire format used to deliver live and on-demand digital media,
-such as audio and video, in the following manners: from an encoder to a web
-server, from a server to another server, and from a server to an HTTP client.
-The use of an MPEG-4 ([[MPEG4-RA])](https://go.microsoft.com/fwlink/?LinkId=327787)-based data
-structure delivery over HTTP allows seamless switching in near real time between
-different quality levels of compressed media content. The result is a constant
-playback experience for the HTTP client end user, even if network and video
-rendering conditions change for the client computer or device.
-
-## 1.1 Glossary
-
-The following terms are defined in *[MS-GLOS]*:
-
-> **globally unique identifier (GUID) universally unique identifier (UUID)**
-
-The following terms are specific to this document:
-
-> **composition time:** The time a sample is presented at the client,
-> as defined in
-> [[ISO/IEC-14496-12]](https://go.microsoft.com/fwlink/?LinkId=183695).
->
-> **CENC**: Common Encryption, as defined in [ISO/IEC 23001-7] Second Edition.
->
-> **decode time:** The time a sample is required to be decoded on the client,
-> as defined in
-> [[ISO/IEC 14496-12:2008]](https://go.microsoft.com/fwlink/?LinkId=183695).
-
-**fragment:** An independently downloadable unit of **media** that comprises one
-or more **samples**.
-
-> **HEVC:** High Efficiency Video Coding, as defined in [ISO/IEC 23008-2]
->
-> **manifest:** Metadata about the **presentation** that allows a client to
-> make requests for **media**. **media:** Compressed audio, video, and text
-> data used by the client to play a **presentation**. **media format:** A
-> well-defined format for representing audio or video as a compressed
-> **sample**.
->
-> **presentation:** The set of all **streams** and related metadata needed to
-> play a single movie. **request:** An HTTP message sent from the client to
-> the server, as defined in
-> [[RFC2616]](https://go.microsoft.com/fwlink/?LinkId=90372) **response:** An
-> HTTP message sent from the server to the client, as defined in
-> [[RFC2616]](https://go.microsoft.com/fwlink/?LinkId=90372)
->
-> **sample:** The smallest fundamental unit (such as a frame) in which
-> **media** is stored and processed.
->
-> **MAY, SHOULD, MUST, SHOULD NOT, MUST NOT:** These terms (in all caps) are
-> used as described in
-> [[RFC2119]](https://go.microsoft.com/fwlink/?LinkId=90317) All statements of
-> optional behavior use either MAY, SHOULD, or SHOULD NOT.
-
-## 1.2 References
-
-> References to Microsoft Open Specifications documentation do not include a
-> publishing year because links are to the latest version of the documents,
-> which are updated frequently. References to other documents include a
-> publishing year when one is available.
-
-### 1.2.1 Normative References
-
-> [MS-SSTR] Smooth Streaming Protocol *v20140502*
-> [https://msdn.microsoft.com/library/ff469518.aspx](/openspecs/windows_protocols/ms-sstr/8383f27f-7efe-4c60-832a-387274457251)
->
-> [ISO/IEC 14496-12] International Organization for Standardization,
-> "Information technology -- Coding of audio-visual objects -- Part 12: ISO
-> Base Media File Format", ISO/IEC 14496-12:2014, Edition 4, Plus Corrigendum
-> 1, Amendments 1 & 2.
-> <https://standards.iso.org/ittf/PubliclyAvailableStandards/c061988_ISO_IEC_14496-12_2012.zip>
->
-> [ISO/IEC 14496-15] International Organization for Standardization,
-> "Information technology -- Coding of audio-visual objects -- Part 15:
-> Carriage of NAL unit structured video in the ISO Base Media File Format",
-> ISO 14496-15:2015, Edition 3.
-> <https://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=65216>
->
-> [ISO/IEC 23008-2] Information technology -- High efficiency coding and media
-> delivery in heterogeneous environments -- Part 2: High efficiency video
-> coding: 2013 or newest edition
-> <https://standards.iso.org/ittf/PubliclyAvailableStandards/c035424_ISO_IEC_23008-2_2013.zip>
->
-> [ISO/IEC 23001-7] Information technology ΓÇö MPEG systems technologies ΓÇö Part
-> 7: Common encryption in ISO base media file format files, CENC Edition
-> 2:2015 <https://www.iso.org/iso/catalogue_detail.htm?csnumber=65271>
->
-> [RFC-6381] IETF RFC-6381, ΓÇ£The 'Codecs' and 'Profiles' Parameters for
-> "Bucket" Media TypesΓÇ¥ <https://tools.ietf.org/html/rfc6381>
->
-> [MPEG4-RA] The MP4 Registration Authority, "MP4REG", [http://www.mp4ra.org](https://go.microsoft.com/fwlink/?LinkId=327787)
->
-> [RFC2119] Bradner, S., "Key words for use in RFCs to Indicate Requirement
-> Levels", BCP 14, RFC 2119, March 1997,
-> [https://www.rfc-editor.org/rfc/rfc2119.txt](https://go.microsoft.com/fwlink/?LinkId=90317)
-
-### 1.2.2 Informative References
-
-> [MS-GLOS] Microsoft Corporation, "*Windows Protocols Master Glossary*."
->
-> [RFC3548] Josefsson, S., Ed., "The Base16, Base32, and Base64 Data
-> Encodings", RFC 3548, July 2003, [https://www.ietf.org/rfc/rfc3548.txt](https://go.microsoft.com/fwlink/?LinkId=90432)
->
-> [RFC5234] Crocker, D., Ed., and Overell, P., "Augmented BNF for Syntax
-> Specifications: ABNF", STD 68, RFC 5234, January 2008,
-> [https://www.rfc-editor.org/rfc/rfc5234.txt](https://go.microsoft.com/fwlink/?LinkId=123096)
--
-## 1.3 Overview
-
-> Only changes to the Smooth Streaming specification required for the delivery
-> of HEVC are specified below. Unchanged section headers are listed to
-> maintain location in the referenced Smooth Streaming specification
-> [MS-SSTR].
-
-## 1.4 Relationship to Other Protocols
-
-## 1.5 Prerequisites/Preconditions
-
-## 1.6 Applicability Statement
-
-## 1.7 Versioning and Capability Negotiation
-
-## 1.8 Vendor-Extensible Fields
-
-> The following method SHALL be used identify streams using the HEVC video
-> format:
->
-> * **Custom Descriptive Codes for Media Formats:** This capability is
-> provided by the **FourCC** field, as specified in section *2.2.2.5*.
-> Implementers can ensure that extensions do not conflict by registering
-> extension codes with the MPEG4-RA, as specified in [[ISO/IEC-14496-12]](https://go.microsoft.com/fwlink/?LinkId=183695)
-
-## 1.9 Standards Assignments
-
-## 2 Messages
-
-## 2.1 Transport
-
-## 2.2 Message Syntax
-
-### 2.2.1 Manifest Request
-
-### 2.2.2 Manifest Response
-
-#### 2.2.2.1 SmoothStreamingMedia
-
-> **MinorVersion (variable):** The minor version of the Manifest Response
-> message. MUST be set to 2. (No Change)
->
-> **TimeScale (variable):** The time scale of the Duration attribute,
-> specified as the number of increments in one second. The default value is
-> 1. (No Change)
->
-> The recommended value is 90000 for representing the exact duration of video
-> frames and fragments containing fractional framerate video (for example, 30/1.001
-> Hz).
-
-#### 2.2.2.2 ProtectionElement
-
-The ProtectionElement SHALL be present when Common Encryption (CENC) has been
-applied to video or audio streams. HEVC encrypted streams SHALL conform to
-Common Encryption 2nd Edition [ISO/IEC 23001-7]. Only slice data in VCL NAL
-Units SHALL be encrypted.
-
-#### 2.2.2.3 StreamElement
-
-> **StreamTimeScale (variable):** The time scale for duration and time values
-> in this stream, specified as the number of increments in one second. A value
-> of 90000 is recommended for HEVC streams. A value matching the waveform
-> sample frequency (for example, 48000 or 44100) is recommended for audio streams.
-
-##### 2.2.2.3.1 StreamProtectionElement
-
-#### 2.2.2.4 UrlPattern
-
-#### 2.2.5 TrackElement
-
-> **FourCC (variable):** A four-character code that identifies which media
-> format is used for each sample. The following range of values is reserved
-> with the following semantic meanings:
->
-> * "hev1ΓÇ¥: Video samples for this track use HEVC video, using the ΓÇÿhev1ΓÇÖ
-> sample description format specified in [ISO/IEC-14496-15].
->
-> * "hvc1ΓÇ¥: Video samples for this track use HEVC video, using the ΓÇÿhvc1ΓÇÖ
-> sample description format specified in [ISO/IEC-14496-15].
->
-> **CodecPrivateData (variable):** Data that specifies parameters specific to
-> the media format and common to all samples in the track, represented as a
-> string of hex-coded bytes. The format and semantic meaning of byte sequence
-> varies with the value of the **FourCC** field as follows:
->
-> * When a TrackElement describes HEVC video, the **FourCC** field SHALL equal
-> **"hev1"** or **"hvc1"**
->
-> The **CodecPrivateData** field SHALL contain a hex-coded string
-> representation of the following byte sequence, specified in ABNF
-> [[RFC5234]:](https://go.microsoft.com/fwlink/?LinkId=123096) (no change from
-> MS-SSTR)
->
-> * %x00 %x00 %x00 %x01 SPSField %x00 %x00 %x00 %x01 PPSField
->
-> * SPSField contains the Sequence Parameter Set (SPS).
->
-> * PPSField contains the Slice Parameter Set (PPS).
->
-> Note: The Video Parameter Set (VPS) is not contained in CodecPrivateData,
-> but should be contained in the file header of stored files in the ΓÇÿhvcCΓÇÖ
-> box. Systems using Smooth Streaming Protocol must signal additional decoding
-> parameters (for example, HEVC Tier) using the Custom Attribute ΓÇ£codecs.ΓÇ¥
-
-##### 2.2.2.5.1 CustomAttributesElement
-
-#### 2.2.6 StreamFragmentElement
-
-> The **SmoothStreamingMediaΓÇÖs MajorVersion** field MUST be set to 2, and
-> **MinorVersion** field MUST be set to 2. (No Change)
-
-##### 2.2.2.6.1 TrackFragmentElement
-
-### 2.2.3 Fragment Request
-
-> **Note**: The default media format requested for **MinorVersion** 2 and ΓÇÿhev1ΓÇÖ or 'hvc1'
-> is ΓÇÿiso8ΓÇÖ brand ISO Base Media File Format specified in [ISO/IEC 14496-12]
-> ISO Base Media File Format Fourth Edition, and [ISO/IEC 23001-7] Common
-> Encryption Second Edition.
-
-### 2.2.4 Fragment Response
-
-#### 2.2.4.1 MoofBox
-
-#### 2.2.4.2 MfhdBox
-
-#### 2.2.4.3 TrafBox
-
-#### 2.2.4.4 TfxdBox
-
-> The **TfxdBox** is deprecated, and its function replaced by the Track
-> Fragment Decode Time Box (ΓÇÿtfdtΓÇÖ) specified in [ISO/IEC 14496-12] section
-> 8.8.12.
->
-> **Note**: A client may calculate the duration of a fragment by summing the
-> sample durations listed in the Track Run Box (ΓÇÿtrunΓÇÖ) or multiplying the
-> number of samples times the default sample duration. The baseMediaDecodeTime
-> in ΓÇÿtfdtΓÇÖ plus fragment duration equals the URL time parameter for the next
-> fragment.
->
-> A Producer Reference Time Box (ΓÇÿprftΓÇÖ) SHOULD be inserted prior to a Movie
-> Fragment Box (ΓÇÿmoofΓÇÖ) as needed, to indicate the UTC time corresponding to
-> the Track Fragment Decode Time of the first sample referenced by the Movie
-> Fragment Box, as specified in [ISO/IEC 14496-12] section 8.16.5.
-
-#### 2.2.4.5 TfrfBox
-
-> The **TfrfBox** is deprecated, and its function replaced by the Track
-> Fragment Decode Time Box (ΓÇÿtfdtΓÇÖ) specified in [ISO/IEC 14496-12] section
-> 8.8.12.
->
-> **Note**: A client may calculate the duration of a fragment by summing the
-> sample durations listed in the Track Run Box (ΓÇÿtrunΓÇÖ) or multiplying the
-> number of samples times the default sample duration. The baseMediaDecodeTime
-> in ΓÇÿtfdtΓÇÖ plus fragment duration equals the URL time parameter for the next
-> fragment. Look ahead addresses are deprecated because they delay live
-> streaming.
-
-#### 2.2.4.6 TfhdBox
-
-> The **TfhdBox** and related fields encapsulate defaults for per sample
-> metadata in the fragment. The syntax of the **TfhdBox** field is a strict
-> subset of the syntax of the Track Fragment Header Box defined in
-> [[ISO/IEC-14496-12]](https://go.microsoft.com/fwlink/?LinkId=183695) section
-> 8.8.7.
->
-> **BaseDataOffset (8 bytes):** The offset, in bytes, from the beginning of
-> the **MdatBox** field to the sample field in the **MdatBox** field. To
-> signal this restriction, the default-base-is-moof flag (0x020000) must be
-> set.
-
-#### 2.2.4.7 TrunBox
-
-> The **TrunBox** and related fields encapsulate per sample metadata for the
-> requested fragment. The syntax of **TrunBox** is a strict subset of the
-> Version 1 Track Fragment Run Box defined in
-> [[ISO/IEC-14496-](https://go.microsoft.com/fwlink/?LinkId=183695)*12]*
-> section 8.8.8.
->
-> **SampleCompositionTimeOffset (4 bytes):** The Sample Composition Time
-> offset of each sample adjusted so that the presentation time of the first
-> presented sample in the fragment is equal to the decode time of the first
-> decoded sample. Negative video sample composition offsets SHALL be used,
->
-> as defined in
-> [[ISO/IEC-14496-12].](https://go.microsoft.com/fwlink/?LinkId=183695)
->
-> Note: This avoids a video synchronization error caused by video lagging
-> audio equal to the largest decoded picture buffer removal delay, and
-> maintains presentation timing between alternative fragments that may have
-> different removal delays.
->
-> The syntax of the fields defined in this section, specified in ABNF
-> [[RFC5234],](https://go.microsoft.com/fwlink/?LinkId=123096) remains the
-> same, except as follows:
->
-> SampleCompositionTimeOffset = SIGNED_INT32
-
-#### 2.2.4.8 MdatBox
-
-#### 2.2.4.9 Fragment Response Common Fields
-
-### 2.2.5 Sparse Stream Pointer
-
-### 2.2.6 Fragment Not Yet Available
-
-### 2.2.7 Live Ingest
-
-#### 2.2.7.1 FileType
-
-> **FileType (variable):** specifies the subtype and intended use of the
-> MPEG-4 ([[MPEG4-RA])](https://go.microsoft.com/fwlink/?LinkId=327787) file,
-> and high-level attributes.
->
-> **MajorBrand (variable):** The major brand of the media file. MUST be set to
-> "isml."
->
-> **MinorVersion (variable):** The minor version of the media file. MUST be
-> set to 1.
->
-> **CompatibleBrands (variable):** Specifies the supported brands of MPEG-4.
-> MUST include "ccff" and "iso8."
->
-> The syntax of the fields defined in this section, specified in ABNF
-> [[RFC5234],](https://go.microsoft.com/fwlink/?LinkId=123096) is as follows:
-
-```properties
-FileType = MajorBrand MinorVersion CompatibleBrands
-MajorBrand = STRING_UINT32
-MinorVersion = STRING_UINT32
-CompatibleBrands = "ccff" "iso8" 0\*(STRING_UINT32)
-```
-
-**Note**: The compatibility brands ΓÇÿccffΓÇÖ and ΓÇÿiso8ΓÇÖ indicate that fragments conform
-to ΓÇ£Common Container File FormatΓÇ¥ and Common Encryption [ISO/IEC 23001-7] and
-ISO Base Media File Format Edition 4 [ISO/IEC 14496-12].
-
-#### 2.2.7.2 StreamManifestBox
-
-##### 2.2.7.2.1 StreamSMIL
-
-#### 2.2.7.3 LiveServerManifestBox
-
-##### 2.2.7.3.1 LiveSMIL
-
-#### 2.2.7.4 MoovBox
-
-#### 2.2.7.5 Fragment
-
-##### 2.2.7.5.1 Track Fragment Extended Header
-
-### 2.2.8 Server-to-Server Ingest
-
-## 3 Protocol Details
--
-## 3.1 Client Details
-
-### 3.1.1 Abstract Data Model
-
-#### 3.1.1.1 Presentation Description
-
-> The Presentation Description data element encapsulates all metadata for the
-> presentation.
->
-> Presentation Metadata: A set of metadata that is common to all streams in
-> the presentation. Presentation Metadata comprises the following fields,
-> specified in section *2.2.2.1*:
->
-> * **MajorVersion**
-> * **MinorVersion**
-> * **TimeScale**
-> * **Duration**
-> * **IsLive**
-> * **LookaheadCount**
-> * **DVRWindowLength**
->
-> Presentations containing HEVC Streams SHALL set:
-
-```properties
-MajorVersion = 2
-MinorVersion = 2
-```
-
-> LookaheadCount = 0 (Note: Boxes deprecated)
->
-> Presentations SHOULD also set:
-
-```properties
-TimeScale = 90000
-```
-
-> Stream Collection: A collection of Stream Description data elements, as
-> specified in section *3.1.1.1.2*.
->
-> Protection Description: A collection of Protection System Metadata
-> Description data elements, as specified in section *3.1.1.1.1*.
-
-##### 3.1.1.1.1 Protection System Metadata Description
-
-> The Protection System Metadata Description data element encapsulates
-> metadata specific to a single Content Protection System. (No Change)
->
-> Protection Header Description: Content protection metadata that pertains to
-> a single Content Protection System. Protection Header Description comprises
-> the following fields, specified in section *2.2.2.2*:
->
-> * **SystemID**
-> * **ProtectionHeaderContent**
-
-##### 3.1.1.1.2 Stream Description
-
-###### 3.1.1.1.2.1 Track Description
-
-###### 3.1.1.1.2.1.1 Custom Attribute Description
-
-##### 3.1.1.3 Fragment Reference Description
-
-###### 3.1.1.3.1 Track-Specific Fragment Reference Description
-
-#### 3.1.1.2 Fragment Description
-
-##### 3.1.1.2.1 Sample Description
-
-### 3.1.2 Timers
-
-### 3.1.3 Initialization
-
-### 3.1.4 Higher-Layer Triggered Events
-
-#### 3.1.4.1 Open Presentation
-
-#### 3.1.4.2 Get Fragment
-
-#### 3.1.4.3 Close Presentation
-
-### 3.1.5 Processing Events and Sequencing Rules
-
-#### 3.1.5.1 Manifest Request and Manifest Response
-
-#### 3.1.5.2 Fragment Request and Fragment Response
-
-## 3.2 Server Details
-
-## 3.3 Live Encoder Details
-
-## 4 Protocol Examples
-
-## 5 Security
-
-## 5.1 Security Considerations for Implementers
-
-> If the content transported using this protocol has high commercial value, a
-> Content Protection System should be used to prevent unauthorized use of the
-> content. The **ProtectionElement** can be used to carry metadata related to
-> the use of a Content Protection System. Protected audio and video content
-> SHALL be encrypted as specified by MPEG Common Encryption Second
-> Edition: 2015 [ISO/IEC 23001-7].
->
-> **Note**: For HEVC video, only slice data in VCL NALs is encrypted. Slice
-> headers and other NALs are accessible to presentation applications prior to
-> decryption. in a secure video path, encrypted information is not available
-> to presentation applications.
-
-## 5.2 Index of Security Parameters
--
-| **Security parameter** | **Section** |
-|-||
-| ProtectionElement | *2.2.2.2* |
-| Common Encryption Boxes | *[ISO/IEC 23001-7]* |
-
-## 5.3 Common Encryption Boxes
-
-The following boxes may be present in fragment responses when Common Encryption
-is applied, and are specified in [ISO/IEC 23001-7] or [ISO/IEC 14496-12]:
-
-1. Protection System Specific Header Box (ΓÇÿpsshΓÇÖ)
-
-2. Sample Encryption Box (ΓÇÿsencΓÇÖ)
-
-3. Sample Auxiliary Information Offset Box (ΓÇÿsaioΓÇÖ)
-
-4. Sample Auxiliary Information Size Box (ΓÇÿsaizΓÇÖ)
-
-5. Sample Group Description Box (ΓÇÿsgpdΓÇÖ)
-
-6. Sample to Group Box (ΓÇÿsbgpΓÇÖ)
----
-[image1]: ./media/media-services-fmp4-live-ingest-overview/media-services-image1.png
-[image2]: ./media/media-services-fmp4-live-ingest-overview/media-services-image2.png
-[image3]: ./media/media-services-fmp4-live-ingest-overview/media-services-image3.png
-[image4]: ./media/media-services-fmp4-live-ingest-overview/media-services-image4.png
-[image5]: ./media/media-services-fmp4-live-ingest-overview/media-services-image5.png
-[image6]: ./media/media-services-fmp4-live-ingest-overview/media-services-image6.png
-[image7]: ./media/media-services-fmp4-live-ingest-overview/media-services-image7.png
media-services Media Services Sspk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-sspk.md
- Title: Licensing Microsoft&reg; Smooth Streaming Client Porting Kit
-description: Learn about how to licensing the Microsoft&reg; Smooth Streaming Client Porting Kit.
------ Previously updated : 3/10/2021--
-# Licensing Microsoft&reg; Smooth Streaming Client Porting Kit
-
-
-## Overview
-Microsoft Smooth Streaming Client Porting Kit (**SSPK** for short) is a Smooth Streaming client implementation that is optimized to help embedded device manufacturers, cable and mobile operators, content service providers, handset manufacturers, independent software vendors (ISVs), and solution providers to create products and services for streaming adaptive content in Smooth Streaming format. SSPK is a device and platform-independent implementation of Smooth Streaming client that can be ported by the licensee to any device and platform.
-
-Included below is a high-level architecture and IIS Smooth Streaming Porting Kit box is the Smooth Streaming Client implementation provided by Microsoft and includes all the core logic for playback of Smooth Streaming content. This content is then ported by partners for a specific device or platform by implementing appropriate interfaces.
-
-![SSPK](./media/media-services-sspk/sspk-arch.png)
-
-## Description
-SSPK is licensed on terms that offer excellent business value. SSPK license provides the industry with:
-
-* Smooth Streaming Porting Kit source in C++
- * implements Smooth Streaming Client functionality
- * adds format parsing, heuristics, buffering logic, etc.
-* Player application APIs
- * programming interfaces for interaction with a media player application
-* Platform Abstraction Layer (PAL) Interface
- * programming interfaces for interaction with the operating system (threads, sockets)
-* Hardware Abstraction Layer (HAL) Interface
- * programming interfaces for interaction with hardware A/V decoders (decoding, rendering)
-* Digital Rights Management (DRM) Interface
- * programming interfaces for handling DRM through the DRM Abstraction Layer (DAL)
- * Microsoft PlayReady Porting Kit ships separately but integrates through this interface. [See more details on Microsoft PlayReady Device licensing](https://www.microsoft.com/playready/licensing/device_technology.mspx#pddipdl).
-* Implementation samples
- * sample PAL implementation for Linux
- * sample HAL implementation for GStreamer
-
-## Licensing Options
-Microsoft Smooth Streaming Client Porting Kit is made available to licensees under two distinct license agreements: one for developing Smooth Streaming Client Interim Products and another for distributing Smooth Streaming Client Final Products to end users.
-
-* For chipset manufacturers, system integrators, or independent software vendors (ISVs) who require a source code porting kit to develop Interim Products, a Microsoft Smooth Streaming Client Porting Kit **Interim Product License** should be executed.
-* For device manufacturers or ISVs who require distribution rights for Smooth Streaming Client Final Products to end users, the Microsoft Smooth Streaming Client Porting Kit **Final Product License** should be executed.
-
-### Microsoft Smooth Streaming Client Porting Kit Interim Product License
-Under this license, Microsoft offers a Smooth Streaming Client Porting Kit and the necessary intellectual property rights to develop and distribute Smooth Streaming Client Interim Products to other Smooth Streaming Client Porting Kit device licensees that distribute Smooth Streaming Client Final Products.
-
-#### Fee structure
-A U.S. $50,000 one-time license fee provides access to the Smooth Streaming Client Porting Kit.
-
-### Microsoft Smooth Streaming Client Porting Kit Final Product License
-Under this license, Microsoft offers all necessary intellectual property rights to receive Smooth Streaming Client Interim Products from other Smooth Streaming Client Porting Kit licensees and to distribute company-branded Smooth Streaming Client Final Products to end users.
-
-#### Fee structure
-The Smooth Streaming Client Final Product is offered under a royalty model as under:
-
-* $0.10 per device implementation shipped
-* The royalty is capped at $50,000 each year
-* No royalty for first 10,000 device implementations each year
-
-## Licensing Procedure and SSPK access
-Email [sspkinfo@microsoft.com](mailto:sspkinfo@microsoft.com) for all licensing queries.
-
-The SSPK Distribution portal is accessible to registered Interim licensees.
-
-Interim and Final SSPK licensees can submit technical questions to [smoothpk@microsoft.com](mailto:smoothpk@microsoft.com).
-
-## Microsoft Smooth Streaming Client Interim Product Agreement Licensees
-View current licensee list here: https://go.microsoft.com/fwlink/?linkid=301271
-
-## Microsoft Smooth Streaming Client Final Product Agreement Licensees
-View current licensee list here: https://go.microsoft.com/fwlink/?linkid=301271
-
-## Media Services learning paths
-
-## Provide feedback
-
media-services Media Services Static Packaging https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-static-packaging.md
- Title: Using Azure Media Packager to accomplish static packaging tasks | Microsoft Docs
-description: This topic shows various tasks that are accomplished with Azure Media Packager.
------ Previously updated : 3/10/2021---
-# Using Azure Media Packager to accomplish static packaging tasks
----
-## Overview
-
-In order to deliver digital video over the internet, you must compress the media. Digital video files are large and may be too large to deliver over the internet or for your customersΓÇÖ devices to display properly. Encoding is the process of compressing video and audio so your customers can view your media. Once a video has been encoded, it can be placed into different file containers. The process of placing encoded media into a container is called packaging. For example, you can take an MP4 file and convert it into Smooth Streaming or HLS content by using the Azure Media Packager.
-
-Media Services supports dynamic and static packaging. When using static packaging, you need to create a copy of your content in each format required by your customers. With dynamic packaging, all you need is to create an asset that contains a set of adaptive bitrate MP4 or Smooth Streaming files. Then, based on the specified format in the manifest or fragment request, the On-Demand Streaming server ensures that your users receive the stream in the protocol they have chosen. As a result, you only need to store and pay for the files in single storage format and Media Services service will build and serve the appropriate response based on requests from a client.
-
-> [!NOTE]
-> It is recommended to use [dynamic packaging](media-services-dynamic-packaging-overview.md).
->
->
-
-However, there are some scenarios that require static packaging:
-
-* Validating adaptive bitrate MP4s encoded with external encoders (for example, using third-party encoders).
-
-You can also use static packaging to perform the following tasks: However it is recommended to use dynamic encryption.
-
-* Using static encryption to protect your Smooth and MPEG DASH with PlayReady
-* Using static encryption to protect HLSv3 with AES-128
-* Using static encryption to protect HLSv3 with PlayReady
-
-## Validating Adaptive Bitrate MP4s Encoded with External Encoders
-If you want to use a set of adaptive bitrate (multi-bitrate) MP4 files that were not encoded with Media Services' encoders, you should validate your files before further processing. The Media Services Packager can validate an asset that contains a set of MP4 files and check whether the asset can be packaged to Smooth Streaming or HLS. If the validation task fails, the job that was processing the task completes with an error. The XML that defines the preset for the validation task can be found in the [Task Preset for Azure Media Packager](/previous-versions/azure/reference/hh973635(v=azure.100)) article.
-
-> [!NOTE]
-> Use the Media Encoder Standard to produce or the Media Services Packager to validate your content in order to avoid runtime issues. If the On-Demand Streaming server is not able to parse your source files at runtime, you receive HTTP 1.1 error ΓÇ£415 Unsupported Media Type.ΓÇ¥ Repeatedly causing the server to fail to parse your source files affects performance of the On-Demand Streaming server and may reduce the bandwidth available to serving other requests. Azure Media Services offers a Service Level Agreement (SLA) on its On-Demand Streaming services; however, this SLA cannot be honored if the server is misused in the fashion described above.
->
->
-
-This section shows how to process the validation task. It also shows how to see the status and the error message of the job that completes with JobStatus.Error.
-
-To validate your MP4 files with Media Services Packager, you must create your own manifest (.ism) file and upload it together with the source files into the Media Services account. Below is a sample of the .ism file produced by the Media Encoder Standard. The file names are case-sensitive. Also, make sure the text in the .ism file is encoded with UTF-8.
-
-```xml
- <?xml version="1.0" encoding="utf-8" standalone="yes"?>
- <smil xmlns="https://www.w3.org/2001/SMIL20/Language">
- <head>
- <!-- Tells the server that these input files are MP4s ΓÇô specific to Dynamic Packaging -->
- <meta name="formats" content="mp4" />
- </head>
- <body>
- <switch>
- <video src="BigBuckBunny_1000.mp4" />
- <video src="BigBuckBunny_1500.mp4" />
- <video src="BigBuckBunny_2250.mp4" />
- <video src="BigBuckBunny_3400.mp4" />
- <video src="BigBuckBunny_400.mp4" />
- <video src="BigBuckBunny_650.mp4" />
- <audio src="BigBuckBunny_400.mp4" />
- </switch>
- </body>
- </smil>
-```
-
-Once you have the adaptive bitrate MP4 set, you can take advantage of Dynamic Packaging. Dynamic Packaging allows you to deliver streams in the specified protocol without further packaging. For more information, see [dynamic packaging](media-services-dynamic-packaging-overview.md).
-
-The following code sample uses Azure Media Services .NET SDK Extensions. Make sure to update the code to point to the folder where your input MP4 files and .ism file are located. And also to where your MediaPackager_ValidateTask.xml file is located. This XML file is defined in [Task Preset for Azure Media Packager](/previous-versions/azure/reference/hh973635(v=azure.100)) article.
-
-```csharp
- using Microsoft.WindowsAzure.MediaServices.Client;
- using System;
- using System.Collections.Generic;
- using System.Configuration;
- using System.IO;
- using System.Linq;
- using System.Text;
- using System.Threading;
- using System.Threading.Tasks;
- using System.Xml.Linq;
-
- namespace MediaServicesStaticPackaging
- {
- class Program
- {
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- // The MultibitrateMP4Files folder should also
- // contain the .ism manifest file.
- private static readonly string _multibitrateMP4s =
- Path.Combine(_mediaFiles, @"MultibitrateMP4Files");
-
- // XML Configuration files path.
- private static readonly string _configurationXMLFiles = @"../..\Configurations";
-
- private static MediaServicesCredentials _cachedCredentials = null;
- private static CloudMediaContext _context = null;
-
- // Read values from the App.config file.
-
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Ingest a set of multibitrate MP4s.
- //
- // Use the SDK extension method to create a new asset by
- // uploading files from a local directory.
- IAsset multibitrateMP4sAsset = _context.Assets.CreateFromFolder(
- _multibitrateMP4s,
- AssetCreationOptions.None,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- // Use Azure Media Packager to validate the files.
- IAsset validatedMP4s =
- ValidateMultibitrateMP4s(multibitrateMP4sAsset);
-
- // Publish the asset.
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- validatedMP4s,
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- // Get the streaming URLs.
- Console.WriteLine("Smooth Streaming URL:");
- Console.WriteLine(validatedMP4s.GetSmoothStreamingUri().ToString());
- Console.WriteLine("MPEG DASH URL:");
- Console.WriteLine(validatedMP4s.GetMpegDashUri().ToString());
- Console.WriteLine("HLS URL:");
- Console.WriteLine(validatedMP4s.GetHlsUri().ToString());
- }
-
- public static IAsset ValidateMultibitrateMP4s(IAsset multibitrateMP4sAsset)
- {
- // Set .ism as a primary file
- // in a multibitrate MP4 set.
- SetISMFileAsPrimary(multibitrateMP4sAsset);
-
- // Create a new job.
- IJob job = _context.Jobs.Create("MP4 validation and conversion to Smooth Stream job.");
-
- // Read the task configuration data into a string.
- string configMp4Validation = File.ReadAllText(Path.Combine(
- _configurationXMLFiles,
- "MediaPackager_ValidateTask.xml"));
-
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor processor = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Create a task with the conversion details, using the configuration data.
- ITask task = job.Tasks.AddNew("Mp4 Validation Task",
- processor,
- configMp4Validation,
- TaskOptions.None);
-
- // Specify the input asset to be validated.
- task.InputAssets.Add(multibitrateMP4sAsset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- task.OutputAssets.AddNew("Validated output asset",
- AssetCreationOptions.None);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // If the validation task fails and job completes with JobState.Error,
- // display the error message and throw an exception.
- if (job.State == JobState.Error)
- {
- Console.WriteLine(" Job ID: " + job.Id);
- Console.WriteLine(" Name: " + job.Name);
- Console.WriteLine(" State: " + job.State);
-
- foreach (var jobTask in job.Tasks)
- {
- Console.WriteLine(" Task Id: " + jobTask.Id);
- Console.WriteLine(" Name: " + jobTask.Name);
- Console.WriteLine(" Progress: " + jobTask.Progress);
- Console.WriteLine(" Configuration: " + jobTask.Configuration);
- Console.WriteLine(" Running time: " + jobTask.RunningDuration);
- if (jobTask.ErrorDetails != null)
- {
- foreach (var errordetail in jobTask.ErrorDetails)
- {
-
- Console.WriteLine(" Error Message:" + errordetail.Message);
- Console.WriteLine(" Error Code:" + errordetail.Code);
- }
- }
- }
- throw new Exception("The specified multi-bitrate MP4 set is not valid.");
- }
--
- return job.OutputMediaAssets[0];
- }
-
- static void SetISMFileAsPrimary(IAsset asset)
- {
- var ismAssetFiles = asset.AssetFiles.ToList().
- Where(f => f.Name.EndsWith(".ism", StringComparison.OrdinalIgnoreCase));
-
- // The following code assigns the first .ism file as the primary file in the asset.
- // An asset should have one .ism file.
- ismAssetFiles.First().IsPrimary = true;
- ismAssetFiles.First().Update();
- }
- }
- }
-```
-
-## Using Static Encryption to Protect your Smooth and MPEG DASH with PlayReady
-If you want to protect your content with PlayReady, you have a choice of using [dynamic encryption](media-services-protect-with-playready-widevine.md) (the recommended option) or static encryption (as described in this section).
-
-The example in this section encodes a mezzanine file (in this case MP4) into adaptive bitrate MP4 files. It then packages MP4s into Smooth Streaming and then encrypts Smooth Streaming with PlayReady. As a result you are able to stream Smooth Streaming or MPEG DASH.
-
-Media Services now provides a service for delivering Microsoft PlayReady licenses. The example in this article shows how to configure the Media Services PlayReady license delivery service (see the ConfigureLicenseDeliveryService method defined in the code below). For more information about Media Services PlayReady license delivery service, see [Using PlayReady Dynamic Encryption and License Delivery Service](media-services-protect-with-playready-widevine.md).
-
-> [!NOTE]
-> To deliver MPEG DASH encrypted with PlayReady, make sure to use CENC options by setting the useSencBox and adjustSubSamples properties (described in the [Task Preset for Azure Media Encryptor](/previous-versions/azure/reference/hh973610(v=azure.100)) article) to true.
->
->
-
-Make sure to update the following code to point to the folder where your input MP4 file is located.
-
-And also to where your MediaPackager_MP4ToSmooth.xml and MediaEncryptor_PlayReadyProtection.xml files are located. MediaPackager_MP4ToSmooth.xml is defined in [Task Preset for Azure Media Packager](/previous-versions/azure/reference/hh973635(v=azure.100)) and MediaEncryptor_PlayReadyProtection.xml is defined in the [Task Preset for Azure Media Encryptor](/previous-versions/azure/reference/hh973610(v=azure.100)) article.
-
-The example defines the UpdatePlayReadyConfigurationXMLFile method that you can use to dynamically update the MediaEncryptor_PlayReadyProtection.xml file. If you have the key seed available, you can use the CommonEncryption.GeneratePlayReadyContentKey method to generate the content key based on the keySeedValue and KeyId values.
-
-```csharp
- using System;
- using System.Collections.Generic;
- using System.Configuration;
- using System.IO;
- using System.Linq;
- using System.Text;
- using System.Threading;
- using System.Threading.Tasks;
- using Microsoft.WindowsAzure.MediaServices.Client;
- using System.Xml.Linq;
- using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
- using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
-
- namespace PlayReadyStaticEncryptAndKeyDeliverySvc
- {
- class Program
- {
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
-
- // XML Configuration files path.
- private static readonly string _configurationXMLFiles = @"../..\Configurations\";
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Encoding and encrypting assets //////////////////////
- // Load a single MP4 file.
- IAsset asset = IngestSingleMP4File(_singleMP4File, AssetCreationOptions.None);
-
- // Encode an MP4 file to a set of multibitrate MP4s.
- // Then, package a set of MP4s to clear Smooth Streaming.
- IAsset clearSmoothStreamAsset =
- ConvertMP4ToMultibitrateMP4sToSmoothStreaming(asset);
-
- // Create a common encryption content key that is used
- // a) to set the key values in the MediaEncryptor_PlayReadyProtection.xml file
- // that is used for encryption.
- // b) to configure the license delivery service and
- //
- Guid keyId;
- byte[] contentKey;
-
- IContentKey key = CreateCommonEncryptionKey(out keyId, out contentKey);
-
- // The content key authorization policy must be configured by you
- // and met by the client in order for the PlayReady license
- // to be delivered to the client.
- // In this example the Media Services PlayReady license delivery service is used.
- ConfigureLicenseDeliveryService(key);
-
- // Get the Media Services PlayReady license delivery URL.
- // This URL will be assigned to the licenseAcquisitionUrl property
- // of the MediaEncryptor_PlayReadyProtection.xml file.
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense);
-
- // Update the MediaEncryptor_PlayReadyProtection.xml file with the key and URL info.
- UpdatePlayReadyConfigurationXMLFile(keyId, contentKey, acquisitionUrl);
--
- // Encrypt your clear Smooth Streaming to Smooth Streaming with PlayReady.
- IAsset outputAsset = CreateSmoothStreamEncryptedWithPlayReady(clearSmoothStreamAsset);
--
- // You can use the http://smf.cloudapp.net/healthmonitor player
- // to test the smoothStreamURL URL.
- string smoothStreamURL = outputAsset.GetSmoothStreamingUri().ToString();
- Console.WriteLine("Smooth Streaming URL:");
- Console.WriteLine(smoothStreamURL);
-
- // You can use the http://dashif.org/reference/players/javascript/ player
- // to test the dashURL URL.
- string dashURL = outputAsset.GetMpegDashUri().ToString();
- Console.WriteLine("MPEG DASH URL:");
- Console.WriteLine(dashURL);
- }
-
- /// <summary>
- /// Creates a job with 2 tasks:
- /// 1 task - encodes a single MP4 to multibitrate MP4s,
- /// 2 task - packages MP4s to Smooth Streaming.
- /// </summary>
- /// <returns>The output asset.</returns>
- public static IAsset ConvertMP4ToMultibitrateMP4sToSmoothStreaming(IAsset asset)
- {
- // Create a new job.
- IJob job = _context.Jobs.Create("Convert MP4 to Smooth Streaming.");
-
- // Add task 1 - Encode single MP4 into multibitrate MP4s.
- IAsset MP4sAsset = EncodeMP4IntoMultibitrateMP4sTask(job, asset);
- // Add task 2 - Package a multibitrate MP4 set to Clear Smooth Stream.
- IAsset packagedAsset = PackageMP4ToSmoothStreamingTask(job, MP4sAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // Get the output asset that contains the Smooth Streaming asset.
- return job.OutputMediaAssets[1];
- }
-
- /// <summary>
- /// Encrypts Smooth Stream with PlayReady.
- /// Then creates a Smooth Streaming Url.
- /// </summary>
- /// <param name="clearSmoothAsset">Asset that contains clear Smooth Streaming.</param>
- /// <returns>The output asset.</returns>
- public static IAsset CreateSmoothStreamEncryptedWithPlayReady(IAsset clearSmoothStreamAsset)
- {
- // Create a job.
- IJob job = _context.Jobs.Create("Encrypt to PlayReady Smooth Streaming.");
-
- // Add task 1 - Encrypt Smooth Streaming with PlayReady
- IAsset encryptedSmoothAsset =
- EncryptSmoothStreamWithPlayReadyTask(job, clearSmoothStreamAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // The OutputMediaAssets[0] contains the desired asset.
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- job.OutputMediaAssets[0],
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- return job.OutputMediaAssets[0];
- }
-
- /// <summary>
- /// Create a common encryption content key that is used
- /// to set the key values in the MediaEncryptor_PlayReadyProtection.xml file
- /// that is used for encryption.
- /// </summary>
- /// <param name="keyId"></param>
- /// <param name="contentKey"></param>
- /// <returns></returns>
- public static IContentKey CreateCommonEncryptionKey(out Guid keyId, out byte[] contentKey)
- {
- keyId = Guid.NewGuid();
- contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryption);
-
- return key;
- }
-
- /// <summary>
- /// Update your configuration .xml file dynamically.
- /// </summary>
- public static void UpdatePlayReadyConfigurationXMLFile(Guid keyId, byte[] keyValue, Uri licenseAcquisitionUrl)
- {
- string xmlFileName = Path.Combine(_configurationXMLFiles,
- @"MediaEncryptor_PlayReadyProtection.xml");
-
- XNamespace xmlns = "http://schemas.microsoft.com/iis/media/v4/TM/TaskDefinition#";
-
- // Prepare the encryption task template
- XDocument doc = XDocument.Load(xmlFileName);
-
- var licenseAcquisitionUrlEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "licenseAcquisitionUrl")
- .FirstOrDefault();
- var contentKeyEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "contentKey")
- .FirstOrDefault();
- var keyIdEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "keyId")
- .FirstOrDefault();
-
- // Update the "value" property.
- if (licenseAcquisitionUrlEl != null)
- licenseAcquisitionUrlEl.Attribute("value").SetValue(licenseAcquisitionUrl.ToString());
-
- if (contentKeyEl != null)
- contentKeyEl.Attribute("value").SetValue(Convert.ToBase64String(keyValue));
-
- if (keyIdEl != null)
- keyIdEl.Attribute("value").SetValue(keyId);
-
- doc.Save(xmlFileName);
- }
-
- /// <summary>
- /// Uploads a single file.
- /// </summary>
- /// <param name="fileDir">The location of the files.</param>
- /// <param name="assetCreationOptions">
- /// You can specify the following encryption options for the AssetCreationOptions.
- /// None: no encryption.
- /// StorageEncrypted: storage encryption. Encrypts a clear input file
- /// before it is uploaded to Azure storage.
- /// CommonEncryptionProtected: for Common Encryption Protected (CENC) files.
- /// For example, a set of files that are already PlayReady encrypted.
- /// EnvelopeEncryptionProtected: for HLS with AES encryption files.
- /// NOTE: The files must have been encoded and encrypted by Transform Manager.
- /// </param>
- /// <returns>Returns an asset that contains a single file.</returns>
- /// </summary>
- /// <returns></returns>
- private static IAsset IngestSingleMP4File(string fileDir, AssetCreationOptions assetCreationOptions)
- {
- // Use the SDK extension method to create a new asset by
- // uploading a mezzanine file from a local path.
- IAsset asset = _context.Assets.CreateFromFile(
- fileDir,
- assetCreationOptions,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- return asset;
- }
-
- /// <summary>
- /// Creates a task to encode to Adaptive Bitrate.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset EncodeMP4IntoMultibitrateMP4sTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Media Encoder Standard.
- IMediaProcessor encoder = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.MediaEncoderStandard);
-
- ITask adaptiveBitrateTask = job.Tasks.AddNew("MP4 to Adaptive Bitrate Task",
- encoder,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input Asset
- adaptiveBitrateTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset abrAsset = adaptiveBitrateTask.OutputAssets.AddNew("Multibitrate MP4s",
- AssetCreationOptions.None);
-
- return abrAsset;
- }
-
- /// <summary>
- /// Creates a task to convert the MP4 file(s) to a Smooth Streaming asset.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset PackageMP4ToSmoothStreamingTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor packager = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Azure Media Packager does not accept string presets, so load xml configuration
- string smoothConfig = File.ReadAllText(Path.Combine(
- _configurationXMLFiles,
- "MediaPackager_MP4toSmooth.xml"));
-
- // Create a new Task to convert adaptive bitrate to Smooth Streaming.
- ITask smoothStreamingTask = job.Tasks.AddNew("MP4 to Smooth Task",
- packager,
- smoothConfig,
- TaskOptions.None);
-
- // Specify the input Asset, which is the output Asset from the first task
- smoothStreamingTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset smoothOutputAsset =
- smoothStreamingTask.OutputAssets.AddNew("Clear Smooth Stream",
- AssetCreationOptions.None);
-
- return smoothOutputAsset;
- }
--
- /// <summary>
- /// Creates a task to encrypt Smooth Streaming with PlayReady.
- /// Note: To deliver DASH, make sure to set the useSencBox and adjustSubSamples
- /// configuration properties to true.
- /// In this example, MediaEncryptor_PlayReadyProtection.xml contains configuration.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset EncryptSmoothStreamWithPlayReadyTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Encryptor.
- IMediaProcessor playreadyProcessor = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaEncryptor);
-
- // Read the configuration XML.
- //
- // Note that the configuration defined in MediaEncryptor_PlayReadyProtection.xml
- // is using keySeedValue. It is recommended that you do this only for testing
- // and not in production. For more information, see
- // https://msdn.microsoft.com/library/windowsazure/dn189154.aspx.
- //
- string configPlayReady = File.ReadAllText(Path.Combine(_configurationXMLFiles,
- @"MediaEncryptor_PlayReadyProtection.xml"));
-
- ITask playreadyTask = job.Tasks.AddNew("My PlayReady Task",
- playreadyProcessor,
- configPlayReady,
- TaskOptions.ProtectedConfiguration);
-
- playreadyTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.CommonEncryptionProtected.
- IAsset playreadyAsset = playreadyTask.OutputAssets.AddNew(
- "PlayReady Smooth Streaming",
- AssetCreationOptions.CommonEncryptionProtected);
-
- return playreadyAsset;
- }
-
- /// <summary>
- /// Configures authorization policy for the content key.
- /// </summary>
- /// <param name="contentKey">The content key.</param>
- static public void ConfigureLicenseDeliveryService(IContentKey contentKey)
- {
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
-
- // Configure PlayReady license template.
- string newLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, newLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
-
- contentKeyAuthorizationPolicy.Options.Add(policyOption);
-
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-
- static private string ConfigurePlayReadyLicenseTemplate()
- {
- // The following code configures PlayReady License Template using .NET classes
- // and returns the XML string.
-
- PlayReadyLicenseResponseTemplate responseTemplate = new PlayReadyLicenseResponseTemplate();
- PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
-
- responseTemplate.LicenseTemplates.Add(licenseTemplate);
-
- return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
- }
-
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
- }
- }
-```
-
-## Using Static Encryption to Protect HLSv3 with AES-128
-If you want to encrypt your HLS with AES-128, you have a choice of using dynamic encryption (the recommended option) or static encryption (as shown in this section). If you decide to use dynamic encryption, see [Using AES-128 Dynamic Encryption and Key Delivery Service](media-services-playready-license-template-overview.md).
-
-> [!NOTE]
-> In order to convert your content into HLS, you must first convert/encode your content into Smooth Streaming.
-> Also, for the HLS to get encrypted with AES make sure to set the following properties in your MediaPackager_SmoothToHLS.xml file: set the encrypt property to true, set the key value, and the keyuri value to point to your authentication\authorization server.
-> Media Services creates a key file and places it in the asset container. You should copy the /asset-containerguid/*.key file to your server (or create your own key file) and then delete the *.key file from the asset container.
->
->
-
-The example in this section encodes a mezzanine file (in this case MP4) into multibitrate MP4 files and then packages MP4s into Smooth Streaming. It then packages Smooth Streaming into HTTP Live Streaming (HLS) encrypted with Advanced Encryption Standard (AES) 128-bit stream encryption. Make sure to update the following code to point to the folder where your input MP4 file is located. And also to where your MediaPackager_MP4ToSmooth.xml and MediaPackager_SmoothToHLS.xml configuration files are located. You can find the definition for these files in the [Task Preset for Azure Media Packager](/previous-versions/azure/reference/hh973635(v=azure.100)) article.
-
-```csharp
- using System;
- using System.Collections.Generic;
- using System.Configuration;
- using System.IO;
- using System.Linq;
- using System.Text;
- using System.Threading;
- using System.Threading.Tasks;
- using Microsoft.WindowsAzure.MediaServices.Client;
- using System.Xml.Linq;
-
- namespace MediaServicesContentProtection
- {
- class Program
- {
- // Paths to support files (within the above base path). You can use
- // the provided sample media files from the "SupportFiles" folder, or
- // provide paths to your own media files below to run these samples.
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"SingleMP4\BigBuckBunny.mp4");
-
- // XML Configuration files path.
- private static readonly string _configurationXMLFiles = @"../..\Configurations\";
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
-
- // Encoding and encrypting assets //////////////////////
-
- // Load an MP4 file.
- IAsset asset = IngestSingleMP4File(_singleMP4File, AssetCreationOptions.None);
-
- // Encode an MP4 file to a set of multibitrate MP4s.
- // Then, package a set of MP4s to clear Smooth Streaming.
- IAsset clearSmoothStreamAsset = ConvertMP4ToMultibitrateMP4sToSmoothStreaming(asset);
-
- // Create HLS encrypted with AES.
- IAsset HLSEncryptedWithAESAsset = CreateHLSEncryptedWithAES(clearSmoothStreamAsset);
-
- // You can use the following player to test the HLS with AES stream.
- // https://apps.microsoft.com/windows/app/3ivx-hls-player/f79ce7d0-2993-4658-bc4e-83dc182a0614
- string hlsWithAESURL = HLSEncryptedWithAESAsset.GetHlsUri().ToString();
- Console.WriteLine("HLS with AES URL:");
- Console.WriteLine(hlsWithAESURL);
- }
--
- /// <summary>
- /// Creates a job with 2 tasks:
- /// 1 task - encodes a single MP4 to multibitrate MP4s,
- /// 2 task - packages MP4s to Smooth Streaming.
- /// </summary>
- /// <returns>The output asset.</returns>
- public static IAsset ConvertMP4ToMultibitrateMP4sToSmoothStreaming(IAsset asset)
- {
- // Create a new job.
- IJob job = _context.Jobs.Create("Convert MP4 to Smooth Streaming.");
-
- // Add task 1 - Encode single MP4 into multibitrate MP4s.
- IAsset MP4sAsset = EncodeSingleMP4IntoMultibitrateMP4sTask(job, asset);
- // Add task 2 - Package a multibitrate MP4 set to Clear Smooth Streaming.
- IAsset packagedAsset = PackageMP4ToSmoothStreamingTask(job, MP4sAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // Get the output asset that contains Smooth Streaming.
- return job.OutputMediaAssets[1];
- }
-
- /// <summary>
- /// Encrypts an HLS with AES-128.
- /// </summary>
- /// <param name="clearSmoothAsset">Asset that contains clear Smooth Streaming.</param>
- /// <returns>The output asset.</returns>
- public static IAsset CreateHLSEncryptedWithAES(IAsset clearSmoothStreamAsset)
- {
- IJob job = _context.Jobs.Create("Encrypt to HLS with AES.");
-
- // Add task 1 - Package clear Smooth Streaming to HLS with AES.
- PackageSmoothStreamToHLS(job, clearSmoothStreamAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // The OutputMediaAssets[0] contains the desired asset.
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- job.OutputMediaAssets[0],
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- return job.OutputMediaAssets[0];
- }
-
- /// <summary>
- /// Uploads a single file.
- /// </summary>
- /// <param name="fileDir">The location of the files.</param>
- /// <param name="assetCreationOptions">
- /// You can specify the following encryption options for the AssetCreationOptions.
- /// None: no encryption.
- /// StorageEncrypted: storage encryption. Encrypts a clear input file
- /// before it is uploaded to Azure storage.
- /// CommonEncryptionProtected: for Common Encryption Protected (CENC) files.
- /// For example, a set of files that are already PlayReady encrypted.
- /// EnvelopeEncryptionProtected: for HLS with AES encryption files.
- /// NOTE: The files must have been encoded and encrypted by Transform Manager.
- /// </param>
- /// <returns>Returns an asset that contains a single file.</returns>
- /// </summary>
- /// <returns></returns>
- private static IAsset IngestSingleMP4File(string fileDir, AssetCreationOptions assetCreationOptions)
- {
- // Use the SDK extension method to create a new asset by
- // uploading a mezzanine file from a local path.
- IAsset asset = _context.Assets.CreateFromFile(
- fileDir,
- assetCreationOptions,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
-
- return asset;
- }
-
- /// <summary>
- /// Creates a task to encode to Adaptive Bitrate.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset EncodeSingleMP4IntoMultibitrateMP4sTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Media Encoder Standard.
- IMediaProcessor encoder = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.MediaEncoderStandard);
-
- ITask adaptiveBitrateTask = job.Tasks.AddNew("MP4 to Adaptive Bitrate Task",
- encoder,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input Asset
- adaptiveBitrateTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset abrAsset = adaptiveBitrateTask.OutputAssets.AddNew("Multibitrate MP4s",
- AssetCreationOptions.None);
-
- return abrAsset;
- }
-
- /// <summary>
- /// Creates a task to convert the MP4 file(s) to a Smooth Streaming asset.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset PackageMP4ToSmoothStreamingTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor packager = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Azure Media Packager does not accept string presets, so load xml configuration
- string smoothConfig = File.ReadAllText(Path.Combine(
- _configurationXMLFiles,
- "MediaPackager_MP4toSmooth.xml"));
-
- // Create a new Task to convert adaptive bitrate to Smooth Streaming.
- ITask smoothStreamingTask = job.Tasks.AddNew("MP4 to Smooth Task",
- packager,
- smoothConfig,
- TaskOptions.None);
-
- // Specify the input Asset, which is the output Asset from the first task
- smoothStreamingTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset smoothOutputAsset =
- smoothStreamingTask.OutputAssets.AddNew("Clear Smooth Streaming",
- AssetCreationOptions.None);
-
- return smoothOutputAsset;
- }
-
- /// <summary>
- /// Converts Smooth Streaming to HLS.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The Smooth Streaming asset.</param>
- /// <returns>The asset that was packaged to HLS.</returns>
- private static IAsset PackageSmoothStreamToHLS(IJob job, IAsset smoothStreamAsset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor processor = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Read the configuration data into a string.
- // For the HLS to get encrypted with AES make sure to set the
- // encrypt configuration property to true.
- //
- // In production, it is recommended to do the following:
- // Set a Key url for your authn/authz server.
- // Copy the /asset-containerguid/*.key file to your server (or craft a key file for yourself).
- // Delete *.key from the asset container.
- //
- string configuration = File.ReadAllText(Path.Combine(_configurationXMLFiles, @"MediaPackager_SmoothToHLS.xml"));
-
- // Create a task with the encoding details, using a configuration file.
- ITask task = job.Tasks.AddNew("My Smooth Streaming to HLS Task",
- processor,
- configuration,
- TaskOptions.ProtectedConfiguration);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(smoothStreamAsset);
-
- // Add an output asset to contain the results of the job.
- IAsset outputAsset =
- task.OutputAssets.AddNew("HLS asset", AssetCreationOptions.None);
--
- return outputAsset;
- }
- }
- }
-```
-
-## Using Static Encryption to Protect HLSv3 with PlayReady
-If you want to protect your content with PlayReady, you have a choice of using [dynamic encryption](media-services-protect-with-playready-widevine.md) (the recommended option) or static encryption (as described in this section).
-
-> [!NOTE]
-> In order to protect your content using PlayReady you must first convert/encode your content into a Smooth Streaming format.
->
->
-
-The example in this section encodes a mezzanine file (in this case MP4) into multibitrate MP4 files. It then packages MP4s into Smooth Streaming and encrypts Smooth Streaming with PlayReady. To produce HTTP Live Streaming (HLS) encrypted with PlayReady, the PlayReady Smooth Streaming asset needs to be packaged into HLS. This article demonstrates how to perform all these steps.
-
-Media Services now provides a service for delivering Microsoft PlayReady licenses. The example in this article shows how to configure the Media Services PlayReady license delivery service (see the **ConfigureLicenseDeliveryService** method defined in the code below).
-
-Make sure to update the following code to point to the folder where your input MP4 file is located. And also to where your MediaPackager_MP4ToSmooth.xml, MediaPackager_SmoothToHLS.xml, and MediaEncryptor_PlayReadyProtection.xml files are located. MediaPackager_MP4ToSmooth.xml and MediaPackager_SmoothToHLS.xml are defined in [Task Preset for Azure Media Packager](/previous-versions/azure/reference/hh973635(v=azure.100)) and MediaEncryptor_PlayReadyProtection.xml is defined in the [Task Preset for Azure Media Encryptor](/previous-versions/azure/reference/hh973610(v=azure.100)) article.
-
-```csharp
- using System;
- using System.Collections.Generic;
- using System.Configuration;
- using System.IO;
- using System.Linq;
- using System.Text;
- using System.Threading;
- using System.Threading.Tasks;
- using Microsoft.WindowsAzure.MediaServices.Client;
- using System.Xml.Linq;
- using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
- using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
-
- namespace MediaServicesContentProtection
- {
- class Program
- {
- // Paths to support files (within the above base path). You can use
- // the provided sample media files from the "SupportFiles" folder, or
- // provide paths to your own media files below to run these samples.
-
- private static readonly string _mediaFiles =
- Path.GetFullPath(@"../..\Media");
-
- private static readonly string _singleMP4File =
- Path.Combine(_mediaFiles, @"SingleMP4\BigBuckBunny.mp4");
-
- // XML Configuration files path.
- private static readonly string _configurationXMLFiles = @"../..\Configurations\";
-
- // Read values from the App.config file.
- private static readonly string _AADTenantDomain =
- ConfigurationManager.AppSettings["AMSAADTenantDomain"];
- private static readonly string _RESTAPIEndpoint =
- ConfigurationManager.AppSettings["AMSRESTAPIEndpoint"];
- private static readonly string _AMSClientId =
- ConfigurationManager.AppSettings["AMSClientId"];
- private static readonly string _AMSClientSecret =
- ConfigurationManager.AppSettings["AMSClientSecret"];
-
- static void Main(string[] args)
- {
- AzureAdTokenCredentials tokenCredentials =
- new AzureAdTokenCredentials(_AADTenantDomain,
- new AzureAdClientSymmetricKey(_AMSClientId, _AMSClientSecret),
- AzureEnvironments.AzureCloudEnvironment);
-
- var tokenProvider = new AzureAdTokenProvider(tokenCredentials);
-
- _context = new CloudMediaContext(new Uri(_RESTAPIEndpoint), tokenProvider);
--
- // Load an MP4 file.
- IAsset asset = IngestSingleMP4File(_singleMP4File, AssetCreationOptions.None);
-
- // Encode an MP4 file to a set of multibitrate MP4s.
- // Then, package a set of MP4s to clear Smooth Streaming.
- IAsset clearSmoothStreamAsset = ConvertMP4ToMultibitrateMP4sToSmoothStreaming(asset);
-
- // Create a common encryption content key that is used
- // a) to set the key values in the MediaEncryptor_PlayReadyProtection.xml file
- // that is used for encryption.
- // b) to configure the license delivery service and
- //
- Guid keyId;
- byte[] contentKey;
-
- IContentKey key = CreateCommonEncryptionKey(out keyId, out contentKey);
-
- // The content key authorization policy must be configured by you
- // and met by the client in order for the PlayReady license
- // to be delivered to the client.
- // In this example the Media Services PlayReady license delivery service is used.
- ConfigureLicenseDeliveryService(key);
-
- // Get the Media Services PlayReady license delivery URL.
- // This URL will be assigned to the licenseAcquisitionUrl property
- // of the MediaEncryptor_PlayReadyProtection.xml file.
- Uri acquisitionUrl = key.GetKeyDeliveryUrl(ContentKeyDeliveryType.PlayReadyLicense);
-
- // Update the MediaEncryptor_PlayReadyProtection.xml file with the key and URL info.
- UpdatePlayReadyConfigurationXMLFile(keyId, contentKey, acquisitionUrl);
-
- // Create HLS encrypted with PlayReady.
- IAsset playReadyHLSAsset = CreateHLSEncryptedWithPlayReady(clearSmoothStreamAsset);
- //
- string hlsWithPlayReadyURL = playReadyHLSAsset.GetHlsUri().ToString();
- Console.WriteLine("HLS with PlayReady URL:");
- Console.WriteLine(hlsWithPlayReadyURL);
- }
-
- /// <summary>
- /// Creates a job with 2 tasks:
- /// 1 task - encodes a single MP4 to multibitrate MP4s,
- /// 2 task - packages MP4s to Smooth Streaming.
- /// </summary>
- /// <returns>The output asset.</returns>
- public static IAsset ConvertMP4ToMultibitrateMP4sToSmoothStreaming(IAsset asset)
- {
- // Create a new job.
- IJob job = _context.Jobs.Create("Convert MP4 to Smooth Streaming.");
-
- // Add task 1 - Encode single MP4 into multibitrate MP4s.
- IAsset MP4sAsset = EncodeSingleMP4IntoMultibitrateMP4sTask(job, asset);
- // Add task 2 - Package a multibitrate MP4 set to Clear Smooth Streaming.
- IAsset packagedAsset = PackageMP4ToSmoothStreamingTask(job, MP4sAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // Get the output asset that contains Smooth Streaming.
- return job.OutputMediaAssets[1];
- }
-
- /// <summary>
- /// Create a common encryption content key that is used
- /// to set the key values in the MediaEncryptor_PlayReadyProtection.xml file
- /// that is used for encryption.
- /// </summary>
- /// <param name="keyId"></param>
- /// <param name="contentKey"></param>
- /// <returns></returns>
- public static IContentKey CreateCommonEncryptionKey(out Guid keyId, out byte[] contentKey)
- {
- keyId = Guid.NewGuid();
- contentKey = GetRandomBuffer(16);
-
- IContentKey key = _context.ContentKeys.Create(
- keyId,
- contentKey,
- "ContentKey",
- ContentKeyType.CommonEncryption);
-
- return key;
- }
-
- /// <summary>
- /// Update your configuration .xml file dynamically.
- /// </summary>
- public static void UpdatePlayReadyConfigurationXMLFile(Guid keyId, byte[] keyValue, Uri licenseAcquisitionUrl)
- {
- string xmlFileName = Path.Combine(_configurationXMLFiles,
- @"MediaEncryptor_PlayReadyProtection.xml");
-
- XNamespace xmlns = "http://schemas.microsoft.com/iis/media/v4/TM/TaskDefinition#";
-
- // Prepare the encryption task template
- XDocument doc = XDocument.Load(xmlFileName);
-
- var licenseAcquisitionUrlEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "licenseAcquisitionUrl")
- .FirstOrDefault();
- var contentKeyEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "contentKey")
- .FirstOrDefault();
- var keyIdEl = doc
- .Descendants(xmlns + "property")
- .Where(p => p.Attribute("name").Value == "keyId")
- .FirstOrDefault();
-
- // Update the "value" property.
- if (licenseAcquisitionUrlEl != null)
- licenseAcquisitionUrlEl.Attribute("value").SetValue(licenseAcquisitionUrl.ToString());
-
- if (contentKeyEl != null)
- contentKeyEl.Attribute("value").SetValue(Convert.ToBase64String(keyValue));
-
- if (keyIdEl != null)
- keyIdEl.Attribute("value").SetValue(keyId);
-
- doc.Save(xmlFileName);
- }
-
- /// <summary>
- // Encrypts clear Smooth Streaming to Smooth Streaming with PlayReady.
- // Then, packages the PlayReady Smooth Streaming to HLS with PlayReady.
- /// </summary>
- /// <param name="clearSmoothAsset">Asset that contains clear Smooth Streaming.</param>
- /// <returns>The output asset.</returns>
- public static IAsset CreateHLSEncryptedWithPlayReady(IAsset clearSmoothStreamAsset)
- {
- IJob job = _context.Jobs.Create("Encrypt to HLS with PlayReady.");
-
- // Add task 1 - Encrypt Smooth Streaming with PlayReady
- IAsset encryptedSmoothAsset =
- EncryptSmoothStreamWithPlayReadyTask(job, clearSmoothStreamAsset);
-
- // Add task 2 - Package to HLS with PlayReady.
- PackageSmoothStreamToHLS(job, encryptedSmoothAsset);
-
- // Submit the job and wait until it is completed.
- job.Submit();
- job = job.StartExecutionProgressTask(
- j =>
- {
- Console.WriteLine("Job state: {0}", j.State);
- Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
- },
- CancellationToken.None).Result;
-
- // Since we had two tasks, the OutputMediaAssets[1]
- // contains the desired asset.
- _context.Locators.Create(
- LocatorType.OnDemandOrigin,
- job.OutputMediaAssets[1],
- AccessPermissions.Read,
- TimeSpan.FromDays(30));
-
- return job.OutputMediaAssets[1];
- }
-
- /// <summary>
- /// Uploads a single file.
- /// </summary>
- /// <param name="fileDir">The location of the files.</param>
- /// <param name="assetCreationOptions">
- /// You can specify the following encryption options for the AssetCreationOptions.
- /// None: no encryption.
- /// StorageEncrypted: storage encryption. Encrypts a clear input file
- /// before it is uploaded to Azure storage.
- /// CommonEncryptionProtected: for Common Encryption Protected (CENC) files.
- /// For example, a set of files that are already PlayReady encrypted.
- /// EnvelopeEncryptionProtected: for HLS with AES encryption files.
- /// NOTE: The files must have been encoded and encrypted by Transform Manager.
- /// </param>
- /// <returns>Returns an asset that contains a single file.</returns>
- /// </summary>
- /// <returns></returns>
- private static IAsset IngestSingleMP4File(string fileDir, AssetCreationOptions assetCreationOptions)
- {
- // Use the SDK extension method to create a new asset by
- // uploading a mezzanine file from a local path.
- IAsset asset = _context.Assets.CreateFromFile(
- fileDir,
- assetCreationOptions,
- (af, p) =>
- {
- Console.WriteLine("Uploading '{0}' - Progress: {1:0.##}%", af.Name, p.Progress);
- });
--
- return asset;
-
- }
- /// <summary>
- /// Creates a task to encode to Adaptive Bitrate.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset EncodeSingleMP4IntoMultibitrateMP4sTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Media Encoder Standard.
- IMediaProcessor encoder = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.MediaEncoderStandard);
-
- ITask adaptiveBitrateTask = job.Tasks.AddNew("MP4 to Adaptive Bitrate Task",
- encoder,
- "Adaptive Streaming",
- TaskOptions.None);
-
- // Specify the input Asset
- adaptiveBitrateTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset abrAsset = adaptiveBitrateTask.OutputAssets.AddNew("Multibitrate MP4s",
- AssetCreationOptions.None);
-
- return abrAsset;
- }
-
- /// <summary>
- /// Creates a task to convert the MP4 file(s) to a Smooth Streaming asset.
- /// Adds the new task to a job.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset PackageMP4ToSmoothStreamingTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor packager = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Azure Media Packager does not accept string presets, so load xml configuration
- string smoothConfig = File.ReadAllText(Path.Combine(
- _configurationXMLFiles,
- "MediaPackager_MP4toSmooth.xml"));
-
- // Create a new Task to convert adaptive bitrate to Smooth Streaming.
- ITask smoothStreamingTask = job.Tasks.AddNew("MP4 to Smooth Task",
- packager,
- smoothConfig,
- TaskOptions.None);
-
- // Specify the input Asset, which is the output Asset from the first task
- smoothStreamingTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.None, which
- // means the output asset is in the clear (unencrypted).
- IAsset smoothOutputAsset =
- smoothStreamingTask.OutputAssets.AddNew("Clear Smooth Streaming",
- AssetCreationOptions.None);
-
- return smoothOutputAsset;
- }
--
- /// <summary>
- /// Converts Smooth Stream to HLS.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The Smooth Stream asset.</param>
- /// <returns>The asset that was packaged to HLS.</returns>
- private static IAsset PackageSmoothStreamToHLS(IJob job, IAsset smoothStreamAsset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Packager.
- IMediaProcessor processor = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaPackager);
-
- // Read the configuration data into a string.
- //
- string configuration = File.ReadAllText(
- Path.Combine(_configurationXMLFiles,
- @"MediaPackager_SmoothToHLS.xml"));
-
- // Create a task with the encoding details, using a configuration file.
- ITask task = job.Tasks.AddNew("My Smooth to HLS Task",
- processor,
- configuration,
- TaskOptions.ProtectedConfiguration);
-
- // Specify the input asset to be encoded.
- task.InputAssets.Add(smoothStreamAsset);
-
- // Add an output asset to contain the results of the job.
- IAsset outputAsset =
- task.OutputAssets.AddNew("HLS asset", AssetCreationOptions.None);
--
- return outputAsset;
- }
-
- /// <summary>
- /// Creates a task to encrypt Smooth Streaming with PlayReady.
- /// Note: Do deliver DASH, make sure to set the useSencBox and adjustSubSamples
- /// configuration properties to true.
- /// </summary>
- /// <param name="job">The job to which to add the new task.</param>
- /// <param name="asset">The input asset.</param>
- /// <returns>The output asset.</returns>
- private static IAsset EncryptSmoothStreamWithPlayReadyTask(IJob job, IAsset asset)
- {
- // Get the SDK extension method to get a reference to the Azure Media Encryptor.
- IMediaProcessor playreadyProcessor = _context.MediaProcessors.GetLatestMediaProcessorByName(
- MediaProcessorNames.WindowsAzureMediaEncryptor);
-
- // Read the configuration XML.
- //
- // Note that the configuration defined in MediaEncryptor_PlayReadyProtection.xml
- // is using keySeedValue. It is recommended that you do this only for testing
- // and not in production. For more information, see
- // https://msdn.microsoft.com/library/windowsazure/dn189154.aspx.
- //
- string configPlayReady = File.ReadAllText(Path.Combine(_configurationXMLFiles,
- @"MediaEncryptor_PlayReadyProtection.xml"));
-
- ITask playreadyTask = job.Tasks.AddNew("My PlayReady Task",
- playreadyProcessor,
- configPlayReady,
- TaskOptions.ProtectedConfiguration);
-
- playreadyTask.InputAssets.Add(asset);
-
- // Add an output asset to contain the results of the job.
- // This output is specified as AssetCreationOptions.CommonEncryptionProtected.
- IAsset playreadyAsset = playreadyTask.OutputAssets.AddNew(
- "PlayReady Smooth Streaming",
- AssetCreationOptions.CommonEncryptionProtected);
--
- return playreadyAsset;
- }
--
- /// <summary>
- /// Configures authorization policy for the content key.
- /// </summary>
- /// <param name="contentKey">The content key.</param>
- static public void ConfigureLicenseDeliveryService(IContentKey contentKey)
- {
- // Create ContentKeyAuthorizationPolicy with Open restrictions
- // and create authorization policy
-
- List<ContentKeyAuthorizationPolicyRestriction> restrictions = new List<ContentKeyAuthorizationPolicyRestriction>
- {
- new ContentKeyAuthorizationPolicyRestriction
- {
- Name = "Open",
- KeyRestrictionType = (int)ContentKeyRestrictionType.Open,
- Requirements = null
- }
- };
-
- // Configure PlayReady license template.
- string newLicenseTemplate = ConfigurePlayReadyLicenseTemplate();
-
- IContentKeyAuthorizationPolicyOption policyOption =
- _context.ContentKeyAuthorizationPolicyOptions.Create("",
- ContentKeyDeliveryType.PlayReadyLicense,
- restrictions, newLicenseTemplate);
-
- IContentKeyAuthorizationPolicy contentKeyAuthorizationPolicy = _context.
- ContentKeyAuthorizationPolicies.
- CreateAsync("Deliver Common Content Key with no restrictions").
- Result;
--
- contentKeyAuthorizationPolicy.Options.Add(policyOption);
-
- // Associate the content key authorization policy with the content key.
- contentKey.AuthorizationPolicyId = contentKeyAuthorizationPolicy.Id;
- contentKey = contentKey.UpdateAsync().Result;
- }
-
- static private string ConfigurePlayReadyLicenseTemplate()
- {
- // The following code configures PlayReady License Template using .NET classes
- // and returns the XML string.
-
- PlayReadyLicenseResponseTemplate responseTemplate = new PlayReadyLicenseResponseTemplate();
- PlayReadyLicenseTemplate licenseTemplate = new PlayReadyLicenseTemplate();
-
- responseTemplate.LicenseTemplates.Add(licenseTemplate);
-
- return MediaServicesLicenseTemplateSerializer.Serialize(responseTemplate);
- }
- static private byte[] GetRandomBuffer(int length)
- {
- var returnValue = new byte[length];
-
- using (var rng =
- new System.Security.Cryptography.RNGCryptoServiceProvider())
- {
- rng.GetBytes(returnValue);
- }
-
- return returnValue;
- }
-
- }
- }
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
media-services Media Services Streaming Endpoints Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-streaming-endpoints-overview.md
- Title: Azure Media Services Streaming Endpoint overview | Microsoft Docs
-description: This article gives an overview of Azure Media Services streaming endpoints.
--
-writer: juliako
---- Previously updated : 3/10/2021--
-# Streaming endpoints overview
---
-In Microsoft Azure Media Services (AMS), a **Streaming Endpoint** represents a streaming service that can deliver content directly to a client player application, or to a Content Delivery Network (CDN) for further distribution. Media Services also provides seamless Azure CDN integration. The outbound stream from a StreamingEndpoint service can be a live stream, a video on demand, or progressive download of your asset in your Media Services account. Each Azure Media Services account includes a default StreamingEndpoint. Additional StreamingEndpoints can be created under the account. There are two versions of StreamingEndpoints, 1.0 and 2.0. Starting with January 10th 2017, any newly created AMS accounts will include version 2.0 **default** StreamingEndpoint. Additional streaming endpoints that you add to this account will also be version 2.0. This change will not impact the existing accounts; existing StreamingEndpoints will be version 1.0 and can be upgraded to version 2.0. With this change there will be behavior, billing and feature changes (for more information, see the **Streaming types and versions** section documented below).
-
-Azure Media Services added the following properties to the Streaming Endpoint entity: **CdnProvider**, **CdnProfile**, **StreamingEndpointVersion**. For detailed overview of these properties, see [this](/rest/api/media/operations/streamingendpoint).
-
-When you create an Azure Media Services account a default standard streaming endpoint is created for you in the **Stopped** state. You cannot delete the default streaming endpoint. Depending on the Azure CDN availability in the targeted region, by default newly created default streaming endpoint also includes "StandardVerizon" CDN provider integration.
-
-> [!NOTE]
-> Azure CDN integration can be disabled before starting the streaming endpoint. The `hostname` and the streaming URL remains the same whether or not you enable CDN.
-
-This topic gives an overview of the main functionalities that are provided by streaming endpoints.
-
-## Naming conventions
-
-For the default endpoint: `{AccountName}.streaming.mediaservices.windows.net`
-
-For any additional endpoints: `{EndpointName}-{AccountName}.streaming.mediaservices.windows.net`
-
-## Streaming types and versions
-
-### Standard/Premium types (version 2.0)
-
-Starting with the January 2017 release of Media Services, you have two streaming types: **Standard** (preview) and **Premium**. These types are part of the Streaming endpoint version "2.0".
--
-|Type|Description|
-|--|--|
-|**Standard**|The default Streaming Endpoint is a **Standard** type, can be changed to the Premium type by adjusting streaming units.|
-|**Premium** |This option is suitable for professional scenarios that require higher scale or control. You move to a **Premium** type by adjusting streaming units.<br/>Dedicated Streaming Endpoints live in isolated environment and do not compete for resources.|
-
-For customers looking to deliver content to large internet audiences, we recommend that you enable CDN on the Streaming Endpoint.
-
-For more detailed information, see the [Compare Streaming types](#comparing-streaming-types) following section.
-
-### Classic type (version 1.0)
-
-For users who created AMS accounts prior to the January 10 2017 release, you have a **Classic** type of a streaming endpoint. This type is part of the streaming endpoint version "1.0".
-
-If your **version "1.0"** streaming endpoint has >=1 premium streaming units (SU), it will be premium streaming endpoint and will provide all AMS features (just like the **Standard/Premium** type) without any additional configuration steps.
-
->[!NOTE]
->**Classic** streaming endpoints (version "1.0" and 0 SU), provides limited features and doesn't include a SLA. It is recommended to migrate to **Standard** type to get a better experience and to use features like dynamic packaging or encryption and other features that come with the **Standard** type. To migrate to the **Standard** type, go to the [Azure portal](https://portal.azure.com/) and select **Opt-in to Standard**. For more information about migration, see the [migration](#migration-between-types) section.
->
->Beware that this operation cannot be rolled back and has a pricing impact.
->
-
-## Comparing streaming types
-
-### Versions
-
-|Type|StreamingEndpointVersion|ScaleUnits|CDN|Billing|
-|--|-|--|--|--|
-|Classic|1.0|0|NA|Free|
-|Standard Streaming Endpoint (preview)|2.0|0|Yes|Paid|
-|Premium Streaming Units|1.0|>0|Yes|Paid|
-|Premium Streaming Units|2.0|>0|Yes|Paid|
-
-### Features
-
-Feature|Standard|Premium
-||
-Throughput |Up to 600 Mbps and can provide a much higher effective throughput when a CDN is used.|200 Mbps per streaming unit (SU). Can provide a much higher effective throughput when a CDN is used.
-CDN|Azure CDN, third party CDN, or no CDN.|Azure CDN, third party CDN, or no CDN.
-Billing is prorated| Daily|Daily
-Dynamic encryption|Yes|Yes
-Dynamic packaging|Yes|Yes
-Scale|Auto scales up to the targeted throughput.|Additional streaming units.
-IP filtering/G20/Custom host <sup>1</sup>|Yes|Yes
-Progressive download|Yes|Yes
-Recommended usage |Recommended for the vast majority of streaming scenarios.|Professional usage.
-
-<sup>1</sup> Only used directly on the Streaming Endpoint when the CDN is not enabled on the endpoint.<br/>
-
-For SLA information, see [Pricing and SLA](https://azure.microsoft.com/pricing/details/media-services/).
-
-## Migration between types
-
-From | To | Action
-||
-Classic|Standard|Need to opt-in
-Classic|Premium| Scale(additional streaming units)
-Standard/Premium|Classic|Not available(If streaming endpoint version is 1.0. It is allowed to change to classic with setting scaleunits to "0")
-Standard (with/without CDN)|Premium with the same configurations|Allowed in the **started** state. (via Azure portal)
-Premium (with/without CDN)|Standard with the same configurations|Allowed in the **started** state (via Azure portal)
-Standard (with/without CDN)|Premium with different config|Allowed in the **stopped** state (via Azure portal). Not allowed in the running state.
-Premium (with/without CDN)|Standard with different config|Allowed in the **stopped** state (via Azure portal). Not allowed in the running state.
-Version 1.0 with SU >= 1 with CDN|Standard/Premium with no CDN|Allowed in the **stopped** state. Not allowed in the **started** state.
-Version 1.0 with SU >= 1 with CDN|Standard with/without CDN|Allowed in the **stopped** state. Not allowed in the **started** state. Version 1.0 CDN will be deleted and new one created and started.
-Version 1.0 with SU >= 1 with CDN|Premium with/without CDN|Allowed in the **stopped** state. Not allowed in the **started** state. Classic CDN will be deleted and new one created and started.
-
-## Next steps
-Review Media Services learning paths.
--
-## Provide feedback
media-services Media Services Telemetry Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-telemetry-overview.md
- Title: Azure Media Services Telemetry | Microsoft Docs
-description: This article provides an overview of Microsoft Azure Media Services telemetry.
------ Previously updated : 3/10/2021---
-# Azure Media Services telemetry
---
-Azure Media Services (AMS) enables you to access telemetry/metrics data for its services. The current version of AMS lets you collect telemetry data for live **Channel**, **StreamingEndpoint**, and live **Archive** entities.
-
-Telemetry is written to a storage table in an Azure Storage account that you specify (typically, you would use the storage account associated with your AMS account).
-
-The telemetry system does not manage data retention. You can remove the old telemetry data by deleting the storage tables.
-
-This topic discusses how to configure and consume the AMS telemetry.
-
-## Configuring telemetry
-
-You can configure telemetry on a component level granularity. There are two detail levels "Normal" and "Verbose". Currently, both levels return the same information. It is recommended to use "Normal.
-
-The following topics show how to enable telemetry:
-
-[Enabling telemetry with .NET](media-services-dotnet-telemetry.md)
-
-[Enabling telemetry with REST](media-services-rest-telemetry.md)
-
-## Consuming telemetry information
-
-Telemetry is written to an Azure Storage Table in the storage account that you specified when you configured telemetry for the Media Services account. This section describes the storage tables for the metrics.
-
-You can consume telemetry data in one of the following ways:
--- Read data directly from Azure Table Storage (e.g. using the Storage SDK). For the description of telemetry storage tables, see the **Consuming telemetry information** in [this](/previous-versions/azure/mt742089(v=azure.100)) topic.-
-Or
--- Use the support in the Media Services .NET SDK for reading storage data, as described in [this](media-services-dotnet-telemetry.md) topic. --
-The telemetry schema described below is designed to give good performance within the limits of Azure Table Storage:
--- Data is partitioned by account ID and service ID to allow telemetry from each service to be queried independently.-- Partitions contain the date to give a reasonable upper bound on the partition size.-- Row keys are in reverse time order to allow the most recent telemetry items to be queried for a given service.-
-This should allow many of the common queries to be efficient:
--- Parallel, independent downloading of data for separate services.-- Retrieving all data for a given service in a date range.-- Retrieving the most recent data for a service.-
-### Telemetry table storage output schema
-
-Telemetry data is stored in aggregate in one table, "TelemetryMetrics20160321" where "20160321" is date of the created table. Telemetry system creates a separate table for each new day based at 00:00 UTC. The table is used to store recurring values such as ingest bitrate within a given window of time, bytes sent, etc.
-
-Property|Value|Examples/notes
-||
-PartitionKey|{account ID}_{entity ID}|e49bef329c29495f9b9570989682069d_64435281c50a4dd8ab7011cb0f4cdf66<br/<br/>The account ID is included in the partition key to simplify workflows where multiple Media Services accounts are writing to the same storage account.
-RowKey|{seconds to midnight}_{random value}|01688_00199<br/><br/>The row key starts with the number of seconds to midnight to allow top n style queries within a partition. For more information, see [this](../../cosmos-db/table-storage-design-guide.md#log-tail-pattern) article.
-Timestamp|Date/Time|Auto timestamp from the Azure table 2016-09-09T22:43:42.241Z
-Type|The type of the entity providing telemetry data|Channel/StreamingEndpoint/Archive<br/><br/>Event type is just a string value.
-Name|The name of the telemetry event|ChannelHeartbeat/StreamingEndpointRequestLog
-ObservedTime|The time the telemetry event occurred (UTC)|2016-09-09T22:42:36.924Z<br/><br/>The observed time is provided by the entity sending the telemetry (for example a channel). There can be time synchronization issues between components so this value is approximate
-ServiceID|{service ID}|f70bd731-691d-41c6-8f2d-671d0bdc9c7e
-Entity-specific properties|As defined by the event|StreamName: stream1, Bitrate 10123, …<br/><br/>The remaining properties are defined for the given event type. Azure Table content is key value pairs. (that is, different rows in the table have different sets of properties).
-
-### Entity-specific schema
-
-There are three types of entity-specific telemetric data entries each pushed with the following frequency:
--- Streaming endpoints: Every 30 seconds-- Live channels: Every minute-- Live archive: Every minute-
-**Streaming Endpoint**
-
-Property|Value|Examples
-||
-PartitionKey|PartitionKey|e49bef329c29495f9b9570989682069d_64435281c50a4dd8ab7011cb0f4cdf66
-RowKey|RowKey|01688_00199
-Timestamp|Timestamp|Auto timestamp from Azure Table 2016-09-09T22:43:42.241Z
-Type|Type|StreamingEndpoint
-Name|Name|StreamingEndpointRequestLog
-ObservedTime|ObservedTime|2016-09-09T22:42:36.924Z
-ServiceID|Service ID|f70bd731-691d-41c6-8f2d-671d0bdc9c7e
-HostName|Hostname of the endpoint|builddemoserver.origin.mediaservices.windows.net
-StatusCode|Records HTTP status|200
-ResultCode|Result code detail|S_OK
-RequestCount|Total request in the aggregation|3
-BytesSent|Aggregated bytes sent|2987358
-ServerLatency|Average server latency (including storage)|129
-E2ELatency|Average end-to-end latency|250
-
-**Live channel**
-
-Property|Value|Examples/notes
-||
-PartitionKey|PartitionKey|e49bef329c29495f9b9570989682069d_64435281c50a4dd8ab7011cb0f4cdf66
-RowKey|RowKey|01688_00199
-Timestamp|Timestamp|Auto timestamp from the Azure table 2016-09-09T22:43:42.241Z
-Type|Type|Channel
-Name|Name|ChannelHeartbeat
-ObservedTime|ObservedTime|2016-09-09T22:42:36.924Z
-ServiceID|Service ID|f70bd731-691d-41c6-8f2d-671d0bdc9c7e
-TrackType|Type of track video/audio/text|video/audio
-TrackName|Name of the track|video/audio_1
-Bitrate|Track bitrate|785000
-CustomAttributes||
-IncomingBitrate|Actual incoming bitrate|784548
-OverlapCount|Overlap in the ingest|0
-DiscontinuityCount|Discontinuity for track|0
-LastTimestamp|Last ingested data timestamp|1800488800
-NonincreasingCount|Count of fragments discarded due to non-increasing timestamp|2
-UnalignedKeyFrames|Whether we received fragment(s) (across quality levels) where key frames not aligned |True
-UnalignedPresentationTime|Whether we received fragment(s) (across quality levels/tracks) where presentation time is not aligned|True
-UnexpectedBitrate|True, if calculated/actual bitrate for audio/video track > 40,000 bps and IncomingBitrate == 0 OR IncomingBitrate and actualBitrate differ by 50% |True
-Healthy|True, if <br/>overlapCount, <br/>DiscontinuityCount, <br/>NonIncreasingCount, <br/>UnalignedKeyFrames, <br/>UnalignedPresentationTime, <br/>UnexpectedBitrate<br/> are all 0|True<br/><br/>Healthy is a composite function that returns false when any of the following conditions hold:<br/><br/>- OverlapCount > 0<br/>- DiscontinuityCount > 0<br/>- NonincreasingCount > 0<br/>- UnalignedKeyFrames == True<br/>- UnalignedPresentationTime == True<br/>- UnexpectedBitrate == True
-
-**Live archive**
-
-Property|Value|Examples/notes
-||
-PartitionKey|PartitionKey|e49bef329c29495f9b9570989682069d_64435281c50a4dd8ab7011cb0f4cdf66
-RowKey|RowKey|01688_00199
-Timestamp|Timestamp|Auto timestamp from the Azure table 2016-09-09T22:43:42.241Z
-Type|Type|Archive
-Name|Name|ArchiveHeartbeat
-ObservedTime|ObservedTime|2016-09-09T22:42:36.924Z
-ServiceID|Service ID|f70bd731-691d-41c6-8f2d-671d0bdc9c7e
-ManifestName|Program url|asset-eb149703-ed0a-483c-91c4-e4066e72cce3/a0a5cfbf-71ec-4bd2-8c01-a92a2b38c9ba.ism
-TrackName|Name of the track|audio_1
-TrackType|Type of the track|Audio/video
-CustomAttribute|Hex string that differentiates between different track with same name and bitrate (multi camera angle)|
-Bitrate|Track bitrate|785000
-Healthy|True, if FragmentDiscardedCount == 0 && ArchiveAcquisitionError == False|True (these two values are not present in the metric but they are present in the source event)<br/><br/>Healthy is a composite function that returns false when any of the following conditions hold:<br/><br/>- FragmentDiscardedCount > 0<br/>- ArchiveAcquisitionError == True
-
-## General Q&A
-
-### How to consume metrics data?
-
-Metrics data is stored as a series of Azure Tables in the customerΓÇÖs storage account. This data can be consumed using the following tools:
--- AMS SDK-- Microsoft Azure Storage Explorer (supports export to comma-separated value format and processed in Excel)-- REST API-
-### How to find average bandwidth consumption?
-
-The average bandwidth consumption is the average of BytesSent over a span of time.
-
-### How to define streaming unit count?
-
-The streaming unit count can be defined as the peak throughput from the serviceΓÇÖs streaming endpoints divided by the peak throughput of one streaming endpoint. The peak usable throughput of one streaming endpoint is 160 Mbps.
-For example, suppose the peak throughput from a customerΓÇÖs service is 40 MBps (the maximum value of BytesSent over a span of time). Then, the streaming unit count is equal to (40 MBps)*(8 bits/byte)/(160 Mbps) = 2 streaming units.
-
-### How to find average requests/second?
-
-To find the average number of requests/second, compute the average number of requests (RequestCount) over a span of time.
-
-### How to define channel health?
-
-Channel health can be defined as a composite Boolean function such that it is false when any of the following conditions hold:
--- OverlapCount > 0-- DiscontinuityCount > 0-- NonincreasingCount > 0-- UnalignedKeyFrames == True -- UnalignedPresentationTime == True -- UnexpectedBitrate == True--
-### How to detect discontinuities?
-
-To detect discontinuities, find all Channel data entries where DiscontinuityCount > 0. The corresponding ObservedTime timestamp indicates the times at which the discontinuities occurred.
-
-### How to detect timestamp overlaps?
-
-To detect timestamp overlaps, find all Channel data entries where OverlapCount > 0. The corresponding ObservedTime timestamp indicates the times at which the timestamp overlaps occurred.
-
-### How to find streaming request failures and reasons?
-
-To find streaming request failures and reasons, find all Streaming Endpoint data entries where ResultCode is not equal to S_OK. The corresponding StatusCode field indicates the reason for the request failure.
-
-### How to consume data with external tools?
-
-Telemetric data can be processed and visualized with the following tools:
--- PowerBI-- Application Insights-- Azure Monitor (formerly Shoebox)-- AMS Live Dashboard-- Azure Portal (pending release)-
-### How to manage data retention?
-
-The telemetry system does not provide data retention management or auto deletion of old records. Thus, you need to manage and delete old records manually from the storage table. You can refer to storage SDK for how to do it.
-
-## Next steps
--
-## Provide feedback
-
media-services Media Services Troubleshooting Live Streaming https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-troubleshooting-live-streaming.md
- Title: Troubleshooting guide for live streaming | Microsoft Docs
-description: This article gives suggestions on how to troubleshoot Azure Media Services live streaming problems.
------ Previously updated : 3/10/2021--
-# Troubleshooting guide for live streaming
--
-This article gives suggestions on how to troubleshoot some live streaming problems.
-
-## Issues related to on-premises encoders
-This section gives suggestions on how to troubleshoot problems related to on-premises encoders that are configured to send a single bitrate stream to AMS channels that are enabled for live encoding.
-
-### Problem: Would like to see logs
-* **Potential issue**: Can't find encoder logs that might help in debugging issues.
-
- * **Telestream Wirecast**: You can usually find logs under C:\Users\{username}\AppData\Roaming\Wirecast\
- * **Elemental Live**: You can find has links to logs on the management portal. Click on **Stats**, then **Logs**. On the **Log Files** page, you will see a list of logs for all the LiveEvent items; select the one matching your current session.
- * **Flash Media Live Encoder**: You can find the **Log Directory...** by navigating to the **Encoding Log** tab.
-
-### Problem: There is no option for outputting a progressive stream
-* **Potential issue**: The encoder being used doesn't automatically deinterlace.
-
- **Troubleshooting steps**: Look for a de-interlacing option within the encoder interface. Once de-interlacing is enabled, check again for progressive output settings.
-
-### Problem: Tried several encoder output settings and still unable to connect.
-* **Potential issue**: Azure encoding channel was not properly reset.
-
- **Troubleshooting steps**: Make sure the encoder is no longer pushing to AMS, stop and reset the channel. Once running again, try connecting your encoder with the new settings. If this still does not correct the issue, try creating a new channel entirely, sometimes channels can become corrupt after several failed attempts.
-* **Potential issue**: The GOP size or key frame settings are not optimal.
-
- **Troubleshooting steps**: Recommended GOP size or keyframe interval is two seconds. Some encoders calculate this setting in number of frames, while others use seconds. For example: When outputting 30 fps, the GOP size would be 60 frames, which is equivalent to 2 seconds.
-* **Potential issue**: Closed ports are blocking the stream.
-
- **Troubleshooting steps**: When streaming via RTMP, check firewall and/or proxy settings to confirm that outbound ports 1935 and 1936 are open.
-
-> [!NOTE]
-> If after following the troubleshooting steps you still cannot successfully stream, submit a support ticket using the Azure portal.
->
->
-
-## Provide feedback
-
media-services Media Services Upload Files From Storsimple https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-upload-files-from-storsimple.md
- Title: Upload files into an Azure Media Services account from Azure StorSimple | Microsoft Docs
-description: This article gives a brief overview of Azure StorSimple Data Manager. The article also links to tutorials that show you how to extract data from StorSimple and upload it as assets to an Azure Media Services account.
------ Previously updated : 3/10/2021--
-# Upload files into an Azure Media Services account from Azure StorSimple
--
->
->
-> Azure StorSimple Data Manager is currently in private preview.
->
-
-## Overview
-
-In Media Services, you upload your digital files into an asset. The Asset can contain video, audio, images, thumbnail collections, text tracks and closed caption files (and the metadata about these files.) Once the files are uploaded, your content is stored securely in the cloud for further processing and streaming.
-
-[Azure StorSimple](../../storsimple/index.yml) uses cloud storage as an extension of the on-premises solution and automatically tiers data across the on-premises storage and cloud storage. The StorSimple device dedupes and compresses your data before sending it to the cloud making it very efficient for sending large files to the cloud. The [StorSimple Data Manager](../../storsimple/storsimple-data-manager-overview.md) service provides APIs that enable you to extract data from StorSimple and present it as AMS assets.
-
-## Get started
-
-1. [Create a Media Services account](media-services-portal-create-account.md) into which you want to transfer the assets.
-2. Sign up for Data Manager preview, as described in the [StorSimple Data Manager](../../storsimple/storsimple-data-manager-overview.md) article.
-3. Create a StorSimple Data Manager account.
-4. Create a data transformation job that when runs, extracts data from a StorSimple device and transfers it into an AMS account as assets.
-
- When the job starts running, a storage queue is created. This queue is populated with messages about transformed blobs as they are ready. The name of this queue is the same as the name of the job definition. You can use this queue to determine when as asset is ready and call your desired Media Services operation to run on it. For example, you can use this queue to trigger an Azure Function that has the necessary Media Services code in it.
-
-## See also
-
-[Use the .NET SDK to trigger jobs in the Data Manager](../../storsimple/storsimple-data-manager-dotnet-jobs.md)
-
-## Media Services learning paths
-
-## Provide feedback
-
-## Next steps
-
-You can now encode your uploaded assets. For more information, see [Encode assets](media-services-portal-encode.md).
media-services Media Services Use Aad Auth To Access Ams Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-use-aad-auth-to-access-ams-api.md
- Title: Access Azure Media Services API with Azure Active Directory authentication | Microsoft Docs
-description: Learn about concepts and steps to take to use Azure Active Directory (Azure AD) to authenticate access to the Azure Media Services API.
------ Previously updated : 3/10/2021--
-# Access the Azure Media Services API with Azure AD authentication
---
-The Azure Media Services API is a RESTful API. You can use it to perform operations on media resources by using a REST API or by using available client SDKs. Azure Media Services offers a Media Services client SDK for Microsoft .NET. To be authorized to access Media Services resources and the Media Services API, you must first be authenticated.
-
-Media Services supports [Azure Active Directory (Azure AD)-based authentication](../../active-directory/fundamentals/active-directory-whatis.md). The Azure Media REST service requires that the user or application that makes the REST API requests have either the **Contributor** or **Owner** role to access the resources. For more information, see [What is Azure role-based access control (Azure RBAC)](../../role-based-access-control/overview.md).
-
-This document gives an overview of how to access the Media Services API by using REST or .NET APIs.
-
-> [!NOTE]
-> Access Control authorization was deprecated on June 1, 2018.
-
-## Access control
-
-For the Azure Media REST request to succeed, the calling user must have a Contributor or Owner role for the Media Services account it is trying to access.
-Only a user with the Owner role can give media resource (account) access to new users or apps. The Contributor role can access only the media resource.
-Unauthorized requests fail, with status code of 401. If you see this error code, check whether your user has the Contributor or Owner role assigned for the user's Media Services account. You can check this in the Azure portal. Search for your media account, and then click the **Access control** tab.
-
-![Access control tab](./media/media-services-use-aad-auth-to-access-ams-api/media-services-access-control.png)
-
-## Types of authentication
-
-When you use Azure AD authentication with Azure Media Services, you have two authentication options:
--- **User authentication**. Authenticate a person who is using the app to interact with Media Services resources. The interactive application should first prompt the user for the user's credentials. An example is a management console app used by authorized users to monitor encoding jobs or live streaming. -- **Service principal authentication**. Authenticate a service. Applications that commonly use this authentication method are apps that run daemon services, middle-tier services, or scheduled jobs. Examples are web apps, function apps, logic apps, API, and microservices.-
-### User authentication
-
-Applications that should use the user authentication method are management or monitoring native apps: mobile apps, Windows apps, and Console applications. This type of solution is useful when you want human interaction with the service in one of the following scenarios:
--- Monitoring dashboard for your encoding jobs.-- Monitoring dashboard for your live streams.-- Management application for desktop or mobile users to administer resources in a Media Services account.-
-> [!NOTE]
-> This authentication method should not be used for consumer-facing applications.
-
-A native application must first acquire an access token from Azure AD, and then use it when you make HTTP requests to the Media Services REST API. Add the access token to the request header.
-
-The following diagram shows a typical interactive application authentication flow:
-
-![Native apps diagram](./media/media-services-use-aad-auth-to-access-ams-api/media-services-native-aad-app1.png)
-
-In the preceding diagram, the numbers represent the flow of the requests in chronological order.
-
-> [!NOTE]
-> When you use the user authentication method, all apps share the same (default) native application client ID and native application redirect URI.
-
-1. Prompt a user for credentials.
-2. Request an Azure AD access token with the following parameters:
-
- * Azure AD tenant endpoint.
-
- The tenant information can be retrieved from the Azure portal. Place your cursor over the name of the signed-in user in the top right corner.
- * Media Services resource URI.
-
- This URI is the same for Media Services accounts that are in the same Azure environment (for example, https:\//rest.media.azure.net).
-
- * Media Services (native) application client ID.
- * Media Services (native) application redirect URI.
- * Resource URI for REST Media Services.
-
- The URI represents the REST API endpoint (for example, https://test03.restv2.westus.media.azure.net/api/).
-
- To get values for these parameters, see [Use the Azure portal to access Azure AD authentication settings](media-services-portal-get-started-with-aad.md) using the user authentication option.
-
-3. The Azure AD access token is sent to the client.
-4. The client sends a request to the Azure Media REST API with the Azure AD access token.
-5. The client gets back the data from Media Services.
-
-For information about how to use Azure AD authentication to communicate with REST requests by using the Media Services .NET client SDK, see [Use Azure AD authentication to access the Media Services API with .NET](media-services-dotnet-get-started-with-aad.md).
-
-If you are not using the Media Services .NET client SDK, you must manually create an Azure AD access token request by using the parameters described in step 2. For more information, see [How to use the Azure AD Authentication Library to get the Azure AD token](../../active-directory/azuread-dev/active-directory-authentication-libraries.md).
-
-### Service principal authentication
-
-Applications that commonly use this authentication method are apps that run middle-tier services and scheduled jobs: web apps, function apps, logic apps, APIs, and microservices. This authentication method also is suitable for interactive applications in which you might want to use a service account to manage resources.
-
-When you use the service principal authentication method to build consumer scenarios, authentication typically is handled in the middle tier (through some API) and not directly in a mobile or desktop application.
-
-To use this method, create an Azure AD application and service principal in its own tenant. After you create the application, give the app Contributor or Owner role access to the Media Services account. You can do this in the Azure portal, by using the Azure CLI, or with a PowerShell script. You also can use an existing Azure AD application. You can register and manage your Azure AD app and service principal [in the Azure portal](media-services-portal-get-started-with-aad.md). You also can do this by using [Azure CLI](media-services-use-aad-auth-to-access-ams-api.md) or [PowerShell](media-services-powershell-create-and-configure-aad-app.md).
-
-![Middle-tier apps](./media/media-services-use-aad-auth-to-access-ams-api/media-services-principal-service-aad-app1.png)
-
-After you create your Azure AD application, you get values for the following settings. You need these values for authentication:
--- Client ID -- Client secret -
-In the preceding figure, the numbers represent the flow of the requests in chronological order:
-
-1. A middle-tier app (web API or web application) requests an Azure AD access token that has the following parameters:
-
- * Azure AD tenant endpoint.
-
- The tenant information can be retrieved from the Azure portal. Place your cursor over the name of the signed-in user in the top right corner.
- * Media Services resource URI.
-
- This URI is the same for Media Services accounts that are located in the same Azure environment (for example, https:\//rest.media.azure.net).
-
- * Resource URI for REST Media Services.
-
- The URI represents the REST API endpoint (for example, https://test03.restv2.westus.media.azure.net/api/).
-
- * Azure AD application values: the client ID and client secret.
-
- To get values for these parameters, see [Use the Azure portal to access Azure AD authentication settings](media-services-portal-get-started-with-aad.md) by using the service principal authentication option.
-
-2. The Azure AD access token is sent to the middle tier.
-4. The middle tier sends request to the Azure Media REST API with the Azure AD token.
-5. The middle tier gets back the data from Media Services.
-
-For more information about how to use Azure AD authentication to communicate with REST requests by using the Media Services .NET client SDK, see [Use Azure AD authentication to access Azure Media Services API with .NET](media-services-dotnet-get-started-with-aad.md).
-
-If you are not using the Media Services .NET client SDK, you must manually create an Azure AD token request by using parameters described in step 1. For more information, see [How to use the Azure AD Authentication Library to get the Azure AD token](../../active-directory/azuread-dev/active-directory-authentication-libraries.md).
-
-## Troubleshooting
-
-Exception: "The remote server returned an error: (401) Unauthorized."
-
-Solution: For the Media Services REST request to succeed, the calling user must be a Contributor or Owner role in the Media Services account it is trying to access. For more information, see the [Access control](media-services-use-aad-auth-to-access-ams-api.md#access-control) section.
-
-## Resources
-
-The following articles are overviews of Azure AD authentication concepts:
--- [Authentication scenarios addressed by Azure AD](../../active-directory/develop/authentication-vs-authorization.md)-- [Add, update, or remove an application in Azure AD](../../active-directory/develop/quickstart-register-app.md)-- [Add or remove Azure role assignments using Azure PowerShell](../../role-based-access-control/role-assignments-powershell.md)-
-## Next steps
-
-* Use the Azure portal to [access Azure AD authentication to consume Azure Media Services API](media-services-portal-get-started-with-aad.md).
-* Use Azure AD authentication to [access Azure Media Services API with .NET](media-services-dotnet-get-started-with-aad.md).
media-services Media Services Use Osmf Smooth Streaming Client Plugin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-use-osmf-smooth-streaming-client-plugin.md
- Title: Smooth Streaming Plugin for the Open Source Media Framework
-description: Learn how to use the Azure Media Services Smooth Streaming plugin for the Adobe Open Source Media Framework.
------ Previously updated : 3/10/2021---
-# How to Use the Microsoft Smooth Streaming Plugin for the Adobe Open Source Media Framework
--
-## Overview
-The Microsoft Smooth Streaming plugin for Open Source Media Framework 2.0 (SS for OSMF) extends the default capabilities of OSMF and adds Microsoft Smooth Streaming content playback to new and existing OSMF players. The plugin also adds Smooth Streaming playback capabilities to Strobe Media Playback (SMP).
-
-SS for OSMF includes two versions of plugin:
-
-* Static Smooth Streaming plugin for OSMF (.swc)
-* Dynamic Smooth Streaming plugin for OSMF (.swf)
-
-This document assumes that the reader has a general working knowledge of OSMF and OSMF plug-ins. For more information about OSMF, please see the documentation on the official OSMF site.
-
-### Smooth Streaming plugin for OSMF 2.0
-The plugin supports loading and playback of on-demand Smooth Streaming content with the following features:
-
-* On-demand Smooth Streaming playback (Play, Pause, Seek, Stop)
-* Live Smooth Streaming playback (Play)
-* Live DVR functions (Pause, Seek, DVR Playback, Go-to-Live)
-* Support for video codecs - H.264
-* Support for Audio codecs - AAC
-* Multiple audio language switching with OSMF built-in APIs
-* Max playback quality selection with OSMF built-in APIs
-* Sidecar closed captions with OSMF captions plugin
-* Adobe&reg; Flash&reg; Player 11.4 or higher.
-* This version only supports OSMF 2.0.
-
-## Supported features and known issues
-For a full list of supported features, unsupported features and known issues, refer to [this document](https://azure.microsoft.com/blog/microsoft-adaptive-streaming-plugin-for-osmf-update/).
-
-## Loading the Plugin
-OSMF plugins can be loaded statically (at compile time) or dynamically (at run-time). The Smooth Streaming plugin for OSMF download includes both dynamic and static versions.
-
-* Static loading: To load statically, a static library (SWC) file is required. Static plugins are added as a reference to the projects and merge inside the final output file at the compile time.
-* Dynamic loading: To load dynamically, a precompiled (SWF) file is required. Dynamic plugins are loaded in the runtime and not included in the project output. (Compiled output) Dynamic plugins can be loaded using HTTP and FILE protocols.
-
-For more information on static and dynamic loading, see the official [OSMF plugin page](https://www.ibm.com/support/knowledgecenter/SSLTBW_2.3.0/com.ibm.zos.v2r3.izua300/IZUHPINFO_PluginsPlanning.htm).
-
-### SS for OSMF Static Loading
-The code snippet below shows how to load the SS plugin for OSMF statically and play a basic video using OSMF MediaFactory class. Before including the SS for OSMF code, please ensure that the project reference includes the "MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swc" static plugin.
-
-```csharp
-package
-{
-
- import com.microsoft.azure.media.AdaptiveStreamingPluginInfo;
-
- import flash.display.*;
- import org.osmf.media.*;
- import org.osmf.containers.MediaContainer;
- import org.osmf.events.MediaErrorEvent;
- import org.osmf.events.MediaFactoryEvent;
- import org.osmf.events.MediaPlayerStateChangeEvent;
- import org.osmf.layout.*;
---
- [SWF(width="1024", height="768", backgroundColor='#405050', frameRate="25")]
- public class TestPlayer extends Sprite
- {
- public var _container:MediaContainer;
- public var _mediaFactory:DefaultMediaFactory;
- private var _mediaPlayerSprite:MediaPlayerSprite;
--
- public function TestPlayer( )
- {
- stage.quality = StageQuality.HIGH;
-
- initMediaPlayer();
-
- }
-
- private function initMediaPlayer():void
- {
-
- // Create the container (sprite) for managing display and layout
- _mediaPlayerSprite = new MediaPlayerSprite();
- _mediaPlayerSprite.addEventListener(MediaErrorEvent.MEDIA_ERROR, onPlayerFailed);
- _mediaPlayerSprite.addEventListener(MediaPlayerStateChangeEvent.MEDIA_PLAYER_STATE_CHANGE, onPlayerStateChange);
- _mediaPlayerSprite.scaleMode = ScaleMode.NONE;
- _mediaPlayerSprite.width = stage.stageWidth;
- _mediaPlayerSprite.height = stage.stageHeight;
- //Adds the container to the stage
- addChild(_mediaPlayerSprite);
-
- // Create a mediafactory instance
- _mediaFactory = new DefaultMediaFactory();
-
- // Add the listeners for PLUGIN_LOADING
- _mediaFactory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD,onPluginLoaded);
- _mediaFactory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onPluginLoadFailed );
-
- // Load the plugin class
- loadAdaptiveStreamingPlugin( );
-
- }
-
- private function loadAdaptiveStreamingPlugin( ):void
- {
- var pluginResource:MediaResourceBase;
-
- pluginResource = new PluginInfoResource(new AdaptiveStreamingPluginInfo( ));
- _mediaFactory.loadPlugin( pluginResource );
- }
-
- private function onPluginLoaded( event:MediaFactoryEvent ):void
- {
- // The plugin is loaded successfully.
- // Your web server needs to host a valid crossdomain.xml file to allow plugin to download Smooth Streaming files.
- loadMediaSource("http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest")
-
- }
-
- private function onPluginLoadFailed( event:MediaFactoryEvent ):void
- {
- // The plugin is failed to load ...
- }
--
- private function onPlayerStateChange(event:MediaPlayerStateChangeEvent) : void
- {
- var state:String;
-
- state = event.state;
-
- switch (state)
- {
- case MediaPlayerState.LOADING:
-
- // A new source is started to load.
-
- break;
-
- case MediaPlayerState.READY :
- // Add code to deal with Player Ready when it is hit the first load after a source is loaded.
-
- break;
-
- case MediaPlayerState.BUFFERING :
-
- break;
-
- case MediaPlayerState.PAUSED :
- break;
- // other states ...
- }
- }
-
- private function onPlayerFailed(event:MediaErrorEvent) : void
- {
- // Media Player is failed .
- }
-
- private function loadMediaSource(sourceURL : String):void
- {
- // Take an URL of SmoothStreamingSource's manifest and add it to the page.
-
- var resource:URLResource= new URLResource( sourceURL );
-
- var element:MediaElement = _mediaFactory.createMediaElement( resource );
- _mediaPlayerSprite.scaleMode = ScaleMode.LETTERBOX;
- _mediaPlayerSprite.width = stage.stageWidth;
- _mediaPlayerSprite.height = stage.stageHeight;
-
- // Add the media element
- _mediaPlayerSprite.media = element;
- }
-
- }
-}
-```
--
-### SS for OSMF Dynamic Loading
-The code snippet below shows how to load the SS plugin for OSMF dynamically and play a basic video using the OSMF MediaFactory class. Before including the SS for OSMF code, copy the "MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swf" dynamic plugin to the project folder if you want to load using FILE protocol, or copy under a web server for HTTP load. There is no need to include "MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swc" in the project references.
-
-```csharp
-package
-{
-
- import flash.display.*;
- import org.osmf.media.*;
- import org.osmf.containers.MediaContainer;
- import org.osmf.events.MediaErrorEvent;
- import org.osmf.events.MediaFactoryEvent;
- import org.osmf.events.MediaPlayerStateChangeEvent;
- import org.osmf.layout.*;
- import flash.events.Event;
- import flash.system.Capabilities;
--
- //Sets the size of the SWF
-
- [SWF(width="1024", height="768", backgroundColor='#405050', frameRate="25")]
- public class TestPlayer extends Sprite
- {
- public var _container:MediaContainer;
- public var _mediaFactory:DefaultMediaFactory;
- private var _mediaPlayerSprite:MediaPlayerSprite;
--
- public function TestPlayer( )
- {
- stage.quality = StageQuality.HIGH;
- initMediaPlayer();
- }
-
- private function initMediaPlayer():void
- {
-
- // Create the container (sprite) for managing display and layout
- _mediaPlayerSprite = new MediaPlayerSprite();
- _mediaPlayerSprite.addEventListener(MediaErrorEvent.MEDIA_ERROR, onPlayerFailed);
- _mediaPlayerSprite.addEventListener(MediaPlayerStateChangeEvent.MEDIA_PLAYER_STATE_CHANGE, onPlayerStateChange);
-
- //Adds the container to the stage
- addChild(_mediaPlayerSprite);
-
- // Create a mediafactory instance
- _mediaFactory = new DefaultMediaFactory();
-
- // Add the listeners for PLUGIN_LOADING
- _mediaFactory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD,onPluginLoaded);
- _mediaFactory.addEventListener(MediaFactoryEvent.PLUGIN_LOAD_ERROR, onPluginLoadFailed );
-
- // Load the plugin class
- loadAdaptiveStreamingPlugin( );
-
- }
-
- private function loadAdaptiveStreamingPlugin( ):void
- {
- var pluginResource:MediaResourceBase;
- var adaptiveStreamingPluginUrl:String;
-
- // Your dynamic plugin web server needs to host a valid crossdomain.xml file to allow loading plugins.
-
- adaptiveStreamingPluginUrl = "http://yourdomain/MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swf";
- pluginResource = new URLResource(adaptiveStreamingPluginUrl);
- _mediaFactory.loadPlugin( pluginResource );
-
- }
-
- private function onPluginLoaded( event:MediaFactoryEvent ):void
- {
- // The plugin is loaded successfully.
-
- // Your web server needs to host a valid crossdomain.xml file to allow plugin to download Smooth Streaming files.
-
- loadMediaSource("http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest")
- }
-
- private function onPluginLoadFailed( event:MediaFactoryEvent ):void
- {
- // The plugin is failed to load ...
- }
--
- private function onPlayerStateChange(event:MediaPlayerStateChangeEvent) : void
- {
- var state:String;
-
- state = event.state;
-
- switch (state)
- {
- case MediaPlayerState.LOADING:
-
- // A new source is started to load.
-
- break;
-
- case MediaPlayerState.READY :
- // Add code to deal with Player Ready when it is hit the first load after a source is loaded.
-
- break;
-
- case MediaPlayerState.BUFFERING :
-
- break;
-
- case MediaPlayerState.PAUSED :
- break;
- // other states ...
- }
- }
-
- private function onPlayerFailed(event:MediaErrorEvent) : void
- {
- // Media Player is failed .
- }
-
- private function loadMediaSource(sourceURL : String):void
- {
- // Take an URL of SmoothStreamingSource's manifest and add it to the page.
-
- var resource:URLResource= new URLResource( sourceURL );
-
- var element:MediaElement = _mediaFactory.createMediaElement( resource );
- _mediaPlayerSprite.scaleMode = ScaleMode.LETTERBOX;
- _mediaPlayerSprite.width = stage.stageWidth;
- _mediaPlayerSprite.height = stage.stageHeight;
- // Add the media element
- _mediaPlayerSprite.media = element;
- }
-
- }
-}
-```
-
-## Strobe Media Playback with the SS ODMF Dynamic Plugin
-The Smooth Streaming for OSMF dynamic plugin is compatible with [Strobe Media Playback (SMP)](https://sourceforge.net/adobe/smp/home/Strobe%20Media%20Playback/). You can use the SS for OSMF plugin to add Smooth Streaming content playback to SMP. To do this, copy "MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swf" under a web server for HTTP load using the following steps:
-
-1. Browse the [Strobe Media Playback setup page](http://www.koopman.me/bob3/setup.html).
-2. Set the src to a Smooth Streaming source, (e.g. http:\//devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest)
-3. Make the desired configuration changes and click Preview and Update.
-
- **Note** Your content web server needs a valid crossdomain.xml.
-4. Copy and paste the code to a simple HTML page using your favorite text editor, such as in the following example:
-
- ```html
- <html>
- <body>
- <object width="920" height="640">
- <param name="movie" value="http://osmf.org/dev/2.0gm/StrobeMediaPlayback.swf"></param>
- <param name="flashvars" value="src=http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest &autoPlay=true"></param>
- <param name="allowFullScreen" value="true"></param>
- <param name="allowscriptaccess" value="always"></param>
- <param name="wmode" value="direct"></param>
- <embed src="http://osmf.org/dev/2.0gm/StrobeMediaPlayback.swf"
- type="application/x-shockwave-flash"
- allowscriptaccess="always"
- allowfullscreen="true"
- wmode="direct"
- width="920"
- height="640"
- flashvars=" src=http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest&autoPlay=true">
- </embed>
- </object>
- </body>
- </html>
- ```
--
-1. Add Smooth Streaming OSMF plugin to the embed code and save.
-
- ```html
- <html>
- <object width="920" height="640">
- <param name="movie" value="http://osmf.org/dev/2.0gm/StrobeMediaPlayback.swf"></param>
- <param name="flashvars" value="src=http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest&autoPlay=true&plugin_AdaptiveStreamingPlugin=http://yourdomain/MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swf&AdaptiveStreamingPlugin_retryLive=true&AdaptiveStreamingPlugin_retryInterval=10"></param>
- <param name="allowFullScreen" value="true"></param>
- <param name="allowscriptaccess" value="always"></param>
- <param name="wmode" value="direct"></param>
- <embed src="http://osmf.org/dev/2.0gm/StrobeMediaPlayback.swf"
- type="application/x-shockwave-flash"
- allowscriptaccess="always"
- allowfullscreen="true"
- wmode="direct"
- width="920"
- height="640"
- flashvars="src=http://devplatem.vo.msecnd.net/Sintel/Sintel_H264.ism/manifest&autoPlay=true&plugin_AdaptiveStreamingPlugin=http://yourdomain/MSAdaptiveStreamingPlugin-v1.0.3-osmf2.0.swf&AdaptiveStreamingPlugin_retryLive=true&AdaptiveStreamingPlugin_retryInterval=10">
- </embed>
- </object>
- </html>
- ```
-
-2. Save your HTML page and publish to a web server. Browse to the published web page using your favorite Flash&reg; Player enabled Internet browser (Internet Explorer, Chrome, Firefox, so on).
-3. Enjoy Smooth Streaming content inside Adobe&reg; Flash&reg; Player.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See Also
-[Microsoft Adaptive Streaming Plugin for OSMF Update](https://azure.microsoft.com/blog/2014/10/27/microsoft-adaptive-streaming-plugin-for-osmf-update/)
-
media-services Media Services Widevine License Template Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-widevine-license-template-overview.md
- Title: Widevine license template overview | Microsoft Docs
-description: This topic gives an overview of a Widevine license template that is used to configure Widevine licenses.
------ Previously updated : 3/10/2021--
-# Widevine license template overview
--
-You can use Azure Media Services to configure and request Google Widevine licenses. When the player tries to play your Widevine-protected content, a request is sent to the license delivery service to obtain a license. If the license service approves the request, the service issues the license. It's sent to the client and is used to decrypt and play the specified content.
-
-A Widevine license request is formatted as a JSON message.
-
->[!NOTE]
-> You can create an empty message with no values, just "{}." Then a license template is created with defaults. The default works for most cases. Microsoft-based license-delivery scenarios should always use the defaults. If you need to set the "provider" and "content_id" values, a provider must match Widevine credentials.
-
-```json
-{
- "payload": "<license challenge>",
- "content_id": "<content id>"
- "provider": "<provider>"
- "allowed_track_types": "<types>",
- "content_key_specs": [
- {
- "track_type": "<track type 1>"
- },
- {
- "track_type": "<track type 2>"
- },
- …
- ],
- "policy_overrides": {
- "can_play": <can play>,
- "can persist": <can persist>,
- "can_renew": <can renew>,
- "rental_duration_seconds": <rental duration>,
- "playback_duration_seconds": <playback duration>,
- "license_duration_seconds": <license duration>,
- "renewal_recovery_duration_seconds": <renewal recovery duration>,
- "renewal_server_url": "<renewal server url>",
- "renewal_delay_seconds": <renewal delay>,
- "renewal_retry_interval_seconds": <renewal retry interval>,
- "renew_with_usage": <renew with usage>
- }
-}
-```
-
-## JSON message
-| Name | Value | Description |
-| | | |
-| payload |Base64-encoded string |The license request sent by a client. |
-| content_id |Base64-encoded string |Identifier used to derive the key ID and content key for each content_key_specs.track_type. |
-| provider |string |Used to look up content keys and policies. If Microsoft key delivery is used for Widevine license delivery, this parameter is ignored. |
-| policy_name |string |Name of a previously registered policy. Optional. |
-| allowed_track_types |enum |SD_ONLY or SD_HD. Controls which content keys are included in a license. |
-| content_key_specs |Array of JSON structures, see the section "Content key specs." |A finer-grained control on which content keys to return. For more information, see the section "Content key specs." Only one of the allowed_track_types and content_key_specs values can be specified. |
-| use_policy_overrides_exclusively |Boolean, true or false |Use policy attributes specified by policy_overrides, and omit all previously stored policy. |
-| policy_overrides |JSON structure, see the section "Policy overrides." |Policy settings for this license. In the event this asset has a predefined policy, these specified values are used. |
-| session_init |JSON structure, see the section "Session initialization." |Optional data is passed to the license. |
-| parse_only |Boolean, true or false |The license request is parsed, but no license is issued. However, values from the license request are returned in the response. |
-
-## Content key specs
-If a preexisting policy exists, there is no need to specify any of the values in the content key spec. The preexisting policy associated with this content is used to determine the output protection, such as High-bandwidth Digital Content Protection (HDCP) and the Copy General Management System (CGMS). If a preexisting policy isn't registered with the Widevine license server, the content provider can inject the values into the license request.
-
-Each content_key_specs value must be specified for all tracks, regardless of the use_policy_overrides_exclusively option.
-
-| Name | Value | Description |
-| | | |
-| content_key_specs. track_type |string |A track type name. If content_key_specs is specified in the license request, make sure to specify all track types explicitly. Failure to do so results in failure to play back past 10 seconds. |
-| content_key_specs <br/> security_level |uint32 |Defines client robustness requirements for playback. <br/> - Software-based white-box cryptography is required. <br/> - Software cryptography and an obfuscated decoder are required. <br/> - The key material and cryptography operations must be performed within a hardware-backed trusted execution environment. <br/> - The cryptography and decoding of content must be performed within a hardware-backed trusted execution environment. <br/> - The cryptography, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware-backed trusted execution environment. |
-| content_key_specs <br/> required_output_protection.hdc |string, one of HDCP_NONE, HDCP_V1, HDCP_V2 |Indicates whether HDCP is required. |
-| content_key_specs <br/>key |Base64-<br/>encoded string |Content key to use for this track. If specified, the track_type or key_id is required. The content provider can use this option to inject the content key for this track instead of letting the Widevine license server generate or look up a key. |
-| content_key_specs.key_id |Base64-encoded string binary, 16 bytes |Unique identifier for the key. |
-
-## Policy overrides
-| Name | Value | Description |
-| | | |
-| policy_overrides. can_play |Boolean, true or false |Indicates that playback of the content is allowed. Default is false. |
-| policy_overrides. can_persist |Boolean, true or false |Indicates that the license might be persisted to nonvolatile storage for offline use. Default is false. |
-| policy_overrides. can_renew |Boolean, true or false |Indicates that renewal of this license is allowed. If true, the duration of the license can be extended by heartbeat. Default is false. |
-| policy_overrides. license_duration_seconds |int64 |Indicates the time window for this specific license. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides. rental_duration_seconds |int64 |Indicates the time window while playback is permitted. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides. playback_duration_seconds |int64 |The viewing window of time after playback starts within the license duration. A value of 0 indicates that there is no limit to the duration. Default is 0. |
-| policy_overrides. renewal_server_url |string |All heartbeat (renewal) requests for this license are directed to the specified URL. This field is used only if can_renew is true. |
-| policy_overrides. renewal_delay_seconds |int64 |How many seconds after license_start_time before renewal is first attempted. This field is used only if can_renew is true. Default is 0. |
-| policy_overrides. renewal_retry_interval_seconds |int64 |Specifies the delay in seconds between subsequent license renewal requests, in case of failure. This field is used only if can_renew is true. |
-| policy_overrides. renewal_recovery_duration_seconds |int64 |The window of time in which playback can continue while renewal is attempted, yet unsuccessful due to back-end problems with the license server. A value of 0 indicates that there is no limit to the duration. This field is used only if can_renew is true. |
-| policy_overrides. renew_with_usage |Boolean, true or false |Indicates that the license is sent for renewal when usage starts. This field is used only if can_renew is true. |
-
-## Session initialization
-| Name | Value | Description |
-| | | |
-| provider_session_token |Base64-encoded string |This session token is passed back in the license and exists in subsequent renewals. The session token doesn't persist beyond sessions. |
-| provider_client_token |Base64-encoded string |Client token to send back in the license response. If the license request contains a client token, this value is ignored. The client token persists beyond license sessions. |
-| override_provider_client_token |Boolean, true or false |If false and the license request contains a client token, use the token from the request even if a client token was specified in this structure. If true, always use the token specified in this structure. |
-
-## Configure your Widevine licenses by using .NET types
-Media Services provides .NET APIs that you can use to configure your Widevine licenses.
-
-### Classes as defined in the Media Services .NET SDK
-The following classes are the definitions of these types:
-
-```dotnetcli
-public class WidevineMessage
-{
- public WidevineMessage();
-
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public AllowedTrackTypes? allowed_track_types { get; set; }
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public ContentKeySpecs[] content_key_specs { get; set; }
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public object policy_overrides { get; set; }
-}
-
-[JsonConverter(typeof(StringEnumConverter))]
-public enum AllowedTrackTypes
-{
- SD_ONLY = 0,
- SD_HD = 1
-}
-public class ContentKeySpecs
-{
- public ContentKeySpecs();
-
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public string key_id { get; set; }
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public RequiredOutputProtection required_output_protection { get; set; }
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public int? security_level { get; set; }
- [JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
- public string track_type { get; set; }
-}
-
-public class RequiredOutputProtection
-{
- public RequiredOutputProtection();
-
- public Hdcp hdcp { get; set; }
-}
-
-[JsonConverter(typeof(StringEnumConverter))]
-public enum Hdcp
-{
- HDCP_NONE = 0,
- HDCP_V1 = 1,
- HDCP_V2 = 2
-}
-```
-
-### Example
-The following example shows how to use .NET APIs to configure a simple Widevine license:
-
-```dotnetcli
-private static string ConfigureWidevineLicenseTemplate()
-{
- var template = new WidevineMessage
- {
- allowed_track_types = AllowedTrackTypes.SD_HD,
- content_key_specs = new[]
- {
- new ContentKeySpecs
- {
- required_output_protection = new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
- security_level = 1,
- track_type = "SD"
- }
- },
- policy_overrides = new
- {
- can_play = true,
- can_persist = true,
- can_renew = false
- }
- };
-
- string configuration = JsonConvert.SerializeObject(template);
- return configuration;
-}
-```
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Media Services learning paths
-
-## Provide feedback
-
-## See also
-[Use PlayReady and/or Widevine dynamic common encryption](media-services-protect-with-playready-widevine.md)
-
media-services Media Services Workflow Designer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/media-services-workflow-designer.md
- Title: Create Advanced Encoding Workflows with Workflow Designer | Microsoft Docs
-description: Learn about how to create advanced encoding workflows with Workflow Designer.
------ Previously updated : 3/10/2021---
-# Create Advanced Encoding Workflows with Workflow Designer
--
-## Overview
-The **Workflow Designer** is a Windows desktop tool that is used to design and build custom workflows for encoding with **Media Encoder Premium Workflow**.
-By using the power of the workflow designer tool, you can design and create complex workflows that will run in **Media Encoder Premium**.
-
-Workflows can include customer decision logic and branching based on the input source file's properties.
-You can create workflows with overridable properties and dynamic values to make even the most complex encoding tasks easy to repeat and customize in the cloud.
-
-Example workflows that you can create include:
-
-* Decision based workflows that inspect the source content for resolution and encode only the desired output tracks. This is helpful by eliminating the wasted tracks that would be generated by upscaling the source content inadvertently.
-* Multiple input files can be used to support captions, overlays and stitching together content.
-
-This tool can also be used to modify any of our [published workflows](media-services-workflow-designer.md#existing_workflows).
-
-> [!NOTE]
-> To get your copy of the Workflow Designer tool, please contact mepd@microsoft.com.
-
-Once a workflow file is created, it can be uploaded as an Asset, and then be used for encoding media files. For information on how to encode with **Media Encoder Premium Workflow** using **.NET**, see [Advanced encoding with Media Encoder Premium Workflow](media-services-encode-with-premium-workflow.md).
-
-## <a id="existing_workflows"></a>Modify existing workflows
-The default [published workflows](media-services-workflow-designer.md#existing_workflows) can be modified using the designer tool. You can get the default workflow files [here](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/media-services/previous/media-services-encode-with-premium-workflow.md). The folder also contains the description of these files.
-
-The following videos demonstrate how to use the designer.
-
-### Day 1 ΓÇô Getting Started
-Day 1 video covers:
-
-* Designer Overview
-* Basic Workflows ΓÇô "Hello World"
-* Creating multiple output MP4 files for use with Azure Media Services streaming
--
-### Day 2
-Day 2 video covers:
-
-* Varying Source file scenarios ΓÇô handling audio
-* Workflows with advanced Logic
-* Graph stages
-
-### Day 3
-Day 3 video covers:
-
-* Scripting inside of Workflows/Blueprints
-* Restrictions with the current Encoder
-* Q&A
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Next step
-Review Media Services learning paths.
--
-## Provide feedback
-
-## See Also
-[Azure Premium Encoder Workflow Designer Training Videos](http://johndeutscher.com/2015/07/06/azure-premium-encoder-workflow-designer-training-videos/)
-
media-services Migrate Azure Media Encoder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/migrate-azure-media-encoder.md
- Title: Migrate from Azure Media Encoder to Media Encoder Standard | Microsoft Docs
-description: This topic discusses how to migrate from Azure Media Encoder to the Media Encoder Standard media processor.
------ Previously updated : 3/10/2021---
-# Migrate from Azure Media Encoder to Media Encoder Standard
--
-This article discusses the steps for migrating from the legacy Azure Media Encoder (AME) media processor (which is being retired) to the Media Encoder Standard media processor. For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-When encoding files with AME, customers typically used a named preset string such as `H264 Adaptive Bitrate MP4 Set 1080p`. In order to migrate, your code needs to be updated to use the **Media Encoder Standard** media processor instead of AME, and one of the equivalent [system presets](media-services-mes-presets-overview.md) like `H264 Multiple Bitrate 1080p`.
-
-## Migrating to Media Encoder Standard
-
-Here is a typical C# code sample that uses the legacy media processor.
-
-```csharp
-// Declare a new job.
-IJob job = _context.Jobs.Create("AME Job");
-// Get a media processor reference, and pass to it the name of the
-// processor to use for the specific task.
-IMediaProcessor processor = GetLatestMediaProcessorByName("Azure Media Encoder");
-
-// Create a task with the encoding details, using a string preset.
-// In this case " H264 Adaptive Bitrate MP4 Set 1080p" preset is used.
-ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- " H264 Adaptive Bitrate MP4 Set 1080p",
- TaskOptions.None);
-```
-
-Here is the updated version that uses Media Encoder Standard.
-
-```csharp
-// Declare a new job.
-IJob job = _context.Jobs.Create("Media Encoder Standard Job");
-// Get a media processor reference, and pass to it the name of the
-// processor to use for the specific task.
-IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
-// Create a task with the encoding details, using a string preset.
-// In this case " H264 Multiple Bitrate 1080p" preset is used.
-ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "H264 Multiple Bitrate 1080p",
- TaskOptions.None);
-```
-
-### Advanced scenarios
-
-If you had created your own encoding preset for AME using its schema, there is an [equivalent schema for Media Encoder Standard](media-services-mes-schema.md). If you have questions on how to map the older settings to the new encoder, please reach out to us via mailto:amshelp@microsoft.com
-## Known differences
-
-Media Encoder Standard is more robust, reliable, has better performance, and produces better quality output than the legacy AME encoder. In addition:
-
-* Media Encoder Standard produces output files with a different naming convention than AME.
-* Media Encoder Standard produces artifacts such as files containing the [input file metadata](media-services-input-metadata-schema.md) and the [output file(s) metadata](media-services-output-metadata-schema.md).
-
-## Next steps
-
-* [Legacy components](legacy-components.md)
-* [Pricing page](https://azure.microsoft.com/pricing/details/media-services/#encoding)
media-services Migrate Indexer V1 V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/migrate-indexer-v1-v2.md
- Title: Migrate from Indexer v1 and v2 to Azure Media Services Video Indexer | Microsoft Docs
-description: This topic discusses how to migrate from Azure Media Indexer v1 and v2 to Azure Media Services Video Indexer.
------ Previously updated : 07/26/2021--
-# Migrate from Media Indexer and Media Indexer 2 to Video Analyzer for Media
--
-> [!IMPORTANT]
-> It is recommended that customers migrate from Indexer v1 and Indexer v2 to using the [Media Services v3 AudioAnalyzerPreset Basic mode](../latest/analyze-video-audio-files-concept.md). The [Azure Media Indexer](media-services-index-content.md) media processor and [Azure Media Indexer 2 Preview](./legacy-components.md) media processors are being retired. For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-Azure Video Analyzer for Media is built on Azure Media Analytics, Azure Cognitive Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Analyzer for Media video and audio models. To see what scenarios Video Analyzer for Media can be used in, and what features it offers, and how to get started, see [Video Analyzer for Media video and audio models](../../azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md).
-
-You can extract insights from your video and audio files by using the [Azure Media Services v3 analyzer presets](../latest/analyze-video-audio-files-concept.md) or directly by using the [Video Analyzer for Media APIs](https://api-portal.videoindexer.ai/). Currently, there is an overlap between features offered by the Video Analyzer for Media APIs and the Media Services v3 APIs.
-
-> [!NOTE]
-> To understand the differences between the Video Analyzer for Media vs. Media Services analyzer presets, check out the [comparison document](../../azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md).
-
-This article discusses the steps for migrating from the Azure Media Indexer and Azure Media Indexer 2 to Video Analyzer for Media.
-
-## Migration options
-
-|If you require |then |
-|||
-|a solution that provides a speech-to-text transcription for any media file format in a closed caption file formats: VTT, SRT, or TTML<br/>as well as additional audio insights such as: keywords, topic inferencing, acoustic events, speaker diarization, entities extraction and translation| update your applications to use the Video Analyzer for Media capabilities through the Video Analyzer for Media v2 REST API or the Azure Media Services v3 Audio Analyzer preset.|
-|speech-to-text capabilities| use the Cognitive Services Speech API directly.|
-
-## Getting started with Video Analyzer for Media
-
-The following section points you to relevant links: [How can I get started with Video Analyzer for Media?](../../azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md#how-can-i-get-started-with-video-analyzer-for-media)
-
-## Getting started with Media Services v3 APIs
-
-Azure Media Services v3 API enables you to extract insights from your video and audio files through the [Azure Media Services v3 analyzer presets](../latest/analyze-video-audio-files-concept.md).
-
-**AudioAnalyzerPreset** enables you to extract multiple audio insights from an audio or video file. The output includes a VTT or TTML file for the audio transcript and a JSON file (with all the additional audio insights). The audio insights include keywords, speaker indexing, and speech sentiment analysis. AudioAnalyzerPreset also supports language detection for specific languages. For detailed information, see [Transforms](/rest/api/media/transforms/createorupdate#audioanalyzerpreset).
-
-### Get started
-
-To get started see:
-
-* [Tutorial](../latest/analyze-videos-tutorial.md)
-* AudioAnalyzerPreset samples: [Java SDK](https://github.com/Azure-Samples/media-services-v3-java/tree/master/AudioAnalytics/AudioAnalyzer) or [.NET SDK](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/master/AudioAnalytics/AudioAnalyzer)
-* VideoAnalyzerPreset samples: [Java SDK](https://github.com/Azure-Samples/media-services-v3-java/tree/master/VideoAnalytics/VideoAnalyzer) or [.NET SDK](https://github.com/Azure-Samples/media-services-v3-dotnet/tree/master/VideoAnalytics/VideoAnalyzer)
-
-## Getting started with Cognitive Services Speech Services
-
-[Azure Cognitive Services](../../cognitive-services/index.yml) provides a speech-to-text service that transcribes audio streams to text in real time that your applications, tools, or devices can consume or display. You can use speech-to-text to [customize your own acoustic model, language model, or pronunciation model](../../cognitive-services/speech-service/how-to-custom-speech-train-model.md). For more information, see [Cognitive Services speech-to-text](../../cognitive-services/speech-service/speech-to-text.md).
-
-> [!NOTE]
-> The speech-to-text service does not take video file formats and only takes [certain audio formats](../../cognitive-services/speech-service/rest-speech-to-text.md#audio-formats).
-
-For more information about the text-to-speech service and how to get started, see [What is speech-to-text?](../../cognitive-services/speech-service/speech-to-text.md)
-
-## Known differences from deprecated services
-
-You will find that Video Analyzer for Media, Azure Media Services v3 AudioAnalyzerPreset, and Cognitive Services Speech Services services are more reliable and produces better quality output than the retired Azure Media Indexer 1 and Azure Media Indexer 2 processors.
-
-Some known differences include:
-
-* Cognitive Services Speech Services does not support keyword extraction. However, Video Analyzer for Media and Media Services v3 AudioAnalyzerPreset both offer a more robust set of keywords in JSON file format.
-
-## Support
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Next steps
-
-* [Legacy components](legacy-components.md)
-* [Pricing page](https://azure.microsoft.com/pricing/details/media-services/#encoding)
media-services Migrate Windows Azure Media Encoder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/migrate-windows-azure-media-encoder.md
- Title: Migrate from Windows Azure Media Encoder to Media Encoder Standard | Microsoft Docs
-description: This topic discusses how to migrate from Windows Azure Media Encoder to the Media Encoder Standard media processor.
------ Previously updated : 3/10/2021---
-# Migrate from Windows Azure Media Encoder to Media Encoder Standard
--
-This article discusses the steps for migrating from the legacy Windows Azure Media Encoder (WAME) media processor (which is being retired) to the Media Encoder Standard media processor. For the retirement dates, see this [legacy components](legacy-components.md) topic.
-
-When encoding files with WAME, customers typically used a named preset string such as `H264 Adaptive Bitrate MP4 Set 1080p`. In order to migrate, your code needs to be updated to use the **Media Encoder Standard** media processor instead of WAME, and one of the equivalent [system presets](media-services-mes-presets-overview.md) like `H264 Multiple Bitrate 1080p`.
-
-## Migrating to Media Encoder Standard
-
-Here is a typical C# code sample that uses the legacy component.
-
-```csharp
-// Declare a new job.
-IJob job = _context.Jobs.Create("WAME Job");
-// Get a media processor reference, and pass to it the name of the
-// processor to use for the specific task.
-IMediaProcessor processor = GetLatestMediaProcessorByName("Windows Azure Media Encoder");
-
-// Create a task with the encoding details, using a string preset.
-// In this case " H264 Adaptive Bitrate MP4 Set 1080p" preset is used.
-ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "H264 Adaptive Bitrate MP4 Set 1080p",
- TaskOptions.None);
-```
-
-Here is the updated version that uses Media Encoder Standard.
-
-```csharp
-// Declare a new job.
-IJob job = _context.Jobs.Create("Media Encoder Standard Job");
-// Get a media processor reference, and pass to it the name of the
-// processor to use for the specific task.
-IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
-
-// Create a task with the encoding details, using a string preset.
-// In this case " H264 Multiple Bitrate 1080p" preset is used.
-ITask task = job.Tasks.AddNew("My encoding task",
- processor,
- "H264 Multiple Bitrate 1080p",
- TaskOptions.None);
-```
-
-### Advanced scenarios
-
-If you had created your own encoding preset for WAME using its schema, there is an [equivalent schema for Media Encoder Standard](media-services-mes-schema.md).
-
-## Known differences
-
-Media Encoder Standard is more robust, reliable, has better performance, and produces better quality output than the legacy WAME encoder. In addition,:
-
-* Media Encoder Standard produces output files with a different naming convention than WAME.
-* Media Encoder Standard produces artifacts such as files containing the [input file metadata](media-services-input-metadata-schema.md) and the [output file(s) metadata](media-services-output-metadata-schema.md).
-* As documented on the [pricing page](https://azure.microsoft.com/pricing/details/media-services/#encoding) (especially in the FAQ section), when you encode videos using Media Encoder Standard, you get billed based on the duration of the files produced as output. With WAME, you would be billed based on the sizes of the input video file(s) and output video file(s).
-
-## Need help?
-
-You can open a support ticket by navigating to [New support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest)
-
-## Next steps
-
-* [Legacy components](legacy-components.md)
-* [Pricing page](https://azure.microsoft.com/pricing/details/media-services/#encoding)
media-services Offline Playready Streaming Windows 10 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/offline-playready-streaming-windows-10.md
- Title: Configure your account for offline streaming of PlayReady protected content - Azure
-description: This article shows how to configure your Azure Media Services account for streaming PlayReady for Windows 10 offline.
-
-keywords: DASH, DRM, Widevine Offline Mode, ExoPlayer, Android
----- Previously updated : 3/10/2021----
-# Offline PlayReady Streaming for Windows 10
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 3](../latest/drm-offline-playready-streaming-for-windows-10.md)
-> * [Version 2](offline-playready-streaming-windows-10.md)
--
-Azure Media Services support offline download/playback with DRM protection. This article covers offline support of Azure Media Services for Windows 10/PlayReady clients. You can read about the offline mode support for iOS/FairPlay and Android/Widevine devices in the following articles:
--- [Offline FairPlay Streaming for iOS](media-services-protect-hls-with-offline-fairplay.md)-- [Offline Widevine Streaming for Android](offline-widevine-for-android.md)-
-## Overview
-
-This section gives some background on offline mode playback, especially why:
-
-* In some countries/regions, Internet availability and/or bandwidth is still limited. Users may choose to download first to be able to watch content in high enough resolution for satisfactory viewing experience. In this case, more often, the issue is not network availability, rather it is limited network bandwidth. OTT/OVP providers are asking for offline mode support.
-* As disclosed at Netflix 2016 Q3 shareholder conference, downloading content is a ΓÇ£oft-requested featureΓÇ¥, and ΓÇ£we are open to itΓÇ¥ said by Reed Hastings, Netflix CEO.
-* Some content providers may disallow DRM license delivery beyond a country/region's border. If a user needs to travel abroad and still wants to watch content, offline download is needed.
-
-The challenge we face in implementing offline mode is the following:
-
-* MP4 is supported by many players, encoder tools, but there is no binding between MP4 container and DRM;
-* In the long term, CFF with CENC is the way to go. However, today, the tools/player support ecosystem is not there yet. We need a solution, today.
-
-The idea is: smooth streaming ([PIFF](/iis/media/smooth-streaming/protected-interoperable-file-format)) file format with H264/AAC has a binding with PlayReady (AES-128 CTR). Individual smooth streaming .ismv file (assuming audio is muxed in video) is itself a fMP4 and can be used for playback. If a smooth streaming content goes through PlayReady encryption, each .ismv file becomes a PlayReady protected fragmented MP4. We can choose an .ismv file with the preferred bitrate and rename it as .mp4 for download.
-
-There are two options for hosting the PlayReady protected MP4 for progressive download:
-
-* One can put this MP4 in the same container/media service asset and leverage Azure Media Services streaming endpoint for progressive download;
-* One can use SAS locator for progressive download directly from Azure Storage, bypassing Azure Media Services.
-
-You can use two types of PlayReady license delivery:
-
-* PlayReady license delivery service in Azure Media Services;
-* PlayReady license servers hosted anywhere.
-
-Below are two sets of test assets, the first one using PlayReady license delivery in AMS while the second one using my PlayReady license server hosted on an Azure VM:
-
-Asset #1:
-
-* Progressive download URL: [https://willzhanmswest.streaming.mediaservices.windows.net/8d078cf8-d621-406c-84ca-88e6b9454acc/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4](https://willzhanmswest.streaming.mediaservices.windows.net/8d078cf8-d621-406c-84ca-88e6b9454acc/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4)
-* PlayReady LA_URL (AMS): `https://willzhanmswest.keydelivery.mediaservices.windows.net/PlayReady/`
-
-Asset #2:
-
-* Progressive download URL: [https://willzhanmswest.streaming.mediaservices.windows.net/7c085a59-ae9a-411e-842c-ef10f96c3f89/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4](https://willzhanmswest.streaming.mediaservices.windows.net/7c085a59-ae9a-411e-842c-ef10f96c3f89/20150807-bridges-2500_H264_1644kbps_AAC_und_ch2_256kbps.mp4)
-* PlayReady LA_URL (on-prem): `https://willzhan12.cloudapp.net/playready/rightsmanager.asmx`
-
-For playback testing, I used a Universal Windows Application on Windows 10. In [Windows 10 Universal samples](https://github.com/Microsoft/Windows-universal-samples), there is a basic player sample called [Adaptive Streaming Sample](https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/AdaptiveStreaming). All we have to do is to add the code for us to pick downloaded video and use it as the source, instead of adaptive streaming source. The changes are in button click event handler:
-
-```csharp
-private async void LoadUri_Click(object sender, RoutedEventArgs e)
-{
- //Uri uri;
- //if (!Uri.TryCreate(UriBox.Text, UriKind.Absolute, out uri))
- //{
- // Log("Malformed Uri in Load text box.");
- // return;
- //}
- //LoadSourceFromUriTask = LoadSourceFromUriAsync(uri);
- //await LoadSourceFromUriTask;
-
- //willzhan change start
- // Create and open the file picker
- FileOpenPicker openPicker = new FileOpenPicker();
- openPicker.ViewMode = PickerViewMode.Thumbnail;
- openPicker.SuggestedStartLocation = PickerLocationId.ComputerFolder;
- openPicker.FileTypeFilter.Add(".mp4");
- openPicker.FileTypeFilter.Add(".ismv");
- //openPicker.FileTypeFilter.Add(".mkv");
- //openPicker.FileTypeFilter.Add(".avi");
-
- StorageFile file = await openPicker.PickSingleFileAsync();
-
- if (file != null)
- {
- //rootPage.NotifyUser("Picked video: " + file.Name, NotifyType.StatusMessage);
- this.mediaPlayerElement.MediaPlayer.Source = MediaSource.CreateFromStorageFile(file);
- this.mediaPlayerElement.MediaPlayer.Play();
- UriBox.Text = file.Path;
- }
- else
- {
- // rootPage.NotifyUser("Operation cancelled.", NotifyType.ErrorMessage);
- }
-
- // On small screens, hide the description text to make room for the video.
- DescriptionText.Visibility = (ActualHeight < 500) ? Visibility.Collapsed : Visibility.Visible;
-}
-```
-
- ![Offline mode playback of PlayReady protected fMP4](./media/offline-playready/offline-playready1.jpg)
-
-Since the video is under PlayReady protection, the screenshot will not be able to include the video.
-
-In summary, we have achieved offline mode on Azure Media
-
-* Content transcoding and PlayReady encryption can be done in Azure Media Services or other tools;
-* Content can be hosted in Azure Media Services or Azure Storage for progressive download;
-* PlayReady license delivery can be from Azure Media Services or elsewhere;
-* The prepared smooth streaming content can still be used for online streaming via DASH or smooth with PlayReady as the DRM.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Next steps
-
-[Hybrid DRM system design](hybrid-design-drm-sybsystem.md)
media-services Offline Widevine For Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/offline-widevine-for-android.md
- Title: Configure your account for offline streaming of Widevine protected content - Azure
-description: This topic shows how to configure your Azure Media Services account for offline streaming of Widevine protected content.
-
-keywords: DASH, DRM, Widevine Offline Mode, ExoPlayer, Android
----- Previously updated : 3/10/2021---
-# Offline Widevine streaming for Android
--
-> [!div class="op_single_selector" title1="Select the version of Media Services that you are using:"]
-> * [Version 3](../latest/drm-offline-widevine-for-android.md)
-> * [Version 2](offline-widevine-for-android.md)
--
-In addition to protecting content for online streaming, media content subscription and rental services offer downloadable content that works when you are not connected to the internet. You might need to download content onto your phone or tablet for playback in airplane mode when flying disconnected from the network. Additional scenarios, in which you might want to download content:
--- Some content providers may disallow DRM license delivery beyond a country/region's border. If a user wants to watch content while traveling abroad, offline download is needed.-- In some countries/regions, Internet availability and/or bandwidth is limited. Users may choose to download content to be able to watch it in high enough resolution for satisfactory viewing experience.-
-This article discusses how to implement offline mode playback for DASH content protected by Widevine on Android devices. Offline DRM allows you to provide subscription, rental, and purchase models for your content, enabling customers of your services to easily take content with them when disconnected from the internet.
-
-For building the Android player apps, we outline three options:
-
-> [!div class="checklist"]
-> * Build a player using the Java API of ExoPlayer SDK
-> * Build a player using Xamarin binding of ExoPlayer SDK
-> * Build a player using Encrypted Media Extension (EME) and Media Source Extension (MSE) in Chrome mobile browser v62 or later
-
-The article also answers some common questions related to offline streaming of Widevine protected content.
-
-## Requirements
-
-Before implementing offline DRM for Widevine on Android devices, you should first:
--- Become familiar with the concepts introduced for online content protection using Widevine DRM. This is covered in detail in the following documents/samples:
- - [Use Azure Media Services to deliver DRM licenses or AES keys](media-services-deliver-keys-and-licenses.md)
- - [CENC with Multi-DRM and Access Control: A Reference Design and Implementation on Azure and Azure Media Services](media-services-cenc-with-multidrm-access-control.md)
- - [Using PlayReady and/or Widevine Dynamic Common Encryption with .NET](https://azure.microsoft.com/resources/samples/media-services-dotnet-dynamic-encryption-with-drm/)
- - [Use Azure Media Services to deliver PlayReady and/or Widevine licenses with .NET](https://azure.microsoft.com/resources/samples/media-services-dotnet-deliver-playready-widevine-licenses/)
-- Become familiar with the Google ExoPlayer SDK for Android, an open-source video player SDK capable of supporting offline Widevine DRM playback.
- - [ExoPlayer SDK](https://github.com/google/ExoPlayer)
- - [ExoPlayer Developer Guide](https://google.github.io/ExoPlayer/guide.html)
- - [EoPlayer Developer Blog](https://medium.com/google-exoplayer)
-
-## Content protection configuration in Azure Media Services
-
-When you configure Widevine protection of an asset in Media Services, you need to create ContentKeyAuthorizationPolicyOption which specified the following three things:
-
-1. DRM system (Widevine)
-2. ContentKeyAuthorizationPolicyRestriction specifying how content key delivery is authorized in license delivery service (Open or token authorization)
-3. DRM (Widevine) license template
-
-To enable **offline** mode for Widevine licenses, you need to configure [Widevine license template](media-services-widevine-license-template-overview.md). In the **policy_overrides** object, set the **can_persist** property to **true** (default is false).
-
-The following code sample uses .NET to enable **offline** mode for Widevine licenses. The code is based on the [
-Using PlayReady and/or Widevine Dynamic Common Encryption with .NET](https://github.com/Azure-Samples/media-services-dotnet-dynamic-encryption-with-drm) sample.
-
-```
-private static string ConfigureWidevineLicenseTemplateOffline(Uri keyDeliveryUrl)
-{
- var template = new WidevineMessage
- {
- allowed_track_types = AllowedTrackTypes.SD_HD,
- content_key_specs = new[]
- {
- new ContentKeySpecs
- {
- required_output_protection = new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
- security_level = 1,
- track_type = "SD"
- }
- },
- policy_overrides = new
- {
- can_play = true,
- can_persist = true,
- //can_renew = true, //if you set can_renew = false, you do not need renewal_server_url
- //renewal_server_url = keyDeliveryUrl.ToString(), //not mandatory, renewal_server_url is needed only if license_duration_seconds is set
- can_renew = false,
- //rental_duration_seconds = 1209600,
- //playback_duration_seconds = 1209600,
- //license_duration_seconds = 1209600
- }
- };
-
- string configuration = Newtonsoft.Json.JsonConvert.SerializeObject(template);
- return configuration;
-}
-```
-
-## Configuring the Android player for offline playback
-
-The easiest way to develop a native player app for Android devices is to use the [Google ExoPlayer SDK](https://github.com/google/ExoPlayer), an open-source video player SDK. ExoPlayer supports features not currently supported by Android's native MediaPlayer API, including MPEG-DASH and Microsoft Smooth Streaming delivery protocols.
-
-ExoPlayer version 2.6 and higher includes many classes that support offline Widevine DRM playback. In particular, the OfflineLicenseHelper class provides utility functions to facilitate the use of the DefaultDrmSessionManager for downloading, renewing, and releasing offline licenses. The classes provided in the SDK folder "library/core/src/main/java/com/google/android/exoplayer2/offline/" support offline video content downloading.
-
-The following list of classes facilitate offline mode in the ExoPlayer SDK for Android:
--- library/core/src/main/java/com/google/android/exoplayer2/drm/OfflineLicenseHelper.java -- library/core/src/main/java/com/google/android/exoplayer2/drm/DefaultDrmSession.java-- library/core/src/main/java/com/google/android/exoplayer2/drm/DefaultDrmSessionManager.java-- library/core/src/main/java/com/google/android/exoplayer2/drm/DrmSession.java-- library/core/src/main/java/com/google/android/exoplayer2/drm/ErrorStateDrmSession.java-- library/core/src/main/java/com/google/android/exoplayer2/drm/ExoMediaDrm.java-- library/core/src/main/java/com/google/android/exoplayer2/offline/SegmentDownloader.java-- library/core/src/main/java/com/google/android/exoplayer2/offline/DownloaderConstructorHelper.java -- library/core/src/main/java/com/google/android/exoplayer2/offline/Downloader.java-- library/dash/src/main/java/com/google/android/exoplayer2/source/dash/offline/DashDownloader.java -
-Developers should reference the [ExoPlayer Developer Guide](https://google.github.io/ExoPlayer/guide.html) and the corresponding [Developer Blog](https://medium.com/google-exoplayer) during development of an application. Google has not released a fully documented reference implementation or sample code for the ExoPlayer app supporting Widevine offline at this time, so the information is limited to the developers guide and blog.
-
-### Working with older Android devices
-
-For some older Android devices, you must set values for the following **policy_overrides** properties (defined in [Widevine license template](media-services-widevine-license-template-overview.md): **rental_duration_seconds**, **playback_duration_seconds**, and **license_duration_seconds**. Alternatively, you can set them to zero, which means infinite/unlimited duration.
-
-The values must be set to avoid an integer overflow bug. For more explanation about the issue, see https://github.com/google/ExoPlayer/issues/3150 and https://github.com/google/ExoPlayer/issues/3112. <br/>If you do not set the values explicitly, very large values for **PlaybackDurationRemaining** and **LicenseDurationRemaining** will be assigned, (for example, 9223372036854775807, which is the maximum positive value for a 64-bit integer). As a result, the Widevine license appears expired and hence the decryption will not happen.
-
-This issue does not occur on Android 5.0 Lollipop or later since Android 5.0 is the first Android version, which has been designed to fully support ARMv8 ([Advanced RISC Machine](https://en.wikipedia.org/wiki/ARM_architecture)) and 64-bit platforms, while Android 4.4 KitKat was originally designed to support ARMv7 and 32-bit platforms as with other older Android versions.
-
-## Using Xamarin to build an Android playback app
-
-You can find Xamarin bindings for ExoPlayer using the following links:
--- [Xamarin bindings library for the Google ExoPlayer library](https://github.com/martijn00/ExoPlayerXamarin)-- [Xamarin bindings for ExoPlayer NuGet](https://www.nuget.org/packages/Xam.Plugins.Android.ExoPlayer/)-
-Also, see the following thread: [Xamarin binding](https://github.com/martijn00/ExoPlayerXamarin/pull/57).
-
-## Chrome player apps for Android
-
-Starting with the release of [Chrome for Android v. 62](https://developers.google.com/web/updates/2017/09/chrome-62-media-updates), persistent license in EME is supported. [Widevine L1](https://developers.google.com/web/updates/2017/09/chrome-62-media-updates#widevine_l1) is now also supported in Chrome for Android. This allows you to create offline playback applications in Chrome if your end users have this (or higher) version of Chrome.
-
-In addition, Google has produced a Progressive Web App (PWA) sample and open-sourced it:
--- [Source code](https://github.com/GoogleChromeLabs/sample-media-pwa)-- [Google hosted version](https://biograf-155113.appspot.com/ttt/episode-2/) (only works in Chrome v 62 and higher on Android devices)-
-If you upgrade your mobile Chrome browser to v62 (or higher) on an Android phone and test the above hosted sample app, you will see that both online streaming and offline playback work.
-
-The above open-source PWA app is authored in Node.js. If you want to host your own version on an Ubuntu server, keep in mind the following common encountered issues that can prevent playback:
-
-1. CORS issue: The sample video in the sample app is hosted in https://storage.googleapis.com/biograf-video-files/videos/. Google has set up CORS for all their test samples hosted in Google Cloud Storage bucket. They are served with CORS headers, specifying explicitly the CORS entry: `https://biograf-155113.appspot.com` (the domain in which google hosts their sample) preventing access by any other sites. If you try, you will see the following HTTP error: `Failed to load https://storage.googleapis.com/biograf-video-files/videos/poly-sizzle-2015/mp4/dash.mpd: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https:\//13.85.80.81:8080' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.`
-2. Certificate issue: Starting from Chrome v 58, EME for Widevine requires HTTPS. Therefore, you need to host the sample app over HTTPS with an X509 certificate. A usual test certificate does not work due to the following requirements: You need to obtain a certificate meeting the following minimum requirements:
- - Chrome and Firefox require SAN-Subject Alternative Name setting to exist in the certificate
- - The certificate must have trusted CA and a self-signed development certificate does not work
- - The certificate must have a CN matching the DNS name of the web server or gateway
-
-## Frequently asked questions
-
-### Question
-
-How can I deliver persistent licenses (offline-enabled) for some clients/users and non-persistent licenses (offline-disabled) for others? Do I have to duplicate the content and use separate content key?
-
-### Answer
-You do not need to duplicate the content. You can simply use a single copy of the content and a single ContentKeyAuthorizationPolicy, but two separate ContentKeyAuthorizationPolicyOption's:
-
-1. IContentKeyAuthorizationPolicyOption 1: uses persistent license, and ContentKeyAuthorizationPolicyRestriction 1 which contains a claim such as license_type = "Persistent"
-2. IContentKeyAuthorizationPolicyOption 2: uses non-persistent license, and ContentKeyAuthorizationPolicyRestriction 2 which contains a claim such as license_type = "Nonpersistent"
-
-This way, when a license request comes in from the client app, from license request there is no difference. However, for different end user/device, the STS should have the business logic to issue different JWT tokens containing different claims (one of the above two license_type's). The claims value in the JWT token will be used to decide by license service to issue what type of
-license: persistent or non-persistent.
-
-This means the Secure Token Service (STS) needs to have the business logic and client/device info to add corresponding claim value into a token.
-
-### Question
-
-For Widevine security levels, in Google's [Widevine DRM Architecture Overview doc](https://storage.googleapis.com/wvdocs/Widevine_DRM_Architecture_Overview.pdf) documentation,
-it defines three different security levels. However, in [Azure Media Services documentation on Widevine license template](./media-services-widevine-license-template-overview.md),
-five different security levels are outlined. What is the relationship or mapping between the two different sets of security levels?
-
-### Answer
-
-In Google's [Widevine DRM Architecture Overview](https://storage.googleapis.com/wvdocs/Widevine_DRM_Architecture_Overview.pdf),
-it defines the following three security levels:
-
-1. Security Level 1: All content processing, cryptography, and control are performed within the Trusted Execution Environment (TEE). In some implementation models, security processing may be performed in different chips.
-2. Security Level 2: Performs cryptography (but not video processing) within the TEE: decrypted buffers are returned to the application domain and processed through separate video hardware or software. At level 2, however, cryptographic information is still processed only within the TEE.
-3. Security Level 3 Does not have a TEE on the device. Appropriate measures may be taken to protect the cryptographic information and decrypted content on host operating system. A Level 3 implementation may also include a hardware cryptographic engine, but that only enhances performance, not security.
-
-At the same time, in [Azure Media Services documentation on Widevine license template](./media-services-widevine-license-template-overview.md), the security_level property of content_key_specs can have the following five different values (client robustness requirements for playback):
-
-1. Software-based whitebox crypto is required.
-2. Software crypto and an obfuscated decoder is required.
-3. The key material and crypto operations must be performed within a hardware backed TEE.
-4. The crypto and decoding of content must be performed within a hardware backed TEE.
-5. The crypto, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware backed TEE.
-
-Both security levels are defined by Google Widevine. The difference is in its usage level: architecture level or API level. The five security levels are used in the Widevine API. The content_key_specs object, which
-contains security_level is deserialized and passed to the Widevine global delivery service by Azure Media Services Widevine license service. The table below shows the mapping between the two sets of security levels.
-
-| **Security Levels Defined in Widevine Architecture** |**Security Levels Used in Widevine API**|
-|||
-| **Security Level 1**: All content processing, cryptography, and control are performed within the Trusted Execution Environment (TEE). In some implementation models, security processing may be performed in different chips.|**security_level=5**: The crypto, decoding, and all handling of the media (compressed and uncompressed) must be handled within a hardware backed TEE.<br/><br/>**security_level=4**: The crypto and decoding of content must be performed within a hardware backed TEE.|
-**Security Level 2**: Performs cryptography (but not video processing) within the TEE: decrypted buffers are returned to the application domain and processed through separate video hardware or software. At level 2, however, cryptographic information is still processed only within the TEE.| **security_level=3**: The key material and crypto operations must be performed within a hardware backed TEE. |
-| **Security Level 3**: Does not have a TEE on the device. Appropriate measures may be taken to protect the cryptographic information and decrypted content on host operating system. A Level 3 implementation may also include a hardware cryptographic engine, but that only enhances performance, not security. | **security_level=2**: Software crypto and an obfuscated decoder is required.<br/><br/>**security_level=1**: Software-based whitebox crypto is required.|
-
-### Question
-
-Why does content download take so long?
-
-### Answer
-
-There are two ways to improve download speed:
-
-1. Enable CDN so that end users are more likely to hit CDN instead of origin/streaming endpoint for content download. If user hits streaming endpoint, each HLS segment or DASH fragment is dynamically packaged and encrypted. Even though this latency is in millisecond scale for each segment/fragment, when you have an hour long video, the accumulated latency can be large causing longer download.
-2. Provide end users the option to selectively download video quality layers and audio tracks instead of all contents. For offline mode, there is no point to download all of the quality layers. There are two ways to achieve this:
- 1. Client controlled: either player app auto selects or user selects video quality layer and audio tracks to download;
- 2. Service controlled: one can use Dynamic Manifest feature in Azure Media Services to create a (global) filter, which limits HLS playlist or DASH MPD to a single video quality layer and selected audio tracks. Then the download URL presented to end users will include this filter.
-
-## Additional notes
-
-* Widevine is a service provided by Google Inc. and subject to the terms of service and Privacy Policy of Google, Inc.
-
-## Summary
-
-This article discussed how to implement offline mode playback for DASH content protected by Widevine on Android devices. It also answered some common questions related to offline streaming of Widevine protected content.
media-services Postman Collection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/postman-collection.md
- Title: Import the Postman collection with Azure On-Demand Streaming operations
-description: Learn about the Postman collection that contains grouped HTTP requests that call Azure Media Services REST APIs.
------ Previously updated : 3/10/2021----
-# Import a Postman collection with On-Demand Streaming operations
--
-This article contains a definition of the **Postman** collection that contains grouped HTTP requests that call Azure Media Services REST APIs. For information about how to configure **Postman** so it can be used to call Media Services REST APIs, see [Configure Postman for Media Services REST API calls](media-rest-apis-with-postman.md) tutorial.
-
-```json
-{
- "info": {
- "name": "Azure Media Services Operations",
- "_postman_id": "3a9a704f-ec11-3433-a0dc-54e4fe39e9d8",
- "description": "Azure Media Service REST API v 2.0 Collection\n\nSupports AD service principal authentication\nFor details see: https://docs.microsoft.com/azure/media-services/media-services-rest-connect-with-aad\n\n",
- "schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
- },
- "item": [
- {
- "name": "1. Get AAD Auth Token",
- "description": "To get started making calls to Azure Media Services you have to first do the following:\n1) Get Token and cache it.\n2) Get the Closest API endpoint from http://media.windows.net",
- "item": [
- {
- "name": "Get Azure AD Token for Service Principal Authentication",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"AccessToken\", json.access_token);"
- ]
- }
- }
- ],
- "request": {
- "method": "POST",
- "header": [
- {
- "key": "Content-Type",
- "value": "application/x-www-form-urlencoded"
- },
- {
- "key": "Keep-Alive",
- "value": "true"
- }
- ],
- "body": {
- "mode": "urlencoded",
- "urlencoded": [
- {
- "key": "grant_type",
- "value": "client_credentials",
- "description": "",
- "type": "text"
- },
- {
- "key": "client_id",
- "value": "{{ClientID}}",
- "description": "The Client ID for your AAD application",
- "type": "text"
- },
- {
- "key": "client_secret",
- "value": "{{ClientSecret}}",
- "description": "The Client Secret for your AAD application Service principal",
- "type": "text"
- },
- {
- "key": "resource",
- "value": "https://rest.media.azure.net",
- "description": "Normally this is https://rest.media.azure.net",
- "type": "text"
- }
- ]
- },
- "url": {
- "raw": "{{AzureADSTSEndpoint}}",
- "host": [
- "{{AzureADSTSEndpoint}}"
- ]
- },
- "description": ""
- },
- "response": []
- }
- ]
- },
- {
- "name": "AccessPolicy",
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy\n\nAn AccessPolicy defines the permissions and duration of storage SAS or streaming access to an Asset.",
- "item": [
- {
- "name": "Create AccessPolicy for ReadOnly",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\": \"NewUploadPolicy\", \n\t\"DurationInMinutes\" : \"100\", \n\t\"Permissions\" : 1 \n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/AccessPolicies",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "AccessPolicies"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy#create_an_accesspolicy\n\n\n## Permissions: \nspecifies the access rights the client has when interacting with the Asset. Valid values are:\n\n- None = 0 (default)\n- Read = 1\n- Write = 2\n- Delete = 4\n- List = 8"
- },
- "response": []
- },
- {
- "name": "Create AccessPolicy for Upload (Write)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"LastAccessPolicyId\", json.Id);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\": \"NewUploadPolicy\", \n\t\"DurationInMinutes\" : \"100\", \n\t\"Permissions\" : 2 \n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/AccessPolicies",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "AccessPolicies"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy#create_an_accesspolicy\n\n\n## Permissions: \nspecifies the access rights the client has when interacting with the Asset. Valid values are:\n\n- None = 0 (default)\n- Read = 1\n- Write = 2\n- Delete = 4\n- List = 8"
- },
- "response": []
- },
- {
- "name": "List AccessPolicies",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/AccessPolicies",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "AccessPolicies"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy#list_accesspolicies"
- },
- "response": []
- },
- {
- "name": "Delete AccessPolicy",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/AccessPolicies{'{{accessPolicyId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "AccessPolicies{'{{accessPolicyId}}')"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy#list_accesspolicies"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Assets",
- "description": "",
- "item": [
- {
- "name": "Get Assets",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ]
- },
- "description": "List Assets\nThe Asset entity contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.\n\nAsset Entity REST API - https://msdn.microsoft.com/library/azure/hh974277.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Specific Asset ID",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets('{{LastAssetId}}')"
- ]
- },
- "description": ""
- },
- "response": []
- },
- {
- "name": "List Locators on a Specific Asset",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')/Locators",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets('{{LastAssetId}}')",
- "Locators"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators\n"
- },
- "response": []
- },
- {
- "name": "Get Assets ($top)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets?$top=1",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ],
- "query": [
- {
- "key": "$top",
- "value": "1",
- "equals": true
- }
- ]
- },
- "description": "List Assets $top = value"
- },
- "response": []
- },
- {
- "name": "Get Assets ($skip)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets?$skip=1",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ],
- "query": [
- {
- "key": "$skip",
- "value": "1",
- "equals": true
- }
- ]
- },
- "description": "List Assets"
- },
- "response": []
- },
- {
- "name": "Get Assets ($inlinecount=allpages)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;",
- "tests[\"Has count\"] = jsonData.odata_count !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets?$inlinecount=allpages",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ],
- "query": [
- {
- "key": "$inlinecount",
- "value": "allpages",
- "equals": true
- }
- ]
- },
- "description": "List Assets with page count"
- },
- "response": []
- },
- {
- "name": "Create Asset",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"LastAssetId\", json.Id);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\": \"Hello from Postman\" \n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ]
- },
- "description": "Create Assets\nThe Asset entity contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.\n\n[Asset Entity REST API](/rest/api/media/operations/asset)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Delete Asset",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets('{{LastAssetId}}')"
- ]
- },
- "description": "Create Assets\nThe Asset entity contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.\n\nUse the global variable from the CreateAsset Post to delete the asset. It is stored in the LastAssetId global variable.\n\nAsset Entity REST API - https://msdn.microsoft.com/library/azure/hh974277.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Update Asset copy",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204\"] = responseCode.code === 204;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n Name:\"Hello - this is my new name\"\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets('{{LastAssetId}}')"
- ]
- },
- "description": "Update Assets\nThe Asset entity contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.\n\nAsset Entity REST API - https://msdn.microsoft.com/library/azure/hh974277.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Assets ($filter=startswith)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets?$top=20&$filter=startswith(Name,'Holo')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ],
- "query": [
- {
- "key": "$top",
- "value": "20",
- "equals": true
- },
- {
- "key": "$filter",
- "value": "startswith(Name,'Holo')",
- "equals": true
- }
- ]
- },
- "description": "List Assets and filter by startswith"
- },
- "response": []
- },
- {
- "name": "Get Assets ($filter) indexof",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets?$top=20&$filter=indexof(Name,'Holo') gt 1",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets"
- ],
- "query": [
- {
- "key": "$top",
- "value": "20",
- "equals": true
- },
- {
- "key": "$filter",
- "value": "indexof(Name,'Holo') gt 1",
- "equals": true
- }
- ]
- },
- "description": "List Assets $filter and indexof"
- },
- "response": []
- }
- ]
- },
- {
- "name": "AssetFiles",
- "description": "",
- "item": [
- {
- "name": "CreateFileInfos",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/CreateFileInfos?assetid='{{LastAssetId}}'",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "CreateFileInfos"
- ],
- "query": [
- {
- "key": "assetid",
- "value": "'{{LastAssetId}}'",
- "equals": true
- }
- ]
- },
- "description": "Create Asset Files\nTo create the asset files on an asset, you have to use the CreateFileInfos function.\nA File entity is created using the CreateFileInfos function and passing in the Asset Id that is associated with the media file you uploaded into blob storage. For more information, see Upload a file to blob storage.\nhttps://msdn.microsoft.com/library/azure/jj683097.aspx\n\nAssetFile Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974275.aspx \n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Asset Files",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Response has a value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Assets('{{LastAssetId}}')/Files",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Assets('{{LastAssetId}}')",
- "Files"
- ]
- },
- "description": "Get Asset Files\n\nAssetFile Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974275.aspx \n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Update Asset File",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Authorization",
- "value": "Bearer {{AccessToken}}"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n MimeType: \"video/mp4\"\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Files('nb:cid:UUID:5710445d-1500-80c4-bc75-f1e5c3a6141b')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Files('nb:cid:UUID:5710445d-1500-80c4-bc75-f1e5c3a6141b')"
- ]
- },
- "description": "Update an Asset Files\n\nAssetFile Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974275.aspx \n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Channels",
- "description": "",
- "item": [
- {
- "name": "Get Channels ",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "List Channels\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Create Channel (Pass-Through)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"LastChannelId\", jsonData.Id);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"postman-test-channel-1\",\r\n \"Description\": \"My channel description for use in discovery\",\r\n \"Input\": {\r\n \"KeyFrameInterval\": null,\r\n \"StreamingProtocol\": \"FragmentedMP4\",\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"Allow All\",\r\n \"Address\": \"0.0.0.0\",\r\n \"SubnetPrefixLength\": 0\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Preview\": {\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"Allow All\",\r\n \"Address\": \"0.0.0.0\",\r\n \"SubnetPrefixLength\": 0\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Output\": {\r\n \"Hls\": {\r\n \"FragmentsPerSegment\": 1\r\n }\r\n },\r\n \"CrossSiteAccessPolicies\": {\r\n \"ClientAccessPolicy\": null,\r\n \"CrossDomainPolicy\": null\r\n }\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "Create Channel\n\nChannels can be created using a POST HTTP request and specifying property values.\n\nIf successful, a 202 Accepted status code is returned along with a representation of the created entity in the response body. \n\nThe 202 Accepted status code indicates an asynchronous operation, in which case the operation-id header value is also provided for use in polling and tracking the status of long-running operations, such as starting or stopping a Channel. Pass the operation-id header value into the Operation Entity to retrieve the status. For more information, see Manually Polling Long-Running Operations.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Create Channel (Encoding old)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"postman-encoding-channel\",\r\n \"Description\": \"My description of my channel for discovery\",\r\n \"EncodingType\": \"Standard\",\r\n \"Encoding\": {\r\n \"SystemPreset\": \"Default720p\",\r\n \"IgnoreCea708ClosedCaptions\": false,\r\n \"AdMarkerSource\": \"Api\",\r\n \"VideoStream\": {\r\n \"Index\": 1,\r\n \"Name\": \"Video stream\"\r\n },\r\n \"AudioStreams\": [\r\n {\r\n \"Index\": 0,\r\n \"Name\": \"English audio stream\",\r\n \"Language\": \"ENG\"\r\n }\r\n ]\r\n },\r\n \"Input\": {\r\n \"KeyFrameInterval\": null,\r\n \"StreamingProtocol\": \"FragmentedMP4\",\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"testName1\",\r\n \"Address\": \"1.1.1.1\",\r\n \"SubnetPrefixLength\": 24\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Preview\": {\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"testName1\",\r\n \"Address\": \"1.1.1.1\",\r\n \"SubnetPrefixLength\": 24\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Output\": {\r\n \"Hls\": {\r\n \"FragmentsPerSegment\": 1\r\n }\r\n }\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "Create Channel with Encoding\n\nChannels can be created using a POST HTTP request and specifying property values.\n\nIf successful, a 202 Accepted status code is returned along with a representation of the created entity in the response body. \n\nThe 202 Accepted status code indicates an asynchronous operation, in which case the operation-id header value is also provided for use in polling and tracking the status of long-running operations, such as starting or stopping a Channel. Pass the operation-id header value into the Operation Entity to retrieve the status. For more information, see Manually Polling Long-Running Operations.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Create Channel (New Encoder)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"CoolNewEncoderChannel\",\r\n \"Description\": \"My description of my channel\",\r\n \"EncodingType\": \"Basic\",\r\n \"Encoding\": null,\r\n \"Slate\": null,\r\n \"Input\": {\r\n \"KeyFrameInterval\": null,\r\n \"StreamingProtocol\": \"RTMP\",\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"testName1\",\r\n \"Address\": \"1.1.1.1\",\r\n \"SubnetPrefixLength\": 24\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Preview\": {\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"testName1\",\r\n \"Address\": \"1.1.1.1\",\r\n \"SubnetPrefixLength\": 24\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Output\": {\r\n \"Hls\": {\r\n \"FragmentsPerSegment\": null\r\n }\r\n },\r\n \"CrossSiteAccessPolicies\": {\r\n \"ClientAccessPolicy\": null,\r\n \"CrossDomainPolicy\": null\r\n }\r\n}\r\n"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "Create Channel with Encoding\n\nChannels can be created using a POST HTTP request and specifying property values.\n\nIf successful, a 202 Accepted status code is returned along with a representation of the created entity in the response body. \n\nThe 202 Accepted status code indicates an asynchronous operation, in which case the operation-id header value is also provided for use in polling and tracking the status of long-running operations, such as starting or stopping a Channel. Pass the operation-id header value into the Operation Entity to retrieve the status. For more information, see Manually Polling Long-Running Operations.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Delete Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')"
- ]
- },
- "description": "Delete Channels\n\nDelete the Channel entity\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Start Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')/Start",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')",
- "Start"
- ]
- },
- "description": "Start a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Stop Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')/Stop",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')",
- "Stop"
- ]
- },
- "description": "Stop a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Reset Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/Reset",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "Reset"
- ]
- },
- "description": "Reset a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Update Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- "",
- "postman.setEnvironmentVariable(\"LastChannelId\", jsonData.Id);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=minimalmetadata"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=minimalmetadata"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "\"Encoding\":{\"IgnoreCea708ClosedCaptions\": true}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "Update Channel\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Start Advertisement",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"duration\":\"PT45S\",\r\n \"cueId\":\"67520935\",\r\n \"showSlate\":\"true\"\r\n}\r\n"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/StartAdvertisement",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "StartAdvertisement"
- ]
- },
- "description": "Start a Channel Ad Break\n\nThe live encoder can be signaled to start an advertisement or commercial break using a POST HTTP request and specifying property values of the in the StartAdvertisement Entity entity in the body of the request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "End Advertisement",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/EndAdvertisement",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "EndAdvertisement"
- ]
- },
- "description": "End a Channel Ad Break\n\nThe live encoder can be signaled to start an advertisement or commercial break using a POST HTTP request and specifying property values of the in the StartAdvertisement Entity entity in the body of the request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Show Slate (use Default)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/ShowSlate",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "ShowSlate"
- ]
- },
- "description": "Show Slate\n\nIndicates to the live encoder within the Channel that it needs to switch to the default slate image during the commercial break (and mask the incoming video feed). Default is false. The image used will be the one specified via the default slate asset Id property at the time of the channel creation. \n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Show Slate (use Asset ID) ",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"duration\":\"PT45S\",\r\n \"assetId\":\"nb:cid:UUID:01234567-ABCD-ABCD-EFEF-01234567\"\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/ShowSlate",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "ShowSlate"
- ]
- },
- "description": "Show Slate\n\nIndicates to the live encoder within the Channel that it needs to switch to the default slate image during the commercial break (and mask the incoming video feed). Default is false. The image used will be the one specified via the default slate asset Id property at the time of the channel creation. \n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Hide Slate",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Channels('{{LastChannelId}}')/HideSlate",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Channels('{{LastChannelId}}')",
- "HideSlate"
- ]
- },
- "description": "Hide Slate\n\nThe live encoder can be signaled to end an on-going slate using a POST HTTP request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Filter",
- "description": "https://docs.microsoft.com/rest/api/media/operations/filter#filter_properties\n",
- "item": [
- {
- "name": "List Filters",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "List Filters\n\n[List Filters documentation](/rest/api/media/operations/filter#list_filters)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Delete Filter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters('{{filterId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters('{{filterId}}')"
- ]
- },
- "description": "Delete Filters\n\n[Delete Filters documentation](/rest/api/media/operations/filter#delete_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Create Filter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{ \r\n \"Name\":\"Mobile\", \r\n \"PresentationTimeRange\":{ \r\n \"StartTimestamp\":\"0\", \r\n \"EndTimestamp\":\"9223372036854775807\", \r\n \"PresentationWindowDuration\":\"12000000000\", \r\n \"LiveBackoffDuration\":\"0\", \r\n \"Timescale\":\"10000000\" \r\n }, \r\n \"Tracks\":[ \r\n { \r\n \"PropertyConditions\":[ \r\n { \r\n \"Property\":\"Type\", \r\n \"Value\":\"video\", \r\n \"Operator\":\"Equal\" \r\n }, \r\n { \r\n \"Property\":\"Bitrate\", \r\n \"Value\":\"550000-1350000\", \r\n \"Operator\":\"Equal\" \r\n } \r\n ] \r\n } \r\n ] \r\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "Create Filter\n\n[Create Filter documentation](/rest/api/media/operations/filter#create_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Update Filter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{ \r\n \"Tracks\":[ \r\n { \r\n \"PropertyConditions\": \r\n [ \r\n { \r\n \"Property\":\"Type\", \r\n \"Value\":\"audio\", \r\n \"Operator\":\"Equal\" \r\n }, \r\n { \r\n \"Property\":\"Bitrate\", \r\n \"Value\":\"0-2147483647\", \r\n \"Operator\":\"Equal\" \r\n } \r\n ] \r\n } \r\n ] \r\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "Create Filter\n\n[Create Filter documentation](/rest/api/media/operations/filter#create_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- }
- ]
- },
- {
- "name": "AssetFilters",
- "description": "https://docs.microsoft.com/rest/api/media/operations/assetfilter",
- "item": [
- {
- "name": "List AssetFilters",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "List Filters\n\n[List Filters documentation](/rest/api/media/operations/filter#list_filters)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Delete AssetFilter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters('{{filterId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters('{{filterId}}')"
- ]
- },
- "description": "Delete AssetFilter\n\n[Delete AssetFilter documentation](/rest/api/media/operations/assetfilter#delete_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Create AssetFilter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{ \r\n \"Name\":\"TestFilter\", \r\n \"ParentAssetId\":\"nb:cid:UUID:536e555d-1500-80c3-92dc-f1e4fdc6c592\", \r\n \"PresentationTimeRange\":{ \r\n \"StartTimestamp\":\"0\", \r\n \"EndTimestamp\":\"9223372036854775807\", \r\n \"PresentationWindowDuration\":\"12000000000\", \r\n \"LiveBackoffDuration\":\"0\", \r\n \"Timescale\":\"10000000\" \r\n }, \r\n \"Tracks\":[ \r\n { \r\n \"PropertyConditions\": \r\n [ \r\n { \r\n \"Property\":\"Type\", \r\n \"Value\":\"audio\", \r\n \"Operator\":\"Equal\" \r\n }, \r\n { \r\n \"Property\":\"Bitrate\", \r\n \"Value\":\"0-2147483647\", \r\n \"Operator\":\"Equal\" \r\n } \r\n ] \r\n } \r\n ] \r\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "Create AssetFilter\n\n[Create AssetFilter documentation](/rest/api/media/operations/assetfilter#create_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Update AssetFilter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{ \r\n \"Tracks\":[ \r\n { \r\n \"PropertyConditions\": \r\n [ \r\n { \r\n \"Property\":\"Type\", \r\n \"Value\":\"audio\", \r\n \"Operator\":\"Equal\" \r\n }, \r\n { \r\n \"Property\":\"Bitrate\", \r\n \"Value\":\"0-2147483647\", \r\n \"Operator\":\"Equal\" \r\n } \r\n ] \r\n } \r\n ] \r\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Filters",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Filters"
- ]
- },
- "description": "Update AssetFilter\n\n[Update AssetFilter documentation](/rest/api/media/operations/assetfilter#update_a_filter)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Functions",
- "description": "REST API Functions\nhttps://msdn.microsoft.com/library/azure/jj683097.aspx\n",
- "item": [
- {
- "name": "CreateFileInfos Function",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/CreateFileInfos?assetid='{{LastAssetId}}'",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "CreateFileInfos"
- ],
- "query": [
- {
- "key": "assetid",
- "value": "'{{LastAssetId}}'",
- "equals": true
- }
- ]
- },
- "description": "Create Asset Files\nTo create the asset files on an asset, you have to use the CreateFileInfos function.\nA File entity is created using the CreateFileInfos function and passing in the Asset Id that is associated with the media file you uploaded into blob storage. For more information, see Upload a file to blob storage.\nhttps://msdn.microsoft.com/library/azure/jj683097.aspx\n\nAssetFile Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974275.aspx \n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Jobs",
- "description": "",
- "item": [
- {
- "name": "Create Job",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"NewTestJob\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"{{RESTAPIEndpoint}}/Assets('nb:cid:UUID:5710445d-1500-80c4-1256-f1e5b3ff8536')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"H264 Multiple Bitrate 720p\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "Create Job\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\n NOTE: It is very important to use the JSON Verbose Accept header for the Job to submit properly. Set the Accept header to application/json;odata=verbose\n \nThis sample creates a Job with Azure Media Encoder Standard - nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Create Job (Encode to Multibitrate)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"NewTestJob\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Assets('nb:cid:UUID:847dcc53-6f4a-4bc1-925b-96538a11b8e3')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"H264 Multiple Bitrate 720p\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "Create Job\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\n NOTE: It is very important to use the JSON Verbose Accept header for the Job to submit properly. Set the Accept header to application/json;odata=verbose\n \nThis sample creates a Job with Azure Media Encoder Standard - nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": [
- {
- "id": "6d532578-c0aa-4dc5-aa06-6ed4573639d4",
- "name": "Create Job (Encode to Multibitrate)",
- "originalRequest": {
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"NewTestJob\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"{{RESTAPIEndpoint}}/Assets('nb:cid:UUID:2a34b997-c651-4b68-b6bc-533ee9f9501e')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"Content Adaptive Multiple Bitrate MP4\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}\r\n"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- }
- },
- "status": "Created",
- "code": 201,
- "_postman_previewlanguage": "json",
- "_postman_previewtype": "text",
- "header": [
- {
- "key": "Cache-Control",
- "value": "no-cache",
- "name": "Cache-Control",
- "description": "Tells all caching mechanisms from server to client whether they may cache this object. It is measured in seconds"
- },
- {
- "key": "Content-Length",
- "value": "1278",
- "name": "Content-Length",
- "description": "The length of the response body in octets (8-bit bytes)"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose;charset=utf-8",
- "name": "Content-Type",
- "description": "The mime type of this content"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0;",
- "name": "DataServiceVersion",
- "description": "Custom header"
- },
- {
- "key": "Date",
- "value": "Thu, 09 Nov 2017 16:34:02 GMT",
- "name": "Date",
- "description": "The date and time that the message was sent"
- },
- {
- "key": "Location",
- "value": "https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')",
- "name": "Location",
- "description": "Used in redirection, or when a new resource has been created."
- },
- {
- "key": "Server",
- "value": "Microsoft-IIS/8.5",
- "name": "Server",
- "description": "A name for the server"
- },
- {
- "key": "Strict-Transport-Security",
- "value": "max-age=31536000; includeSubDomains",
- "name": "Strict-Transport-Security",
- "description": "A HSTS Policy informing the HTTP client how long to cache the HTTPS only policy and whether this applies to subdomains."
- },
- {
- "key": "X-Content-Type-Options",
- "value": "nosniff",
- "name": "X-Content-Type-Options",
- "description": "The only defined value, \"nosniff\", prevents Internet Explorer from MIME-sniffing a response away from the declared content-type"
- },
- {
- "key": "X-Powered-By",
- "value": "ASP.NET",
- "name": "X-Powered-By",
- "description": "Specifies the technology (ASP.NET, PHP, JBoss, e.g.) supporting the web application (version details are often in X-Runtime, X-Version, or X-AspNet-Version)"
- },
- {
- "key": "access-control-expose-headers",
- "value": "request-id, x-ms-request-id",
- "name": "access-control-expose-headers",
- "description": "Lets a server allowlist headers that browsers are allowed to access."
- },
- {
- "key": "request-id",
- "value": "adfd5998-a8f8-44e2-ac00-4b8be807230a",
- "name": "request-id",
- "description": "Custom header"
- },
- {
- "key": "x-ms-request-id",
- "value": "adfd5998-a8f8-44e2-ac00-4b8be807230a",
- "name": "x-ms-request-id",
- "description": "Custom header"
- }
- ],
- "cookie": [],
- "responseTime": 895,
- "body": "{\"d\":{\"__metadata\":{\"id\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')\",\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')\",\"type\":\"Microsoft.Cloud.Media.Vod.Rest.Data.Models.Job\"},\"Tasks\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/Tasks\"}},\"OutputMediaAssets\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/OutputMediaAssets\"}},\"InputMediaAssets\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/InputMediaAssets\"}},\"Id\":\"nb:jid:UUID:2470355d-1500-80c0-97f2-f1e7c56bca63\",\"Name\":\"NewTestJob\",\"Created\":\"2017-11-09T16:34:01.578661Z\",\"LastModified\":\"2017-11-09T16:34:01.578661Z\",\"EndTime\":null,\"Priority\":0,\"RunningDuration\":0,\"StartTime\":null,\"State\":0,\"TemplateId\":null,\"JobNotificationSubscriptions\":{\"__metadata\":{\"type\":\"Collection(Microsoft.Cloud.Media.Vod.Rest.Data.Models.JobNotificationSubscription)\"},\"results\":[]}}}"
- }
- ]
- },
- {
- "name": "Create Job (Encode with custom settings)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"Custom Encoding Job with Thumbnails\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Assets('nb:cid:UUID:847dcc53-6f4a-4bc1-925b-96538a11b8e3')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"{\r\n 'Version': 1.0,\r\n 'Codecs': [\r\n {\r\n 'KeyFrameInterval': '00:00:02',\r\n 'SceneChangeDetection': 'true',\r\n 'H264Layers': [\r\n {\r\n 'Profile': 'Auto',\r\n 'Level': 'auto',\r\n 'Bitrate': 4500,\r\n 'MaxBitrate': 4500,\r\n 'BufferWindow': '00:00:05',\r\n 'Width': 1280,\r\n 'Height': 720,\r\n 'ReferenceFrames': 3,\r\n 'EntropyMode': 'Cabac',\r\n 'AdaptiveBFrame': true,\r\n 'Type': 'H264Layer',\r\n 'FrameRate': '0/1'\r\n\r\n }\r\n ],\r\n 'Type': 'H264Video'\r\n },\r\n {\r\n 'JpgLayers': [\r\n {\r\n 'Quality': 90,\r\n 'Type': 'JpgLayer',\r\n 'Width': '100%',\r\n 'Height': '100%'\r\n }\r\n ],\r\n 'Start': '{Best}',\r\n 'Type': 'JpgImage'\r\n },\r\n {\r\n 'Channels': 2,\r\n 'SamplingRate': 48000,\r\n 'Bitrate': 128,\r\n 'Type': 'AACAudio'\r\n }\r\n ],\r\n 'Outputs': [\r\n {\r\n 'FileName': '{Basename}_{Index}{Extension}',\r\n 'Format': {\r\n 'Type': 'JpgFormat'\r\n }\r\n },\r\n {\r\n 'FileName': '{Basename}_{Resolution}_{VideoBitrate}.mp4',\r\n 'Format': {\r\n 'Type': 'MP4Format'\r\n }\r\n }\r\n ]\r\n}\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "Create Job\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\n NOTE: It is very important to use the JSON Verbose Accept header for the Job to submit properly. Set the Accept header to application/json;odata=verbose\n \nThis sample creates a Job with a custom encoding profile. For details see [Customizing Media Encoder Standard presets](./media-services-custom-mes-presets-with-dotnet.md)\nOr for JSON samples of our system presets, see the [Sample Presets page](./media-services-mes-presets-overview.md)\n\n[Job Entity REST API](/rest/api/media/operations/job)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": [
- {
- "id": "a9740355-96df-4925-8d63-52735e5a6d37",
- "name": "Create Job (Encode to Multibitrate)",
- "originalRequest": {
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"NewTestJob\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"{{RESTAPIEndpoint}}/Assets('nb:cid:UUID:2a34b997-c651-4b68-b6bc-533ee9f9501e')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"Content Adaptive Multiple Bitrate MP4\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:ff4df607-d419-42f0-bc17-a481b1331e56\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}\r\n"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- }
- },
- "status": "Created",
- "code": 201,
- "_postman_previewlanguage": "json",
- "_postman_previewtype": "text",
- "header": [
- {
- "key": "Cache-Control",
- "value": "no-cache",
- "name": "Cache-Control",
- "description": "Tells all caching mechanisms from server to client whether they may cache this object. It is measured in seconds"
- },
- {
- "key": "Content-Length",
- "value": "1278",
- "name": "Content-Length",
- "description": "The length of the response body in octets (8-bit bytes)"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose;charset=utf-8",
- "name": "Content-Type",
- "description": "The mime type of this content"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0;",
- "name": "DataServiceVersion",
- "description": "Custom header"
- },
- {
- "key": "Date",
- "value": "Thu, 09 Nov 2017 16:34:02 GMT",
- "name": "Date",
- "description": "The date and time that the message was sent"
- },
- {
- "key": "Location",
- "value": "https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')",
- "name": "Location",
- "description": "Used in redirection, or when a new resource has been created."
- },
- {
- "key": "Server",
- "value": "Microsoft-IIS/8.5",
- "name": "Server",
- "description": "A name for the server"
- },
- {
- "key": "Strict-Transport-Security",
- "value": "max-age=31536000; includeSubDomains",
- "name": "Strict-Transport-Security",
- "description": "A HSTS Policy informing the HTTP client how long to cache the HTTPS only policy and whether this applies to subdomains."
- },
- {
- "key": "X-Content-Type-Options",
- "value": "nosniff",
- "name": "X-Content-Type-Options",
- "description": "The only defined value, \"nosniff\", prevents Internet Explorer from MIME-sniffing a response away from the declared content-type"
- },
- {
- "key": "X-Powered-By",
- "value": "ASP.NET",
- "name": "X-Powered-By",
- "description": "Specifies the technology (ASP.NET, PHP, JBoss, e.g.) supporting the web application (version details are often in X-Runtime, X-Version, or X-AspNet-Version)"
- },
- {
- "key": "access-control-expose-headers",
- "value": "request-id, x-ms-request-id",
- "name": "access-control-expose-headers",
- "description": "Lets a server allowlist headers that browsers are allowed to access."
- },
- {
- "key": "request-id",
- "value": "adfd5998-a8f8-44e2-ac00-4b8be807230a",
- "name": "request-id",
- "description": "Custom header"
- },
- {
- "key": "x-ms-request-id",
- "value": "adfd5998-a8f8-44e2-ac00-4b8be807230a",
- "name": "x-ms-request-id",
- "description": "Custom header"
- }
- ],
- "cookie": [],
- "responseTime": 895,
- "body": "{\"d\":{\"__metadata\":{\"id\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')\",\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')\",\"type\":\"Microsoft.Cloud.Media.Vod.Rest.Data.Models.Job\"},\"Tasks\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/Tasks\"}},\"OutputMediaAssets\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/OutputMediaAssets\"}},\"InputMediaAssets\":{\"__deferred\":{\"uri\":\"https://tvmewest.restv2.westcentralus-2.media.azure.net/api/Jobs('nb%3Ajid%3AUUID%3A2470355d-1500-80c0-97f2-f1e7c56bca63')/InputMediaAssets\"}},\"Id\":\"nb:jid:UUID:2470355d-1500-80c0-97f2-f1e7c56bca63\",\"Name\":\"NewTestJob\",\"Created\":\"2017-11-09T16:34:01.578661Z\",\"LastModified\":\"2017-11-09T16:34:01.578661Z\",\"EndTime\":null,\"Priority\":0,\"RunningDuration\":0,\"StartTime\":null,\"State\":0,\"TemplateId\":null,\"JobNotificationSubscriptions\":{\"__metadata\":{\"type\":\"Collection(Microsoft.Cloud.Media.Vod.Rest.Data.Models.JobNotificationSubscription)\"},\"results\":[]}}}"
- }
- ]
- },
- {
- "name": "Create Indexer Job ",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"Indexer v2 Test Job\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"{{RESTAPIEndpoint}}/Assets('nb:cid:UUID:733f8d88-f96b-496c-a46e-38c037b89d48')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": \"{'Version':'1.0','Features':[{'Options':{'Formats':['WebVtt','TTML'],'Language':'EnUs','Type':'RecoOptions'},'Type':'SpReco'}]}\",\r\n \"MediaProcessorId\": \"nb:mpid:UUID:1927f26d-0aa5-4ca1-95a3-1a3f95b0f706\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "Create Indexer Job\n\nThis job submits a Speech Analytics job with custom configuration settings. \nNote that it uses the Media Processor ID for Indexer: \n\t\"MediaProcessorId\": \"nb:mpid:UUID:1927f26d-0aa5-4ca1-95a3-1a3f95b0f706\"\n\nAnd it uses a custom configuration JSON:\n\t \"Configuration\": \"{'Version':'1.0','Features':[{'Options':{'Formats':['WebVtt','TTML'],'Language':'EnUs','Type':'RecoOptions'},'Type':'SpReco'}]}\",\n\t \nFor details on configuring the settings of the speech analyzer, read the [Indexing Media Files](./media-services-process-content-with-indexer2.md) article\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Create Redactor Job",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"Indexer v2 Test Job\",\r\n \"InputMediaAssets\": [{\r\n \"__metadata\": {\r\n \"uri\": \"{{RESTAPIEndpoint}}/Assets('nb:cid:UUID:733f8d88-f96b-496c-a46e-38c037b89d48')\"\r\n }\r\n }],\r\n \"Tasks\": [{\r\n \"Configuration\": '{\"Version\":\"1.0\",\"Features\":[{\"Options\":{\"Formats\":[\"WebVtt\",\"TTML\"],\"Language\":\"EnUs\",\"Type\":\"RecoOptions\"},\"Type\":\"SpReco\"}]}',\r\n \"MediaProcessorId\": \"nb:mpid:UUID:1927f26d-0aa5-4ca1-95a3-1a3f95b0f706\",\r\n \"TaskBody\": \"<?xml version=\\\"1.0\\\" encoding=\\\"utf-8\\\"?>\r\n <taskBody>\r\n <inputAsset>JobInputAsset(0)</inputAsset>\r\n <outputAsset assetName=\\\"foobar.mp4\\\">JobOutputAsset(0)</outputAsset>\r\n </taskBody>\"\r\n }]\r\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "This example submits a redaction job.\nFor details on configuration settings, refer to the [Redact faces with Azure Media Analytics](./media-services-face-redaction.md) article. \n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "List Jobs",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Response has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs"
- ]
- },
- "description": "List Jobs\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Job",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs('nb:jid:UUID:56debcff-0300-80c0-8bf6-f1e7c1785b5c')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs('nb:jid:UUID:56debcff-0300-80c0-8bf6-f1e7c1785b5c')"
- ]
- },
- "description": "Get Job\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Job State",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs('nb:jid:UUID:bc6e241f-efa9-8a49-acc6-39700110f8d4')?$select=State",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs('nb:jid:UUID:bc6e241f-efa9-8a49-acc6-39700110f8d4')"
- ],
- "query": [
- {
- "key": "$select",
- "value": "State",
- "equals": true
- }
- ]
- },
- "description": "Get Job State\n\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get Job State and RunningDuration",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Jobs('nb:jid:UUID:bc6e241f-efa9-8a49-acc6-39700110f8d4')?$select=State, RunningDuration",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Jobs('nb:jid:UUID:bc6e241f-efa9-8a49-acc6-39700110f8d4')"
- ],
- "query": [
- {
- "key": "$select",
- "value": "State, RunningDuration",
- "equals": true
- }
- ]
- },
- "description": "Get Job State and RunningDuration\n\nA job is an entity that contains metadata about a set of tasks. Each task performs an atomic operation on the input asset(s). A job is typically used to process one audio/video presentation. If you are processing multiple videos, create a job for each video to be encoded. \n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Locator",
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#create_a_locator\n\nLocator provides an entry point to access the files contained in an Asset. An AccessPolicy is used to define the permissions and duration that a client has access to a given Asset. Locators can have a many to one relationship with an AccessPolicy, such that different Locators can provide different start times and connection types to different clients while all using the same permission and duration settings; however, because of a shared access policy restriction set by Azure storage services, you cannot have more than five unique Locators associated with a given Asset at one time. For more information, see Using a Shared Access Signature (REST API).",
- "item": [
- {
- "name": "List Locators",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Locators",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Locators"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- },
- {
- "name": "Create SAS Locator",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "var filename = postman.getEnvironmentVariable(\"MediaFileName\");",
- "postman.setEnvironmentVariable(\"UploadURL\", json.BaseUri + \"/\" + filename + json.ContentAccessComponent);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"AccessPolicyId\": \"{{LastAccessPolicyId}}\", \n\t\"AssetId\" : \"{{LastAssetId}}\", \n\t\"Type\":1\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Locators",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Locators"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- },
- {
- "name": "Create Streaming Locator",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"AccessPolicyId\": \"nb:pid:UUID:25544a8f-8ccf-43b1-a188-2a860b35bffa\", \n\t\"AssetId\" : \"nb:cid:UUID:d062e5ef-e496-4f21-87e7-17d210628b7c\", \n\t\"StartTime\" : \"2014-05-17T16:45:53\", \n\t\"Type\":2\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Locators",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Locators"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- },
- {
- "name": "Update Locator",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"StartTime\" : \"2017-10-10T00:00:00\"\n} \n"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Locators('{{locatorId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Locators('{{locatorId}}')"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- },
- {
- "name": "List AccessPolicies",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/AccessPolicies",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "AccessPolicies"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- }
- ]
- },
- {
- "name": "MediaProcessors",
- "description": "",
- "item": [
- {
- "name": "List MediaProcessors",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Response has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/MediaProcessors",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "MediaProcessors"
- ]
- },
- "description": "List MediaProcessors\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "NotificationEndPoints",
- "description": "",
- "item": [
- {
- "name": "Create NotificationEndPoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201 Created\"] = responseCode.code === 201;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json;odata=verbose"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": " {\n \"Name\": \"FunctionWebHook\",\n \"EndPointAddress\": \"https://johdeufunctions.azurewebsites.net/api/Notification_Webhook_Function?code=j0txf1f8msjytzvpe40nxbpxdcxtqcgxy0nt\",\n \"EndPointType\": 3\n }"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/NotificationEndPoints",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "NotificationEndPoints"
- ]
- },
- "description": "Create NotificationEndpoint\n\nJob Entity REST API \nhttps://msdn.microsoft.com/library/azure/hh974289.aspx\n\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Get NotificationEndpoints",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200 Ok\"] = responseCode.code === 200;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/NotificationEndPoints",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "NotificationEndPoints"
- ]
- },
- "description": "Get NotificationEndpoints\n\nThe endpoint to which the notifications about the job state are sent. Notifications can flow to an Azure Queue or a WebHook \n\nhttps://msdn.microsoft.com/library/azure/dn169055.aspx\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Delete NotificationEndpoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 204 No Content\"] = responseCode.code === 204;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/NotificationEndPoints('nb:nepid:UUID:1aa7afea-4bca-445b-82cd-ad6edeeea724')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "NotificationEndPoints('nb:nepid:UUID:1aa7afea-4bca-445b-82cd-ad6edeeea724')"
- ]
- },
- "description": "Delete NotificationEndpoint\n\n\nhttps://msdn.microsoft.com/library/azure/dn169055.aspx\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "List By Filter",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200 Ok\"] = responseCode.code === 200;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/NotificationEndPoints?$top=5&$filter=startswith(Name,'WebHook')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "NotificationEndPoints"
- ],
- "query": [
- {
- "key": "$top",
- "value": "5",
- "equals": true
- },
- {
- "key": "$filter",
- "value": "startswith(Name,'WebHook')",
- "equals": true
- }
- ]
- },
- "description": "Get NotificationEndpoints with Filter\n\n\nThe endpoint to which the notifications about the job state are sent. Notifications can flow to an Azure Queue or a WebHook \n\nhttps://msdn.microsoft.com/library/azure/dn169055.aspx\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "List by Type",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200 Ok\"] = responseCode.code === 200;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/NotificationEndPoints?$filter=EndPointType eq 3",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "NotificationEndPoints"
- ],
- "query": [
- {
- "key": "$filter",
- "value": "EndPointType eq 3",
- "equals": true
- }
- ]
- },
- "description": "Get NotificationEndpoints by Type\n\n1 = Queue\n2 = Reserved\n3 = WebHook\n\n\nThe endpoint to which the notifications about the job state are sent. Notifications can flow to an Azure Queue or a WebHook \n\nhttps://msdn.microsoft.com/library/azure/dn169055.aspx\n\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- },
- {
- "name": "Programs",
- "description": "https://docs.microsoft.com/rest/api/media/operations/program#program_properties",
- "item": [
- {
- "name": "Get Programs",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs"
- ]
- },
- "description": "List Programs\n\n[List Programs documentation](/rest/api/media/operations/program#list_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Start Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs('{{programId}}')/Start",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs('{{programId}}')",
- "Start"
- ]
- },
- "description": "Start Programs\n\n[Start a Program documentation](/rest/api/media/operations/program#start_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Delete Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs('{{programId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs('{{programId}}')"
- ]
- },
- "description": "Delete Programs\n\n[Delete a Program documentation](/rest/api/media/operations/program#delete_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Update Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"ArchiveWindowLength\":\"PT4H\"\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs('{{programId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs('{{programId}}')"
- ]
- },
- "description": "Update Programs\n\n[Update a Program documentation](/rest/api/media/operations/program#update_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Stop Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs('{{programId}}')/Stop",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs('{{programId}}')",
- "Stop"
- ]
- },
- "description": "Strop Programs\n\n[Stop a Program documentation](/rest/api/media/operations/program#stop_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Create Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\":\"testprogram001\",\n\t\"Description\":\"\",\n\t\"ChannelId\":\"nb:chid:UUID:83bb19de-7abf-4907-9578-abe90adfbabe\",\n\t\"AssetId\":\"nb:cid:UUID:bc495364-5357-42a1-9a9d-be54689cfae2\",\n\t\"ArchiveWindowLength\":\"PT1H\"\n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/Programs",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "Programs"
- ]
- },
- "description": "Create Program\n\n[Create Programs documentation](/rest/api/media/operations/program#create_programs)\n\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- }
- ]
- },
- {
- "name": "StreamingEndpoint",
- "description": "https://docs.microsoft.com/rest/api/media/operations/streamingendpoint#create_streaming_endpoints\n",
- "item": [
- {
- "name": "List StreamingEndpoints",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints"
- ]
- },
- "description": "List StreamingEndpoints\n\n[List StreamingEndpoints documentation](/rest/api/media/operations/streamingendpoint#list_create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Delete StreamingEndpoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints('{{streamingEndpointId}}')",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints('{{streamingEndpointId}}')"
- ]
- },
- "description": "List StreamingEndpoints\n\n[List StreamingEndpoints documentation](/rest/api/media/operations/streamingendpoint#list_create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Start StreamingEndpoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints('{{streamingEndpointId}}')/Start",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints('{{streamingEndpointId}}')",
- "Start"
- ]
- },
- "description": "Start StreamingEndpoints\n\n[Start StreamingEndpoint documentation](/rest/api/media/operations/streamingendpoint#start_create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Stop StreamingEndpoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints('{{streamingEndpointId}}')/Start",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints('{{streamingEndpointId}}')",
- "Start"
- ]
- },
- "description": "Start StreamingEndpoints\n\n[Stop StreamingEndpoint documentation](/rest/api/media/operations/streamingendpoint#stop_create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Scale StreamingEndpoint",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"scaleUnits\" : 2\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints('{{streamingEndpointId}}')/Scale",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints('{{streamingEndpointId}}')",
- "Scale"
- ]
- },
- "description": "Scale StreamingEndpoints\n\n[Scale StreamingEndpoint documentation](/rest/api/media/operations/streamingendpoint#scale_create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Create StreamingEndpoints",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n \"Name\": \"test-streamingendpoint-1\",\n \"Description\": \"\",\n \"ScaleUnits\": 0,\n \"CustomHostNames\": [],\n \"AccessControl\": null,\n \"CdnEnabled\": true,\n \"CdnProfile\": \"AzureMediaStreamingPlatformCdnProfile-StandardVerizon\",\n \"CdnProvider\": \"StandardVerizon\",\n \"CacheControl\":{ \n \"MaxAge\":\"1800\" \n\t}, \n \"CrossSiteAccessPolicies\":{ \n \"ClientAccessPolicy\":\"<access-policy><cross-domain-access><policy><allow-from http-request-headers='*'><domain uri='http://*' /></allow-from><grant-to><resource path='/' include-subpaths='false' /></grant-to></policy></cross-domain-access></access-policy>\", \n \"CrossDomainPolicy\":\"<?xml version='1.0'?><!DOCTYPE cross-domain-policy SYSTEM 'https://www.macromedia.com/xml/dtds/cross-domain-policy.dtd'><cross-domain-policy><allow-access-from domain='*' /></cross-domain-policy>\" \n } \n}"
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints"
- ]
- },
- "description": "Create StreamingEndpoints\n\n[Create StreamingEndpoints documentation](/rest/api/media/operations/streamingendpoint#create_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "Update StreamingEndpoints",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "PATCH",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"CacheControl\":{\n\t\t\"MaxAge\":\"2000\"\n\t}\n} "
- },
- "url": {
- "raw": "{{RESTAPIEndpoint}}/StreamingEndpoints",
- "host": [
- "{{RESTAPIEndpoint}}"
- ],
- "path": [
- "StreamingEndpoints"
- ]
- },
- "description": "Update StreamingEndpoints\n\n[Update StreamingEndpoints documentation](/rest/api/media/operations/streamingendpoint#update_streaming_endpoints)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- }
- ]
- }
- ]
-}
-```
media-services Postman Environment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/postman-environment.md
- Title: Import the Postman environment for Azure Media Services REST calls
-description: This topic provides a definition of the Postman environment for Azure Media Services REST calls.
------ Previously updated : 3/10/2021---
-# Import the Postman environment
--
-This article contains a definition of the **Postman** environment variables that are used the [Postman collection](postman-collection.md) that contains grouped HTTP requests that call Media Services REST APIs. The environment and collection files are used by the [Configure Postman for Media Services REST API calls](media-rest-apis-with-postman.md) tutorial.
-
-> [!NOTE]
-> The value of `AzureADSTSEndpoint ` = `https://login.microsoftonline.com/{{TenantId}}/oauth2/token`. To get your Tenant ID, you can hover your mouse over your user name in the portal (in the upper right corner) and it will be in the "Directory: Microsoft ( {{TENANTID}} ) .
-
-```
-{
- "id": "2dbce3ce-74c2-2ceb-0461-c4c2323f5b09",
- "name": "AzureMedia",
- "values": [
- {
- "enabled": true,
- "key": "TenantID",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "AzureADSTSEndpoint",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "RESTAPIEndpoint",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "ClientID",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "ClientSecret",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "AccessToken",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "LastAssetId",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "LastAccessPolicyId",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "UploadURL",
- "value": "",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "MediaFileName",
- "value": "BigBuckBunny.mp4",
- "type": "text"
- },
- {
- "enabled": true,
- "key": "LastChannelId",
- "value": "",
- "type": "text"
- }
- ],
- "_postman_variable_scope": "environment",
- "_postman_exported_using": "Postman/5.5.0"
-}
-```
media-services Postman Live Streaming Collection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/postman-live-streaming-collection.md
- Title: Import the Postman collection for Azure Live Streaming REST calls
-description: This article provides a definition of the Postman collection for Azure Media Services REST calls.
------ Previously updated : 3/10/2021---
-# Import a Postman collection with Live Streaming operations
--
-This article contains a definition of the **Postman** collection that contains grouped HTTP requests that call **Live Streaming** Azure Media Services REST APIs. For information about how to configure **Postman** so it can be used to call Media Services REST APIs, see [Configure Postman for Media Services REST API calls](media-rest-apis-with-postman.md) tutorial.
-
-```json
-{
- "info": {
- "name": "Azure Media Live Streaming Quickstart",
- "_postman_id": "0dc5e4c6-4865-cbe9-250c-78e40b634256",
- "description": "Quickstart collection to use Live Streaming and Encoding on Azure Media Services\n",
- "schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json"
- },
- "item": [
- {
- "name": "1. Get AAD Auth Token copy",
- "description": "To get started making calls to Azure Media Services you have to first do the following:\n1) Get Token and cache it.\n2) Get the Closest API endpoint from http://media.windows.net",
- "item": [
- {
- "name": "Get Azure AD Token for Auth (Expires every Hour!)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "var json = JSON.parse(responseBody);",
- "postman.setEnvironmentVariable(\"AccessToken\", json.access_token);"
- ]
- }
- }
- ],
- "request": {
- "method": "POST",
- "header": [
- {
- "key": "Content-Type",
- "value": "application/x-www-form-urlencoded"
- },
- {
- "key": "Keep-Alive",
- "value": "true"
- }
- ],
- "body": {
- "mode": "urlencoded",
- "urlencoded": [
- {
- "key": "grant_type",
- "value": "client_credentials",
- "description": "",
- "type": "text"
- },
- {
- "key": "client_id",
- "value": "{{ClientId}}",
- "description": "The Client ID for your AAD application",
- "type": "text"
- },
- {
- "key": "client_secret",
- "value": "{{ClientSecret}}",
- "description": "The Client Secret for your AAD application Service principal",
- "type": "text"
- },
- {
- "key": "resource",
- "value": "https://rest.media.azure.net",
- "description": "Normally this is https://rest.media.azure.net",
- "type": "text"
- }
- ]
- },
- "url": {
- "raw": "https://login.microsoftonline.com/{{TenantId}}/oauth2/token",
- "protocol": "https",
- "host": [
- "login",
- "microsoftonline",
- "com"
- ],
- "path": [
- "{{TenantId}}",
- "oauth2",
- "token"
- ]
- },
- "description": ""
- },
- "response": []
- }
- ]
- },
- {
- "name": "2. Create Channel",
- "description": "",
- "item": [
- {
- "name": "2.1 - Create Channel (New Encoder)",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "7ac7d788-f35e-420b-aca3-ffabc5d65ae6",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- "",
- "",
- "pm.test(\"Check for Id and store\", function () {",
- " var jsonData = pm.response.json();",
- " pm.expect(jsonData.Id.Value)",
- " ",
- " pm.environment.set(\"ChannelId\", jsonData.Id );",
- "",
- "});",
- "",
- "var jsonData = pm.response.json();",
- "tests[\"Has State\"] = jsonData.State !== null;",
- "tests[\"Has Encoding\"] = jsonData.EncodingType == \"Standard\";",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"Name\": \"MyPostmanChannel\",\r\n \"Description\": \"My Live Encoding channel from Postman\",\r\n \"EncodingType\": \"Standard\",\r\n \"Encoding\": null,\r\n \"Slate\": null,\r\n \"Input\": {\r\n \"KeyFrameInterval\": null,\r\n \"StreamingProtocol\": \"RTMP\",\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"Allow All\",\r\n \"Address\": \"0.0.0.0\",\r\n \"SubnetPrefixLength\": 0\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Preview\": {\r\n \"AccessControl\": {\r\n \"IP\": {\r\n \"Allow\": [\r\n {\r\n \"Name\": \"Allow All\",\r\n \"Address\": \"0.0.0.0\",\r\n \"SubnetPrefixLength\": 0\r\n }\r\n ]\r\n }\r\n },\r\n \"Endpoints\": []\r\n },\r\n \"Output\": {\r\n \"Hls\": {\r\n \"FragmentsPerSegment\": \"1\"\r\n }\r\n },\r\n \"CrossSiteAccessPolicies\": {\r\n \"ClientAccessPolicy\": null,\r\n \"CrossDomainPolicy\": null\r\n }\r\n}\r\n"
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels"
- ]
- },
- "description": "Create Channel with Encoding\n\nChannels can be created using a POST HTTP request and specifying property values.\n\nIf successful, a 202 Accepted status code is returned along with a representation of the created entity in the response body. \n\nThe 202 Accepted status code indicates an asynchronous operation, in which case the operation-id header value is also provided for use in polling and tracking the status of long-running operations, such as starting or stopping a Channel. Pass the operation-id header value into the Operation Entity to retrieve the status. For more information, see Manually Polling Long-Running Operations.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "2.2 - Get Channel (to check that it is good!)",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "447b6b3e-6c43-437e-80ba-048bb1b55dc0",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Odata.metadata\"] = jsonData.odata_metadata !== null;",
- "tests[\"Has value\"] = jsonData.value !== null;",
- "tests[\"Channel has an ID\"] = jsonData.Id !== null;",
- "",
- "tests[\"Channel has an IngestUrl\"] = jsonData.Input.Endpoints[0].Url!== null;",
- "pm.environment.set(\"IngestUrl\", jsonData.Input.Endpoints[0].Url);",
- "",
- "pm.environment.set(\"variable_key\", \"variable_value\");"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')"
- ]
- },
- "description": "List Channels\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "2.3 - Start the Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "ab712ce8-5023-4fa4-b0f8-90fa5ad7d56a",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202- Means it started!\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/Start",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "Start"
- ]
- },
- "description": "Start a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "2.4 - Poll Channel to see if it started (State == \"Started\")",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "6f2a03fc-dae1-4582-9a7a-690c4d4386f0",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Channel has value\"] = jsonData.value !== null;",
- "tests[\"Channel is Running\"] = jsonData.value == \"Running\";",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "GET",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.15"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/State",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "State"
- ]
- },
- "description": "List Channels\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ],
- "event": [
- {
- "listen": "prerequest",
- "script": {
- "id": "66894a1f-eb6e-4755-925d-4e0c9715efcc",
- "type": "text/javascript",
- "exec": [
- ""
- ]
- }
- },
- {
- "listen": "test",
- "script": {
- "id": "01313c76-73d2-4218-a42f-594f8b2740b8",
- "type": "text/javascript",
- "exec": [
- ""
- ]
- }
- }
- ]
- },
- {
- "name": "3. Create a Program (Recording)",
- "description": "",
- "item": [
- {
- "name": "3.1 - Create Asset for the Program to Record to",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "2870b32d-e412-4783-b4fe-74d0a2e6ca66",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201\"] = responseCode.code === 201;",
- "",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has an ID\"] = jsonData.Id !== null;",
- "tests[\"Has Created date\"] = jsonData.Created !== null;",
- "tests[\"has a URI\"] = jsonData.Uri !== null;",
- "",
- "",
- "pm.environment.set(\"AssetId\", jsonData.Id);",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n Name:\"Asset for Recording\",\n Options:0\n}"
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Assets",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Assets"
- ]
- },
- "description": "Create Assets\nThe Asset entity contains digital files (including video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. After the digital files are uploaded into an asset, they could be used in the Media Services encoding and streaming workflows.\n\n[Asset Entity REST API](/rest/api/media/operations/asset)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "3.2 - Create Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "f8ee90d1-5fad-448c-9686-4394ed5b094c",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 200\"] = responseCode.code === 200;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Id\"] = jsonData.Id !== null;",
- "tests[\"Has State\"] = jsonData.State !== null;",
- "",
- "",
- "pm.environment.set(\"ProgramId\", jsonData.Id);"
- ]
- }
- },
- {
- "listen": "prerequest",
- "script": {
- "id": "c87a3cf5-d004-4512-acb0-1878f5d7375f",
- "type": "text/javascript",
- "exec": [
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\":\"testprogram004\",\n\t\"Description\":\"\",\n\t\"ChannelId\" : \"{{ChannelId}}\",\n\t\"AssetId\": \"{{AssetId}}\",\n\t\"ArchiveWindowLength\":\"PT1H\"\n}"
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Programs",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Programs"
- ]
- },
- "description": "Create Program\n\n[Create Programs documentation](/rest/api/media/operations/program#create_programs)\n\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "3.3 - Create AccessPolicy for Streaming",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "8a646a73-f26d-4493-a101-5c270f5b68de",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 201\"] = responseCode.code === 201;",
- "",
- "",
- "var jsonData = JSON.parse(responseBody);",
- "tests[\"Has Id\"] = jsonData.Id !== null;",
- "",
- "",
- "pm.environment.set(\"AccessPolicyId\", jsonData.Id);"
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"Name\": \"StreamingAccessPolicy-test001\", \n\t\"DurationInMinutes\" : \"525600\", \n\t\"Permissions\" : 1 \n} "
- },
- "url": {
- "raw": "{{ApiEndpoint}}/AccessPolicies",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "AccessPolicies"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/accesspolicy#create_an_accesspolicy\n\n\n## Permissions: \nspecifies the access rights the client has when interacting with the Asset. Valid values are:\n\n- None = 0 (default)\n- Read = 1\n- Write = 2\n- Delete = 4\n- List = 8"
- },
- "response": []
- },
- {
- "name": "3.3 - Start the Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "d626a1a9-b51f-4343-97ea-d981ba45041d",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Programs('{{ProgramId}}')/Start",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Programs('{{ProgramId}}')",
- "Start"
- ]
- },
- "description": "Start Programs\n\n[Start a Program documentation](/rest/api/media/operations/program#start_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- },
- {
- "name": "3.4 - Create Streaming URL (Locator)",
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\n\t\"AccessPolicyId\": \"{{AccessPolicyId}}\", \n\t\"AssetId\" : \"{{AssetId}}\", \n\t\"StartTime\" : \"2018-02-09T17:55\", \n\t\"Type\":2\n} "
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Locators",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Locators"
- ]
- },
- "description": "https://docs.microsoft.com/rest/api/media/operations/locator#list_locators"
- },
- "response": []
- },
- {
- "name": "3.5 - Stop the Program",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "9810e24f-4bc4-4048-a70f-9c5a9fba5bf7",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Programs('{{ProgramId}}')/Stop",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Programs('{{ProgramId}}')",
- "Stop"
- ]
- },
- "description": "Strop Programs\n\n[Stop a Program documentation](/rest/api/media/operations/program#stop_programs)\n\n[Full REST API documentation](/rest/api/media/operations/azure-media-services-rest-api-reference)"
- },
- "response": []
- }
- ]
- },
- {
- "name": "4 - Operate Live Stream",
- "description": "",
- "item": [
- {
- "name": "Reset Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/Reset",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "Reset"
- ]
- },
- "description": "Reset a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Start Advertisement",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"duration\":\"PT45S\",\r\n \"cueId\":\"67520935\",\r\n \"showSlate\":\"true\"\r\n}\r\n"
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/StartAdvertisement",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "StartAdvertisement"
- ]
- },
- "description": "Start a Channel Ad Break\n\nThe live encoder can be signaled to start an advertisement or commercial break using a POST HTTP request and specifying property values of the in the StartAdvertisement Entity entity in the body of the request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "End Advertisement",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/EndAdvertisement",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "EndAdvertisement"
- ]
- },
- "description": "End a Channel Ad Break\n\nThe live encoder can be signaled to start an advertisement or commercial break using a POST HTTP request and specifying property values of the in the StartAdvertisement Entity entity in the body of the request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Show Slate (use Default)",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/ShowSlate",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "ShowSlate"
- ]
- },
- "description": "Show Slate\n\nIndicates to the live encoder within the Channel that it needs to switch to the default slate image during the commercial break (and mask the incoming video feed). Default is false. The image used will be the one specified via the default slate asset Id property at the time of the channel creation. \n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Show Slate (use Asset ID) ",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": "{\r\n \"duration\":\"PT45S\",\r\n \"assetId\":\"nb:cid:UUID:01234567-ABCD-ABCD-EFEF-01234567\"\r\n}"
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/ShowSlate",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "ShowSlate"
- ]
- },
- "description": "Show Slate\n\nIndicates to the live encoder within the Channel that it needs to switch to the default slate image during the commercial break (and mask the incoming video feed). Default is false. The image used will be the one specified via the default slate asset Id property at the time of the channel creation. \n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Hide Slate",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {
- "mode": "raw",
- "raw": ""
- },
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')/HideSlate",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')",
- "HideSlate"
- ]
- },
- "description": "Hide Slate\n\nThe live encoder can be signaled to end an on-going slate using a POST HTTP request.\n\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Stop Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "POST",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')/Stop",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('nb:chid:UUID:27ff0843-abae-4261-b46e-0558efc21f82')",
- "Stop"
- ]
- },
- "description": "Stop a Channel\n\nThe Channel entity represents a pipeline for processing live streaming content.\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- },
- {
- "name": "Delete Channel",
- "event": [
- {
- "listen": "test",
- "script": {
- "id": "aba1c50b-39da-46f4-aa73-9b8ec84cd068",
- "type": "text/javascript",
- "exec": [
- "tests[\"Status code is 202\"] = responseCode.code === 202;",
- "",
- ""
- ]
- }
- }
- ],
- "request": {
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "method": "DELETE",
- "header": [
- {
- "key": "x-ms-version",
- "value": "2.19"
- },
- {
- "key": "Accept",
- "value": "application/json"
- },
- {
- "key": "Content-Type",
- "value": "application/json"
- },
- {
- "key": "DataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "MaxDataServiceVersion",
- "value": "3.0"
- },
- {
- "key": "User-Agent",
- "value": "azure media services postman collection"
- }
- ],
- "body": {},
- "url": {
- "raw": "{{ApiEndpoint}}/Channels('{{ChannelId}}')",
- "host": [
- "{{ApiEndpoint}}"
- ],
- "path": [
- "Channels('{{ChannelId}}')"
- ]
- },
- "description": "Delete Channels\n\nDelete the Channel entity\n\nChannel Entity REST API - https://msdn.microsoft.com/library/azure/dn783458.aspx\n\nFull REST API documentation\nhttps://msdn.microsoft.com/library/azure/hh973617.aspx"
- },
- "response": []
- }
- ]
- }
- ],
- "auth": {
- "type": "bearer",
- "bearer": [
- {
- "key": "token",
- "value": "{{AccessToken}}",
- "type": "string"
- }
- ]
- },
- "variable": [
- {
- "id": "f73392be-121b-418c-8489-8530323768b0",
- "key": "channelName",
- "value": "User001",
- "type": "text"
- },
- {
- "id": "ec9ba052-77ba-47e2-93c2-5aaed691c012",
- "key": "channelID",
- "value": "",
- "type": "text"
- },
- {
- "id": "0611b82b-6c00-498b-89ee-2b97f2a4dcd7",
- "key": "programName",
- "value": "User001Program",
- "type": "text"
- },
- {
- "id": "fdc71bce-8477-473b-aa89-eda66a61b776",
- "key": "programId",
- "value": "",
- "type": "text"
- }
- ]
-}
-```
media-services Scenarios And Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/media-services/previous/scenarios-and-availability.md
- Title: Microsoft Azure Media Services common scenarios | Microsoft Docs
-description: This article gives an overview of Microsoft Azure Media Services scenarios.
------ Previously updated : 3/10/2021---
-# Microsoft Azure Media Services common scenarios
--
-> [!NOTE]
-> No new features or functionality are being added to Media Services v2. Check out the latest version, [Media Services v3](../latest/media-services-overview.md). Also, see [migration guidance from v2 to v3](../latest/migrate-v-2-v-3-migration-introduction.md)
-
-Microsoft Azure Media Services (AMS) enables you to securely upload, store, encode, and package video or audio content for both on-demand and live streaming delivery to various clients (for example, TV, PC, and mobile devices).
-
-This article shows common scenarios for delivering your content live or on-demand.
-
-## Overview
-
-### Prerequisites
-
-* An Azure account. If you don't have an account, you can create a free trial account in just a couple of minutes. For details, see [Azure Free Trial](https://azure.microsoft.com).
-* An Azure Media Services account. For more information, see [Create Account](media-services-portal-create-account.md).
-* The streaming endpoint from which you want to stream content has to be in the **Running** state.
-
- When your AMS account is created, a **default** streaming endpoint is added to your account in the **Stopped** state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint has to be in the **Running** state.
-
-### Commonly used objects when developing against the AMS OData model
-
-The following image shows some of the most commonly used objects when developing against the Media Services OData model.
-
-Click the image to view it full size.
-
-[![Diagram showing some of the most commonly used objects when developing against the Azure Media Services object data model.](./media/media-services-overview/media-services-overview-object-model-small.png)](./media/media-services-overview/media-services-overview-object-model.png#lightbox)
-
-You can view the whole model [here](https://m.eet.com/media/1170326/ms-part1.pdf).
-
-## Protect content in storage and deliver streaming media in the clear (non-encrypted)
-
-![VoD workflow](./media/scenarios-and-availability/scenarios-and-availability01.png)
-
-1. Upload a high-quality media file into an asset.
-
- Applying the storage encryption option to your asset in order to protect your content during upload and while at rest in storage is recommended.
-
-1. Encode to a set of adaptive bitrate MP4 files.
-
- Applying the storage encryption option to the output asset in order to protect your content at rest is recommended.
-
-1. Configure asset delivery policy (used by dynamic packaging).
-
- If your asset is storage encrypted, you **must** configure asset delivery policy.
-1. Publish the asset by creating an OnDemand locator.
-1. Stream published content.
-
-## Protect content in storage, deliver dynamically encrypted streaming media
-
-![Protect with PlayReady](./media/media-services-content-protection-overview/media-services-content-protection-with-multi-drm.png)
-
-1. Upload a high-quality media file into an asset. Apply storage encryption option to the asset.
-1. Encode to a set of adaptive bitrate MP4 files. Apply storage encryption option to the output asset.
-1. Create encryption content key for the asset you want to be dynamically encrypted during playback.
-1. Configure content key authorization policy.
-1. Configure asset delivery policy (used by dynamic packaging and dynamic encryption).
-1. Publish the asset by creating an OnDemand locator.
-1. Stream published content.
-
-## Deliver progressive download
-
-1. Upload a high-quality media file into an asset.
-1. Encode to a single MP4 file.
-1. Publish the asset by creating an OnDemand or SAS locator. If using SAS locator, the content is downloaded from the Azure blob storage. You don't need to have streaming endpoints in started state.
-1. Progressively download content.
-
-## Delivering live-streaming events
-
-1. Ingest live content using various live streaming protocols (for example RTMP or Smooth Streaming).
-1. (optionally) Encode your stream into adaptive bitrate stream.
-1. Preview your live stream.
-1. Deliver the content through:
- 1. Common streaming protocols (for example, MPEG DASH, Smooth, HLS) directly to your customers,
- 1. To a Content Delivery Network (CDN) for further distribution, or
- 1. Record and store the ingested content to be streamed later (Video-on-Demand).
-
-When doing live streaming, you can choose one of the following routes:
-
-### Working with channels that receive multi-bitrate live stream from on-premises encoders (pass-through)
-
-The following diagram shows the major parts of the AMS platform that are involved in the **pass-through** workflow.
-
-![Diagram that shows the major parts of the A M S platform involved in the "pass-through" workflow.](./media/scenarios-and-availability/media-services-live-streaming-current.png)
-
-For more information, see [Working with Channels that Receive Multi-bitrate Live Stream from On-premises Encoders](media-services-live-streaming-with-onprem-encoders.md).
-
-### Working with channels that are enabled to perform live encoding with Azure Media Services
-
-The following diagram shows the major parts of the AMS platform that are involved in Live Streaming workflow where a Channel is enabled to do live encoding with Media Services.
-
-![Live workflow](./media/scenarios-and-availability/media-services-live-streaming-new.png)
-
-For more information, see [Working with Channels that are Enabled to Perform Live Encoding with Azure Media Services](media-services-manage-live-encoder-enabled-channels.md).
-
-## Consuming content
-
-Azure Media Services provides the tools you need to create rich, dynamic client player applications for most platforms including: iOS Devices, Android Devices, Windows, Windows Phone, Xbox, and Set-top boxes.
-
-## Enabling Azure CDN
-
-Media Services supports integration with Azure CDN. For information on how to enable Azure CDN, see [How to Manage Streaming Endpoints in a Media Services Account](media-services-portal-manage-streaming-endpoints.md).
-
-## Scaling a Media Services account
-
-AMS customers can scale streaming endpoints, media processing, and storage in their AMS accounts.
-
-* Media Services customers can choose either a **Standard** streaming endpoint or a **Premium** streaming endpoint. A **Standard** streaming endpoint is suitable for most streaming workloads. It includes the same features as a **Premium** streaming endpoints and scales outbound bandwidth automatically.
-
- **Premium** streaming endpoints are suitable for advanced workloads, providing dedicated and scalable bandwidth capacity. Customers that have a **Premium** streaming endpoint, by default get one streaming unit (SU). The streaming endpoint can be scaled by adding SUs. Each SU provides additional bandwidth capacity to the application. For more information about scaling **Premium** streaming endpoints, see the [Scaling streaming endpoints](media-services-portal-scale-streaming-endpoints.md) topic.
-
-* A Media Services account is associated with a Reserved Unit Type, which determines the speed with which your media processing tasks are processed. You can pick between the following reserved unit types: **S1**, **S2**, or **S3**. For example, the same encoding job runs faster when you use the **S2** reserved unit type compare to the **S1** type.
-
- In addition to specifying the reserved unit type, you can specify to provision your account with **Reserved Units** (RUs). The number of provisioned RUs determines the number of media tasks that can be processed concurrently in a given account.
-
- > [!NOTE]
- > RUs work for parallelizing all media processing, including indexing jobs using Azure Media Indexer. However, unlike encoding, indexing jobs do not get processed faster with faster reserved units.
-
- For more information see, [Scale media processing](media-services-portal-scale-media-processing.md).
-
-* You can also scale your Media Services account by adding storage accounts to it. Each storage account is limited to 500 TB. For more information, see [Manage storage accounts](./media-services-managing-multiple-storage-accounts.md).
-
-## Next steps
-
-[Migrate to Media Services v3](../latest/media-services-overview.md)
-
-## Provide feedback
migrate Tutorial App Containerization Aspnet Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/tutorial-app-containerization-aspnet-kubernetes.md
description: Tutorial:Containerize & migrate ASP.NET applications to Azure Kuber
+ Last updated 6/30/2021
If you just created a free Azure account, you're the owner of your subscription.
![Search box to search for the Azure subscription.](./media/tutorial-discover-vmware/search-subscription.png)
-2. In the **Subscriptions** page, select the subscription in which you want to create an Azure Migrate project.
-3. In the subscription, select **Access control (IAM)** > **Check access**.
-4. In **Check access**, search for the relevant user account.
-5. In **Add a role assignment**, click **Add**.
+1. In the **Subscriptions** page, select the subscription in which you want to create an Azure Migrate project.
- ![Search for a user account to check access and assign a role.](./media/tutorial-discover-vmware/azure-account-access.png)
+1. Select **Access control (IAM)**.
-6. In **Add role assignment**, select the Owner role, and select the account (azmigrateuser in our example). Then click **Save**.
+1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
- ![Opens the Add Role assignment page to assign a role to the account.](./media/tutorial-discover-vmware/assign-role.png)
+1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
-7. Your Azure account also needs **permissions to register Azure Active Directory apps.**
-8. In Azure portal, navigate to **Azure Active Directory** > **Users** > **User Settings**.
-9. In **User settings**, verify that Azure AD users can register applications (set to **Yes** by default).
+ | Setting | Value |
+ | | |
+ | Role | Owner |
+ | Assign access to | User |
+ | Members | azmigrateuser (in this example) |
+
+ ![Add role assignment page in Azure portal.](../../includes/role-based-access-control/media/add-role-assignment-page.png)
+
+1. Your Azure account also needs **permissions to register Azure Active Directory apps.**
+
+1. In Azure portal, navigate to **Azure Active Directory** > **Users** > **User Settings**.
+
+1. In **User settings**, verify that Azure AD users can register applications (set to **Yes** by default).
![Verify in User Settings that users can register Active Directory apps.](./media/tutorial-discover-vmware/register-apps.png)
-10. In case the 'App registrations' settings is set to 'No', request the tenant/global admin to assign the required permission. Alternately, the tenant/global admin can assign the **Application Developer** role to an account to allow the registration of Azure Active Directory App. [Learn more](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md).
+1. In case the 'App registrations' settings is set to 'No', request the tenant/global admin to assign the required permission. Alternately, the tenant/global admin can assign the **Application Developer** role to an account to allow the registration of Azure Active Directory App. [Learn more](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md).
## Download and install Azure Migrate: App Containerization tool
migrate Tutorial App Containerization Java Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/tutorial-app-containerization-java-kubernetes.md
description: Tutorial:Containerize & migrate Java web applications to Azure Kube
+ Last updated 6/30/2021
If you just created a free Azure account, you're the owner of your subscription.
![Search box to search for the Azure subscription.](./media/tutorial-discover-vmware/search-subscription.png)
-2. In the **Subscriptions** page, select the subscription in which you want to create an Azure Migrate project.
-3. In the subscription, select **Access control (IAM)** > **Check access**.
-4. In **Check access**, search for the relevant user account.
-5. In **Add a role assignment**, click **Add**.
+1. In the **Subscriptions** page, select the subscription in which you want to create an Azure Migrate project.
- ![Search for a user account to check access and assign a role.](./media/tutorial-discover-vmware/azure-account-access.png)
+1. Select **Access control (IAM)**.
-6. In **Add role assignment**, select the Owner role, and select the account (azmigrateuser in our example). Then click **Save**.
+1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
- ![Opens the Add Role assignment page to assign a role to the account.](./media/tutorial-discover-vmware/assign-role.png)
+1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
-7. Your Azure account also needs **permissions to register Azure Active Directory apps.**
-8. In Azure portal, navigate to **Azure Active Directory** > **Users** > **User Settings**.
-9. In **User settings**, verify that Azure AD users can register applications (set to **Yes** by default).
+ | Setting | Value |
+ | | |
+ | Role | Owner |
+ | Assign access to | User |
+ | Members | azmigrateuser (in this example) |
+
+ ![Add role assignment page in Azure portal.](../../includes/role-based-access-control/media/add-role-assignment-page.png)
+
+1. Your Azure account also needs **permissions to register Azure Active Directory apps.**
+
+1. In Azure portal, navigate to **Azure Active Directory** > **Users** > **User Settings**.
+
+1. In **User settings**, verify that Azure AD users can register applications (set to **Yes** by default).
![Verify in User Settings that users can register Active Directory apps.](./media/tutorial-discover-vmware/register-apps.png)
-10. In case the 'App registrations' settings is set to 'No', request the tenant/global admin to assign the required permission. Alternately, the tenant/global admin can assign the **Application Developer** role to an account to allow the registration of Azure Active Directory App. [Learn more](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md).
+1. In case the 'App registrations' settings is set to 'No', request the tenant/global admin to assign the required permission. Alternately, the tenant/global admin can assign the **Application Developer** role to an account to allow the registration of Azure Active Directory App. [Learn more](../active-directory/fundamentals/active-directory-users-assign-role-azure-portal.md).
## Download and install Azure Migrate: App Containerization tool
network-watcher Traffic Analytics Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/traffic-analytics-schema.md
Title: Azure traffic analytics schema | Microsoft Docs description: Understand schema of Traffic Analytics to analyze Azure network security group flow logs.--++
Below is the schema for public ip details:
| FlowIntervalEndTime_t | Date and Time in UTC | End time of the flow log processing interval | | FlowType_s | * AzurePublic <br> * ExternalPublic <br> * MaliciousFlow | Definition in notes below the table | | IP | Public IP | Public IP whose information is provided in the record |
-| Location | Location of the IP | - For Azure Public IP: Azure region of virtual network/network interface/virtual machine to which the IP belongs to <br> - For External Public IP and Malicious IP: 2-letter country code where IP is located (ISO 3166-1 alpha-2) |
-| PublicIPDetails | Information about IP | - For AzurePublic IP: Azure Service behind the IP <br> - ExternalPublic/Malicious IP: WhoIS information of the IP |
+| Location | Location of the IP | - For Azure Public IP: Azure region of virtual network/network interface/virtual machine to which the IP belongs OR Global for IP [168.63.129.16](../virtual-network/what-is-ip-address-168-63-129-16.md) <br> - For External Public IP and Malicious IP: 2-letter country code where IP is located (ISO 3166-1 alpha-2) |
+| PublicIPDetails | Information about IP | - For AzurePublic IP: Azure Service owning the IP OR "Microsoft Virtual Public IP" for IP [168.63.129.16](../virtual-network/what-is-ip-address-168-63-129-16.md) <br> - ExternalPublic/Malicious IP: WhoIS information of the IP |
| ThreatType | Threat posed by malicious IP | **For Malicious IPs only**: One of the threats from the list of currently allowed values (described below) | | ThreatDescription | Description of the threat | **For Malicious IPs only**: Description of the threat posed by the malicious IP | | DNSDomain | DNS domain | **For Malicious IPs only**: Domain name associated with this IP |
openshift Howto Secure Openshift With Front Door Feb 22 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/openshift/howto-secure-openshift-with-front-door-feb-22.md
Title: Secure access to Azure Red Hat OpenShift with Azure Front Door
-description: This article explains how to use Azure Front Door to secure access to Azure Red Hat OpenShift applications.
+ Title: Secure access to Azure Red Hat OpenShift with Front Door
+description: This article explains how to use Front Door to secure access to Azure Red Hat OpenShift applications.
keywords: azure, openshift, red hat, front, door
#Customer intent: I need to understand how to secure access to Azure Red Hat OpenShift applications with Azure Front Door.
-# Secure access to Azure Red Hat OpenShift with Azure Front Door
+# Secure access to Azure Red Hat OpenShift with Front Door
This article explains how to use Azure Front Door Premium to secure access to Azure Red Hat OpenShift.
This section explains how to register a domain in Azure DNS.
To create a new Azure Front Door Premium service:
-1. On [Microsoft Azure (PREVIEW) Compare offerings](https://ms.portal.azure.com/#create/Microsoft.AFDX) select **Azure Front Door**, and then select **Continue to create a Front Door**.
+1. On [Microsoft Azure Compare offerings](https://ms.portal.azure.com/#create/Microsoft.AFDX) select **Azure Front Door**, and then select **Continue to create a Front Door**.
-2. On the **Create a front door profile** page in the **Subscription** > **Resource group**, select the resource group in which your Azure Red Hat OpenShift cluster was deployed to house your Azure Front Door Premium (PREVIEW) resource.
+2. On the **Create a front door profile** page in the **Subscription** > **Resource group**, select the resource group in which your Azure Red Hat OpenShift cluster was deployed to house your Azure Front Door Premium resource.
3. Name your Azure Front Door Premium service appropriately. For example, in the **Name** field, enter the following name:
To create a new Azure Front Door Premium service:
At this stage, don't enable the Azure Private Link service, caching, or the Web Application Firewall (WAF) policy.
-9. Select **Review + create** to create the Azure Front Door Premium (PREVIEW) resource, and then wait for the process to complete.
+9. Select **Review + create** to create the Azure Front Door Premium resource, and then wait for the process to complete.
## Initial configuration of Azure Front Door Premium
openshift Howto Secure Openshift With Front Door https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/openshift/howto-secure-openshift-with-front-door.md
+
+ Title: Secure access to Azure Red Hat OpenShift with Azure Front Door
+description: This article explains how to use Azure Front Door to secure access to Azure Red Hat OpenShift applications.
++++ Last updated : 12/07/2021
+keywords: azure, openshift, red hat, front, door
+#Customer intent: I need to understand how to secure access to Azure Red Hat OpenShift applications with Azure Front Door.
++
+# Secure access to Azure Red Hat OpenShift with Azure Front Door
+
+This article explains how to use Azure Front Door Premium to secure access to Azure Red Hat OpenShift.
+
+## Prerequisites
+
+The following prerequisites are required:
+
+- You have an existing Azure Red Hat OpenShift cluster. For information on creating an Azure Red Hat OpenShift Cluster, learn how to [create-an-aks-cluster](../aks/kubernetes-walkthrough-portal.md#create-an-aks-cluster).
+
+- The cluster is configured with private ingress visibility.
+
+- A custom domain name is used, for example:
+
+ `example.com`
+
+> [!NOTE]
+> The initial state doesn't have DNS configured.
+> No applications are exposed externally from the Azure Red Hat OpenShift cluster.
+
+## Create an Azure Private Link service
+
+This section explains how to create an Azure Private Link service. An Azure Private Link service is a reference to your own service that is powered by Azure Private Link.
+
+Your service, which is running behind the Azure Standard Load Balancer, can be enabled for Private Link access so that consumers to your service can access it privately from their own VNets. Your customers can create a private endpoint inside their VNet and map it to this service.
+
+For more information about the Azure Private Link service and how it's used, see [Azure Private Link service](../private-link/private-link-service-overview.md).
+
+Create an **AzurePrivateLinkSubnet**. This subnet includes a netmask that permits visibility of the subnet to the control plane and worker nodes of the Azure cluster. Don't delegate this new subnet to any services or configure any service endpoints.
+
+For example, if the virtual network is 10.10.0.0/16 and:
+
+ - Existing Azure Red Hat OpenShift control plane subnet = 10.10.0.0/24
+ - Existing Azure Red Hat OpenShift worker subnet = 10.10.1.0/24
+ - New AzurePrivateLinkSubnet = 10.10.2.0/24
+
+ Create a new Private Link at [Azure Private Link service](https://portal.azure.com/#create/Microsoft.PrivateLinkservice), as explained in the following steps:
+
+1. On the **Basics** tab, configure the following options:
+ - **Project Details**
+ * Select your Azure subscription.
+ * Select the resource group in which your Azure Red Hat OpenShift cluster was deployed.
+ - **Instance Details**
+ - Enter a **Name** for your Azure Private Link service, as in the following example: *example-com-private-link*.
+ - Select a **Region** for your Private Link.
+
+2. On the **Outbound Settings** tab:
+ - Set the **Load Balancer** to the **-internal** load balancer of the cluster for which you're enabling external access. The choices are populated in the drop-down list.
+ - Set the **Load Balancer frontend IP address** to the IP address of the Azure Red Hat OpenShift ingress controller, which typically ends in **.254**. If you're unsure, use the following command.
+
+ ```azurecli
+ az aro show -n <cluster-name> -g <resource-group> -o tsv --query ingressProfiles[].ip
+ ```
+
+ - The **Source NAT subnet** should be the **AzurePrivateLinkSubnet**, which you created.
+ - No items should be changed in **Outbound Settings**.
+
+3. On the **Access Security** tab, no changes are required.
+
+ - At the **Who can request access to your service?** prompt, select **Anyone with your alias**.
+ - Don't add any subscriptions for auto-approval.
+
+4. On the **Tags** tab, select **Review + create**.
+
+5. Select **Create** to create the Azure Private Link service, and then wait for the process to complete.
+
+6. When your deployment is complete, select **Go to resource group** under **Next steps**.
+
+In the Azure portal, enter the Azure Private Link service that was deployed. Retain the **Alias** that was generated for the Azure Private Link service. It will be used later.
+
+## Register domain in Azure DNS
+
+This section explains how to register a domain in Azure DNS.
+
+1. Create a global [Azure DNS](https://portal.azure.com/#create/Microsoft.DnsZone) zone for example.com.
+
+2. Create a global [Azure DNS](https://portal.azure.com/#create/Microsoft.DnsZone) zone for apps.example.com.
+
+3. Note the four nameservers that are present in Azure DNS for apps.example.com.
+
+4. Create a new **NS** record set in the example.com zone that points to **app** and specify the four nameservers that were present when the **apps** zone was created.
+
+## Create a new Azure Front Door Premium service
+
+To create a new Azure Front Door Premium service:
+
+1. On [Microsoft Azure (PREVIEW) Compare offerings](https://ms.portal.azure.com/#create/Microsoft.AFDX) select **Azure Front Door**, and then select **Continue to create a Front Door**.
+
+2. On the **Create a front door profile** page in the **Subscription** > **Resource group**, select the resource group in which your Azure Red Hat OpenShift cluster was deployed to house your Azure Front Door Premium (PREVIEW) resource.
+
+3. Name your Azure Front Door Premium service appropriately. For example, in the **Name** field, enter the following name:
+
+ `example-com-frontdoor`
+
+4. Select the **Premium** tier. The Premium tier is the only choice that supports Azure Private Link.
+
+5. For **Endpoint name**, choose an endpoint name that is appropriate for Azure Front Door.
+
+ For each application deployed, a CNAME will be created in the Azure DNS to point to this hostname. Therefore, it's important to choose a name that is agnostic to applications. For security, the name shouldn't suggest the applications or architecture that youΓÇÖve deployed, such as **example01**.
+
+ The name you choose will be prepended to the **.z01.azurefd.net** domain.
+
+6. For **Origin type**, select **Custom**.
+
+7. For **Origin Host Name**, enter the following placeholder:
+
+ `changeme.com`
+
+ This placeholder will be deleted later.
+
+ At this stage, don't enable the Azure Private Link service, caching, or the Web Application Firewall (WAF) policy.
+
+9. Select **Review + create** to create the Azure Front Door Premium (PREVIEW) resource, and then wait for the process to complete.
+
+## Initial configuration of Azure Front Door Premium
+
+To configure Azure Front Door Premium:
+
+1. In the Azure portal, enter the Azure Front Door Premium service that was deployed.
+
+2. In the **Endpoint Manager** window, modify the endpoint by selecting **Edit endpoint**.
+
+3. Delete the default route, which was created as **default-route**.
+
+4. Close the **Endpoint Manager** window.
+
+5. In the **Origin Groups** window, delete the default origin group that was named **default-origin-group**.
+
+## Exposing an application route in Azure Red Hat OpenShift
+
+Azure Red Hat OpenShift must be configured to serve the application with the same hostname that Azure Front Door will be exposing externally (\*.apps.example.com). In our example, we'll expose the Reservations application with the following hostname:
+
+`reservations.apps.example.com`
+
+Also, create a secure route in Azure Red Hat OpenShift that exposes the hostname.
+
+## Configure Azure DNS
+
+To configure the Azure DNS:
+
+1. Enter the public **apps** DNS zone previously created.
+
+2. Create a new CNAME record set named **reservation**. This CNAME record set is an alias for our example Azure Front Door endpoint:
+
+ `example01.z01.azurefd.net`
+
+## Configure Azure Front Door Premium
+
+The following steps explain how to configure Azure Front Door Premium.
+
+1. In the Azure portal, enter the Azure Front Door Premium service you created previously:
+
+ `example-com-frontdoor`
+
+ **In the Domains window**:
+
+ 1. Because all DNS servers are hosted on Azure, leave **DNS Management** set to **Azure managed DNS**.
+
+3. Select the example domain:
+
+ `apps.example.com`
+
+4. Select the CNAME in our example:
+
+ `reservations.apps.example.com`
+
+5. Use the default values for **HTTPS** and **Minimum TLS version**.
+
+6. Select **Add**.
+
+7. When the **Validation stat** changes to **Pending**, select **Pending**.
+
+8. To authenticate ownership of the DNS zone, for **DNS record status**, select **Add**.
+
+9. Select **Close**.
+
+10. Continue to select **Refresh** until the **Validation state** of the domain changes to **Approved** and the **Endpoint association** changes to **Unassociated**.
+
+**In the Origin Groups window**:
+
+1. Select **Add**.
+
+2. Give your **Origin Group** an appropriate name, such as **Reservations-App**.
+
+3. Select **Add an origin**.
+
+4. Enter the name of the origin, such as **ARO-Cluster-1**.
+
+5. Choose an **Origin type** of **Custom**.
+
+6. Enter the fully qualified domain name (FQDN) hostname that was exposed in your Azure Red Hat OpenShift cluster, such as:
+
+ `reservations.apps.example.com`
+
+7. Enable the **Private Link** service.
+
+8. Enter the **Alias** that was obtained from the Azure Private Link service.
+
+9. Select **Add** to return to the origin group creation window.
+
+10. Select **Add** to add the origin group and return to the Azure portal.
+
+## Grant approval in Azure Private Link
+
+To grant approval to the **example-com-private-link**, which is the **Azure Private Link** service you created previously, complete the following steps.
+
+1. On the **Private endpoint connections** tab, select the checkbox that now exists from the resource described as **do from AFD**.
+
+2. Select **Approve**, and then select **Yes** to verify the approval.
+
+## Complete Azure Front Door Premium configuration
+
+The following steps explain how to complete the configuration of Azure Front Door Premium.
+
+1. In the Azure portal, enter the Azure Front Door Premium service you previously created:
+
+ `example-com-frontdoor`
+
+2. In the **Endpoint Manager** window, select **Edit endpoint** to modify the endpoint.
+
+3. Select **+Add** under **Routes**.
+
+4. Give your route an appropriate name, such as **Reservations-App-Route-Config**.
+
+5. Under **Domains**, then under **Available validated domains**, select the fully qualified domain name, for example:
+
+ `reservations.apps.example.com`
++
+6. To redirect HTTP traffic to use HTTPS, leave the **Redirect** checkbox selected.
+
+7. Under **Origin group**, select **Reservations-App**, the origin group you previously created.
+
+8. You can enable caching, if appropriate.
+
+9. Select **Add** to create the route.
+After the route is configured, the **Endpoint manager** populates the **Domains** and **Origin groups** panes with the other elements created for this application.
+
+Because Azure Front Door is a global service, the application can take up to 30 minutes to deploy. During this time, you may choose to create a WAF for your application. When your application goes live, it can be accessed using the URL used in this example:
+
+`https://reservations.apps.example.com`
+
+## Next steps
+
+Create a Azure Web Application Firewall on Azure Front Door using the Azure portal:
+> [!div class="nextstepaction"]
+> [Tutorial: Create a Web Application Firewall policy on Azure Front Door using the Azure portal](../web-application-firewall/afds/waf-front-door-create-portal.md)
purview Azure Purview Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/azure-purview-connector-overview.md
The table below shows the supported capabilities for each data source. Select th
|| [Azure Database for PostgreSQL](register-scan-azure-postgresql.md) | [Yes](register-scan-azure-postgresql.md#register) | [Yes](register-scan-azure-postgresql.md#scan) | No* | No | || [Azure Dedicated SQL pool (formerly SQL DW)](register-scan-azure-synapse-analytics.md)| [Yes](register-scan-azure-synapse-analytics.md#register) | [Yes](register-scan-azure-synapse-analytics.md#scan)| No* | No | || [Azure Files](register-scan-azure-files-storage-source.md)|[Yes](register-scan-azure-files-storage-source.md#register) | [Yes](register-scan-azure-files-storage-source.md#scan) | Limited* | No |
-|| [Azure SQL Database](register-scan-azure-sql-database.md)| [Yes](register-scan-azure-sql-database.md#register) |[Yes](register-scan-azure-sql-database.md#scan)| No* | No |
+|| [Azure SQL Database](register-scan-azure-sql-database.md)| [Yes](register-scan-azure-sql-database.md#register) |[Yes](register-scan-azure-sql-database.md#scan)| [Yes (Preview)](register-scan-azure-sql-database.md#lineagepreview) | No |
|| [Azure SQL Database Managed Instance](register-scan-azure-sql-database-managed-instance.md)| [Yes](register-scan-azure-sql-database-managed-instance.md#scan) | [Yes](register-scan-azure-sql-database-managed-instance.md#scan) | No* | No | || [Azure Synapse Analytics (Workspace)](register-scan-synapse-workspace.md)| [Yes](register-scan-synapse-workspace.md#register) | [Yes](register-scan-synapse-workspace.md#scan)| [Yes - Synapse pipelines](how-to-lineage-azure-synapse-analytics.md)| No| |Database| [Amazon RDS](register-scan-amazon-rds.md) | [Yes](register-scan-amazon-rds.md#register-an-amazon-rds-data-source) | [Yes](register-scan-amazon-rds.md#scan-an-amazon-rds-database) | No | No |
The table below shows the supported capabilities for each data source. Select th
|| [Db2](register-scan-db2.md) | [Yes](register-scan-db2.md#register) | No | [Yes](register-scan-db2.md#lineage) | No | || [Google BigQuery](register-scan-google-bigquery-source.md)| [Yes](register-scan-google-bigquery-source.md#register)| No | [Yes](register-scan-google-bigquery-source.md#lineage)| No| || [Hive Metastore Database](register-scan-hive-metastore-source.md) | [Yes](register-scan-hive-metastore-source.md#register) | No | [Yes*](register-scan-hive-metastore-source.md#lineage) | No|
-|| [MySQL](register-scan-mysql.md) | [Yes](register-scan-mysql.md#register) | No | [Yes](register-scan-mysql.md#scan) | No |
+|| [MySQL](register-scan-mysql.md) | [Yes](register-scan-mysql.md#register) | No | [Yes](register-scan-mysql.md#lineage) | No |
|| [Oracle](register-scan-oracle-source.md) | [Yes](register-scan-oracle-source.md#register)| No | [Yes*](register-scan-oracle-source.md#lineage) | No| || [PostgreSQL](register-scan-postgresql.md) | [Yes](register-scan-postgresql.md#register) | No | [Yes](register-scan-postgresql.md#lineage) | No | || [SAP Business Warehose](register-scan-sap-bw.md) | [Yes](register-scan-sap-bw.md#register) | No | No | No |
purview Tutorial Using Rest Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/tutorial-using-rest-apis.md
+ Last updated 09/17/2021
Once service principal is created, you need to assign Data plane roles of your p
>[!NOTE] >You can also assign your service principal permission to any sub-collections, instead of the root collection. However, all APIs will be scoped to that collection (and sub-collections that inherit permissions), and users trying to call the API for another collection will get errors.
-
-1. Select the Role assignments tab.
-1. Assign the following roles to service principal created above to access various data planes in Azure Purview.
- 1. 'Data Curator' role to access Catalog Data plane.
- 1. 'Data Source Administrator' role to access Scanning Data plane.
- 1. 'Collection Admin' role to access Account Data Plane.
- 1. 'Collection Admin' role to access Metadata policy Data Plane.
-> [!Note]
-> Only 'Collection Admin' can assign data plane roles in Azure Purview [Access Control in Azure Purview](./catalog-permissions.md).
+1. Select **Access control (IAM)**.
+
+1. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
+
+ ![Screenshot that shows Add role assignment page in Azure portal.](../../includes/role-based-access-control/media/add-role-assignment-page.png)
+
+1. Select the **Role** tab.
+
+1. Assign the following roles to the service principal created previously to access various data planes in Azure Purview. For detailed steps, see [Assign Azure roles using the Azure portal](../role-based-access-control/role-assignments-portal.md).
+
+ * Data Curator role to access Catalog Data plane.
+ * Data Source Administrator role to access Scanning Data plane.
+ * Collection Admin role to access Account Data Plane and Metadata policy Data Plane.
+
+ > [!Note]
+ > Only members of the Collection Admin role can assign data plane roles in Azure Purview. For more information about Azure Purview roles, see [Access Control in Azure Purview](./catalog-permissions.md).
## Get token You can send a POST request to the following URL to get access token. https://login.microsoftonline.com/{your-tenant-id}/oauth2/token
-The following parameters needs to be passed to the above URL.
+The following parameters need to be passed to the above URL.
- **client_id**: client ID of the application registered in Azure Active directory and is assigned to a data plane role for the Azure Purview account. - **client_secret**: client secret created for the above application.
role-based-access-control Resource Provider Operations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/role-based-access-control/resource-provider-operations.md
Azure service: [Azure Maps](../azure-maps/index.yml)
### Microsoft.Media
-Azure service: [Media Services](../media-services/index.yml)
+Azure service: [Media Services](/media-services/)
> [!div class="mx-tableFixed"] > | Action | Description |
Azure service: [Azure Chaos Studio](../chaos-studio/index.yml)
> | | | > | Microsoft.Chaos/register/action | Registers the subscription for the Chaos Resource Provider and enables the creation of Chaos resources. | > | Microsoft.Chaos/unregister/action | Unregisters the subscription for the Chaos Resource Provider and enables the creation of Chaos resources. |
-> | Microsoft.Chaos/artifactSetDefinitions/write | Creates an Artifact Set Definition which describes the set of artifact to capture for a given Chaos Experiment. |
-> | Microsoft.Chaos/artifactSetDefinitions/read | Gets all Artifact Set Definitions that extend a Chaos Experiment resource. |
-> | Microsoft.Chaos/artifactSetDefinitions/delete | Deletes all Artifact Set Definitions that extend a Chaos Experiment resource. |
-> | Microsoft.Chaos/artifactSetSnapshots/read | Gets all Artifact Set Snapshots that extend a Chaos Experiment resource. |
-> | Microsoft.Chaos/artifactSetSnapshots/artifactSnapshots/read | Gets all Artifact Snapshots that extend a Artifact Set Snapshot. |
> | Microsoft.Chaos/experiments/write | Creates or updates a Chaos Experiment resource in a resource group. | > | Microsoft.Chaos/experiments/delete | Deletes a Chaos Experiment resource in a resource group. | > | Microsoft.Chaos/experiments/read | Gets all Chaos Experiments in a resource group. |
search Search Howto Managed Identities Data Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-howto-managed-identities-data-sources.md
Cognitive Search can use a system-assigned or user-assigned managed identity on
| [Knowledge Store (hosted in Azure Storage)](knowledge-store-create-rest.md) | Yes <sup>2</sup>| Yes | | [Custom skills (hosted in Azure Functions or equivalent)](cognitive-search-custom-skill-interface.md) | Yes | Yes |
-<sup>1</sup> The Import data wizard doesn't currently accept a managed identity connection string for incremental enrichment, but after the wizard completes, you can update the connection string in indexer JSON definition to specify the managed identity, and then rerun the indexer.
+<sup>1</sup> The Import data wizard doesn't currently accept a managed identity connection string for enrichment cache, but after the wizard completes, you can update the connection string in indexer JSON definition to specify the managed identity, and then rerun the indexer.
<sup>2</sup> If your indexer has an attached skillset that writes back to Azure Storage (for example, it creates a knowledge store or caches enriched content), a managed identity won't work if the storage account is behind a firewall or has IP restrictions. This is a known limitation that will be lifted when managed identity support for skillset scenarios becomes generally available. The solution is to use a full access connection string instead of a managed identity if Azure Storage is behind a firewall.
search Search Security Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-security-overview.md
Independent of network security, all inbound requests must be authenticated. Key
### Outbound traffic
-Outbound requests from a search service to other applications are typically made by indexers for text-based indexing and some aspects of AI enrichment. Outbound requests include both read and write operations:
+Outbound requests from a search service to other applications are typically made by indexers for text-based indexing and some aspects of AI enrichment. Outbound requests include both read and write operations. Outbound requests are made by the search service on its own behalf, and on the behalf of an indexer or skillset.
-+ Search, on behalf of an indexer, connects to external data sources to read in data for indexing.
-+ Search, on behalf of an indexer, writes to Azure Storage when creating knowledge stores, persisting cached enrichments, and persisting debug sessions.
++ Indexer connects to external data sources to read in data for indexing.++ Indexer writes to Azure Storage when creating knowledge stores, persisting cached enrichments, and persisting debug sessions. + A custom skill connects to an Azure function or app to run external code that's hosted off-service. The request for external processing is sent during skillset execution. + Search connects to Azure Key Vault for a customer-managed key used to encrypt and decrypt sensitive data.
-Outbound connections can be made using a resource's full access connection string that includes a key or a database login, or a managed identity if you're using Azure Active Directory.
+Outbound connections can be made using a resource's full access connection string that includes a key or a database login, or an Azure AD login ([a managed identity](search-howto-managed-identities-data-sources.md)) if you're using Azure Active Directory.
-If your Azure resource is behind a firewall, you'll need to create rules that admit search service requests. For resources protected by Azure Private Link, you can create a shared private link that an indexer uses to make its connection.
+If your Azure resource is behind a firewall, you'll need to [create rules that admit search service requests](search-indexer-howto-access-ip-restricted.md). For resources protected by Azure Private Link, you can [create a shared private link](search-indexer-howto-access-private.md) that an indexer uses to make its connection.
### Internal traffic
Once a request is admitted, it must still undergo authentication and authorizati
+ [Key-based authentication](search-security-api-keys.md) is performed on the request (not the calling app or user) through an API key, where the key is a string composed of randomly generated numbers and letters that prove the request is from a trustworthy source. Keys are required on every request. Submission of a valid key is considered proof the request originates from a trusted entity.
-+ [Azure AD authentication (preview)](search-security-rbac.md) establishes the caller (and not the request) as the authenticated identity. An Azure role assignment determines the allowed operation.
++ [Azure AD authentication (preview)](search-security-rbac.md) establishes the caller (and not the request) as the authenticated identity. An Azure role assignment determines the allowed operation. Outbound requests made by an indexer are subject to the authentication protocols supported by the external service. A search service can be made a trusted service on Azure, connecting to other services using a system or user managed identity. For more information, see [Set up an indexer connection to a data source using a managed identity](search-howto-managed-identities-data-sources.md).
service-fabric Service Fabric Content Roadmap https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-fabric/service-fabric-content-roadmap.md
Azure Service Fabric is a distributed systems platform that makes it easy to pac
## Core concepts [Service Fabric terminology](service-fabric-technical-overview.md), [Application model](service-fabric-application-model.md), and [Supported programming models](service-fabric-choose-framework.md) provide more concepts and descriptions, but here are the basics.
+* <b>Service Fabric Cluster:</b> [Check this link for a training video to get an introduction to Service Fabric architecture and its core concepts and explore many service fabric features.](/shows/building-microservices-applications-on-azure-service-fabric/what-is-a-service-fabric-cluster)
+* <b>Runtime Concepts:</b> [Check this link for a training video to understand runtime concepts and best practices of Service Fabric.](/shows/building-microservices-applications-on-azure-service-fabric/run-time-concepts)
+* <b>Design Type Concepts:</b> [Check this link for a training video to understand application, packaging, and deployment; key Service Fabric terminology, abstractions, and concepts.](/shows/building-microservices-applications-on-azure-service-fabric/design-time-concepts)
++ ### Design time: service type, service package and manifest, application type, application package and manifest A service type is the name/version assigned to a service's code packages, data packages, and configuration packages. This is defined in a ServiceManifest.xml file. The service type is composed of executable code and service configuration settings, which are loaded at run time, and static data that is consumed by the service.
As with other platforms, an application on Service Fabric usually goes through t
The entire app lifecycle can be managed using [PowerShell cmdlets](/powershell/module/servicefabric/?preserve-view=true&view=azureservicefabricps), [CLI commands](service-fabric-sfctl.md), [C# APIs](/dotnet/api/system.fabric.fabricclient.applicationmanagementclient), [Java APIs](/jav) or [Jenkins](/azure/developer/jenkins/deploy-to-service-fabric-cluster).
+[Check this link for a training video that describes how to manage your application lifecycle:](/shows/building-microservices-applications-on-azure-service-fabric/application-lifetime-management-in-action)
## Test applications and services To create truly cloud-scale services, it is critical to verify that your applications and services can withstand real-world failures. The Fault Analysis Service is designed for testing services that are built on Service Fabric. With the [Fault Analysis Service](service-fabric-testability-overview.md), you can induce meaningful faults and run complete test scenarios against your applications. These faults and scenarios exercise and validate the numerous states and transitions that a service will experience throughout its lifetime, all in a controlled, safe, and consistent manner.
A [Service Fabric cluster](service-fabric-deploy-anywhere.md) is a network-conne
Service Fabric clusters can be created on virtual or physical machines running Windows Server or Linux. You are able to deploy and run Service Fabric applications in any environment where you have a set of Windows Server or Linux computers that are interconnected: on-premises, on Microsoft Azure, or on any cloud provider.
+[Check this link for a training video that describes Service Fabric architecture, its core concepts, and many other service fabric features](/shows/building-microservices-applications-on-azure-service-fabric/what-is-a-service-fabric-cluster)
### Clusters on Azure Running Service Fabric clusters on Azure provides integration with other Azure features and services, which makes operations and management of the cluster easier and more reliable. A cluster is an Azure Resource Manager resource, so you can model clusters like any other resources in Azure. Resource Manager also provides easy management of all resources used by the cluster as a single unit. Clusters on Azure are integrated with Azure diagnostics and Azure Monitor logs. Cluster node types are [virtual machine scale sets](../virtual-machine-scale-sets/index.yml), so autoscaling functionality is built in.
Service Fabric provides multiple ways to [view health reports](service-fabric-vi
* Health queries (through [PowerShell](/powershell/module/servicefabric/?preserve-view=true&view=azureservicefabricps), [CLI](service-fabric-sfctl.md), the [C# FabricClient APIs](/dotnet/api/system.fabric.fabricclient.healthclient) and [Java FabricClient APIs](/java/api/system.fabric), or [REST APIs](/rest/api/servicefabric)). * General queries that return a list of entities that have health as one of the properties (through PowerShell, CLI, the APIs, or REST).
+[Check this page for a training video that describes the Service Fabric health model and how it's used:](/shows/building-microservices-applications-on-azure-service-fabric/service-fabric-health-system)
## Monitoring and diagnostics [Monitoring and diagnostics](service-fabric-diagnostics-overview.md) are critical to developing, testing, and deploying applications and services in any environment. Service Fabric solutions work best when you plan and implement monitoring and diagnostics that help ensure applications and services are working as expected in a local development environment or in production.
site-recovery Hyper V Azure Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/site-recovery/hyper-v-azure-tutorial.md
In this tutorial, you learn how to:
This is the third tutorial in a series. It assumes that you have already completed the tasks in the previous tutorials:
-1. [Prepare Azure](tutorial-prepare-azure.md)
+1. [Prepare Azure](https://docs.microsoft.com/azure/site-recovery/tutorial-prepare-azure-for-hyperv)
2. [Prepare on-premises Hyper-V](./hyper-v-prepare-on-premises-tutorial.md) ## Select a replication goal
spring-cloud Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-cloud/quickstart.md
In order to deploy to Azure, you must sign in with your Azure account, then choo
1. Start the deployment by selecting the **Run** button at the bottom of the **Deploy Azure Spring Cloud app** dialog. The plug-in will run the command `mvn package -DskipTests` on the `hellospring` app and deploy the jar generated by the `package` command.
+#### [Visual Studio Code](#tab/VS-Code)
+To deploy a simple Spring Boot web app to Azure Spring Cloud, follow the steps in [Build and Deploy Java Spring Boot Apps to Azure Spring Cloud with Visual Studio Code](https://code.visualstudio.com/docs/java/java-spring-cloud#_download-and-test-the-spring-boot-app).
+ Once deployment has completed, you can access the app at `https://<service instance name>-hellospring.azuremicroservices.io/`.
Logs appear in the results:
[![Streaming log output](media/spring-cloud-quickstart-java/intellij-streaming-logs-output.png)](media/spring-cloud-quickstart-java/intellij-streaming-logs-output.png)
+#### [Visual Studio Code](#tab/VS-Code)
+
+To get real-time application logs with Visual Studio Code, follow the steps in [Stream your application logs](https://code.visualstudio.com/docs/java/java-spring-cloud#_stream-your-application-logs).
+ For advanced logs analytics features, visit the **Logs** tab in the menu on the [Azure portal](https://portal.azure.com/). Logs here have a latency of a few minutes.
static-web-apps Private Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/private-endpoint.md
Last updated 7/28/2021
You can use a private endpoint (also called private link) to restrict access to your static web app so that it is only accessible from your private network.
-> [!NOTE]
-> Private endpoints support in Static Web Apps is currently in preview.
- ## How it works An Azure Virtual Network (VNet) is a network just like you might have in a traditional data center, but resources within the VNet talk to each other securely on the Microsoft backbone network.
Configuring Static Web Apps with a private endpoint allows you to use a private
> [!NOTE] > Placing your application behind a private endpoint means your app is only available in the region where your VNet is located. As a result, your application is no longer available across multiple points of presence.
-> [!WARNING]
-> Currently, private endpoints only secure your production environment. Support for staging environments will be added in an upcoming service update.
+If your app has a private endpoint enabled, the server will respond with a `403` status code if the request comes from a public IP address. This behavior applies to both the production environment as well as any staging environments. The only way to reach the app is to use the private endpoint deployed within your VNet.
+
+The default DNS resolution of the static web app still exists and routes to a public IP address. The private endpoint will expose 2 IP Addresses within your VNet, one for the production environment and one for any staging environments. To ensure your client is able to reach the app correctly, make sure your client resolves the hostname of the app to the appropriate IP address of the private endpoint. This is required for the default hostname as well as any custom domains configured for the static web app. This resolution is done automatically if you select a private DNS zone when creating the private endpoint (see example below) and is the recommended solution.
+
+If you are connecting from on-prem or do not wish to use a private DNS zone, manually configure the DNS records for your application so that requests are routed to the appropriate IP address of the private endpoint. You can find more information on private endpoint DNS resolution [here](https://docs.microsoft.com/azure/private-link/private-endpoint-dns).
## Prerequisites
static-web-apps Publish Hugo https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/publish-hugo.md
The following steps show you how to create a new static site app and deploy it t
1. Select the **Review + Create** button to verify the details are all correct.
-1. Select **Create** to start the creation of the App Service Static Web App and provision a GitHub Action for deployment.
+1. Select **Create** to start the creation of the App Service Static Web App and provision a GitHub Actions for deployment.
1. Once the deployment completes click, **Go to resource**.
-1. On the resource screen, click the _URL_ link to open your deployed application. You may need to wait a minute or two for the GitHub Action to complete.
+1. On the resource screen, click the _URL_ link to open your deployed application. You may need to wait a minute or two for the GitHub Actions to complete.
:::image type="content" source="./media/publish-hugo/deployed-app.png" alt-text="Deployed application":::
storage Secure File Transfer Protocol Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/secure-file-transfer-protocol-known-issues.md
For performance issues and considerations, see [SSH File Transfer Protocol (SFTP
- Special containers such as $logs, $blobchangefeed, $root, $web are not accessible via the SFTP endpoint. -- When using custom domains the connection string is `<accountName>.<userName>@customdomain.com`. If home directory has not been specified for the user, it is `<accountName>.<containerName>.<userName>@customdomain.com`.- - Symbolic links are not supported. - `ssh-keyscan` is not supported.
storage Secure File Transfer Protocol Support How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/secure-file-transfer-protocol-support-how-to.md
After the transfer is complete, you can view and manage the file in the Azure po
See the documentation of your SFTP client for guidance about how to connect and transfer files.
+## Connect using a custom domain
+
+When using custom domains the connection string is `myaccount.myuser@customdomain.com`. If home directory has not been specified for the user, it is `myaccount.mycontainer.myuser@customdomain.com`.
+
+> [!IMPORTANT]
+> Ensure your DNS provider does not proxy requests. Proxying may cause the connection attempt to time out.
+ ## See also - [SSH File Transfer Protocol (SFTP) support for Azure Blob Storage](secure-file-transfer-protocol-support.md)
storage Storage Blob Change Feed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-change-feed.md
description: Learn about change feed logs in Azure Blob Storage and how to use t
Previously updated : 03/10/2022 Last updated : 03/29/2022
The following example shows a change event record in JSON format that uses event
}, "asyncOperationInfo": { "DestinationTier": "Hot",
- "WasAsyncOperation": true,
+ "WasAsyncOperation": "true",
"CopyId": "copyId" }, "storageDiagnostics": {
The following example shows a change event record in JSON format that uses event
}, "asyncOperationInfo": { "DestinationTier": "Hot",
- "WasAsyncOperation": true,
+ "WasAsyncOperation": "true",
"CopyId": "copyId" }, "blobTagsUpdated": {
storage Storage Blob Dotnet Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-dotnet-get-started.md
# Get started with Azure Blob Storage and .NET
-This article shows you to connect to Azure Blob Storage by using the Azure Blob Storage client library v12 for .NET. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service.
+This article shows you how to connect to Azure Blob Storage by using the Azure Blob Storage client library v12 for .NET. Once connected, your code can operate on containers, blobs, and features of the Blob Storage service.
[Package (NuGet)](https://www.nuget.org/packages/Azure.Storage.Blobs) | [Samples](../common/storage-samples-dotnet.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#blob-samples) | [API reference](/dotnet/api/azure.storage.blobs) | [Library source code](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/storage/Azure.Storage.Blobs) | [Give Feedback](https://github.com/Azure/azure-sdk-for-net/issues)
storage File Sync How To Manage Tiered Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/file-sync/file-sync-how-to-manage-tiered-files.md
The easiest way to recall a file to disk is to open the file. The Azure File Syn
> [!NOTE] > If a shortcut file is brought down to the server as a tiered file, there may be an issue when accessing the file over SMB. To mitigate this, there is task that runs every three days that will recall any shortcut files. However, if you would like shortcut files that are tiered to be recalled more frequently, create a scheduled task that runs this at the desired frequency: > ```powershell
-> Import-Module "C:\Program Files\Azure\StorageSyncAgent\StorageSync.Management.PowerShell.Cmdlets.dll"
+> Import-Module "C:\Program Files\Azure\StorageSyncAgent\StorageSync.Management.ServerCmdlets.dll"
> Invoke-StorageSyncFileRecall -Path <path-to-to-your-server-endpoint> -Pattern *.lnk > ```
synapse-analytics Connectivity Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/security/connectivity-settings.md
description: An article that teaches you to configure connectivity settings in A
Previously updated : 12/03/2021 -- Last updated : 03/28/2022 ++
Selecting the **Disable** option will not apply any firewall rules that you may
The connection policy for Synapse SQL in Azure Synapse Analytics is set to *Default*. You cannot change this in Azure Synapse Analytics. You can learn more about how that affects connections to Synapse SQL in Azure Synapse Analytics [here](../../azure-sql/database/connectivity-architecture.md#connection-policy). ## Minimal TLS version
-Synapse SQL in Azure Synapse Analytics allows connections using all TLS versions. You cannot set the minimal TLS version for Synapse SQL in Azure Synapse Analytics.
+While the serverless SQL endpoint and development endpoint only accept TLS 1.2 and above, the "Workspace Managed Sql Server Dedicated SQL Minimal" API allows connections using all TLS versions. Starting in December 2021, a requirement for TLS 1.2 has been implemented for new Synapse workspaces only. Login attempts to the newly created Synapse workspace from connections using a TLS version lower than 1.2 will fail.
-Starting in December 2021, a requirement for TLS 1.2 has been implemented for new Synapse Workspaces only. Login attempts to the newly created Synapse workspace from connections using a TLS version lower than 1.2 will fail.
+Customers can now raise or lower the workspace managed SQL server dedicated SQL minimal TLS version using API for both new Synapse workspaces or existing workspaces. Users previously using lower client version in the new workspace which enforces min1.2 can now lower the TLS version. And customers who have existing workspaces now can raise the minTLS version to meet their security needs. Learn more by reading [minTLS REST API](/rest/api/synapse/sqlserver/workspace-managed-sql-server-dedicated-sql-minimal-tls-settings/update).
## Next steps
synapse-analytics Synapse Workspace Understand What Role You Need https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/security/synapse-workspace-understand-what-role-you-need.md
Previously updated : 3/07/2022 Last updated : 03/23/2022
This article will help you understand which Synapse RBAC (role-based access cont
### Access Synapse Studio
-You can open Synapse Studio and view details of the workspace and list any of its Azure resources (SQL pools, Spark pools, or Integration runtimes) if you've been assigned any Synapse RBAC role or have the Azure Owner, Contributor, or Reader role on the workspace.
+You can open Synapse Studio and view details of the workspace and list any of its Azure resources such as SQL pools, Spark pools, or Integration runtimes. You will see if you've been assigned any Synapse RBAC role or have the Azure Owner, Contributor, or Reader role on the workspace.
### Resource management You can create SQL pools, Data Explorer pools, Apache Spark pools, and Integration runtimes if you're an Azure Owner or Contributor on the workspace. When using ARM templates for automated deployment, you need to be an Azure Contributor on the resource group.
-You can pause or scale a dedicated SQL pool, configure a Spark pool or an integration runtime if you're an Azure Owner or Contributor on the workspace or that resource.
+You can pause or scale a dedicated SQL pool, configure a Spark pool, or an integration runtime if you're an Azure Owner or Contributor on the workspace or that resource.
### View and edit code artifacts
-With access to Synapse Studio, you can create new code artifacts, such as SQL scripts, KQL scripts, notebooks, spark jobs, linked services, pipelines, dataflows, triggers, and credentials. (These artifacts can be published or saved with additional permissions.)
+With access to Synapse Studio, you can create new code artifacts, such as SQL scripts, KQL scripts, notebooks, spark jobs, linked services, pipelines, dataflows, triggers, and credentials. These artifacts can be published or saved with additional permissions.
If you're a Synapse Artifact User, Synapse Artifact Publisher, Synapse Contributor, or Synapse Administrator you can list, open, and edit already published code artifacts.
You can commit code artifacts to a working branch of a Git repository if the wor
If you close Synapse Studio without publishing or committing changes to code artifacts, then those changes will be lost. - ## Tasks and required roles The table below lists common tasks and for each task, the Synapse RBAC, or Azure RBAC roles required.
All Synapse RBAC permissions/actions shown in the table are prefixed `Microsoft/
Task (I want to...) |Role (I need to be...)|Synapse RBAC permission/action --|--|--
-|Open Synapse Studio on a workspace|Synapse User, or|read
+|Open Synapse Studio on a workspace|Synapse User, or |read
| |Azure Owner, Contributor, or Reader on the workspace|none |List SQL pools, Data Explorer pools, Apache Spark pools, Integration runtimes and access their configuration details|Synapse User, or|read| ||Azure Owner, Contributor, or Reader on the workspace|none
virtual-desktop Create Host Pools Azure Marketplace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/create-host-pools-azure-marketplace.md
After that, you're all done!
If you'd rather use an automated process, [download our Azure Resource Manager template](https://github.com/Azure/RDS-Templates/tree/master/ARM-wvd-templates) to provision your new host pool instead. >[!NOTE]
->If you're using an automated process to build your environment, you'll need the latest version of the configuration JSON file. You can find the JSON file [here](https://wvdportalstorageblob.blob.core.windows.net/galleryartifacts?restype=container&comp=list).
+>If you're using an automated process to build your environment, you'll need the latest version of the configuration JSON file.
## Next steps
virtual-desktop Multimedia Redirection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/multimedia-redirection.md
Title: Multimedia redirection on Azure Virtual Desktop - Azure
description: How to use multimedia redirection for Azure Virtual Desktop (preview). Previously updated : 03/28/2022 Last updated : 03/29/2022
The following list shows websites that are known to work with MMR. MMR is suppos
- Sites with embedded YouTube videos, such as Medium, Udacity, Los Angeles Times, and so on. - Teams Live Events (on web) - Currently, Teams live events aren't media-optimized for Azure Virtual Desktop and Windows 365. MMR is a short-term workaround for a smoother Teams live events playback on Azure Virtual Desktop.
+ - MMR supports Enterprise Content Delivery Network (ECDN) for Teams live events.
### How to use MMR for Teams live events
virtual-desktop Whats New Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/whats-new-agent.md
+
+ Title: What's new in the Azure Virtual Desktop Agent? - Azure
+description: New features and product updates for the Azure Virtual Desktop Agent.
++ Last updated : 03/28/2022++++
+# What's new in the Azure Virtual Desktop Agent?
+
+The Azure Virtual Desktop Agent updates regularly. This article is where you'll find out about:
+
+- The latest updates
+- New features
+- Improvements to existing features
+- Bug fixes
+
+Make sure to check back here often to keep up with new updates.
+
+## Version 1.0.4119.1500
+
+This update was released in February 2022 and includes the following changes:
+
+- Fixes an issue with arithmetic overflow casting exceptions.
+- Updated the agent to now start the Azure Instance Metadata Service (IMDS) when the agent starts.
+- Fixes an issue that caused Sandero name pipe service start ups to be slow when the VM has no registration information.
+- General bug fixes and agent improvements.
+
+## Version 1.0.4009.1500
+
+This update was released in January 2022 and includes the following changes:
+
+- Added logging to better capture agent update telemetry.
+- Updated the agent's Azure Instance Metadata Service health check to be Azure Stack HCI-friendly
+
+## Version 1.0.3855.1400
+
+This update was released December 2021 and has the following changes:
+
+- Fixes an issue that caused an unhandled exception.
+- This version now supports Azure Stack HCI by retrieving VM metadata from the Azure Arc service.
+- This version now allows built-in stacks to be automatically updated if its version number is beneath a certain threshold.
+- The UrlsAccessibleCheck health check now only gets the URL until the path delimiter to prevent 404 errors.
+
+## Version 1.0.3719.1700
+
+This update was released November 2021 and has the following changes:
+
+- Updated agent error messages.
+- Fixes an issue with the agent restarting every time the side-by-side stack was updated.
+- General agent improvements.
+
+## Version 1.0.3583.2600
+
+This update was released October 2021 and it fixes an issue where upgrading from Windows 10 to Windows 11 disabled the side-by-side stack.
+
+## Version 1.0.3373.2605
+
+This update was released September 2021 and it fixes an issue with package deregistration getting stuck when using MSIX App Attach.
+
+## Version 1.0.3373.2600
+
+This update was released September 2021 and has the following changes:
+
+- General agent improvements.
+- Fixes issues with restarting the agent on Windows 7 VMs.
+- Fixes an issue with fields in the WVDAgentHealthStatus table not showing up correctly.
+
+## Version 1.0.3130.2900
+
+This update was released July 2021 and has the following changes:
+
+- General improvements and bug fixes.
+- Fixes an issue with getting the host pool path for Intune registration.
+- Added logging to better diagnose agent issues.
+- Fixes an issue with orchestration timeouts.
+
+## Version 1.0.3050.2500
+
+This update was released July 2021 and has the following changes:
+
+- Updated internal monitors for agent health.
+- Updated retry logic for stack health.
+
+## Version 1.0.2990.1500
+
+This update was released April 2021 and has the following changes:
+
+- Updated agent error messages.
+- Added an exception that prevents you from installing non-Windows 7 agents on Windows 7 VMs.
+- Has updated heartbeat service logic.
+
+## Version 1.0.2944.1400
+
+This update was released April 2021 and has the following changes:
+
+- Placed links to the Azure Virtual Desktop Agent troubleshooting guide in the event viewer logs for agent errors.
+- Added an additional exception for better error handling.
+- Added the WVDAgentUrlTool.exe that allows customers to check which required URLs they can access.
+
+## Version 1.0.2866.1500
+
+This update was released March 2021 and it fixes an issue with the stack health check.
+
+## Version 1.0.2800.2802
+
+This update was released March 2021 and it has general improvements and bug fixes.
+
+## Version 1.0.2800.2800
+
+This update was released March 2021 and it fixes a reverse connection issue.
+
+## Version 1.0.2800.2700
+
+This update was released February 2021 and it fixes an access denied orchestration issue.
virtual-desktop Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/whats-new.md
Azure Virtual Desktop updates regularly. This article is where you'll find out a
- Improvements to existing features - Bug fixes
-This article is updated monthly. Make sure to check back here often to keep up with new updates.
-
-## Client updates
-
-Check out these articles to learn about updates for our clients for Azure Virtual Desktop and Remote Desktop
--- [Windows](/windows-server/remote/remote-desktop-services/clients/windowsdesktop-whatsnew)-- [macOS](/windows-server/remote/remote-desktop-services/clients/mac-whatsnew)-- [iOS](/windows-server/remote/remote-desktop-services/clients/ios-whatsnew)-- [Android](/windows-server/remote/remote-desktop-services/clients/android-whatsnew)-- [Web](/windows-server/remote/remote-desktop-services/clients/web-client-whatsnew)-
-## Azure Virtual Desktop Agent updates
-
-The Azure Virtual Desktop agent updates at least once per month.
-
-Here's what's changed in the Azure Virtual Desktop Agent:
--- Version 1.0.4119.1500: This update was released in February 2022 and includes the following changes:
- - Fixes an issue with arithmetic overflow casting exceptions.
- - Updated the agent to now start the Azure Instance Metadata Service (IMDS) when the agent starts.
- - Fixes an issue that caused Sandero name pipe service start ups to be slow when the VM has no registration information.
- - General bug fixes and agent improvements.
-- Version 1.0.4009.1500: This update was released in January 2022 and includes the following changes:
- - Added logging to better capture agent update telemetry.
- - Updated the agent's Azure Instance Metadata Service health check to be Azure Stack HCI-friendly
-- Version 1.0.3855.1400: This update was released December 2021 and has the following changes:
- - Fixes an issue that caused an unhandled exception.
- - This version now supports Azure Stack HCI by retrieving VM metadata from the Azure Arc service.
- - This version now allows built-in stacks to be automatically updated if its version number is beneath a certain threshold.
- - The UrlsAccessibleCheck health check now only gets the URL until the path delimiter to prevent 404 errors.
-- Version 1.0.3719.1700: This update was released November 2021 and has the following changes:
- - Updated agent error messages.
- - Fixes an issue with the agent restarting every time the side-by-side stack was updated.
- - General agent improvements.
-- Version 1.0.3583.2600: This update was released October 2021 and it fixes an issue where upgrading from Windows 10 to Windows 11 disabled the side-by-side stack.-- Version 1.0.3373.2605: This update was released September 2021 and it fixes an issue with package deregistration getting stuck when using MSIX App Attach.-- Version 1.0.3373.2600: This update was released September 2021 and has the following changes:
- - General agent improvements.
- - Fixes issues with restarting the agent on Windows 7 VMs.
- - Fixes an issue with fields in the WVDAgentHealthStatus table not showing up correctly.
-- Version 1.0.3130.2900: This update was released July 2021 and has the following changes:
- - General improvements and bug fixes.
- - Fixes an issue with getting the host pool path for Intune registration.
- - Added logging to better diagnose agent issues.
- - Fixes an issue with orchestration timeouts.
-- Version 1.0.3050.2500: This update was released July 2021 and has the following changes:
- - Updated internal monitors for agent health.
- - Updated retry logic for stack health.
-- Version 1.0.2990.1500: This update was released April 2021 and has the following changes:
- - Updated agent error messages.
- - Added an exception that prevents you from installing non-Windows 7 agents on Windows 7 VMs.
- - Has updated heartbeat service logic.
-- Version 1.0.2944.1400: This update was released April 2021 and has the following changes:
- - Placed links to the Azure Virtual Desktop Agent troubleshooting guide in the event viewer logs for agent errors.
- - Added an additional exception for better error handling.
- - Added the WVDAgentUrlTool.exe that allows customers to check which required URLs they can access.
-- Version 1.0.2866.1500: This update was released March 2021 and it fixes an issue with the stack health check.-- Version 1.0.2800.2802: This update was released March 2021 and it has general improvements and bug fixes.-- Version 1.0.2800.2800: This update was released March 2021 and it fixes a reverse connection issue.-- Version 1.0.2800.2700: This update was released February 2021 and it fixes an access denied orchestration issue.-
-## FSLogix updates
-
-Curious about the latest updates for FSLogix? Check out [What's new at FSLogix](/fslogix/whats-new).
+Make sure to check back here often to keep up with new updates.
## February 2022
On September 30th, 2021, the Azure Virtual Desktop web client will no longer sup
We've started the public preview for Microsoft Endpoint Manager support in Windows 10 Enterprise multi-session. This new feature will let you manage your Windows 10 VMs with the same tools as your local devices. Learn more at our [Microsoft Endpoint Manger documentation](/mem/intune/fundamentals/windows-virtual-desktop-multi-session).
-### FSLogix agent public preview
+### FSLogix version 2105 public preview
We have released a public preview of the latest version of the FSLogix agent. Check out our [blog post](https://techcommunity.microsoft.com/t5/windows-virtual-desktop/public-preview-fslogix-release-2105-is-now-available-in-public/m-p/2380996/thread-id/7105) for more information and to submit the form you'll need to access the preview.
The Azure Marketplace now has Generation 2 images for Windows 10 Enterprise and
### FSLogix is now preinstalled on Windows 10 Enterprise multi-session images
-Based on customer feedback, we've set up a new version of the Windows 10 Enterprise multi-session image that has an unconfigured version of FSLogix already installed. We hope this makes your Azure Virtual Desktop deployment easier.
+Based on customer feedback, we've released a new version of the Windows 10 Enterprise multi-session image that has an unconfigured version of FSLogix already installed. We hope this makes your Azure Virtual Desktop deployment easier.
### Azure Monitor for Azure Virtual Desktop is now in General Availability
We've added new built-in roles for Azure Virtual Desktop for admin permissions.
We've increased the default application group limit per Azure Active Directory tenant to 200 groups.
-### Client updates for December 2020
-
-We've released new versions of the following clients:
--- Android-- macOS-- Windows-
-For more information about client updates, see [Client updates](whats-new.md#client-updates).
- ## November 2020 ### Azure portal experience
Here's what changed in October 2020:
### Improved performance -- We've optimized performance by reducing connection latency in the following Azure geographies:
- - Switzerland
- - Canada
+We've optimized performance by reducing connection latency in the following Azure geographies:
+
+- Switzerland
+- Canada
You can now use the [Experience Estimator](https://azure.microsoft.com/services/virtual-desktop/assessment/) to estimate the user experience quality in these areas.
We've made some updates to the Azure Virtual Desktop Azure portal:
- Fixed an issue where the "requires command line" text didn't display correctly in the "Application list" tab. - Fixed an issue when the portal couldn't deploy host pools or virtual machines while using the German-language version of the Shared Image Gallery.
-### Client updates for October 2020
-
-We've released new versions of the clients. See these articles to learn more:
--- [Windows](/windows-server/remote/remote-desktop-services/clients/windowsdesktop-whatsnew)-- [iOS](/windows-server/remote/remote-desktop-services/clients/ios-whatsnew)-
-For more information about the other clients, see [Client updates](#client-updates).
- ## September 2020 Here's what changed in September 2020:
July was when Azure Virtual Desktop with Azure Resource Management integration b
Here's what changed with this new release: -- The "Fall 2019 release" is now known as "Azure Virtual Desktop (Classic)," while the "Spring 2020 release" is now just "Azure Virtual Desktop." For more information, check out [this blog post](https://azure.microsoft.com/blog/new-windows-virtual-desktop-capabilities-now-generally-available/).
+- The "Fall 2019 release" is now known as "Azure Virtual Desktop (classic)," while the "Spring 2020 release" is now just "Azure Virtual Desktop." For more information, check out [this blog post](https://azure.microsoft.com/blog/new-windows-virtual-desktop-capabilities-now-generally-available/).
To learn more about new features, check out [this blog post](https://techcommunity.microsoft.com/t5/itops-talk-blog/windows-virtual-desktop-spring-update-enters-public-preview/ba-p/1340245).
We've added a new gateway cluster in South Africa to reduce connection latency.
We've made some improvements to Microsoft Teams for Azure Virtual Desktop. Most importantly, Azure Virtual Desktop now supports audio and visual redirection for calls. Redirection improves latency by creating direct paths between users when they call using audio or video. Less distance means fewer hops, which makes calls look and sound smoother. To learn more, see [our blog post](https://azure.microsoft.com/updates/windows-virtual-desktop-media-optimization-for-microsoft-teams-is-now-available-in-public-preview/).-
-## Next steps
-
-Learn about future plans at the [Microsoft 365 Azure Virtual Desktop roadmap](https://www.microsoft.com/microsoft-365/roadmap?filters=Windows%20Virtual%20Desktop).
virtual-machines Capacity Reservation Associate Virtual Machine Scale Set Flex https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set-flex.md
+
+ Title: Associate a virtual machine scale set with flexible orchestration to a Capacity Reservation group (preview)
+description: Learn how to associate a new virtual machine scale set with flexible orchestration mode to a Capacity Reservation group.
++++ Last updated : 03/28/2022++++
+# Associate a virtual machine scale set with flexible orchestration to a Capacity Reservation group
+
+**Applies to:** :heavy_check_mark: Flexible scale sets
+
+Virtual Machine Scale Sets have two modes:
+
+- **Uniform Orchestration Mode:** In this mode, virtual machine scale sets use a VM profile or a template to scale up to the desired capacity. While there is some ability to manage or customize individual VM instances, Uniform uses identical VM instances. These instances are exposed through the virtual machine scale sets VM APIs and are not compatible with the standard Azure IaaS VM API commands. Since the scale set performs all the actual VM operations, reservations are associated with the virtual machine scale set directly. Once the scale set is associated with the reservation, all the subsequent VM allocations will be done against the reservation.
+- **Flexible Orchestration Mode:** In this mode, you get more flexibility managing the individual virtual machine scale set VM instances as they can use the standard Azure IaaS VM APIs instead of using the scale set interface. To use reservations with flexible orchestration mode, define both the virtual machine scale set property and the capacity reservation property on each virtual machine.
+
+To learn more about these modes, go to [Virtual Machine Scale Sets Orchestration Modes](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md).
+
+This content applies to the flexible orchestration mode. For uniform orchestration mode, go to [Associate a virtual machine scale set with flexible orchestration to a Capacity Reservation group](capacity-reservation-associate-virtual-machine-scale-set.md)
++
+> [!IMPORTANT]
+> Capacity Reservations with virtual machine set using flexible orchestration is currently in public preview. This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+> During the preview, always attach reserved capacity during creation of new scale sets using flexible orchestration mode. There are known issues attaching capacity reservations to existing scale sets using flexible orchestration. Microsoft will update this page as more options become enabled during preview.
+
+## Associate a new virtual machine scale set to a Capacity Reservation group
+
+**Option 1: Add to Virtual Machine profile** - If the Scale Set with flexible orchestration includes a VM profile, add the Capacity Reservation group property to the profile during Scale Set creation. Follow the same process used for a Scale Set using uniform orchestration. For sample code, see [Associate a virtual machine scale set with uniform orchestration to a Capacity Reservation group](capacity-reservation-associate-virtual-machine-scale-set.md).
+
+**Option 2: Add to the first Virtual Machine deployed** - If the Scale Set omits a VM profile, then you must add the Capacity Reservation group to the first Virtual Machine deployed using the Scale Set. Follow the same process used to associate a VM. For sample code, see [Associate a virtual machine to a Capacity Reservation group](capacity-reservation-associate-vm.md).
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Learn how to remove a scale set association from a Capacity Reservation](capacity-reservation-remove-virtual-machine-scale-set.md)
virtual-machines Capacity Reservation Associate Virtual Machine Scale Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-associate-virtual-machine-scale-set.md
Title: Associate a virtual machine scale set to a Capacity Reservation group (preview)
-description: Learn how to associate a new or existing virtual machine scale set to a Capacity Reservation group.
+ Title: Associate a virtual machine scale set with uniform orchestration to a Capacity Reservation group
+description: Learn how to associate a new or existing virtual machine scale with uniform orchestration set to a Capacity Reservation group.
-# Associate a virtual machine scale set to a Capacity Reservation group (preview)
+# Associate a virtual machine scale set with uniform orchestration to a Capacity Reservation group
+
+**Applies to:** :heavy_check_mark: Uniform scale set
Virtual Machine Scale Sets have two modes: - **Uniform Orchestration Mode:** In this mode, virtual machine scale sets use a VM profile or a template to scale up to the desired capacity. While there is some ability to manage or customize individual VM instances, Uniform uses identical VM instances. These instances are exposed through the virtual machine scale sets VM APIs and are not compatible with the standard Azure IaaS VM API commands. Since the scale set performs all the actual VM operations, reservations are associated with the virtual machine scale set directly. Once the scale set is associated with the reservation, all the subsequent VM allocations will be done against the reservation. -- **Flexible Orchestration Mode:** In this mode, you get more flexibility managing the individual virtual machine scale set VM instances as they can use the standard Azure IaaS VM APIs instead of using the scale set interface. This mode will not work with Capacity Reservation during public preview.
+- **Flexible Orchestration Mode:** In this mode, you get more flexibility managing the individual virtual machine scale set VM instances as they can use the standard Azure IaaS VM APIs instead of using the scale set interface. To use reservations with flexible orchestration mode, define both the virtual machine scale set property and the capacity reservation property on each virtual machine.
-To learn more about these modes, go to [Virtual Machine Scale Sets Orchestration Modes](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md). The rest of this article will cover how to associate a Uniform virtual machine scale set to a Capacity Reservation group.
+To learn more about these modes, go to [Virtual Machine Scale Sets Orchestration Modes](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md).
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+This content applies to the uniform orchestration mode. For flexible orchestration mode, go to [Associate a virtual machine scale set with flexible orchestration to a Capacity Reservation group](capacity-reservation-associate-virtual-machine-scale-set-flex.md)
## Limitations of scale sets in Uniform Orchestration
If your environment meets the prerequisites and you are familiar with using ARM
## Associate an existing virtual machine scale set to Capacity Reservation group
-During public preview, you are first required to deallocate your scale set. Then you can associate the existing Uniform virtual machine scale set to the Capacity Reservation group at the time of reallocation. This ensures that all the scale set VMs consume Capacity Reservation at the time of reallocation.
+To add an existing Capacity Reservation Group to an existing Uniform Scale Set:
+
+- Stop the Scale Set to deallocate the VM instances
+- Update the Scale Set to use a matching Capacity Reservation Group
+- Start the Scale Set
+
+This process ensures the placement for the Capacity Reservations and Scale Set in the region are compatible.
+ ### Important notes on Upgrade Policies - **Automatic Upgrade** ΓÇô In this mode, the scale set VM instances are automatically associated with the Capacity Reservation group without any further action from you. When the scale set VMs are reallocated, they start consuming the reserved capacity. - **Rolling Upgrade** ΓÇô In this mode, scale set VM instances are associated with the Capacity Reservation group without any further action from you. However, they are updated in batches with an optional pause time between them. When the scale set VMs are reallocated, they start consuming the reserved capacity.-- **Manual Upgrade** ΓÇô In this mode, nothing happens to the scale set VM instances when the virtual machine scale set is attached to a Capacity Reservation group. You will need to do individual updates to each scale set VM by [upgrading it with the latest Scale Set model](../virtual-machine-scale-sets/virtual-machine-scale-sets-upgrade-scale-set.md#how-to-bring-vms-up-to-date-with-the-latest-scale-set-model).
+- **Manual Upgrade** ΓÇô In this mode, nothing happens to the scale set VM instances when the virtual machine scale set is attached to a Capacity Reservation group. You will need to update to each scale set VM by [upgrading it with the latest Scale Set model](../virtual-machine-scale-sets/virtual-machine-scale-sets-upgrade-scale-set.md#how-to-bring-vms-up-to-date-with-the-latest-scale-set-model).
### [API](#tab/api2)
virtual-machines Capacity Reservation Associate Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-associate-vm.md
Title: Associate a virtual machine to a Capacity Reservation group (preview)
+ Title: Associate a virtual machine to a Capacity Reservation group
description: Learn how to associate a new or existing virtual machine to a Capacity Reservation group.
-# Associate a VM to a Capacity Reservation group (preview)
+# Associate a VM to a Capacity Reservation group
-This article walks through the steps of associating a new or existing virtual machine to a Capacity Reservation group. To learn more about Capacity Reservations, see the [overview article](capacity-reservation-overview.md).
+**Applies to:** :heavy_check_mark: Windows Virtual Machines :heavy_check_mark: Linux Virtual Machines
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+Capacity reservation groups can be used with new or existing virtual machines. To learn more about Capacity Reservations, see the [overview article](capacity-reservation-overview.md).
## Associate a new VM
-To associate a new VM to the Capacity Reservation group, the group must be explicitly referenced as a property of the virtual machine. This reference protects the matching reservation in the group from accidental consumption by less critical applications and workloads that are not intended to use it.
+To associate a new VM to the Capacity Reservation group, the group must be explicitly referenced as a property of the virtual machine. This reference protects the matching reservation in the group for applications and workloads intended to use it.
### [API](#tab/api1)
In the request body, include the `capacityReservationGroup` property:
### [CLI](#tab/cli1)
-Use `az vm create` to create a new VM and add the `capacity-reservation-group` property to associate it to an existing Capacity Reservation group. The following example creates a Standard_D2s_v3 VM in the East US location and associate the VM to a Capacity Reservation group.
+Use `az vm create` to create a new VM and add the `capacity-reservation-group` property to associate it to an existing Capacity Reservation group. The following example creates a Standard_D2s_v3 VM in the East US location and associates the VM to a Capacity Reservation group.
```azurecli-interactive az vm create
An [ARM template](../azure-resource-manager/templates/overview.md) is a Java
ARM templates let you deploy groups of related resources. In a single template, you can create Capacity Reservation group and capacity reservations. You can deploy templates through the Azure portal, Azure CLI, or Azure PowerShell, or from continuous integration/continuous delivery (CI/CD) pipelines.
-If your environment meets the prerequisites and you are familiar with using ARM templates, use this [Create VM with Capacity Reservation](https://github.com/Azure/on-demand-capacity-reservation/blob/main/VirtualMachineWithReservation.json) template.
+If your environment meets the prerequisites and you're familiar with using ARM templates, use this [Create VM with Capacity Reservation](https://github.com/Azure/on-demand-capacity-reservation/blob/main/VirtualMachineWithReservation.json) template.
If your environment meets the prerequisites and you are familiar with using ARM
## Associate an existing VM
-During public preview, you are first required to deallocate your scale set. Then you can associate the existing Uniform virtual machine scale set to the Capacity Reservation group at the time of reallocation. This ensures that all the scale set VMs consume Capacity Reservation at the time of reallocation.
+For the initial release of Capacity Reservation, a virtual machine must be allocated to a capacity reservation.
+
+- If not already complete, follow guidance to create a capacity reservation group and capacity reservation. Or increment the quantity of an existing capacity reservation so there's unused reserved capacity.
+- Deallocate the VM.
+- Update the capacity reservation group property on the VM.
+- Restart the VM.
### [API](#tab/api2)
virtual-machines Capacity Reservation Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-create.md
Title: Create a Capacity Reservation in Azure (preview)
+ Title: Create a Capacity Reservation in Azure
description: Learn how to reserve Compute capacity in an Azure region or an Availability Zone by creating a Capacity Reservation.
-# Create a Capacity Reservation (preview)
+# Create a Capacity Reservation
Capacity Reservation is always created as part of a Capacity Reservation group. The first step is to create a group if a suitable one doesnΓÇÖt exist already, then create reservations. Once successfully created, reservations are immediately available for use with virtual machines. The capacity is reserved for your use as long as the reservation is not deleted.
A well-formed request for Capacity Reservation group should always succeed as it
A Capacity Reservation creation succeeds or fails in its entirety. For a request to reserve 10 instances, success is returned only if all 10 could be allocated. Otherwise, the Capacity Reservation creation will fail.
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we do not recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
- ## Considerations The Capacity Reservation must meet the following rules: - The location parameter must match the location property for the parent Capacity Reservation group. A mismatch will result in an error. - The VM size must be available in the target region. Otherwise, the reservation creation will fail. -- The subscription must have sufficient approved quota equal to or more than the quantity of VMs being reserved for the VM series and for the region overall. If needed, [request more quota](../azure-portal/supportability/per-vm-quota-requests.md).
+- The subscription must have available quota equal to or more than the quantity of VMs being reserved for the VM series and for the region overall. If needed, [request more quota](../azure-portal/supportability/per-vm-quota-requests.md).
+ - As needed to satisfy existing quota limits, single VMs can be done in stages. Create a capacity reservation with a smaller quantity and reallocate that quantity of virtual machines. This will free up quota to increase the quantity reserved and add more virtual machines. Alternatively, if the subscription uses different VM sizes in the same series, reserve and redeploy VMs for the first size. Then add a reservation to the group for another size and redeploy the VMs for the new size to the reservation group. Repeat until complete.
+ - For Scale Sets, available quota will be required unless the Scale Set or its VM instances are deleted, capacity is reserved, and the Scale Set instances are added using reserved capacity. If the Scale Set is updated using blue green deployment, then reserve the capacity and deploy the new Scale Set to the reserved capacity at the next update.
- Each Capacity Reservation group can have exactly one reservation for a given VM size. For example, only one Capacity Reservation can be created for the VM size `Standard_D2s_v3`. Attempt to create a second reservation for `Standard_D2s_v3` in the same Capacity Reservation group will result in an error. However, another reservation can be created in the same group for other VM sizes, such as `Standard_D4s_v3`, `Standard_D8s_v3`, and so on. - For a Capacity Reservation group that supports zones, each reservation type is defined by the combination of **VM size** and **zone**. For example, one Capacity Reservation for `Standard_D2s_v3` in `Zone 1`, another Capacity Reservation for `Standard_D2s_v3` in `Zone 2`, and a third Capacity Reservation for `Standard_D2s_v3` in `Zone 3` is supported.
virtual-machines Capacity Reservation Modify https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-modify.md
Title: Modify a Capacity Reservation in Azure (preview)
+ Title: Modify a Capacity Reservation in Azure
description: Learn how to modify a Capacity Reservation.
After creating a Capacity Reservation group and Capacity Reservation, you may wa
> * Resize VMs associated with a Capacity Reservation group > * Delete the Capacity Reservation group and Capacity Reservation
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-- ## Update the number of instances reserved Update the number of virtual machine instances reserved in a Capacity Reservation.
virtual-machines Capacity Reservation Overallocate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-overallocate.md
Title: Overallocating Capacity Reservation in Azure (preview)
+ Title: Overallocating Capacity Reservation in Azure
description: Learn how overallocation works when it comes to Capacity Reservation.
-# Overallocating Capacity Reservation (preview)
+# Overallocating Capacity Reservation
Azure permits association of extra VMs beyond the reserved count of a Capacity Reservation to facilitate burst and other scale-out scenarios, without the overhead of managing around the limits of reserved capacity. The only difference is that the count of VMs beyond the quantity reserved does not receive the capacity availability SLA benefit. As long as Azure has available capacity that meets the virtual machine requirements, the extra allocations will succeed.
The Instance View of a Capacity Reservation group provides a snapshot of usage f
This article assumes you have created a Capacity Reservation group (`myCapacityReservationGroup`), a member Capacity Reservation (`myCapacityReservation`), and a virtual machine (*myVM1*) that is associated to the group. Go to [Create a Capacity Reservation](capacity-reservation-create.md) and [Associate a VM to a Capacity Reservation](capacity-reservation-associate-vm.md) for more details.
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-- ## Instance View for Capacity Reservation group The Instance View for a Capacity Reservation group will look like this:
The Instance View for the Capacity Reservation group will now look like this:
Notice that the length of `virtualMachinesAllocated` (2) is greater than `capacity` (1). This valid state is referred to as *overallocated*. > [!IMPORTANT]
-> Azure will not stop allocations just because a Capacity Reservation is fully consumed. Auto-scale rules, temporary scale-out, and related requirements will work beyond the quantity of reserved capacity as long as Azure has available capacity.
+> Azure will not stop allocations just because a Capacity Reservation is fully consumed. Auto-scale rules, temporary scale-out, and related requirements will work beyond the quantity of reserved capacity as long as Azure has available capacity and other constraints such as available quota are met.
## States and considerations
virtual-machines Capacity Reservation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-overview.md
Title: On-demand Capacity Reservation in Azure (preview)
+ Title: On-demand Capacity Reservation in Azure
description: Learn how to reserve compute capacity in an Azure region or an Availability Zone with Capacity Reservation.
-# On-demand Capacity Reservation (preview)
+# On-demand Capacity Reservation
On-demand Capacity Reservation enables you to reserve Compute capacity in an Azure region or an Availability Zone for any duration of time. Unlike [Reserved Instances](https://azure.microsoft.com/pricing/reserved-vm-instances/), you do not have to sign up for a 1-year or a 3-year term commitment. Create and delete reservations at any time and have full control over how you want to manage your reservations. Once the Capacity Reservation is created, the capacity is available immediately and is exclusively reserved for your use until the reservation is deleted. -
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we do not recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-- Capacity Reservation has some basic properties that are always defined at the time of creation: -- **VM size**ΓÇ»- Each reservation is for one VM size. For example, `Standard_D2s_v3`. +
+- **VM size**ΓÇ»- Each reservation is for one VM size. For example, `Standard_D2s_v3`.
- **Location**ΓÇ»- Each reservation is for one location (region). If that location has availability zones, then the reservation can also specify one of the zones. - **Quantity**ΓÇ»- Each reservation has a quantity of instances to be reserved. To create a Capacity Reservation, these parameters are passed to Azure as a capacity request. If the subscription lacks the required quota or Azure does not have capacity available that meets the specification, the reservation will fail to deploy. To avoid deployment failure, request more quota or try a different VM size, location, or zone combination.
-Once Azure accepts a reservation request, it is available to be consumed by VMs of matching configurations. To consume Capacity Reservation, the VM will have to specify the reservation as one of its properties. Otherwise, the Capacity Reservation will remain unused. One benefit of this design is that you can target only critical workloads to reservations and other non-critical workloads can run without reserved capacity.
-
-> [!NOTE]
-> Capacity Reservation also comes with Azure availability SLA for use with virtual machines. The SLA won't be enforced during public preview and will be defined when Capacity Reservation is generally available.
+Once Azure accepts a reservation request, it is available to be consumed by VMs of matching configurations. To consume Capacity Reservation, the VM will have to specify the reservation as one of its properties. Otherwise, the Capacity Reservation will remain unused. One benefit of this design is that you can target only critical workloads to reservations and other non-critical workloads can run without reserved capacity.
## Benefits of Capacity Reservation -- Once deployed, capacity is reserved for your use and always available within the scope of applicable SLAs
+- Once deployed, capacity is reserved for your use and always available within the scope of applicable SLAs
- Can be deployed and deleted at any time with no term commitment -- Can be combined automatically with Reserved Instances to avail term commitments discounts
+- Can be combined automatically with Reserved Instances to use term commitment discounts
## SLA for Capacity Reservation
-The SLA for Capacity Reservation will be defined later when the feature is generally available.
+Please read the Service Level Agreement details in the [SLA for Capacity Reservation](https://aka.ms/CapacityReservationSLAForVM).
+
+Any claim against the SLA requires calculating the Minutes Not Available for the reserved capacity. Here is an example of how to calculate Minutes Not Available.
+- A On Demand Capacity Reservation has a total Capacity of 5 Reserved Units. The On Demand Capacity Reservation starts in the Unused Capacity state with 0 Virtual Machines Allocated.
+- A Supported Deployment of quantity 5 is allocated to the On Demand Capacity Reservation. 3 Virtual Machines succeed and 2 fail with a Virtual Machine capacity error. Result: 2 Reserved Units begin to accumulate Minutes Not Available.
+- No action is taken for 20 minutes. Result: two Reserved Units each accumulate 15 Minutes Not Available.
+- At 20 minutes, a Supported Deployment of quantity 2 is attempted. One Virtual Machine succeeds, the other Virtual Machine fails with a Virtual Machine capacity error. Result: One Reserved Unit stays at 15 accumulated Minutes Not Available. Another Reserved Unit resumes accumulating Minutes Not Available.
+- Four additional Supported Deployments of quantity 1 are made at 10 minute intervals. On the fourth attempt (60 minutes after the first capacity error), the Virtual Machine is deployed. Result: The last Reserved Unit adds 40 minutes of Minutes Not Available (4 attempts x 10 minutes between attempts) for a total of 55 Minutes Not Available.
+
+From this example accumulation of Minutes Not Available, here is the calculation of Service Credit.
+
+- One Reserved Unit accumulated 15 minutes of Downtime. The Percentage Uptime is 99.97%. This Reserved Unit does not qualify for Service Credit.
+- Another Reserved Unit accumulated 55 minutes of Downtime. The Percentage Uptime is 99.87. This Reserved Unit qualifies for Service Credit of 10%.
## Limitations and restrictions - Creating capacity reservations requires quota in the same manner as creating virtual machines. -- Spot VMs and Azure Dedicated Host Nodes are not supported with Capacity Reservation. -- Some deployment constraints are not supported:
+- Creating capacity reservation is currently limited to certain VM Series and Sizes. The Compute [Resource SKUs list](https://docs.microsoft.com/rest/api/compute/resource-skus/list) advertises the set of supported VM Sizes.
+- The following VM Series support creation of capacity reservations:
+ - Av2
+ - B
+ - D series, v2 and newer; AMD and Intel
+ - E series, all versions; AMD and Intel
+ - F series, all versions
+ - At VM deployment, Fault Domain (FD) count of up to 3 may be set as desired using Virtual Machine Scale Sets. A deployment with more than 3 FDs will fail to deploy against a Capacity Reservation.
+- Support for additional VM Series isn't currently available:
+ - L series
+ - M series, any version
+ - NC-series, v3 and newer
+ - NV-series, v2 and newer
+ - ND-series
+ - Hb-series
+ - Hc-series
+- The following deployment types are supported:
+ - Single VM
+ - Virtual Machine Scale Sets with Uniform Orchestration
+ - Virtual Machine Scale Sets with Flexible Orchestration (preview)
+- The following deployment types are not supported:
+ - Spot VMs
+ - Azure Dedicated Host Nodes or VMs deployed to Dedicated Hosts
+ - Availability Sets
+- Other deployment constraints are not supported. For example:
- Proximity Placement Group - Update domains
- - UltraSSD storage
-- Only Av2, B, D, E, & F VM series are supported during public preview. -- For the supported VM series during public preview, up to 3 Fault Domains (FDs) will be supported. A deployment with more than 3 FDs will fail to deploy against Capacity Reservation. -- Availability Sets are not supported with Capacity Reservation. -- During this preview, only the subscription that created the reservation can use it.
+ - Virtual Machine Scale Sets with single placement group set 'true'
+ - UltraSSD storage
+ - VMs resuming from hibernation
+ - VMs requiring vnet encryption
+- Only the subscription that created the reservation can use it.
- Reservations are only available to paid Azure customers. Sponsored accounts such as Free Trial and Azure for Students are not eligible to use this feature. ## Pricing and billing
-Capacity Reservations are priced at the same rate as the underlying VM size. For example, if you create a reservation for ten quantities of D2s_v3 VM, as soon as the reservation is created, you will start getting billed for ten D2s_v3 VMs, even if the reservation is not being used.
+Capacity Reservations are priced at the same rate as the underlying VM size. For example, if you create a reservation for quantity 10 for D2s_v3 VM then you will start getting billed for ten D2s_v3 VMs, even if the reservation is not being used.
-If you then deploy a D2s_v3 VM and specify reservation as its property, the Capacity Reservation gets used. Once in use, you will only pay for the VM and nothing extra for the Capacity Reservation. LetΓÇÖs say you deploy five D2s_v3 VMs against the previously mentioned Capacity Reservation. You will see a bill for five D2s_v3 VMs and five unused Capacity Reservation, both charged at the same rate as a D2s_v3 VM.
+If you then deploy a D2s_v3 VM and specify reservation property, the Capacity Reservation gets used. Once in use, you will only pay for the VM and nothing extra for the Capacity Reservation. LetΓÇÖs say you deploy six D2s_v3 VMs against the previously mentioned Capacity Reservation. You will see a bill for six D2s_v3 VMs and four unused Capacity Reservation, both charged at the same rate as a D2s_v3 VM.
-Both used and unused Capacity Reservation are eligible for Reserved Instances term commitment discounts. In the previous example, if you have Reserved Instances for two D2s_v3 VMs in the same Azure region, the billing for two resources (either VM or unused Capacity Reservation) will be zeroed out and you will only pay for the rest of the eight resources. Those eight resources are the five unused capacity reservations and three D2s_v3 VMs. In this case, the term commitment discounts could be applied on either the VM or the unused Capacity Reservation, both of which are charged at the same PAYG rate.
+Both used and unused Capacity Reservation are eligible for Reserved Instances term commitment discounts. In the previous example, if you have Reserved Instances for two D2s_v3 VMs in the same Azure region, the billing for two resources (either VM or unused Capacity Reservation) will be zeroed out. The remaining eight D2s_v3 will be billed normally. The term commitment discounts could be applied on either the VM or the unused Capacity Reservation.
## Difference between On-demand Capacity Reservation and Reserved Instances
Both used and unused Capacity Reservation are eligible for Reserved Instances te
## Work with Capacity Reservation
-Capacity Reservation can be created for a specific VM size in an Azure region or an Availability Zone. All reservations are created and managed as part of a Capacity Reservation group, which allows creation of a group to manage different VM sizes in a single multi-tier application. Each reservation is for one VM size and a group can have only one reservation per VM size.
+Capacity Reservation is created for a specific VM size in an Azure region or an Availability Zone. All reservations are created and managed as part of a Capacity Reservation Group.
+
+The group specifies the Azure location:
+
+- The group sets the region in which all reservations will be created. For example, East US, North Europe, or Southeast Asia.
+- The group sets the eligible zones. For example, AZ1, AZ2, AZ3 in any combination.
+- If no zones are specified, Azure will select the placement for the group somewhere in the region. Each reservation will specify the region and may not set a zone.
+
+Each reservation in a group is for one VM size. If eligible zones were selected for the group, the reservation must be for one of the supported zones.
+
+A group can have only one reservation per VM size per zone, or just one reservation per VM size if no zones are selected.
-To consume Capacity Reservation, specify Capacity Reservation group as one of the VM properties. If the group doesnΓÇÖt have a matching reservation, Azure will return an error message.
+To consume Capacity Reservation, specify Capacity Reservation Group as one of the VM properties. If the group doesnΓÇÖt have a reservation matching the size and location, Azure will return an error message.
-The quantity reserved for reservation can be adjusted after initial deployment by changing the capacity property. Other changes to Capacity Reservation, such as VM size or location, are not permitted. The recommended approach is to delete the existing reservation and create a new one with the new requirements.
+The quantity reserved for reservation can be adjusted after initial deployment by changing the capacity property. Other changes to Capacity Reservation, such as VM size or location, are not permitted. The recommended approach is to create a new reservation, migrate any existing VMs, and then delete the old reservation if no longer needed.
-Capacity Reservation doesnΓÇÖt create limits on the number of VM deployments. Azure supports allocating as many VMs as desired against the reservation. As the reservation itself requires quota, the quota checks are omitted for VM deployment up to the reserved quantity. Allocating more VMs against the reservation is subject to quota checks and Azure fulfilling the extra capacity. Once deployed, these extra VM instances can cause the quantity of VMs allocated against the reservation to exceed the reserved quantity. This state is called overallocating. To learn more, go to [Overallocating Capacity Reservation](capacity-reservation-overallocate.md).
+Capacity Reservation doesnΓÇÖt create limits on the number of VM deployments. Azure supports allocating as many VMs as desired against the reservation. As the reservation itself requires quota, the quota checks are omitted for VM deployment up to the reserved quantity. Allocating VMs beyond the reserved quantity is call overallocating the reservation. Overallocating VMs is not covered by the SLA and the VMs will be subject to quota checks and Azure fulfilling the extra capacity. Once deployed, these extra VM instances can cause the quantity of VMs allocated against the reservation to exceed the reserved quantity. To learn more, go to [Overallocating Capacity Reservation](capacity-reservation-overallocate.md).
## Capacity Reservation lifecycle
Now suppose the application scales down to the minimum of two VMs. Since VM 0 ne
![Capacity Reservation image 4.](./media/capacity-reservation-overview/capacity-reservation-4.jpg)
-The `capacity` and the length of `virtualMachinesAllocated` are both 2. However, the length for `virtualMachinesAssociated` is still 3 as VM 0, though deallocated, is still associated with the Capacity Reservation.
+The `capacity` and the length of `virtualMachinesAllocated` are both 2. However, the length for `virtualMachinesAssociated` is still 3 as VM 0, though deallocated, is still associated with the Capacity Reservation. To prevent quota overrun, the deallocated VM 0 still counts against the quota allocated to the reservation. As long as you have enough unused quota, you can deploy new VMs to the Capacity Reservation and receive the SLA from any unused reserved capacity. Or you can delete VM 0 to remove its use of quota.
The Capacity Reservation will exist until explicitly deleted. To delete a Capacity Reservation, the first step is to dissociate all the VMs in the `virtualMachinesAssociated` property. Once disassociation is complete, the Capacity Reservation should look like this:
In the previous image, the VM Reserved Instance discount is applied to VM 0, whi
- **WhatΓÇÖs the price of on-demand Capacity Reservation?**
- The price of your on-demand Capacity Reservation is same as the price of underlying VM size associated with the reservation. When using Capacity Reservation, you will be charged for the VM size you selected at pay-as-you-go rates, whether the VM has been provisioned or not. Visit the [Windows](https://azure.microsoft.com/pricing/details/virtual-machines/windows/) and [Linux](https://azure.microsoft.com/pricing/details/virtual-machines/linux/) VM pricing pages for more details.
+ The price of your on-demand Capacity Reservation is same as the price of underlying VM size associated with the reservation. When using Capacity Reservation, you will be charged for the VM size you selected at pay-as-you-go rates, whether the VM has been provisioned or not. Visit the [Windows](https://azure.microsoft.com/pricing/details/virtual-machines/windows/) and [Linux](https://azure.microsoft.com/pricing/details/virtual-machines/linux/) VM pricing pages for more details.
- **Will I get charged twice, for the cost of on-demand Capacity Reservation and for the actual VM when I finally provision it?**
- No, you will only get charged once for on-demand Capacity Reservation.
+ No, you will only get charged once for on-demand Capacity Reservation.
- **Can I apply Reserved Virtual Machine Instance (RI) to on-demand Capacity Reservation to lower my costs?**
virtual-machines Capacity Reservation Remove Virtual Machine Scale Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-remove-virtual-machine-scale-set.md
Title: Remove a virtual machine scale set association from a Capacity Reservation group (preview)
+ Title: Remove a virtual machine scale set association from a Capacity Reservation group
description: Learn how to remove a virtual machine scale set from a Capacity Reservation group.
# Remove a virtual machine scale set association from a Capacity Reservation group
+**Applies to:** :heavy_check_mark: Uniform scale set
+ This article walks you through removing a virtual machine scale set association from a Capacity Reservation group. To learn more about capacity reservations, see the [overview article](capacity-reservation-overview.md). Because both the VM and the underlying Capacity Reservation logically occupy capacity, Azure imposes some constraints on this process to avoid ambiguous allocation states and unexpected errors.
There are two ways to change an association:
- Option 1: Deallocate the Virtual machine scale set, change the Capacity Reservation group property at the scale set level, and then update the underlying VMs - Option 2: Update the reserved quantity to zero and then change the Capacity Reservation group property
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
## Deallocate the Virtual machine scale set
virtual-machines Capacity Reservation Remove Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/capacity-reservation-remove-vm.md
Title: Remove a virtual machine association from a Capacity Reservation group (preview)
+ Title: Remove a virtual machine association from a Capacity Reservation group
description: Learn how to remove a virtual machine from a Capacity Reservation group.
-# Remove a VM association from a Capacity Reservation group (preview)
+# Remove a VM association from a Capacity Reservation group
This article walks you through the steps of removing a VM association to a Capacity Reservation group. To learn more about capacity reservations, see the [overview article](capacity-reservation-overview.md).
There are two ways to change an association:
- Option 1: Deallocate the virtual machine, change the Capacity Reservation group property, and optionally restart the virtual machine - Option 2: Update the reserved quantity to zero and then change the Capacity Reservation group property
-> [!IMPORTANT]
-> Capacity Reservation is currently in public preview.
-> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
- ## Deallocate the VM The first option is to deallocate the VM, change the Capacity Reservation group property, and optionally restart the VM.
virtual-machines Oracle Database Backup Azure Backup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-azure-backup.md
Title: Back up and recover an Oracle Database 19c database on an Azure Linux VM using Azure Backup
-description: Learn how to back up and recover an Oracle Database 19c database using the Azure Backup service.
+ Title: Back up and recover an Oracle Database on an Azure Linux VM using Azure Backup
+description: Learn how to back up and recover an Oracle Database using the Azure Backup service.
-# Back up and recover an Oracle Database 19c database on an Azure Linux VM using Azure Backup
+# Back up and recover an Oracle Database on an Azure Linux VM using Azure Backup
**Applies to:** :heavy_check_mark: Linux VMs
This article demonstrates the use of Azure Backup to take disk snapshots of the
[!INCLUDE [azure-cli-prepare-your-environment.md](../../../../includes/azure-cli-prepare-your-environment.md)]
-To perform the backup and recovery process, you must first create a Linux VM that has an installed instance of Oracle Database 19c. The Marketplace image currently used to create the VM is **Oracle:oracle-database-19-3:oracle-database-19-0904:latest**. Follow the steps in the [Oracle create database quickstart](./oracle-database-quick-create.md) to create an Oracle database to complete this tutorial.
+- To perform the backup and recovery process, you must first create a Linux VM that has an installed instance of Oracle Database 12.1 or higher.
+
+- Follow the steps in the [Oracle create database quickstart](./oracle-database-quick-create.md) to create an Oracle database to complete this tutorial.
## Prepare the environment
To perform the backup and recovery process, you must first create a Linux VM tha
To prepare the environment, complete these steps: 1. [Connect to the VM](#connect-to-the-vm).
-1. [Setup Azure Files Storage](#setup-azure-files-storage-for-the-oracle-archived-redo-log-files)
+1. [Set up Azure Files Storage](#set-up-azure-files-storage-for-the-oracle-archived-redo-log-files)
1. [Prepare the database](#prepare-the-databases). ### Connect to the VM
To prepare the environment, complete these steps:
echo "oracle ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers ```
-### Setup Azure Files Storage for the Oracle archived redo log files
+### Set up Azure Files Storage for the Oracle archived redo log files
The Oracle database archive redo logfiles play a crucial role in database recovery as they store the committed transactions needed to roll forward from a database snapshot taken in the past. When in archivelog mode, the database archives the contents of online redo logfiles when they become full and switch. Together with a backup, they are required to achieve point-in-time recovery when the database has been lost. Oracle provides the capability to archive redo logfiles to different locations, with industry best practice recommending that at least one of those destinations be on remote storage, so it is separate from the host storage and protected with independent snapshots. Azure Files is a great fit for those requirements.
-An Azure Files fileshare is storage which can be attached to a Linux or Windows VM as a regular filesystem component, using SMB or NFS (Preview) protocols.
+An Azure Files fileshare is storage which can be attached to a Linux or Windows VM as a regular filesystem component, using SMB or NFS protocols.
-To setup an Azure Files fileshare on Linux, using SMB 3.0 protocol (recommended), for use as archive log storage, please follow the [Use Azure Files with Linux how-to guide](../../../storage/files/storage-how-to-use-files-linux.md). When you have completed the setup, return to this guide and complete all remaining steps.
+To set up an Azure Files fileshare on Linux, using SMB 3.0 protocol, for use as archive log storage, please follow the [Use Azure Files with Linux how-to guide](../../../storage/files/storage-how-to-use-files-linux.md). When you have completed the setup, return to this guide and complete all remaining steps.
### Prepare the databases
The Azure Backup service provides simple, secure, and cost-effective solutions t
Azure Backup service provides a [framework](../../../backup/backup-azure-linux-app-consistent.md) to achieve application consistency during backups of Windows and Linux VMs for various applications like Oracle and MySQL. This involves invoking a pre-script (to quiesce the applications) before taking a snapshot of disks and calling a post-script (to unfreeze the applications) after the snapshot is completed.
-The framework has now been enhanced so that packaged pre-scripts and post-scripts for selected applications like Oracle are provided by the Azure Backup service and are pre-loaded on the Linux image, so there is nothing you need to install. Azure Backup users just need to name the application and then Azure VM backup will automatically invoke the relevant pre and post scripts. The packaged pre-scripts and post-scripts will be maintained by the Azure Backup team and so users can be assured of the support, ownership, and validity of these scripts. Currently, the supported applications for the enhanced framework are *Oracle* and *MySQL*.
+The framework has now been enhanced so that packaged pre-scripts and post-scripts for selected applications like Oracle are provided by the Azure Backup service and are pre-loaded on the Linux image, so there is nothing you need to install. Azure Backup users just need to name the application and then Azure VM backup will automatically invoke the relevant pre and post scripts. The packaged pre-scripts and post-scripts will be maintained by the Azure Backup team and so users can be assured of the support, ownership, and validity of these scripts.
+
+Currently, the supported applications for the enhanced framework are *Oracle 12.x or higher* and *MySQL*.
+Please see the [Support matrix for managed pre-post scripts for Linux databases](../../../backup/backup-support-matrix-iaas.md) for details.
+Customers can author their own scripts for Azure Backup to use with pre-12.x databases. Example scripts can be found [here](https://github.com/Azure/azure-linux-extensions/tree/master/VMBackup/main/workloadPatch/DefaultScripts).
+ > [!Note] > The enhanced framework will run the pre and post scripts on all Oracle databases installed on the VM each time a backup is executed. >
-> The parameter `configuration_path` in the **workload.conf** file points to the location of the Oracle /etc/oratab file (or a user defined file that follows the oratab syntax). See [Set up application-consistent backups](#set-up-application-consistent-backups) for details.
+> The parameter `configuration_path` in the **workload.conf** file points to the location of the Oracle /etc/oratab file (or a user defined file that follows the oratab syntax). See [Set up application-consistent backups](#set-up-application-consistent-backups) for details.
> > Azure Backup will run the pre and post backup scripts for each database listed in the file pointed to by configuration_path, except those lines that begin with # (treated as comment) or +ASM (Oracle Automatic Storage Management instance). >
To use Azure Backup to back up the database, complete these steps:
``` > [!IMPORTANT]
- > If you the output does not match the Oracle operating system group value retrieved in Step 3 you will need to create the operating system group representing the Oracle SYSBACKUP role. Please substitute `<group name>` for the group name retrieved in step 3 :
+ > If the output does not match the Oracle operating system group value retrieved in Step 3 you will need to create the operating system group representing the Oracle SYSBACKUP role. Please substitute `<group name>` for the group name retrieved in step 3 :
> ```bash > sudo groupadd <group name> > ```
To use Azure Backup to back up the database, complete these steps:
1. Set up external authentication for the new backup user. The backup user `azbackup` needs to be able to access the database using external authentication, so as not to be challenged by a password. In order to do this you must create a database user that authenticates externally through `azbackup`. The database uses a prefix for the user name which you need to find.-
- On each database installed on the VM perform the following steps:
+
+ > [!IMPORTANT]
+ > Perform the following steps for ***each*** database installed on the VM::
Log in to the database using sqlplus and check the default settings for external authentication:
To use Azure Backup to back up the database, complete these steps:
1. Create a stored procedure to log backup messages to the database alert log:
- Perform the following for each database installed on the VM:
+ > [!IMPORTANT]
+ > Perform the following steps for ***each*** database installed on the VM:
```bash sqlplus / as sysdba
To use Azure Backup to back up the database, complete these steps:
-## Recovery
-To recover a database, complete these steps:
+## Restore the VM
-1. [Remove the database files](#remove-the-database-files).
-1. [Generate a restore script from the Recovery Services vault](#generate-a-restore-script-from-the-recovery-services-vault).
-1. [Mount the restore point](#mount-the-restore-point).
-1. [Perform recovery](#perform-recovery).
+Restoring the entire VM allows you to restore the VM and its attached disks to a new VM from a selected restore point. This will restore all databases that run on the VM and each database will need to be recovered afterwards.
-### Remove the database files
+To restore the entire VM, complete these steps:
-Later in this article, you'll learn how to test the recovery process. Before you can test the recovery process, you have to remove the database files.
+1. [Stop and delete the VM](#stop-and-delete-the-vm).
+1. [Recover the VM](#recover-the-vm).
+1. [Set the public IP address](#set-the-public-ip-address).
+1. [Perform database recovery](#recovery-after-complete-vm-restore).
-1. Switch back to the oracle user:
- ```bash
- su - oracle
- ```
+### Stop and delete the VM
-1. Shut down the Oracle instance:
+# [Portal](#tab/azure-portal)
- ```bash
- sqlplus / as sysdba
- SQL> shutdown abort
- ORACLE instance shut down.
+1. In the Azure portal, go to the **vmoracle19c** Virtual Machine, and then select **Stop**.
+
+1. When the Virtual Machine is no longer running, select **Delete** and then **Yes**.
+
+ ![Vault delete command](./media/oracle-backup-recovery/recover-vm-01.png)
+
+# [Azure CLI](#tab/azure-cli)
+
+1. Stop and deallocate the VM:
+
+ ```azurecli
+ az vm deallocate --resource-group rg-oracle --name vmoracle19c
```
-1. Remove the database datafiles and contolfiles to simulate a failure:
+1. Delete the VM. Enter 'y' when prompted:
- ```bash
- cd /u02/oradata/ORATEST1
- rm -f *.dbf *.ctl
+ ```azurecli
+ az vm delete --resource-group rg-oracle --name vmoracle19c
```
-### Generate a restore script from the Recovery Services vault
++
+### Recover the VM
# [Portal](#tab/azure-portal)
-1. In the Azure portal, search for the *myVault* Recovery Services vaults item and select it.
+1. Create a storage account for staging in the Azure portal.
- ![Recovery Services vaults myVault backup items](./media/oracle-backup-recovery/recovery-service-06.png)
+ 1. In the Azure portal, select **+ Create a resource** and search for and select **Storage Account**.
+
+ ![Screenshot that shows where to create a resource.](./media/oracle-backup-recovery/storage-1.png)
+
+
+ 1. In the Create storage account page, choose your existing resource group **rg-oracle**, name your storage account **oracrestore** and choose **Storage V2 (generalpurpose v2)** for Account Kind. Change Replication to **Locally-redundant storage (LRS)** and set Performance to **Standard**. Ensure that Location is set to the same region as all your other resources in the resource group.
+
+ ![Storage Account add page](./media/oracle-backup-recovery/recovery-storage-1.png)
+
+ 1. Click on Review + Create and then click Create.
-1. On the **Overview** blade, select **Backup items** and the select **Azure Virtual Machine**, which should have anon-zero Backup Item Count listed.
+1. In the Azure portal, search for the *myVault* Recovery Services vaults item and click on it.
+
+ ![Recovery Services vaults myVault backup items](./media/oracle-backup-recovery/recovery-service-06.png)
+
+1. On the **Overview** blade, select **Backup items** and the select **Azure Virtual Machine**, which should have anon-zero Backup Item Count listed.
![Recovery Services vaults Azure Virtual Machine backup item count](./media/oracle-backup-recovery/recovery-service-07.png)
-1. On the Backups Items (Azure Virtual Machines) page, your VM **vmoracle19c** is listed. Click the ellipsis on the right to bring up the menu and select **File Recovery**.
+1. On the Backups Items (Azure Virtual Machines), page your VM **vmoracle19c** is listed. Click on the VM name.
- ![Screenshot of the Recovery Services vaults file recovery page](./media/oracle-backup-recovery/recovery-service-08.png)
+ ![Recovery VM page](./media/oracle-backup-recovery/recover-vm-02.png)
-1. On the **File Recovery (Preview)** pane, click **Download Script**. Then, save the download (.py) file to a folder on the client computer. A password is generated to the run the script. Copy the password to a file for use later.
+1. On the **vmoracle19c** blade, choose a restore point that has a consistency type of **Application Consistent** and click the ellipsis (**...**) on the right to bring up the menu. From the menu click **Restore VM**.
- ![Download script file saves options](./media/oracle-backup-recovery/recovery-service-09.png)
+ ![Restore VM command](./media/oracle-backup-recovery/recover-vm-03.png)
-1. Copy the .py file to the VM.
+1. On the **Restore Virtual Machine** blade, choose **Create New** and **Create New Virtual Machine**. Enter the virtual machine name **vmoracle19c** and choose the VNet **vmoracle19cVNET**, the subnet will be automatically populated for you based on your VNet selection. The restore VM process requires an Azure storage account in the same resource group and region. You can choose the storage account **or a restore** you set up earlier.
- The following example shows how you to use a secure copy (scp) command to move the file to the VM. You also can copy the contents to the clipboard, and then paste the contents in a new file that is set up on the VM.
+ ![Restore configuration values](./media/oracle-backup-recovery/recover-vm-04.png)
- > [!IMPORTANT]
- > In the following example, ensure that you update the IP address and folder values. The values must map to the folder where the file is saved.
- >
+1. To restore the VM, click the **Restore** button.
- ```bash
- $ scp vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py azureuser@<publicIpAddress>:/tmp
- ```
+1. To view the status of the restore process, click **Jobs**, and then click **Backup Jobs**.
+
+ ![Backup jobs status command](./media/oracle-backup-recovery/recover-vm-05.png)
+
+ Click on the **In Progress** restore operation to show the status of the restore process:
+
+ ![Status of the restore process](./media/oracle-backup-recovery/recover-vm-06.png)
# [Azure CLI](#tab/azure-cli)
-To list recovery points for your VM, use az backup recovery point list. In this example, we select the most recent recovery point for the VM named vmoracle19c that's protected in the Recovery Services Vault called myVault:
+To set up your storage account and file share, run the following commands in Azure CLI.
-```azurecli
+1. Create the storage account in the same resource group and location as your VM:
+
+ ```azurecli
+ az storage account create -n orarestore -g rg-oracle -l eastus --sku Standard_LRS
+ ```
+
+1. Retrieve the list of recovery points available.
+
+ ```azurecli
az backup recoverypoint list \ --resource-group rg-oracle \ --vault-name myVault \
To list recovery points for your VM, use az backup recovery point list. In this
--item-name vmoracle19c \ --query [0].name \ --output tsv
-```
-
-To obtain the script that connects, or mounts, the recovery point to your VM, use az backup restore files mount-rp. The following example obtains the script for the VM named vmoracle19c that's protected in the Recovery Services Vault called myVault.
+ ```
-Replace myRecoveryPointName with the name of the recovery point that you obtained in the preceding command:
+1. Restore the recovery point to the storage account. Substitute `<myRecoveryPointName>` with a recovery point from the list generated in the previous step:
-```azurecli
- az backup restore files mount-rp \
+ ```azurecli
+ az backup restore restore-disks \
--resource-group rg-oracle \ --vault-name myVault \ --container-name vmoracle19c \ --item-name vmoracle19c \
- --rp-name myRecoveryPointName
-```
-
-The script is downloaded and a password is displayed, as in the following example:
+ --storage-account orarestore \
+ --rp-name <myRecoveryPointName> \
+ --target-resource-group rg-oracle
+ ```
-```bash
- File downloaded: vmoracle19c_eus_4598131610710119312_456133188157_6931c635931f402eb543ee554e1cf06f102c6fc513d933.py. Use password c4487e40c760d29
-```
+1. Retrieve the restore job details. The following command gets more details for the triggered restored job, including its name, which is needed to retrieve the template URI.
-Copy the .py file to the VM.
+ ```azurecli
+ az backup job list \
+ --resource-group rg-oracle \
+ --vault-name myVault \
+ --output table
+ ```
-The following example shows how you to use a secure copy (scp) command to move the file to the VM. You also can copy the contents to the clipboard, and then paste the contents in a new file that is set up on the VM.
+ The output will look similar to this `(Note down the name of the restore job)`:
-> [!IMPORTANT]
-> In the following example, ensure that you update the IP address and folder values. The values must map to the folder where the file is saved.
->
+ ```output
+ Name Operation Status Item Name Start Time UTC Duration
+ -- -- --
+ c009747a-0d2e-4ac9-9632-f695bf874693 Restore Completed vmoracle19c 2021-01-10T21:46:07.506223+00:00 0:03:06.634177
+ 6b779c98-f57a-4db1-b829-9e8eab454a52 Backup Completed vmoracle19c 2021-01-07T10:11:15.784531+00:00 0:21:13.220616
+ 502bc7ae-d429-4f0f-b78e-51d41b7582fc ConfigureBackup Completed vmoracle19c 2021-01-07T09:43:55.298755+00:00 0:00:30.839674
+ ```
-```bash
-$ scp vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py azureuser@<publicIpAddress>:/tmp
-```
-
+1. Retrieve the details of the URI to use for recreating the VM. Substitute the restore job name from the previous step for `<RestoreJobName>`.
-### Mount the restore point
+ ```azurecli
+ az backup job show \
+ -v myVault \
+ -g rg-oracle \
+ -n <RestoreJobName> \
+ --query properties.extendedInfo.propertyBag
+ ```
-1. Switch to the root user:
- ```bash
- sudo su -
- ``````
-1. Create a restore mount point and copy the script to it.
+ Output is similar to this:
- In the following example, create a */restore* directory for the snapshot to mount to, move the file to the directory, and change the file so that it's owned by the root user and made executable.
+ ```output
+ {
+ "Config Blob Container Name": "vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2",
+ "Config Blob Name": "config-vmoracle19c-c009747a-0d2e-4ac9-9632-f695bf874693.json",
+ "Config Blob Uri": "https://orarestore.blob.core.windows.net/vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2/config-vmoracle19c-c009747a-0d2e-4ac9-9632-f695bf874693.json",
+ "Job Type": "Recover disks",
+ "Recovery point time ": "1/7/2021 10:11:19 AM",
+ "Target Storage Account Name": "orarestore",
+ "Target resource group": "rg-oracle",
+ "Template Blob Uri": "https://orarestore.blob.core.windows.net/vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2/azuredeployc009747a-0d2e-4ac9-9632-f695bf874693.json"
+ }
+ ```
- ```bash
- mkdir /restore
- chmod 777 /restore
- cd /restore
- cp /tmp/vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py /restore
- chmod 755 /restore/vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py
- ```
-
- Now execute the script to restore the backup. You will be asked to supply the password generated in Azure portal.
-
- ```bash
- ./vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py
- ```
+ The template name, which is at the end of Template Blob Uri, which in this example is `azuredeployc009747a-0d2e-4ac9-9632-f695bf874693.json`, and the Blob container name, which is `vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2` are listed.
- The following example shows what you should see after you run the preceding script. When you're prompted to continue, enter **Y**.
+ Use these values in the following command to assign variables in preparation for creating the VM. A SAS key is generated for the storage container with 30-minutes duration.
- ```output
- Microsoft Azure VM Backup - File Recovery
- ______________________________________________
- Please enter the password as shown on the portal to securely connect to the recovery point. : b1ad68e16dfafc6
- Connecting to recovery point using ISCSI service...
+ ```azurecli
+ expiretime=$(date -u -d "30 minutes" '+%Y-%m-%dT%H:%MZ')
+ connection=$(az storage account show-connection-string \
+ --resource-group rg-oracle \
+ --name orarestore \
+ --query connectionString)
+ token=$(az storage blob generate-sas \
+ --container-name <ContainerName> \
+ --name <TemplateName> \
+ --expiry $expiretime \
+ --permissions r \
+ --output tsv \
+ --connection-string $connection)
+ url=$(az storage blob url \
+ --container-name <ContainerName> \
+ --name <TemplateName> \
+ --connection-string $connection \
+ --output tsv)
+ ```
- Connection succeeded!
+ Now deploy the template to create the VM.
- Please wait while we attach volumes of the recovery point to this machine...
+ ```azurecli
+ az deployment group create \
+ --resource-group rg-oracle \
+ --template-uri $url?$token
+ ```
- ************ Volumes of the recovery point and their mount paths on this machine ************
+ You will be prompted to provide a name for the VM.
- Sr.No. | Disk | Volume | MountPath
+
- 1) | /dev/sdc | /dev/sdc1 | /restore/vmoracle19c-20201215123912/Volume1
+### Set the public IP address
- 2) | /dev/sdd | /dev/sdd1 | /restore/vmoracle19c-20201215123912/Volume2
+After the VM is restored, you should reassign the original IP address to the new VM.
- 3) | /dev/sdd | /dev/sdd2 | /restore/vmoracle19c-20201215123912/Volume3
+# [Portal](#tab/azure-portal)
- 4) | /dev/sdd | /dev/sdd15 | /restore/vmoracle19c-20201215123912/Volume5
+1. In the Azure portal, go to your Virtual Machine **vmoracle19c**. You will notice it has been assigned a new public IP and NIC similar to vmoracle19c-nic-XXXXXXXXXXXX, but does not have a DNS address. When the original VM was deleted its public IP and NIC are retained and the next steps will reattach them to the new VM.
- The following partitions failed to mount since the OS couldn't identify the filesystem.
+ ![List of public IP addresses](./media/oracle-backup-recovery/create-ip-01.png)
- ************ Volumes from unknown filesystem ************
+1. Stop the VM
- Sr.No. | Disk | Volume | Partition Type
+ ![Create IP address](./media/oracle-backup-recovery/create-ip-02.png)
- 1) | /dev/sdb | /dev/sdb14 | BIOS Boot partition
+1. Go to **Networking**
- Please refer to '/restore/vmoracle19c-2020XXXXXXXXXX/Scripts/MicrosoftAzureBackupILRLogFile.log' for more details.
+ ![Associate IP address](./media/oracle-backup-recovery/create-ip-03.png)
- ************ Open File Explorer to browse for files. ************
+1. Click on **Attach network interface**, choose the original NIC **vmoracle19cVMNic, which the original public IP address is still associated to, and click **OK**
- After recovery, remove the disks and close the connection to the recovery point by clicking the 'Unmount Disks' button from the portal or by using the relevant unmount command in case of powershell or CLI.
+ ![Select resource type and NIC values](./media/oracle-backup-recovery/create-ip-04.png)
- After unmounting disks, run the script with the parameter 'clean' to remove the mount paths of the recovery point from this machine.
+1. Now you must detach the NIC that was created with the VM restore operation as it is configured as the primary interface. Click on **Detach network interface** and choose the new NIC similar to **vmoracle19c-nic-XXXXXXXXXXXX**, then click **OK**
- Please enter 'q/Q' to exit...
- ```
-
-1. Access to the mounted volumes is confirmed.
-
- To exit, enter **q**, and then search for the mounted volumes. To create a list of the added volumes, at a command prompt, enter **df -h**.
+ ![Screenshot that shows where to select Detach network interface.](./media/oracle-backup-recovery/create-ip-05.png)
- ```
- [root@vmoracle19c restore]# df -h
- Filesystem Size Used Avail Use% Mounted on
- devtmpfs 3.8G 0 3.8G 0% /dev
- tmpfs 3.8G 0 3.8G 0% /dev/shm
- tmpfs 3.8G 17M 3.8G 1% /run
- tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup
- /dev/sdd2 30G 9.6G 18G 36% /
- /dev/sdb1 126G 736M 119G 1% /u02
- /dev/sda1 497M 199M 298M 41% /boot
- /dev/sda15 495M 9.7M 486M 2% /boot/efi
- tmpfs 771M 0 771M 0% /run/user/54322
- /dev/sdc1 126G 2.9G 117G 3% /restore/vmoracle19c-20201215123912/Volume1
- /dev/sdd1 497M 199M 298M 41% /restore/vmoracle19c-20201215123912/Volume2
- /dev/sdd2 30G 9.6G 18G 36% /restore/vmoracle19c-20201215123912/Volume3
- /dev/sdd15 495M 9.7M 486M 2% /restore/vmoracle19c-20201215123912/Volume5
- ```
-
-### Perform recovery
-Perform the following steps for each database on the VM:
-
-1. Restore the missing database files back to their location:
+ Your recreated VM will now have the original NIC, which is associated with the original IP address and Network Security Group rules
+
+ ![IP address value](./media/oracle-backup-recovery/create-ip-06.png)
+
+1. Go back to the **Overview** and click **Start**
- ```bash
- cd /restore/vmoracle19c-2020XXXXXXXXXX/Volume1/oradata/ORATEST1
- cp * /u02/oradata/ORATEST1
- cd /u02/oradata/ORATEST1
- chown -R oracle:oinstall *
- ```
-1. Switch back to the oracle user
- ```bash
- sudo su - oracle
- ```
-1. Start the database instance and mount the controlfile for reading:
- ```bash
- sqlplus / as sysdba
- SQL> startup mount
- SQL> quit
- ```
+# [Azure CLI](#tab/azure-cli)
-1. Connect to the database with sysbackup:
- ```bash
- sqlplus / as sysbackup
- ```
-1. Initiate automatic database recovery:
+1. Stop and deallocate the VM:
- ```bash
- SQL> recover automatic database until cancel using backup controlfile;
+ ```azurecli
+ az vm deallocate --resource-group rg-oracle --name vmoracle19c
```
- > [!IMPORTANT]
- > Please note that it is important to specify the USING BACKUP CONTROLFILE syntax to inform the RECOVER AUTOMATIC DATABASE command that recovery should not stop at the Oracle system change number (SCN) recorded in the restored database control file. The restored database control file was a snapshot, along with the rest of the database, and the SCN stored within it is from the point-in-time of the snapshot. There may be transactions recorded after this point and we want to recover to the point-in-time of the last transaction committed to the database.
- When recovery completes successfully you will see the message `Media recovery complete`. However, when using the BACKUP CONTROLFILE clause the recover command will ignore online log files and it is possible there are changes in the current online redo log required to complete point in time recovery. In this situation you may see messages similar to these:
-
- ```output
- SQL> recover automatic database until cancel using backup controlfile;
- ORA-00279: change 2172930 generated at 04/08/2021 12:27:06 needed for thread 1
- ORA-00289: suggestion :
- /u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc
- ORA-00280: change 2172930 for thread 1 is in sequence #13
- ORA-00278: log file
- '/u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc' no
- longer needed for this recovery
- ORA-00308: cannot open archived log
- '/u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc'
- ORA-27037: unable to obtain file status
- Linux-x86_64 Error: 2: No such file or directory
- Additional information: 7
+1. List the current, restore generated VM NIC
- Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
+ ```azurecli
+ az vm nic list --resource-group rg-oracle --vm-name vmoracle19c
```
-
- > [!IMPORTANT]
- > Note that if the current online redo log has been lost or corrupted and cannot be used, you may cancel recovery at this point.
- To correct this you can identify which is the current online log that has not been archived, and supply the fully qualified filename to the prompt.
--
- Open a new ssh connection
- ```bash
- ssh azureuser@<IP Address>
- ```
- Switch to the oracle user and set the Oracle SID
- ```bash
- sudo su - oracle
- export ORACLE_SID=oratest1
- ```
-
- Connect to the database and run the following query to find the online logfile
- ```bash
- sqlplus / as sysdba
- SQL> column member format a45
- SQL> set linesize 500
- SQL> select l.SEQUENCE#, to_char(l.FIRST_CHANGE#,'999999999999999') as CHK_CHANGE, l.group#, l.archived, l.status, f.member
- from v$log l, v$logfile f
- where l.group# = f.group#;
- ```
+ The output will look similar to this, which lists the restore generated NIC name as `vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf`
- The output will look similar to this.
```output
- SEQUENCE# CHK_CHANGE GROUP# ARC STATUS MEMBER
- - - - -
- 13 2172929 1 NO CURRENT /u02/oradata/ORATEST1/redo01.log
- 12 2151934 3 YES INACTIVE /u02/oradata/ORATEST1/redo03.log
- 11 2071784 2 YES INACTIVE /u02/oradata/ORATEST1/redo02.log
+ {
+ "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx/resourceGroups/rg-oracle/providers/Microsoft.Network/networkInterfaces/vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf",
+ "primary": true,
+ "resourceGroup": "rg-oracle"
+ }
```
- Copy the logfile path and file name for the CURRENT online log, in this example it is `/u02/oradata/ORATEST1/redo01.log`. Switch back to the ssh session running the recover command, input the logfile information and press return:
- ```bash
- Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
- /u02/oradata/ORATEST1/redo01.log
- ```
+1. Attach original NIC, which should have a name of `<VMName>VMNic`, in this case `vmoracle19cVMNic`. The original Public IP address is still attached to this NIC and will be restored to the VM when the NIC is reattached.
- You should see the logfile is applied and recovery completes. Enter CANCEL to exit the recover command:
- ```output
- Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
- /u02/oradata/ORATEST1/redo01.log
- Log applied.
- Media recovery complete.
+ ```azurecli
+ az vm nic add --nics vmoracle19cVMNic --resource-group rg-oracle --vm-name vmoracle19c
```
-1. Open the database
-
- > [!IMPORTANT]
- > The RESETLOGS option is required when the RECOVER command uses the USING BACKUP CONTROLFILE option. RESETLOGS creates a new incarnation of the database by resetting the redo history back to the beginning, because there is no way to determine how much of the previous database incarnation was skipped in the recovery.
+1. Detach the restore generated NIC
- ```bash
- SQL> alter database open resetlogs;
+ ```azurecli
+ az vm nic remove --nics vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf --resource-group rg-oracle --vm-name vmoracle19c
```
+
+1. Start the VM:
-
-1. Check the database content has been fully recovered:
-
- ```bash
- RMAN> SELECT * FROM scott.scott_table;
- ```
-
-1. Unmount the restore point.
-
- When all databases on the VM have been successfully recovered you may unmount the restore point. This can be done on the VM using the `unmount` command or in Azure portal from the File Recovery blade. You can also unmount the recovery volumes by running the Python script again with the **-clean** option.
-
- In the VM using unmount:
- ```bash
- sudo umount /restore/vmoracle19c-20210107110037/Volume*
+ ```azurecli
+ az vm start --resource-group rg-oracle --name vmoracle19c
```
- In the Azure portal, on the **File Recovery (Preview)** blade, click **Unmount Disks**.
-
- ![Unmount disks command](./media/oracle-backup-recovery/recovery-service-10.png)
-
--
-## Restore the entire VM
-
-Instead of restoring the deleted files from the Recovery Services vaults, you can restore the entire VM.
-
-To restore the entire VM, complete these steps:
-
-1. [Stop and delete the VM](#stop-and-delete-the-vm).
-1. [Recover the VM](#recover-the-vm).
-1. [Set the public IP address](#set-the-public-ip-address).
-1. [Perform database recovery](#perform-database-recovery).
+
-### Stop and delete the VM
+## Restore an individual database
+As multiple Oracle databases can be run on an Azure VM, there may be times when you want to restore and recover an individual database without disrupting the other databases running on the VM.
-# [Portal](#tab/azure-portal)
+To restore an individual database, complete these steps:
-1. In the Azure portal, go to the **vmoracle19c** Virtual Machine, and then select **Stop**.
+1. [Remove the database files](#remove-the-database-files).
+1. [Generate a restore script from the Recovery Services vault](#generate-a-restore-script-from-the-recovery-services-vault).
+1. [Mount the restore point](#mount-the-restore-point).
+1. [Restore the database files](#restore-the-database-files).
-1. When the Virtual Machine is no longer running, select **Delete** and then **Yes**.
+### Remove the database files
- ![Vault delete command](./media/oracle-backup-recovery/recover-vm-01.png)
+Later in this article, you'll learn how to test the recovery process. Before you can test the recovery process, you have to remove the database files.
-# [Azure CLI](#tab/azure-cli)
+1. Switch back to the oracle user:
+ ```bash
+ su - oracle
+ ```
-1. Stop and deallocate the VM:
+1. Shut down the Oracle instance:
- ```azurecli
- az vm deallocate --resource-group rg-oracle --name vmoracle19c
+ ```bash
+ sqlplus / as sysdba
+ SQL> shutdown abort
+ ORACLE instance shut down.
```
-1. Delete the VM. Enter 'y' when prompted:
+1. Remove the database datafiles and contolfiles to simulate a failure:
- ```azurecli
- az vm delete --resource-group rg-oracle --name vmoracle19c
+ ```bash
+ cd /u02/oradata/ORATEST1
+ rm -f *.dbf *.ctl
``` --
-### Recover the VM
+### Generate a restore script from the Recovery Services vault
# [Portal](#tab/azure-portal)
-1. Create a storage account for staging in the Azure portal.
-
- 1. In the Azure portal, select **+ Create a resource** and search for and select **Storage Account**.
-
- ![Screenshot that shows where to create a resource.](./media/oracle-backup-recovery/storage-1.png)
-
-
- 1. In the Create storage account page, choose your existing resource group **rg-oracle**, name your storage account **oracrestore** and choose **Storage V2 (generalpurpose v2)** for Account Kind. Change Replication to **Locally-redundant storage (LRS)** and set Performance to **Standard**. Ensure that Location is set to the same region as all your other resources in the resource group.
-
- ![Storage Account add page](./media/oracle-backup-recovery/recovery-storage-1.png)
-
- 1. Click on Review + Create and then click Create.
-
-1. In the Azure portal, search for the *myVault* Recovery Services vaults item and click on it.
+1. In the Azure portal, search for the *myVault* Recovery Services vaults item and select it.
![Recovery Services vaults myVault backup items](./media/oracle-backup-recovery/recovery-service-06.png)
-
-1. On the **Overview** blade, select **Backup items** and the select **Azure Virtual Machine**, which should have anon-zero Backup Item Count listed.
- ![Recovery Services vaults Azure Virtual Machine backup item count](./media/oracle-backup-recovery/recovery-service-07.png)
-
-1. On the Backups Items (Azure Virtual Machines), page your VM **vmoracle19c** is listed. Click on the VM name.
-
- ![Recovery VM page](./media/oracle-backup-recovery/recover-vm-02.png)
+1. On the **Overview** blade, select **Backup items** and the select **Azure Virtual Machine**, which should have anon-zero Backup Item Count listed.
-1. On the **vmoracle19c** blade, choose a restore point that has a consistency type of **Application Consistent** and click the ellipsis (**...**) on the right to bring up the menu. From the menu click **Restore VM**.
+ ![Recovery Services vaults Azure Virtual Machine backup item count](./media/oracle-backup-recovery/recovery-service-07.png)
- ![Restore VM command](./media/oracle-backup-recovery/recover-vm-03.png)
+1. On the Backups Items (Azure Virtual Machines) page, your VM **vmoracle19c** is listed. Click the ellipsis on the right to bring up the menu and select **File Recovery**.
-1. On the **Restore Virtual Machine** blade, choose **Create New** and **Create New Virtual Machine**. Enter the virtual machine name **vmoracle19c** and choose the VNet **vmoracle19cVNET**, the subnet will be automatically populated for you based on your VNet selection. The restore VM process requires an Azure storage account in the same resource group and region. You can choose the storage account **orarestore** you setup earlier.
+ ![Screenshot of the Recovery Services vaults file recovery page](./media/oracle-backup-recovery/recovery-service-08.png)
- ![Restore configuration values](./media/oracle-backup-recovery/recover-vm-04.png)
+1. On the **File Recovery (Preview)** pane, click **Download Script**. Then, save the download (.py) file to a folder on the client computer. A password is generated to the run the script. Copy the password to a file for use later.
-1. To restore the VM, click the **Restore** button.
+ ![Download script file saves options](./media/oracle-backup-recovery/recovery-service-09.png)
-1. To view the status of the restore process, click **Jobs**, and then click **Backup Jobs**.
+1. Copy the .py file to the VM.
- ![Backup jobs status command](./media/oracle-backup-recovery/recover-vm-05.png)
+ The following example shows how you to use a secure copy (scp) command to move the file to the VM. You also can copy the contents to the clipboard, and then paste the contents in a new file that is set up on the VM.
- Click on the **In Progress** restore operation to show the status of the restore process:
+ > [!IMPORTANT]
+ > In the following example, ensure that you update the IP address and folder values. The values must map to the folder where the file is saved.
+ >
- ![Status of the restore process](./media/oracle-backup-recovery/recover-vm-06.png)
+ ```bash
+ $ scp vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py azureuser@<publicIpAddress>:/tmp
+ ```
# [Azure CLI](#tab/azure-cli)
-To set up your storage account and file share, run the following commands in Azure CLI.
-
-1. Create the storage account in the same resource group and location as your VM:
-
- ```azurecli
- az storage account create -n orarestore -g rg-oracle -l eastus --sku Standard_LRS
- ```
-
-1. Retrieve the list of recovery points available.
+To list recovery points for your VM, use az backup recovery point list. In this example, we select the most recent recovery point for the VM named vmoracle19c that's protected in the Recovery Services Vault called myVault:
- ```azurecli
+```azurecli
az backup recoverypoint list \ --resource-group rg-oracle \ --vault-name myVault \
To set up your storage account and file share, run the following commands in Azu
--item-name vmoracle19c \ --query [0].name \ --output tsv
- ```
+```
-1. Restore the recovery point to the storage account. Substitute `<myRecoveryPointName>` with a recovery point from the list generated in the previous step:
+To obtain the script that connects, or mounts, the recovery point to your VM, use az backup restore files mount-rp. The following example obtains the script for the VM named vmoracle19c that's protected in the Recovery Services Vault called myVault.
- ```azurecli
- az backup restore restore-disks \
+Replace myRecoveryPointName with the name of the recovery point that you obtained in the preceding command:
+
+```azurecli
+ az backup restore files mount-rp \
--resource-group rg-oracle \ --vault-name myVault \ --container-name vmoracle19c \ --item-name vmoracle19c \
- --storage-account orarestore \
- --rp-name <myRecoveryPointName> \
- --target-resource-group rg-oracle
- ```
-
-1. Retrieve the restore job details. The following command gets more details for the triggered restored job, including its name, which is needed to retrieve the template URI.
-
- ```azurecli
- az backup job list \
- --resource-group rg-oracle \
- --vault-name myVault \
- --output table
- ```
-
- The output will look similar to this `(Note down the name of the restore job)`:
-
- ```output
- Name Operation Status Item Name Start Time UTC Duration
- -- -- --
- c009747a-0d2e-4ac9-9632-f695bf874693 Restore Completed vmoracle19c 2021-01-10T21:46:07.506223+00:00 0:03:06.634177
- 6b779c98-f57a-4db1-b829-9e8eab454a52 Backup Completed vmoracle19c 2021-01-07T10:11:15.784531+00:00 0:21:13.220616
- 502bc7ae-d429-4f0f-b78e-51d41b7582fc ConfigureBackup Completed vmoracle19c 2021-01-07T09:43:55.298755+00:00 0:00:30.839674
- ```
-
-1. Retrieve the details of the URI to use for recreating the VM. Substitute the restore job name from the previous step for `<RestoreJobName>`.
-
- ```azurecli
- az backup job show \
- -v myVault \
- -g rg-oracle \
- -n <RestoreJobName> \
- --query properties.extendedInfo.propertyBag
- ```
-
- Output is similar to this:
+ --rp-name myRecoveryPointName
+```
- ```output
- {
- "Config Blob Container Name": "vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2",
- "Config Blob Name": "config-vmoracle19c-c009747a-0d2e-4ac9-9632-f695bf874693.json",
- "Config Blob Uri": "https://orarestore.blob.core.windows.net/vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2/config-vmoracle19c-c009747a-0d2e-4ac9-9632-f695bf874693.json",
- "Job Type": "Recover disks",
- "Recovery point time ": "1/7/2021 10:11:19 AM",
- "Target Storage Account Name": "orarestore",
- "Target resource group": "rg-oracle",
- "Template Blob Uri": "https://orarestore.blob.core.windows.net/vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2/azuredeployc009747a-0d2e-4ac9-9632-f695bf874693.json"
- }
- ```
+The script is downloaded and a password is displayed, as in the following example:
- The template name, which is at the end of Template Blob Uri, which in this example is `azuredeployc009747a-0d2e-4ac9-9632-f695bf874693.json`, and the Blob container name, which is `vmoracle19c-75aefd4b34c64dd39fdcd3db579783f2` are listed.
+```bash
+ File downloaded: vmoracle19c_eus_4598131610710119312_456133188157_6931c635931f402eb543ee554e1cf06f102c6fc513d933.py. Use password c4487e40c760d29
+```
- Use these values in the following command to assign variables in preparation for creating the VM. A SAS key is generated for the storage container with 30-minutes duration.
+Copy the .py file to the VM.
+The following example shows how you to use a secure copy (scp) command to move the file to the VM. You also can copy the contents to the clipboard, and then paste the contents in a new file that is set up on the VM.
- ```azurecli
- expiretime=$(date -u -d "30 minutes" '+%Y-%m-%dT%H:%MZ')
- connection=$(az storage account show-connection-string \
- --resource-group rg-oracle \
- --name orarestore \
- --query connectionString)
- token=$(az storage blob generate-sas \
- --container-name <ContainerName> \
- --name <TemplateName> \
- --expiry $expiretime \
- --permissions r \
- --output tsv \
- --connection-string $connection)
- url=$(az storage blob url \
- --container-name <ContainerName> \
- --name <TemplateName> \
- --connection-string $connection \
- --output tsv)
- ```
+> [!IMPORTANT]
+> In the following example, ensure that you update the IP address and folder values. The values must map to the folder where the file is saved.
+>
- Now deploy the template to create the VM.
+```bash
+$ scp vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py azureuser@<publicIpAddress>:/tmp
+```
+
- ```azurecli
- az deployment group create \
- --resource-group rg-oracle \
- --template-uri $url?$token
- ```
+### Mount the restore point
- You will be prompted to provide a name for the VM.
+1. Switch to the root user:
+ ```bash
+ sudo su -
+ ``````
+1. Create a restore mount point and copy the script to it.
-
+ In the following example, create a */restore* directory for the snapshot to mount to, move the file to the directory, and change the file so that it's owned by the root user and made executable.
-### Set the public IP address
+ ```bash
+ mkdir /restore
+ chmod 777 /restore
+ cd /restore
+ cp /tmp/vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py /restore
+ chmod 755 /restore/vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py
+ ```
+
+ Now execute the script to restore the backup. You will be asked to supply the password generated in Azure portal.
+
+ ```bash
+ ./vmoracle19c_xxxxxx_xxxxxx_xxxxxx.py
+ ```
-After the VM is restored, you should reassign the original IP address to the new VM.
+ The following example shows what you should see after you run the preceding script. When you're prompted to continue, enter **Y**.
-# [Portal](#tab/azure-portal)
+ ```output
+ Microsoft Azure VM Backup - File Recovery
+ ______________________________________________
+ Please enter the password as shown on the portal to securely connect to the recovery point. : b1ad68e16dfafc6
-1. In the Azure portal, go to your Virtual Machine **vmoracle19c**. You will notice it has been assigned a new public IP and NIC similar to vmoracle19c-nic-XXXXXXXXXXXX, but does not have a DNS address. When the original VM was deleted its public IP and NIC are retained and the next steps will reattach them to the new VM.
+ Connecting to recovery point using ISCSI service...
- ![List of public IP addresses](./media/oracle-backup-recovery/create-ip-01.png)
+ Connection succeeded!
-1. Stop the VM
+ Please wait while we attach volumes of the recovery point to this machine...
- ![Create IP address](./media/oracle-backup-recovery/create-ip-02.png)
+ ************ Volumes of the recovery point and their mount paths on this machine ************
-1. Go to **Networking**
+ Sr.No. | Disk | Volume | MountPath
- ![Associate IP address](./media/oracle-backup-recovery/create-ip-03.png)
+ 1) | /dev/sdc | /dev/sdc1 | /restore/vmoracle19c-20201215123912/Volume1
-1. Click on **Attach network interface**, choose the original NIC **vmoracle19cVMNic, which the original public IP address is still associated to, and click **OK**
+ 2) | /dev/sdd | /dev/sdd1 | /restore/vmoracle19c-20201215123912/Volume2
- ![Select resource type and NIC values](./media/oracle-backup-recovery/create-ip-04.png)
+ 3) | /dev/sdd | /dev/sdd2 | /restore/vmoracle19c-20201215123912/Volume3
-1. Now you must detach the NIC that was created with the VM restore operation as it is configured as the primary interface. Click on **Detach network interface** and choose the new NIC similar to **vmoracle19c-nic-XXXXXXXXXXXX**, then click **OK**
+ 4) | /dev/sdd | /dev/sdd15 | /restore/vmoracle19c-20201215123912/Volume5
- ![Screenshot that shows where to select Detach network interface.](./media/oracle-backup-recovery/create-ip-05.png)
-
- Your recreated VM will now have the original NIC, which is associated with the original IP address and Network Security Group rules
-
- ![IP address value](./media/oracle-backup-recovery/create-ip-06.png)
-
-1. Go back to the **Overview** and click **Start**
+ The following partitions failed to mount since the OS couldn't identify the filesystem.
-# [Azure CLI](#tab/azure-cli)
+ ************ Volumes from unknown filesystem ************
-1. Stop and deallocate the VM:
+ Sr.No. | Disk | Volume | Partition Type
- ```azurecli
- az vm deallocate --resource-group rg-oracle --name vmoracle19c
- ```
+ 1) | /dev/sdb | /dev/sdb14 | BIOS Boot partition
-1. List the current, restore generated VM NIC
+ Please refer to '/restore/vmoracle19c-2020XXXXXXXXXX/Scripts/MicrosoftAzureBackupILRLogFile.log' for more details.
- ```azurecli
- az vm nic list --resource-group rg-oracle --vm-name vmoracle19c
- ```
+ ************ Open File Explorer to browse for files. ************
- The output will look similar to this, which lists the restore generated NIC name as `vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf`
+ After recovery, remove the disks and close the connection to the recovery point by clicking the 'Unmount Disks' button from the portal or by using the relevant unmount command in case of powershell or CLI.
- ```output
- {
- "id": "/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx/resourceGroups/rg-oracle/providers/Microsoft.Network/networkInterfaces/vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf",
- "primary": true,
- "resourceGroup": "rg-oracle"
- }
- ```
+ After unmounting disks, run the script with the parameter 'clean' to remove the mount paths of the recovery point from this machine.
-1. Attach original NIC, which should have a name of `<VMName>VMNic`, in this case `vmoracle19cVMNic`. The original Public IP address is still attached to this NIC and will be restored to the VM when the NIC is reattached.
+ Please enter 'q/Q' to exit...
+ ```
- ```azurecli
- az vm nic add --nics vmoracle19cVMNic --resource-group rg-oracle --vm-name vmoracle19c
- ```
+1. Access to the mounted volumes is confirmed.
-1. Detach the restore generated NIC
+ To exit, enter **q**, and then search for the mounted volumes. To create a list of the added volumes, at a command prompt, enter **df -h**.
+
+ ```
+ [root@vmoracle19c restore]# df -h
+ Filesystem Size Used Avail Use% Mounted on
+ devtmpfs 3.8G 0 3.8G 0% /dev
+ tmpfs 3.8G 0 3.8G 0% /dev/shm
+ tmpfs 3.8G 17M 3.8G 1% /run
+ tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup
+ /dev/sdd2 30G 9.6G 18G 36% /
+ /dev/sdb1 126G 736M 119G 1% /u02
+ /dev/sda1 497M 199M 298M 41% /boot
+ /dev/sda15 495M 9.7M 486M 2% /boot/efi
+ tmpfs 771M 0 771M 0% /run/user/54322
+ /dev/sdc1 126G 2.9G 117G 3% /restore/vmoracle19c-20201215123912/Volume1
+ /dev/sdd1 497M 199M 298M 41% /restore/vmoracle19c-20201215123912/Volume2
+ /dev/sdd2 30G 9.6G 18G 36% /restore/vmoracle19c-20201215123912/Volume3
+ /dev/sdd15 495M 9.7M 486M 2% /restore/vmoracle19c-20201215123912/Volume5
+ ```
- ```azurecli
- az vm nic remove --nics vmoracle19cRestoredNICc2e8a8a4fc3f47259719d5523cd32dcf --resource-group rg-oracle --vm-name vmoracle19c
- ```
-1. Start the VM:
+### Restore The Database Files
+Perform the following steps for the database on the VM you want to restore:
- ```azurecli
- az vm start --resource-group rg-oracle --name vmoracle19c
- ```
+1. Restore the missing database files back to their location:
-
+ ```bash
+ cd /restore/vmoracle19c-2020XXXXXXXXXX/Volume1/oradata/ORATEST1
+ cp * /u02/oradata/ORATEST1
+ cd /u02/oradata/ORATEST1
+ chown -R oracle:oinstall *
+ ```
+Now the database has been restored you must recover the database. Please follow the steps in [Database Recovery](#recovery-after-an-individual-database-restore) to complete the recovery.
-### Perform database recovery
-First reconnect to the VM:
+## Database Recovery
-```azurecli
-ssh azureuser@<publicIpAddress>
-```
+### Recovery after complete VM restore
-When the whole VM has been restored, it is important to recover each database on the VM by performing the following steps on each:
+1. First reconnect to the VM:
+ ```bash
+ ssh azureuser@<publicIpAddress>
+ ```
+ > [!Important]
+ > When the whole VM has been restored, it is important to recover each database on the VM by performing the following steps on each:
1. You may find that the instance is running as the auto start has attempted to start the database on VM boot. However the database requires recovery and is likely to be at mount stage only, so a preparatory shutdown is run first followed by starting to mount stage.
When the whole VM has been restored, it is important to recover each database on
SQL> shutdown immediate SQL> startup mount ```
-
+
1. Perform database recovery > [!IMPORTANT] > Please note that it is important to specify the USING BACKUP CONTROLFILE syntax to inform the RECOVER AUTOMATIC DATABASE command that recovery should not stop at the Oracle system change number (SCN) recorded in the restored database control file. The restored database control file was a snapshot, along with the rest of the database, and the SCN stored within it is from the point-in-time of the snapshot. There may be transactions recorded after this point and we want to recover to the point-in-time of the last transaction committed to the database.
When the whole VM has been restored, it is important to recover each database on
```bash SQL> select * from scott.scott_table; ```
+### Recovery after an individual database restore
+
+1. Switch back to the oracle user
+ ```bash
+ sudo su - oracle
+ ```
+1. Start the database instance and mount the controlfile for reading:
+ ```bash
+ sqlplus / as sysdba
+ SQL> startup mount
+ SQL> quit
+ ```
+
+1. Connect to the database with sysbackup:
+ ```bash
+ sqlplus / as sysbackup
+ ```
+1. Initiate automatic database recovery:
+
+ ```bash
+ SQL> recover automatic database until cancel using backup controlfile;
+ ```
+ > [!IMPORTANT]
+ > Please note that it is important to specify the USING BACKUP CONTROLFILE syntax to inform the RECOVER AUTOMATIC DATABASE command that recovery should not stop at the Oracle system change number (SCN) recorded in the restored database control file. The restored database control file was a snapshot, along with the rest of the database, and the SCN stored within it is from the point-in-time of the snapshot. There may be transactions recorded after this point and we want to recover to the point-in-time of the last transaction committed to the database.
+
+ When recovery completes successfully you will see the message `Media recovery complete`. However, when using the BACKUP CONTROLFILE clause the recover command will ignore online log files and it is possible there are changes in the current online redo log required to complete point in time recovery. In this situation you may see messages similar to these:
+
+ ```output
+ SQL> recover automatic database until cancel using backup controlfile;
+ ORA-00279: change 2172930 generated at 04/08/2021 12:27:06 needed for thread 1
+ ORA-00289: suggestion :
+ /u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc
+ ORA-00280: change 2172930 for thread 1 is in sequence #13
+ ORA-00278: log file
+ '/u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc' no
+ longer needed for this recovery
+ ORA-00308: cannot open archived log
+ '/u02/fast_recovery_area/ORATEST1/archivelog/2021_04_08/o1_mf_1_13_%u_.arc'
+ ORA-27037: unable to obtain file status
+ Linux-x86_64 Error: 2: No such file or directory
+ Additional information: 7
+
+ Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
+ ```
+
+ > [!IMPORTANT]
+ > Note that if the current online redo log has been lost or corrupted and cannot be used, you may cancel recovery at this point.
+
+ To correct this you can identify which is the current online log that has not been archived, and supply the fully qualified filename to the prompt.
++
+ Open a new ssh connection
+ ```bash
+ ssh azureuser@<IP Address>
+ ```
+ Switch to the oracle user and set the Oracle SID
+ ```bash
+ sudo su - oracle
+ export ORACLE_SID=oratest1
+ ```
+
+ Connect to the database and run the following query to find the online logfile
+ ```bash
+ sqlplus / as sysdba
+ SQL> column member format a45
+ SQL> set linesize 500
+ SQL> select l.SEQUENCE#, to_char(l.FIRST_CHANGE#,'999999999999999') as CHK_CHANGE, l.group#, l.archived, l.status, f.member
+ from v$log l, v$logfile f
+ where l.group# = f.group#;
+ ```
+
+ The output will look similar to this.
+ ```output
+ SEQUENCE# CHK_CHANGE GROUP# ARC STATUS MEMBER
+ - - - -
+ 13 2172929 1 NO CURRENT /u02/oradata/ORATEST1/redo01.log
+ 12 2151934 3 YES INACTIVE /u02/oradata/ORATEST1/redo03.log
+ 11 2071784 2 YES INACTIVE /u02/oradata/ORATEST1/redo02.log
+ ```
+ Copy the logfile path and file name for the CURRENT online log, in this example it is `/u02/oradata/ORATEST1/redo01.log`. Switch back to the ssh session running the recover command, input the logfile information and press return:
+
+ ```bash
+ Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
+ /u02/oradata/ORATEST1/redo01.log
+ ```
+
+ You should see the logfile is applied and recovery completes. Enter CANCEL to exit the recover command:
+ ```output
+ Specify log: {<RET>=suggested | filename | AUTO | CANCEL}
+ /u02/oradata/ORATEST1/redo01.log
+ Log applied.
+ Media recovery complete.
+ ```
+
+1. Open the database
+
+ > [!IMPORTANT]
+ > The RESETLOGS option is required when the RECOVER command uses the USING BACKUP CONTROLFILE option. RESETLOGS creates a new incarnation of the database by resetting the redo history back to the beginning, because there is no way to determine how much of the previous database incarnation was skipped in the recovery.
+
+ ```bash
+ SQL> alter database open resetlogs;
+ ```
+
+
+1. Check the database content has been fully recovered:
+
+ ```bash
+ RMAN> SELECT * FROM scott.scott_table;
+ ```
+
+1. Unmount the restore point.
+
+ When all databases on the VM have been successfully recovered you may unmount the restore point. This can be done on the VM using the `unmount` command or in Azure portal from the File Recovery blade. You can also unmount the recovery volumes by running the python script again with the **-clean** option.
+
+ In the VM using unmount:
+ ```bash
+ sudo umount /restore/vmoracle19c-20210107110037/Volume*
+ ```
+
+ In the Azure portal, on the **File Recovery (Preview)** blade, click **Unmount Disks**.
-The backup and recovery of the Oracle Database 19c database on an Azure Linux VM is now finished.
+ ![Unmount disks command](./media/oracle-backup-recovery/recovery-service-10.png)
+
+The backup and recovery of the Oracle Database on an Azure Linux VM is now finished.
More information about Oracle commands and concepts can be found in the Oracle documentation, including:
More information about Oracle commands and concepts can be found in the Oracle d
* [Oracle ARCHIVE_LAG_TARGET parameter](https://docs.oracle.com/en/database/oracle/oracle-database/19/refrn/ARCHIVE_LAG_TARGET.html#GUID-405D335F-5549-4E02-AFB9-434A24465F0B) ++ ## Delete the VM When you no longer need the VM, you can use the following commands to remove the resource group, the VM, and all related resources:
virtual-machines Oracle Database Backup Azure Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-azure-storage.md
Title: Back up an Oracle Database 19c database on an Azure Linux VM with RMAN and Azure Files
-description: Learn how to back up an Oracle Database 19c database to Azure Files.
+ Title: Back up an Oracle Database on an Azure Linux VM with RMAN and Azure Files
+description: Learn how to back up an Oracle Database to Azure Files.
-# Back up and recover an Oracle Database 19c database on an Azure Linux VM using Azure Files
+# Back up and recover an Oracle Database on an Azure Linux VM using Azure Files
**Applies to:** :heavy_check_mark: Linux VMs
-This article demonstrates the use of Azure Files as a media to back up and restore an Oracle database running on an Azure VM. You will back up the database using Oracle RMAN to an Azure file share mounted to the VM using the SMB protocol. Using Azure Files for backup media is extremely cost effective and performant. However, for very large databases, Azure Backup provides a better solution.
+This article demonstrates the use of Azure Files as a media to back up and restore an Oracle database running on an Azure VM. The steps in this article have been tested against Oracle 12.1 and higher. You will back up the database using Oracle RMAN to an Azure file share mounted to the VM using the SMB protocol. Using Azure Files for backup media is extremely cost effective and performant. However, for very large databases, Azure Backup provides a better solution.
[!INCLUDE [azure-cli-prepare-your-environment.md](../../../../includes/azure-cli-prepare-your-environment.md)] -- To perform the backup and recovery process, you must first create a Linux VM that has an installed instance of Oracle Database 19c. The Marketplace image currently used to create the VM is **Oracle:oracle-database-19-3:oracle-database-19-0904:latest**. Follow the steps in the [Oracle create database quickstart](./oracle-database-quick-create.md) to create an Oracle database to complete this tutorial.
+- To perform the backup and recovery process, you must first create a Linux VM that has an installed instance of Oracle Database. We recommend using Oracle 12.x or higher.
+
+- Follow the steps in the [Oracle create database quickstart](./oracle-database-quick-create.md) to create an Oracle database to complete this tutorial.
## Prepare the database environment
This article demonstrates the use of Azure Files as a media to back up and resto
To back up to Azure Files, complete these steps:
-1. Set up Azure Files.
-1. Mount the Azure file share to your VM.
-1. Back up the database.
-1. Restore and recover the database.
+1. [Set up Azure Files](#set-up-azure-files).
+1. [Mount the Azure file share to your VM](#mount-the-azure-storage-file-share-to-your-vm).
+1. [Back up the database](#backup-the-database).
+1. [Restore and recover the database](#restore-and-recover-the-database).
### Set up Azure Files
To set up your storage account and file share run the following commands in Azur
//orabackup1.file.core.windows.net/orabackup 10T 0 10T 0% /mnt/orabackup ```
-### Back up the database
+### Backup the database
In this section, we will be using Oracle Recovery Manager (RMAN) to take a full backup of the database and archive logs and write the backup as a backup set to the Azure File share mounted earlier.
virtual-machines Oracle Database Backup Strategies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-strategies.md
A vault created with GRS redundancy includes the option to configure the [Cross
Azure Backup service provides a [framework](../../../backup/backup-azure-linux-app-consistent.md) to achieve application consistency during backups of Windows and Linux VMs for various applications like Oracle, MySQL, Mongo DB, SAP HANA, and PostGreSQL, called application consistent snapshots. This involves invoking a pre-script (to quiesce the applications) before taking a snapshot of disks and calling post-script (commands to unfreeze the applications) after the snapshot is completed, to return the applications to the normal mode. While sample pre-scripts and post-scripts are provided on GitHub, the creation and maintenance of these scripts is your responsibility. In the case of Oracle, the database must be in archivelog mode to allow online backups, and the appropriate database begin and end backup commands are run in the pre and post-scripts, which you must create and maintain.
-Azure Backup is now providing an [enhanced pre-script and post-script framework](../../../backup/backup-azure-linux-database-consistent-enhanced-pre-post.md), now generally available, where the Azure Backup service will provide packaged pre-scripts and post-scripts for selected applications. Azure Backup users just need to name the application and then Azure VM backup will automatically invoke the relevant pre-scripts and post-scripts. The packaged pre-scripts and post-scripts will be maintained by the Azure Backup team so users can be assured of the support, ownership, and validity of these scripts. Currently, the supported applications for the enhanced framework are Oracle and MySQL, with more application types expected in the future. The snapshot will be a full copy of the storage and not an incremental or Copy on Write snapshot, so it is an effective medium to restore your database from.
+Azure Backup is now providing an [enhanced pre-script and post-script framework](../../../backup/backup-azure-linux-database-consistent-enhanced-pre-post.md), now generally available, where the Azure Backup service will provide packaged pre-scripts and post-scripts for selected applications. Azure Backup users just need to name the application and then Azure VM backup will automatically invoke the relevant pre-scripts and post-scripts. The packaged pre-scripts and post-scripts will be maintained by the Azure Backup team so users can be assured of the support, ownership, and validity of these scripts. Currently, the supported applications for the enhanced framework are Oracle 12.1 or higher and MySQL, with more application types expected in the future. The snapshot will be a full copy of the storage and not an incremental or Copy on Write snapshot, so it is an effective medium to restore your database from.
+
+## VLDB considerations
+
+Backup strategies for Very large databases (VLDB) require careful consideration due to their size. Using RMAN to back up to Azure Blob or Azure Files storage may not provide the required throughput to backup a VLDB in the target time frame. RMAN incremental backup can be used to reduce backup sizes which may allow Azure Storage to be used as the backup medium for VLDB's, however for VLDB's with high volumes of changes this might not be effective.
+
+Using Azure services that provide snapshot capabilities such as Azure Backup service or Azure NetApp Files (ANF) is the recommended approach for VLDB's. Application consistent snapshots, where the databases are automatically placed in and out of backup mode, take only seconds to create regardless of the size of the database.
+
+Your backup strategy may be also tied to the overall storage solution used for the Oracle database. Extreme IO throughput database workloads often use Azure NetApp Files or third party Azure Marketplace solutions such as Silk, to underpin the database storage throughput and IOPS requirements. These solutions also provide application consistent snapshots that can be used for fast database backup and restore operations.
## Next steps - [Create Oracle Database quickstart](oracle-database-quick-create.md) - [Back up Oracle Database to Azure Files](oracle-database-backup-azure-storage.md)-- [Back up Oracle Database using Azure Backup service](oracle-database-backup-azure-backup.md)
+- [Back up Oracle Database using Azure Backup service](oracle-database-backup-azure-backup.md)
virtual-machines Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/get-started.md
ms.assetid: ad8e5c75-0cf6-4564-ae62-ea1246b4e5f2
vm-linux Previously updated : 03/15/2022 Last updated : 03/28/2022
In this section, you find documents about Microsoft Power BI integration into SA
## Change Log
+- March 28, 2022: Formatting changes and reorganizing ILB configuration instructions in: [HA for SAP HANA on Azure VMs on SLES](./sap-hana-high-availability.md), [HA for SAP HANA Scale-up with Azure NetApp Files on SLES](./sap-hana-high-availability-netapp-files-suse.md), [HA for SAP HANA on Azure VMs on RHEL](./sap-hana-high-availability-rhel.md), [HA for SAP HANA scale-up with ANF on RHEL](./sap-hana-high-availability-netapp-files-red-hat.md), [HA for SAP NW on SLES with NFS on Azure Files](./high-availability-guide-suse-nfs-azure-files.md), [HA for SAP NW on Azure VMs on SLES with ANF](./high-availability-guide-suse-netapp-files.md), [HA for SAP NW on Azure VMs on SLES for SAP applications](./high-availability-guide-suse.md), [HA for NFS on Azure VMs on SLES](./high-availability-guide-suse-nfs.md), [HA for SAP NNW on Azure VMs on SLES multi-SID guide](./high-availability-guide-suse-multi-sid.md), [HA for SAP NW on RHEL with NFS on Azure Files](./high-availability-guide-rhel-nfs-azure-files.md), [HA for SAP NW on Azure VMs on RHEL with ANF](./high-availability-guide-rhel-netapp-files.md), [HA for SAP NW on Azure VMs on RHEL for SAP applications](./high-availability-guide-rhel.md) and [HA for SAP NW on Azure VMs on RHEL multi-SID guide](./high-availability-guide-rhel-multi-sid.md)
- March 15, 2022: Corrected rsize and wsize mount option settings for ANF in [IBM Db2 Azure Virtual Machines DBMS deployment for SAP workload](./dbms_guide_ibm.md) - March 1, 2022: Corrected note about database snapshots with multiple database containers in [SAP HANA Large Instances high availability and disaster recovery on Azure](./hana-overview-high-availability-disaster-recovery.md) - February 28, 2022: Added E(d)sv5 VM storage configurations to [SAP HANA Azure virtual machine storage configurations](./hana-vm-operations-storage.md)
virtual-machines High Availability Guide Rhel Multi Sid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/high-availability-guide-rhel-multi-sid.md
vm-windows Previously updated : 01/24/2022 Last updated : 03/28/2022
To achieve high availability, SAP NetWeaver requires highly available shares. In
SAP NetWeaver ASCS, SAP NetWeaver SCS and SAP NetWeaver ERS use virtual hostname and virtual IP addresses. On Azure, a load balancer is required to use a virtual IP address. We recommend using [Standard load balancer](../../../load-balancer/quickstart-load-balancer-standard-public-portal.md).
-The following list shows the configuration of the (A)SCS and ERS load balancer for this multi-SID cluster example with three SAP systems. You will need separate frontend IP, health probes, and load-balancing rules for each ASCS and ERS instance for each of the SIDs. Assign all VMs, that are part of the ASCS/ASCS cluster to one backend pool of a single ILB.
-
-### (A)SCS
-
-* Frontend configuration
- * IP address for NW1: 10.3.1.50
- * IP address for NW2: 10.3.1.52
- * IP address for NW3: 10.3.1.54
-
-* Probe Ports
- * Port 620<strong>&lt;nr&gt;</strong>, therefore for NW1, NW2, and NW3 probe ports 620**00**, 620**10** and 620**20**
-* Load-balancing rules - create one for each instance, that is, NW1/ASCS, NW2/ASCS and NW3/ASCS.
- * If using Standard Load Balancer, select **HA ports**
- * If using Basic Load Balancer, create Load balancing rules for the following ports
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 36<strong>&lt;nr&gt;</strong> TCP
- * 39<strong>&lt;nr&gt;</strong> TCP
- * 81<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-### ERS
-
-* Frontend configuration
- * IP address for NW1 10.3.1.51
- * IP address for NW2 10.3.1.53
- * IP address for NW3 10.3.1.55
-
-* Probe Port
- * Port 621<strong>&lt;nr&gt;</strong>, therefore for NW1, NW2, and N3 probe ports 621**02**, 621**12** and 621**22**
-* Load-balancing rules - create one for each instance, that is, NW1/ERS, NW2/ERS and NW3/ERS.
- * If using Standard Load Balancer, select **HA ports**
- * If using Basic Load Balancer, create Load balancing rules for the following ports
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 33<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-* Backend configuration
- * Connected to primary network interfaces of all virtual machines that should be part of the (A)SCS/ERS cluster
+* Frontend IP addresses for ASCS: 10.3.1.50 (NW1), 10.3.1.52 (NW2) and 10.3.1.54 (NW3)
+* Frontend IP addresses for ERS: 10.3.1.51 (NW1), 10.3.1.53 (NW2) and 10.3.1.55 (NW3)
+* Probe port 62000 for NW1 ASCS, 62010 for NW2 ASCS and 62020 for NW3 ASCS
+* Probe port 62102 for NW1 ASCS, 62112 for NW2 ASCS and 62122 for NW3 ASCS
> [!IMPORTANT] > Floating IP is not supported on a NIC secondary IP configuration in load-balancing scenarios. For details see [Azure Load balancer Limitations](../../../load-balancer/load-balancer-multivip-overview.md#limitations). If you need additional IP address for the VM, deploy a second NIC.
virtual-machines High Availability Guide Rhel Netapp Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/high-availability-guide-rhel-netapp-files.md
vm-windows Previously updated : 02/13/2022 Last updated : 03/25/2022
Now it is possible to achieve SAP Netweaver HA by using shared storage, deployed
![SAP NetWeaver High Availability overview](./media/high-availability-guide-rhel/high-availability-guide-rhel-anf.png)
-SAP NetWeaver ASCS, SAP NetWeaver SCS, SAP NetWeaver ERS, and the SAP HANA database use virtual hostname and virtual IP addresses. On Azure, a load balancer is required to use a virtual IP address. We recommend using [Standard load balancer](../../../load-balancer/quickstart-load-balancer-standard-public-portal.md). The following list shows the configuration of the load balancer with separate front-end IPs for (A)SCS and ERS.
-
-### (A)SCS
-
-* Frontend configuration
- * IP address 192.168.14.9
-* Probe Port
- * Port 620<strong>&lt;nr&gt;</strong>
-* Load-balancing rules
- * If using Standard Load Balancer, select **HA ports**
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 36<strong>&lt;nr&gt;</strong> TCP
- * 39<strong>&lt;nr&gt;</strong> TCP
- * 81<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-### ERS
-
-* Frontend configuration
- * IP address 192.168.14.10
-* Probe Port
- * Port 621<strong>&lt;nr&gt;</strong>
-* Load-balancing rules
- * If using Standard Load Balancer, select **HA ports**
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 33<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-* Backend configuration
- * Connected to primary network interfaces of all virtual machines that should be part of the (A)SCS/ERS cluster
+SAP NetWeaver ASCS, SAP NetWeaver SCS, SAP NetWeaver ERS, and the SAP HANA database use virtual hostname and virtual IP addresses. On Azure, a load balancer is required to use a virtual IP address. We recommend using [Standard load balancer](../../../load-balancer/quickstart-load-balancer-standard-public-portal.md). The presented configuration shows a load balancer with:
+
+* Frontend IP address 192.168.14.9 for ASCS
+* Frontend IP address 192.168.14.10 for ERS
+* Probe port 62000 for ASCS
+* Probe port 62101 for ERS
## Setting up the Azure NetApp Files infrastructure
First you need to create the Azure NetApp Files volumes. Deploy the VMs. Afterwa
1. **Make sure to enable Floating IP** 1. Click OK * Repeat the steps above to create load balancing rules for ERS (for example **lb.QAS.ERS**)
-1. Alternatively, if your scenario requires basic load balancer (internal), follow these steps:
+1. Alternatively, ***only if*** your scenario requires basic load balancer (internal), follow these configuration steps instead to create basic load balancer:
1. Create the frontend IP addresses 1. IP address 192.168.14.9 for the ASCS 1. Open the load balancer, select frontend IP pool, and click Add
virtual-machines High Availability Guide Rhel Nfs Azure Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/high-availability-guide-rhel-nfs-azure-files.md
vm-windows Previously updated : 01/24/2022 Last updated : 03/28/2022
After you deploy the VMs for your SAP system, create a load balancer. Then, use
1. **Make sure to enable Floating IP** 1. Click OK * Repeat the steps above to create load balancing rules for ERS (for example **lb.NW1.ERS**)
-1. Alternatively, if your scenario requires basic load balancer (internal), follow these steps:
+1. Alternatively, ***only if*** your scenario requires basic load balancer (internal), follow these steps instead to create basic load balancer:
1. Create the frontend IP addresses 1. IP address 10.90.90.10 for the ASCS 1. Open the load balancer, select frontend IP pool, and click Add
virtual-machines High Availability Guide Rhel https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/high-availability-guide-rhel.md
vm-windows Previously updated : 01/24/2022 Last updated : 03/28/2022
To achieve high availability, SAP NetWeaver requires shared storage. GlusterFS i
![SAP NetWeaver High Availability overview](./media/high-availability-guide-rhel/ha-rhel.png)
-SAP NetWeaver ASCS, SAP NetWeaver SCS, SAP NetWeaver ERS, and the SAP HANA database use virtual hostname and virtual IP addresses. On Azure, a load balancer is required to use a virtual IP address. We recommend using [Standard load balancer](../../../load-balancer/quickstart-load-balancer-standard-public-portal.md). The following list shows the configuration of the (A)SCS and ERS load balancer.
-
-### (A)SCS
-
-* Frontend configuration
- * IP address 10.0.0.7
-* Probe Port
- * Port 620<strong>&lt;nr&gt;</strong>
-* Load-balancing rules
- * If using Standard Load Balancer, select **HA ports**
- * If using Basic Load Balancer, create Load balancing rules for the following ports
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 36<strong>&lt;nr&gt;</strong> TCP
- * 39<strong>&lt;nr&gt;</strong> TCP
- * 81<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-### ERS
-
-* Frontend configuration
- * IP address 10.0.0.8
-* Probe Port
- * Port 621<strong>&lt;nr&gt;</strong>
-* Load-balancing rules
- * If using Standard Load Balancer, select **HA ports**
- * If using Basic Load Balancer, create Load balancing rules for the following ports
- * 32<strong>&lt;nr&gt;</strong> TCP
- * 33<strong>&lt;nr&gt;</strong> TCP
- * 5<strong>&lt;nr&gt;</strong>13 TCP
- * 5<strong>&lt;nr&gt;</strong>14 TCP
- * 5<strong>&lt;nr&gt;</strong>16 TCP
-
-* Backend configuration
- * Connected to primary network interfaces of all virtual machines that should be part of the (A)SCS/ERS cluster
+SAP NetWeaver ASCS, SAP NetWeaver SCS, SAP NetWeaver ERS, and the SAP HANA database use virtual hostname and virtual IP addresses. On Azure, a load balancer is required to use a virtual IP address. We recommend using [Standard load balancer](../../../load-balancer/quickstart-load-balancer-standard-public-portal.md). The presented configuration shows a load balancer with:
+
+* Frontend IP address 10.0.0.7 for ASCS
+* Frontend IP address 10.0.0.8 for ERS
+* Probe port 62000 for ASCS
+* Probe port 62101 for ERS
## Setting up GlusterFS
You first need to create the virtual machines for this cluster. Afterwards, you
1. **Make sure to enable Floating IP** 1. Click OK * Repeat the steps above to create load balancing rules for ERS (for example **nw1-lb-ers**)
-1. Alternatively, if your scenario requires basic load balancer (internal), follow these steps:
+1. Alternatively, ***only if*** your scenario requires basic load balancer (internal), follow these configuration steps instead to create basic load balancer:
1. Create the frontend IP addresses 1. IP address 10.0.0.7 for the ASCS 1. Open the load balancer, select frontend IP pool, and click Add
virtual-network Virtual Network Public Ip Address https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/virtual-network-public-ip-address.md
Learn how to assign a public IP address to the following resources:
- [Azure Firewall](../../firewall/tutorial-firewall-deploy-portal-policy.md?toc=%2fazure%2fvirtual-network%2ftoc.json) - [Cross-region load balancer](../../load-balancer/tutorial-cross-region-portal.md?toc=%2fazure%2fvirtual-network%2ftoc.json)
+## Region availability
+
+Azure Public IP is available in all regions for both Public and US Gov clouds. Azure Public IP doesn't move or store customer data out of the region it's deployed in.
+ ## Permissions To manage public IP addresses, your account must be assigned to the [network contributor](../../role-based-access-control/built-in-roles.md?toc=%2fazure%2fvirtual-network%2ftoc.json#network-contributor) role. A [custom](../../role-based-access-control/custom-roles.md?toc=%2fazure%2fvirtual-network%2ftoc.json) role is also supported. The custom role must be assigned the appropriate actions listed in the following table: