Updates from: 11/18/2023 02:19:57
Service Microsoft Docs article Related commit history on GitHub Change details
ai-services Batch Inference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/batch-inference.md
Title: Trigger batch inference with trained model description: Trigger batch inference with trained model-
+#
ai-services Create Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/create-resource.md
Title: Create an Anomaly Detector resource description: Create an Anomaly Detector resource-
+#
ai-services Deploy Anomaly Detection On Container Instances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/deploy-anomaly-detection-on-container-instances.md
Title: Run Anomaly Detector Container in Azure Container Instances description: Deploy the Anomaly Detector container to an Azure Container Instance, and test it in a web browser.-
+#
ai-services Deploy Anomaly Detection On Iot Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/deploy-anomaly-detection-on-iot-edge.md
Title: Run Anomaly Detector on IoT Edge description: Deploy the Anomaly Detector module to IoT Edge. -
+#
ai-services Identify Anomalies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/identify-anomalies.md
Title: How to use the Anomaly Detector API on your time series data description: Learn how to detect anomalies in your data either as a batch, or on streaming data.-
+#
ai-services Postman https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/postman.md
Title: How to run Multivariate Anomaly Detector API (GA version) in Postman? description: Learn how to detect anomalies in your data either as a batch, or on streaming data with Postman.-
+#
ai-services Prepare Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/prepare-data.md
Title: Prepare your data and upload to Storage Account description: Prepare your data and upload to Storage Account-
+#
ai-services Streaming Inference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/streaming-inference.md
Title: Streaming inference with trained model description: Streaming inference with trained model-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/How-to/train-model.md
Title: Train a Multivariate Anomaly Detection model description: Train a Multivariate Anomaly Detection model-
+#
ai-services Anomaly Detector Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/anomaly-detector-container-configuration.md
Title: How to configure a container for Anomaly Detector API description: The Anomaly Detector API container runtime environment is configured using the `docker run` command arguments. This container has several required settings, along with a few optional settings. -
+#
ai-services Anomaly Detector Container Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/anomaly-detector-container-howto.md
Title: Install and run Docker containers for the Anomaly Detector API description: Use the Anomaly Detector API's algorithms to find anomalies in your data, on-premises using a Docker container.-
+#
ai-services Anomaly Detection Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/concepts/anomaly-detection-best-practices.md
Title: Best practices when using the Anomaly Detector univariate API description: Learn about best practices when detecting anomalies with the Anomaly Detector API.-
+#
ai-services Best Practices Multivariate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/concepts/best-practices-multivariate.md
Title: Best practices for using the Multivariate Anomaly Detector API description: Best practices for using the Anomaly Detector Multivariate API's to apply anomaly detection to your time series data.-
+#
ai-services Multivariate Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/concepts/multivariate-architecture.md
Title: Predictive maintenance architecture for using the Anomaly Detector Multivariate API description: Reference architecture for using the Anomaly Detector Multivariate APIs to apply anomaly detection to your time series data for predictive maintenance.-
+#
ai-services Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/concepts/troubleshoot.md
Title: Troubleshoot the Anomaly Detector multivariate API description: Learn how to remediate common error codes when you use the Azure AI Anomaly Detector multivariate API.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/overview.md
Title: What is Anomaly Detector? description: Use the Anomaly Detector API's algorithms to apply anomaly detection on your time series data.-
+#
ai-services Client Libraries Multivariate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/quickstarts/client-libraries-multivariate.md
Title: 'Quickstart: Anomaly detection using the Anomaly Detector client library for multivariate anomaly detection' description: The Anomaly Detector multivariate offers client libraries to detect abnormalities in your data series either as a batch or on streaming data.-
+#
zone_pivot_groups: anomaly-detector-quickstart-multivariate
ai-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/quickstarts/client-libraries.md
Title: 'Quickstart: Anomaly detection using the Anomaly Detector client library' description: The Anomaly Detector API offers client libraries to detect abnormalities in your data series either as a batch or on streaming data.-
+#
zone_pivot_groups: anomaly-detector-quickstart
ai-services Regions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/regions.md
Title: Regions - Anomaly Detector service description: A list of available regions and endpoints for the Anomaly Detector service, including Univariate Anomaly Detection and Multivariate Anomaly Detection.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/service-limits.md
Title: Service limits - Anomaly Detector service description: Service limits for Anomaly Detector service, including Univariate Anomaly Detection and Multivariate Anomaly Detection.-
+#
ai-services Azure Data Explorer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/tutorials/azure-data-explorer.md
Title: "Tutorial: Use Univariate Anomaly Detector in Azure Data Explorer" description: Learn how to use the Univariate Anomaly Detector with Azure Data Explorer.-
+#
ai-services Batch Anomaly Detection Powerbi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/tutorials/batch-anomaly-detection-powerbi.md
Title: "Tutorial: Visualize anomalies using batch detection and Power BI" description: Learn how to use the Anomaly Detector API and Power BI to visualize anomalies throughout your time series data.-
+#
ai-services Multivariate Anomaly Detection Synapse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/Anomaly-Detector/tutorials/multivariate-anomaly-detection-synapse.md
Title: "Tutorial: Use Multivariate Anomaly Detector in Azure Synapse Analytics" description: Learn how to use the Multivariate Anomaly Detector with Azure Synapse Analytics.-
+#
ai-services Application Design https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/concepts/application-design.md
Title: Application Design description: Application design concepts-
+#
ai-services Entities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/concepts/entities.md
Title: Entities description: Entities concepts-
+#
ai-services Intents https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/concepts/intents.md
Title: What are intents in LUIS description: Learn about intents and how they're used in LUIS-
+#
ai-services Patterns Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/concepts/patterns-features.md
Title: Patterns and features description: Use this article to learn about patterns and features in LUIS-
+#
ai-services Get Started Get Model Rest Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/get-started-get-model-rest-apis.md
Title: "How to update your LUIS model using the REST API" description: In this article, add example utterances to change a model and train the app.-
+#
ai-services Howto Add Prebuilt Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/howto-add-prebuilt-models.md
Title: Prebuilt models for Language Understanding description: LUIS includes a set of prebuilt models for quickly adding common, conversational user scenarios.-
+#
ai-services Luis Concept Data Conversion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-concept-data-conversion.md
Title: Data conversion - LUIS description: Learn how utterances can be changed before predictions in Language Understanding (LUIS)-
+#
ai-services Luis Concept Data Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-concept-data-storage.md
Title: Data storage - LUIS description: LUIS stores data encrypted in an Azure data store corresponding to the region specified by the key. -
+#
ai-services Luis Concept Prebuilt Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-concept-prebuilt-model.md
Title: Prebuilt models - LUIS description: Prebuilt models provide domains, intents, utterances, and entities. You can start your app with a prebuilt domain or add a relevant domain to your app later. -
+#
ai-services Luis Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-container-configuration.md
Title: Docker container settings - LUIS description: The LUIS container runtime environment is configured using the `docker run` command arguments. LUIS has several required settings, along with a few optional settings.-
+#
ai-services Luis Container Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-container-howto.md
Title: Install and run Docker containers for LUIS description: Use the LUIS container to load your trained or published app, and gain access to its predictions on-premises.-
+#
ai-services Luis Container Limitations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-container-limitations.md
Title: Container limitations - LUIS description: The LUIS container languages that are supported.-
+#
ai-services Luis How To Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-azure-subscription.md
Title: How to create and manage LUIS resources description: Learn how to use and manage Azure resources for LUIS.the app.-
+#
ai-services Luis How To Batch Test https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-batch-test.md
Title: How to perform a batch test - LUIS description: Use Language Understanding (LUIS) batch testing sets to find utterances with incorrect intents and entities.-
+#
ai-services Luis How To Collaborate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-collaborate.md
Title: Collaborate with others - LUIS description: An app owner can add contributors to the authoring resource. These contributors can modify the model, train, and publish the app.-
+#
ai-services Luis How To Manage Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-manage-versions.md
Title: Manage versions - LUIS description: Versions allow you to build and publish different models. A good practice is to clone the current active model to a different version of the app before making changes to the model.-
+#
ai-services Luis How To Model Intent Pattern https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-model-intent-pattern.md
Title: Patterns add accuracy - LUIS description: Add pattern templates to improve prediction accuracy in Language Understanding (LUIS) applications.-
+#
ai-services Luis How To Use Dashboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-how-to-use-dashboard.md
Title: Dashboard - Language Understanding - LUIS description: Fix intents and entities with your trained app's dashboard. The dashboard displays overall app information, with highlights of intents that should be fixed.-
+#
ai-services Luis Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-language-support.md
Title: Language support - LUIS description: LUIS has a variety of features within the service. Not all features are at the same language parity. Make sure the features you are interested in are supported in the language culture you are targeting. A LUIS app is culture-specific and cannot be changed once it is set.-
+#
ai-services Luis Migration Api V1 To V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-migration-api-v1-to-v2.md
Title: v1 to v2 API Migration description: The version 1 endpoint and authoring Language Understanding APIs are deprecated. Use this guide to understand how to migrate to version 2 endpoint and authoring APIs.-
+#
ai-services Luis Migration Authoring https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-migration-authoring.md
Title: Migrate to an Azure resource authoring key description: This article describes how to migrate Language Understanding (LUIS) authoring authentication from an email account to an Azure resource.-
+#
ai-services Luis Reference Prebuilt Age https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-age.md
Title: Age Prebuilt entity - LUIS description: This article contains age prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Currency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-currency.md
Title: Currency Prebuilt entity - LUIS description: This article contains currency prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Datetimev2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-datetimev2.md
Title: DatetimeV2 Prebuilt entities - LUIS description: This article has datetimeV2 prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Deprecated https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-deprecated.md
Title: Deprecated Prebuilt entities - LUIS description: This article contains deprecated prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Dimension https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-dimension.md
Title: Dimension Prebuilt entities - LUIS description: This article contains dimension prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Domains https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-domains.md
Title: Prebuilt domain reference - LUIS description: Reference for the prebuilt domains, which are prebuilt collections of intents and entities from Language Understanding Intelligent Services (LUIS).-
+#
ai-services Luis Reference Prebuilt Email https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-email.md
Title: LUIS Prebuilt entities email reference description: This article contains email prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Entities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-entities.md
Title: All Prebuilt entities - LUIS description: This article contains lists of the prebuilt entities that are included in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Geographyv2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-geographyV2.md
Title: Geography V2 prebuilt entity - LUIS description: This article contains geographyV2 prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Keyphrase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-keyphrase.md
Title: Keyphrase prebuilt entity - LUIS description: This article contains keyphrase prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Number https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-number.md
Title: Number Prebuilt entity - LUIS description: This article contains number prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Ordinal V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-ordinal-v2.md
Title: Ordinal V2 prebuilt entity - LUIS description: This article contains ordinal V2 prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Ordinal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-ordinal.md
Title: Ordinal Prebuilt entity - LUIS description: This article contains ordinal prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Percentage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-percentage.md
Title: Percentage Prebuilt entity - LUIS description: This article contains percentage prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Person https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-person.md
Title: PersonName prebuilt entity - LUIS description: This article contains personName prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Phonenumber https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-phonenumber.md
Title: Phone number Prebuilt entities - LUIS description: This article contains phone number prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Sentiment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-sentiment.md
Title: Sentiment analysis - LUIS description: If Sentiment analysis is configured, the LUIS json response includes sentiment analysis.-
+#
ai-services Luis Reference Prebuilt Temperature https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-temperature.md
Title: Temperature Prebuilt entity - LUIS description: This article contains temperature prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Prebuilt Url https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-prebuilt-url.md
Title: URL Prebuilt entities - LUIS description: This article contains url prebuilt entity information in Language Understanding (LUIS).-
+#
ai-services Luis Reference Response Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-reference-response-codes.md
Title: API HTTP response codes - LUIS description: Understand what HTTP response codes are returned from the LUIS Authoring and Endpoint APIs-
+#
ai-services Luis Traffic Manager https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-traffic-manager.md
ms.devlang: javascript -
+#
ai-services Luis Tutorial Bing Spellcheck https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-tutorial-bing-spellcheck.md
Title: Correct misspelled words - LUIS description: Correct misspelled words in utterances by adding Bing Spell Check API V7 to LUIS endpoint queries.-
+#
ai-services Luis Tutorial Node Import Utterances Csv https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-tutorial-node-import-utterances-csv.md
Title: Import utterances using Node.js - LUIS description: Learn how to build a LUIS app programmatically from preexisting data in CSV format using the LUIS Authoring API.-
+#
ai-services Luis User Privacy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/luis-user-privacy.md
Title: Export & delete data - LUIS description: You have full control over viewing, exporting, and deleting their data. Delete customer data to ensure privacy and compliance.-
+#
ai-services Reference Entity Machine Learned Entity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/reference-entity-machine-learned-entity.md
Title: Machine-learning entity type - LUIS description: The machine-learning entity is the preferred entity for building LUIS applications.-
+#
ai-services Reference Entity Pattern Any https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/reference-entity-pattern-any.md
Title: Pattern.any entity type - LUIS description: Pattern.any is a variable-length placeholder used only in a pattern's template utterance to mark where the entity begins and ends.-
+#
ai-services Reference Entity Simple https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/LUIS/reference-entity-simple.md
Title: Simple entity type - LUIS description: A simple entity describes a single concept from the machine-learning context. Add a phrase list when using a simple entity to improve results.-
+#
ai-services Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/authentication.md
Title: Authentication in Azure AI services description: "There are three ways to authenticate a request to an Azure AI services resource: a resource key, a bearer token, or a multi-service subscription. In this article, you'll learn about each method, and how to make a request."-
+#
ai-services Cognitive Services Container Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-container-support.md
Title: Use Azure AI containers on-premises description: Learn how to use Docker containers to use Azure AI services on-premises.-
+#
Azure AI containers provide the following set of Docker containers, each of whic
| [Language service][ta-containers-sentiment] | **Sentiment Analysis** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/sentiment/about)) | Analyzes raw text for clues about positive or negative sentiment. This version of sentiment analysis returns sentiment labels (for example *positive* or *negative*) for each document and sentence within it. | Generally available. <br> This container can also [run in disconnected environments](containers/disconnected-containers.md). | | [Language service][ta-containers-health] | **Text Analytics for health** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/healthcare/about))| Extract and label medical information from unstructured clinical text. | Generally available | | [Language service][ta-containers-ner] | **Named Entity Recognition** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/ner/about))| Extract named entities from text. | Generally available. <br>This container can also [run in disconnected environments](containers/disconnected-containers.md). |
-| [Language service][ta-containers-cner] | **Custom Named Entity Recognition** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/customner/about))| Extract named entities from text, using a custom model you create using your data. | Preview |
+| [Language service][ta-containers-cner] | **Custom Named Entity Recognition** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/customner/about))| Extract named entities from text, using a custom model you create using your data. | Generally available |
| [Language service][ta-containers-summarization] | **Summarization** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/textanalytics/summarization/about))| Summarize text from various sources. | Public preview. <br>This container can also [run in disconnected environments](containers/disconnected-containers.md). | | [Translator][tr-containers] | **Translator** ([image](https://mcr.microsoft.com/product/azure-cognitive-services/translator/text-translation/about))| Translate text in several languages and dialects. | Generally available. Gated - [request access](https://aka.ms/csgate-translator). <br>This container can also [run in disconnected environments](containers/disconnected-containers.md). |
ai-services Cognitive Services Custom Subdomains https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-custom-subdomains.md
Title: Custom subdomains description: Custom subdomain names for each Azure AI services resource are created through the Azure portal, Azure Cloud Shell, or Azure CLI.-
+#
ai-services Cognitive Services Environment Variables https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-environment-variables.md
Title: Use environment variables with Azure AI services description: "This guide shows you how to set and retrieve environment variables to handle your Azure AI services subscription credentials in a more secure way when you test out applications."-
+#
ai-services Cognitive Services Limited Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-limited-access.md
Title: Limited Access features for Azure AI services description: Azure AI services that are available with Limited Access are described below.-
+#
ai-services Cognitive Services Support Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-support-options.md
Title: Azure AI services support and help options description: How to obtain help and support for questions and problems when you create applications that integrate with Azure AI services.-
+#
ai-services Cognitive Services Virtual Networks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/cognitive-services-virtual-networks.md
Title: Configure Virtual Networks for Azure AI services description: Configure layered network security for your Azure AI services resources.-
+#
ai-services Category Taxonomy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/Category-Taxonomy.md
Title: Taxonomy of image categories - Azure AI Vision description: Get the 86 categories of taxonomy for the Azure AI Vision API in Azure AI services.-
+#
ai-services Storage Lab Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/Tutorials/storage-lab-tutorial.md
Title: "Tutorial: Generate metadata for Azure images" description: In this tutorial, you'll learn how to integrate the Azure AI Vision service into a web app to generate metadata for images.-
+#
ai-services Computer Vision How To Install Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/computer-vision-how-to-install-containers.md
Title: Azure AI Vision 3.2 GA Read OCR container description: Use the Read 3.2 OCR containers from Azure AI Vision to extract text from images and documents, on-premises.-
+#
ai-services Computer Vision Resource Container Config https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/computer-vision-resource-container-config.md
Title: Configure Read OCR containers - Azure AI Vision description: This article shows you how to configure both required and optional settings for Read OCR containers in Azure AI Vision.-
+#
ai-services Concept Background Removal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-background-removal.md
Title: Background removal - Image Analysis description: Learn about background removal, an operation of Image Analysis-
+#
ai-services Concept Brand Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-brand-detection.md
Title: Brand detection - Azure AI Vision description: Learn about brand and logo detection, a specialized mode of object detection, using the Azure AI Vision API.-
+#
ai-services Concept Categorizing Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-categorizing-images.md
Title: Image categorization - Azure AI Vision description: Learn concepts related to the image categorization feature of the Image Analysis API.-
+#
ai-services Concept Describe Images 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-describe-images-40.md
Title: Image captions - Image Analysis 4.0 description: Concepts related to the image captioning feature of the Image Analysis 4.0 API.-
+#
ai-services Concept Describing Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-describing-images.md
Title: Image descriptions - Azure AI Vision description: Concepts related to the image description feature of the Azure AI Vision API.-
+#
ai-services Concept Detecting Adult Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-detecting-adult-content.md
Title: Adult, racy, gory content - Azure AI Vision description: Concepts related to detecting adult content in images using the Azure AI Vision API.-
+#
ai-services Concept Detecting Color Schemes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-detecting-color-schemes.md
Title: Color scheme detection - Azure AI Vision description: Concepts related to detecting the color scheme in images using the Azure AI Vision API.-
+#
ai-services Concept Detecting Domain Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-detecting-domain-content.md
Title: Domain-specific content - Azure AI Vision description: Learn how to specify an image categorization domain to return more detailed information about an image.-
+#
ai-services Concept Detecting Faces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-detecting-faces.md
Title: Face detection - Azure AI Vision description: Learn concepts related to the face detection feature of the Azure AI Vision API.-
+#
ai-services Concept Detecting Image Types https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-detecting-image-types.md
Title: Image type detection - Azure AI Vision description: Concepts related to the image type detection feature of the Azure AI Vision API.-
+#
ai-services Concept Face Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-face-detection.md
Title: "Face detection, attributes, and input data - Face" description: Learn more about face detection; face detection is the action of locating human faces in an image and optionally returning different kinds of face-related data.-
+#
ai-services Concept Face Recognition Data Structures https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-face-recognition-data-structures.md
Title: "Face recognition data structures - Face" description: Learn about the Face recognition data structures, which hold data on faces and persons.-
+#
ai-services Concept Face Recognition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-face-recognition.md
Title: "Face recognition - Face" description: Learn the concept of Face recognition, its related operations, and the underlying data structures.-
+#
ai-services Concept Generate Thumbnails 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-generate-thumbnails-40.md
Title: Smart-cropped thumbnails - Image Analysis 4.0 description: Concepts related to generating thumbnails for images using the Image Analysis 4.0 API.-
+#
ai-services Concept Generating Thumbnails https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-generating-thumbnails.md
Title: Smart-cropped thumbnails - Azure AI Vision description: Concepts related to generating thumbnails for images using the Azure AI Vision API.-
+#
ai-services Concept Image Retrieval https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-image-retrieval.md
Title: Multi-modal embeddings concepts - Image Analysis 4.0 description: Concepts related to image vectorization using the Image Analysis 4.0 API.-
+#
ai-services Concept Liveness Abuse Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-liveness-abuse-monitoring.md
Title: Abuse monitoring in Face liveness detection - Face description: Learn about abuse-monitoring methods in Azure Face service.-
+#
ai-services Concept Model Customization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-model-customization.md
Title: Model customization concepts - Image Analysis 4.0 description: Concepts related to the custom model feature of the Image Analysis 4.0 API.-
+#
ai-services Concept Object Detection 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-object-detection-40.md
Title: Object detection - Image Analysis 4.0 description: Learn concepts related to the object detection feature of the Image Analysis 4.0 API - usage and limits.-
+#
ai-services Concept Object Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-object-detection.md
Title: Object detection - Azure AI Vision description: Learn concepts related to the object detection feature of the Azure AI Vision API - usage and limits.-
+#
ai-services Concept Ocr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-ocr.md
Title: OCR for images - Azure AI Vision description: Extract text from in-the-wild and non-document images with a fast and synchronous Azure AI Vision Image Analysis 4.0 API.-
+#
ai-services Concept People Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-people-detection.md
Title: People detection - Azure AI Vision description: Learn concepts related to the people detection feature of the Azure AI Vision API - usage and limits.-
+#
ai-services Concept Shelf Analysis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-shelf-analysis.md
Title: Product Recognition - Image Analysis 4.0 description: Learn concepts related to the Product Recognition feature set of Image Analysis 4.0 - usage and limits.-
+#
ai-services Concept Tag Images 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-tag-images-40.md
Title: Content tags - Image Analysis 4.0 description: Learn concepts related to the images tagging feature of the Image Analysis 4.0 API.-
+#
ai-services Concept Tagging Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/concept-tagging-images.md
Title: Content tags - Azure AI Vision description: Learn concepts related to the images tagging feature of the Azure AI Vision API.-
+#
ai-services Deploy Computer Vision On Premises https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/deploy-computer-vision-on-premises.md
Title: Use Azure AI Vision container with Kubernetes and Helm description: Learn how to deploy the Azure AI Vision container using Kubernetes and Helm.-
+#
ai-services Add Faces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/add-faces.md
Title: "Example: Add faces to a PersonGroup - Face" description: This guide demonstrates how to add a large number of persons and faces to a PersonGroup object with the Azure AI Face service.-
+#
ai-services Analyze Video https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/analyze-video.md
Title: Analyze videos in near real time - Azure AI Vision description: Learn how to perform near real-time analysis on frames that are taken from a live video stream by using the Azure AI Vision API.-
+#
ai-services Background Removal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/background-removal.md
Title: Remove the background in images description: Learn how to call the Segment API to isolate and remove the background from images.-
+#
ai-services Blob Storage Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/blob-storage-search.md
Title: Configure your blob storage for image retrieval and video search in Vision Studio description: To get started with the **Search photos with natural language** or with **Video summary and frame locator** in Vision Studio, you will need to select or create a new storage account.-
+#
ai-services Call Analyze Image 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/call-analyze-image-40.md
Title: Call the Image Analysis 4.0 Analyze API description: Learn how to call the Image Analysis 4.0 API and configure its behavior.-
+#
ai-services Call Analyze Image https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/call-analyze-image.md
Title: Call the Image Analysis API description: Learn how to call the Image Analysis API and configure its behavior.-
+#
ai-services Call Read Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/call-read-api.md
Title: How to call the Read API description: Learn how to call the Read API and configure its behavior in detail.-
+#
ai-services Coco Verification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/coco-verification.md
Title: Verify a COCO annotation file description: Use a Python script to verify your COCO file for custom model training.-
+#
ai-services Find Similar Faces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/find-similar-faces.md
Title: "Find similar faces" description: Use the Face service to find similar faces (face search by image).-
+#
ai-services Generate Thumbnail https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/generate-thumbnail.md
Title: "Generate a smart-cropped thumbnail - Image Analysis" description: Use the Image Analysis REST API to generate a thumbnail with smart cropping.-
+#
ai-services Identity Access Token https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/identity-access-token.md
Title: "Use limited access tokens - Face" description: Learn how ISVs can manage the Face API usage of their clients by issuing access tokens that grant access to Face features which are normally gated.-
+#
ai-services Identity Detect Faces https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/identity-detect-faces.md
Title: "Call the Detect API - Face" description: This guide demonstrates how to use face detection to extract attributes like age, emotion, or head pose from a given image.-
+#
ai-services Image Retrieval https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/image-retrieval.md
Title: Do image retrieval using multi-modal embeddings - Image Analysis 4.0 description: Learn how to call the image retrieval API to vectorize image and search terms.-
+#
ai-services Migrate From Custom Vision https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/migrate-from-custom-vision.md
Title: "Migrate a Custom Vision project to Image Analysis 4.0" description: Learn how to generate an annotation file from an old Custom Vision project, so you can train a custom Image Analysis model on previous training data.-
+#
ai-services Mitigate Latency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/mitigate-latency.md
Title: How to mitigate latency and improve performance when using the Face service description: Learn how to mitigate network latency and improve service performance when using the Face service.-
+#
ai-services Model Customization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/model-customization.md
Title: Create a custom Image Analysis model description: Learn how to create and train a custom model to do image classification and object detection that's specific to your use case.-
+#
ai-services Specify Detection Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/specify-detection-model.md
Title: How to specify a detection model - Face description: This article will show you how to choose which face detection model to use with your Azure AI Face application.-
+#
ai-services Specify Recognition Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/specify-recognition-model.md
Title: How to specify a recognition model - Face description: This article will show you how to choose which recognition model to use with your Azure AI Face application.-
+#
ai-services Use Large Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/use-large-scale.md
Title: "Scale to handle more enrolled users - Face" description: This guide is an article on how to scale up from existing PersonGroup and FaceList objects to LargePersonGroup and LargeFaceList objects.-
+#
ai-services Use Persondirectory https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/use-persondirectory.md
Title: "Example: Use the PersonDirectory data structure - Face" description: Learn how to use the PersonDirectory data structure to store face and person data at greater capacity and with other new features.-
+#
ai-services Video Retrieval https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/how-to/video-retrieval.md
Title: Do video retrieval using vectorization - Image Analysis 4.0 description: Learn how to call the Spatial Analysis Video Retrieval APIs to vectorize video frames and search terms.-
+#
ai-services Identity Api Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/identity-api-reference.md
Title: API Reference - Face description: API reference provides information about the Person, LargePersonGroup/PersonGroup, LargeFaceList/FaceList, and Face Algorithms APIs.-
+#
ai-services Intro To Spatial Analysis Public Preview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/intro-to-spatial-analysis-public-preview.md
Title: What is Spatial Analysis? description: This document explains the basic concepts and features of the Azure Spatial Analysis container.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/language-support.md
Title: Language support - Azure AI Vision description: This article provides a list of natural languages supported by Azure AI Vision features; OCR, Image analysis.-
+#
ai-services Overview Image Analysis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/overview-image-analysis.md
Title: What is Image Analysis? description: The Image Analysis service uses pretrained AI models to extract many different visual features from images. -
+#
ai-services Overview Ocr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/overview-ocr.md
Title: OCR - Optical Character Recognition description: Learn how the optical character recognition (OCR) services extract print and handwritten text from images and documents in global languages.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/overview.md
Title: What is Azure AI Vision? description: The Azure AI Vision service provides you with access to advanced algorithms for processing images and returning information.-
+#
ai-services Client Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/quickstarts-sdk/client-library.md
Title: "Quickstart: Optical character recognition (OCR)" description: Learn how to use Optical character recognition (OCR) in your application through a native client library in the language of your choice.-
+#
ai-services Identity Client Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/quickstarts-sdk/identity-client-library.md
Title: 'Quickstart: Use the Face service' description: The Face API offers client libraries that make it easy to detect, find similar, identify, verify and more.-
+#
zone_pivot_groups: programming-languages-set-face
ai-services Image Analysis Client Library 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/quickstarts-sdk/image-analysis-client-library-40.md
Title: "Quickstart: Image Analysis 4.0" description: Learn how to tag images in your application using Image Analysis 4.0 through a native client SDK in the language of your choice.-
+#
ai-services Image Analysis Client Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/quickstarts-sdk/image-analysis-client-library.md
Title: "Quickstart: Image Analysis" description: Learn how to tag images in your application using Image Analysis through a native client library in the language of your choice.-
+#
ai-services Read Container Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/read-container-migration-guide.md
Title: Migrating to the Read v3.x containers description: Learn how to migrate to the v3 Read OCR containers-
+#
ai-services Reference Video Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/reference-video-search.md
Title: Video Retrieval API reference - Image Analysis 4.0 description: Learn how to call the Video Retrieval APIs.-
+#
ai-services Install Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/sdk/install-sdk.md
Title: Install the Vision SDK description: In this guide, you learn how to install the Vision SDK for your preferred programming language.-
+#
ai-services Overview Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/sdk/overview-sdk.md
Title: Vision SDK Overview description: This page gives you an overview of the Azure AI Vision SDK for Image Analysis.-
+#
ai-services Spatial Analysis Camera Placement https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-camera-placement.md
Title: Spatial Analysis camera placement description: Learn how to set up a camera for use with Spatial Analysis-
+#
ai-services Spatial Analysis Container https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-container.md
Title: How to install and run the Spatial Analysis container - Azure AI Vision description: The Spatial Analysis container lets you can detect people and distances.-
+#
ai-services Spatial Analysis Local https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-local.md
Title: Run Spatial Analysis on a local video file description: Use this guide to learn how to run Spatial Analysis on a recorded local video.-
+#
ai-services Spatial Analysis Logging https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-logging.md
Title: Telemetry and logging for Spatial Analysis containers description: Spatial Analysis provides each container with a common configuration framework insights, logging, and security settings.-
+#
ai-services Spatial Analysis Operations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-operations.md
Title: Spatial Analysis operations description: The Spatial Analysis operations.-
+#
ai-services Spatial Analysis Web App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-web-app.md
Title: Deploy a Spatial Analysis web app description: Learn how to use Spatial Analysis in a web application.-
+#
ai-services Spatial Analysis Zone Line Placement https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/spatial-analysis-zone-line-placement.md
Title: Spatial Analysis zone and line placement description: Learn how to set up zones and lines with Spatial Analysis-
+#
ai-services Upgrade Api Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/upgrade-api-versions.md
Title: Upgrade to Read v3.0 of the Azure AI Vision API description: Learn how to upgrade to Azure AI Vision v3.0 Read API from v2.0/v2.1.-
+#
ai-services Use Case Alt Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/use-case-alt-text.md
Title: "Overview: Generate alt text of images with Image Analysis" description: Grow your customer base by making your products and services more accessible. Generate a description of an image in human-readable language, using complete sentences. -
+#
ai-services Use Case Dwell Time https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/use-case-dwell-time.md
Title: "Overview: Monitor dwell time in front of displays" description: Spatial Analysis can provide real-time information about how long customers spend in front of a display in a retail store. -
+#
ai-services Use Case Identity Verification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/use-case-identity-verification.md
Title: "Overview: Verification with Face" description: Provide the best-in-class face verification experience in your business solution using Azure AI Face service. You can verify a user's face against a government-issued ID card like a passport or driver's license.-
+#
ai-services Use Case Queue Time https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/use-case-queue-time.md
Title: "Overview: Measure retail queue wait time with Spatial Analysis" description: Spatial Analysis can generate real-time information about how long users are waiting in a queue in a retail store. A store manager can use this information to manage open checkout stations and employee shifts.-
+#
ai-services Vehicle Analysis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/vehicle-analysis.md
Title: Configure vehicle analysis containers description: Vehicle analysis provides each container with a common configuration framework, so that you can easily configure and manage compute, AI insight egress, logging, and security settings.-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/computer-vision/whats-new.md
Title: What's new in Azure AI Vision? description: Stay up to date on recent releases and updates to Azure AI Vision.-
+#
ai-services Azure Container Instance Recipe https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/azure-container-instance-recipe.md
Title: Azure Container Instance recipe description: Learn how to deploy Azure AI containers on Azure Container Instance-
+#
ai-services Azure Kubernetes Recipe https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/azure-kubernetes-recipe.md
Title: Run Language Detection container in Kubernetes Service description: Deploy the language detection container, with a running sample, to the Azure Kubernetes Service, and test it in a web browser. -
+#
ai-services Container Reuse Recipe https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/container-reuse-recipe.md
Title: Recipes for Docker containers description: Learn how to build, test, and store containers with some or all of your configuration settings for deployment and reuse.-
+#
ai-services Disconnected Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/disconnected-containers.md
Title: Use Docker containers in disconnected environments description: Learn how to run Azure AI services Docker containers disconnected from the internet.-
+#
ai-services Docker Compose Recipe https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/containers/docker-compose-recipe.md
Title: Use Docker Compose to deploy multiple containers description: Learn how to deploy multiple Azure AI containers. This article shows you how to orchestrate multiple Docker container images by using Docker Compose.-
+#
ai-services Api Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/api-reference.md
Title: API reference - Content Moderator description: Learn about the content moderation APIs for Content Moderator.-
+#
ai-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/client-libraries.md
Title: 'Quickstart: Use the Content Moderator client library' description: The Content Moderator API offers client libraries that make it easy to integrate Content Moderator into your applications.-
+#
zone_pivot_groups: programming-languages-set-conmod
ai-services Export Delete Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/export-delete-data.md
Title: Export or delete user data - Content Moderator description: You have full control over your data. Learn how to view, export or delete your data in Content Moderator.-
+#
ai-services Image Lists Quickstart Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/image-lists-quickstart-dotnet.md
Title: "Check images against custom lists in C# - Content Moderator" description: How to moderate images with custom image lists using the Content Moderator SDK for C#.-
+#
ai-services Image Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/image-moderation-api.md
Title: Image Moderation - Content Moderator description: Use Content Moderator's machine-assisted image moderation to moderate images for adult and racy content.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/language-support.md
Title: Language support - Content Moderator API description: This is a list of natural languages that the Content Moderator API supports.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/overview.md
Title: What is Azure AI Content Moderator? description: Learn how to use Content Moderator to track, flag, assess, and filter inappropriate material in user-generated content.-
+#
ai-services Samples Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/samples-dotnet.md
Title: Code samples - Content Moderator, .NET description: Learn how to use Content Moderator in your .NET applications through the SDK.-
+#
ai-services Samples Rest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/samples-rest.md
Title: Code samples - Content Moderator, C# description: Use Content Moderator feature based samples in your applications through REST API calls.-
+#
ai-services Term Lists Quickstart Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/term-lists-quickstart-dotnet.md
Title: "Check text against a custom term list in C# - Content Moderator" description: How to moderate text with custom term lists using the Content Moderator SDK for C#.-
+#
ai-services Text Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/text-moderation-api.md
Title: Text Moderation - Content Moderator description: Use text moderation for possible unwanted text, personal data, and custom lists of terms.-
+#
ai-services Try Image Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/try-image-api.md
Title: Moderate images with the API Console - Content Moderator description: Use the Image Moderation API in Azure AI Content Moderator to scan image content.-
+#
ai-services Try Image List Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/try-image-list-api.md
Title: Moderate images with custom lists and the API console - Content Moderator description: You use the List Management API in Azure AI Content Moderator to create custom lists of images.-
+#
ai-services Try Terms List Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/try-terms-list-api.md
Title: Moderate text with custom term lists - Content Moderator description: Use the List Management API to create custom lists of terms to use with the Text Moderation API.-
+#
ai-services Try Text Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/try-text-api.md
Title: Moderate text by using the Text Moderation API - Content Moderator description: Test-drive text moderation by using the Text Moderation API in the online console.-
+#
ai-services Video Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-moderator/video-moderation-api.md
Title: "Analyze video content for objectionable material in C# - Content Moderator" description: How to analyze video content for various objectionable material using the Content Moderator SDK for .NET-
+#
ai-services Harm Categories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/concepts/harm-categories.md
Title: "Harm categories in Azure AI Content Safety" description: Learn about the different content moderation flags and severity levels that the Azure AI Content Safety service returns.-
+#
ai-services Jailbreak Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/concepts/jailbreak-detection.md
Title: "Jailbreak risk detection in Azure AI Content Safety" description: Learn about jailbreak risk detection and the related flags that the Azure AI Content Safety service returns.-
+#
ai-services Response Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/concepts/response-codes.md
Title: "Content Safety error codes" description: See the possible error codes for the Azure AI Content Safety APIs.-
+#
ai-services Use Blocklist https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/how-to/use-blocklist.md
Title: "Use blocklists for text moderation" description: Learn how to customize text moderation in Azure AI Content Safety by using your own list of blocklistItems.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/language-support.md
Title: Language support - Azure AI Content Safety description: This is a list of natural languages that the Azure AI Content Safety API supports.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/overview.md
Title: What is Azure AI Content Safety? description: Learn how to use Content Safety to track, flag, assess, and filter inappropriate material in user-generated content.-
+#
ai-services Quickstart Image https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/quickstart-image.md
Title: "Quickstart: Analyze image content" description: Get started using Azure AI Content Safety to analyze image content for objectionable material.-
+#
ai-services Quickstart Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/quickstart-text.md
Title: "Quickstart: Analyze image and text content" description: Get started using Azure AI Content Safety to analyze image and text content for objectionable material.-
+#
ai-services Studio Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/studio-quickstart.md
Title: "Quickstart: Content Safety Studio" description: In this quickstart, get started with the Content Safety service using Content Safety Studio in your browser.-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/content-safety/whats-new.md
Title: What's new in Azure AI Content Safety? description: Stay up to date on recent releases and updates to Azure AI Content Safety.-
+#
ai-services Create Account Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/create-account-bicep.md
Title: Create an Azure AI services resource using Bicep | Microsoft Docs description: Create an Azure AI service resource with Bicep. keywords: Azure AI services, cognitive solutions, cognitive intelligence, cognitive artificial intelligence-
+#
ai-services Create Account Resource Manager Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/create-account-resource-manager-template.md
Title: Create an Azure AI services resource using ARM templates | Microsoft Docs description: Create an Azure AI service resource with ARM template. keywords: Azure AI services, cognitive solutions, cognitive intelligence, cognitive artificial intelligence-
+#
ai-services Create Account Terraform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/create-account-terraform.md
Title: 'Quickstart: Create an Azure AI services resource using Terraform' description: 'In this article, you create an Azure AI services resource using Terraform' keywords: Azure AI services, cognitive solutions, cognitive intelligence, cognitive artificial intelligence-
+#
Last updated 4/14/2023
ai-services Custom Vision Onnx Windows Ml https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/custom-vision-onnx-windows-ml.md
Title: "Use an ONNX model with Windows ML - Custom Vision Service" description: Learn how to create a Windows UWP app that uses an ONNX model exported from Azure AI services.-
+#
ai-services Export Delete Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/export-delete-data.md
Title: View or delete your data - Custom Vision Service description: You maintain full control over your data. This article explains how you can view, export or delete your data in the Custom Vision Service.-
+#
ai-services Export Model Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/export-model-python.md
Title: "Tutorial: Run TensorFlow model in Python - Custom Vision Service" description: Run a TensorFlow model in Python. This article only applies to models exported from image classification projects in the Custom Vision service.-
+#
ai-services Export Programmatically https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/export-programmatically.md
Title: "Export a model programmatically" description: Use the Custom Vision client library to export a trained model.-
+#
ai-services Export Your Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/export-your-model.md
Title: Export your model to mobile - Custom Vision Service description: This article will show you how to export your model for use in creating mobile applications or run locally for real-time classification.-
+#
ai-services Get Started Build Detector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/get-started-build-detector.md
Title: "Quickstart: Build an object detector with the Custom Vision website" description: In this quickstart, you'll learn how to use the Custom Vision website to create, train, and test an object detector model.-
+#
ai-services Getting Started Build A Classifier https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/getting-started-build-a-classifier.md
Title: "Quickstart: Build an image classification model with the Custom Vision portal" description: In this quickstart, you'll learn how to use the Custom Vision web portal to create, train, and test an image classification model.-
+#
ai-services Getting Started Improving Your Classifier https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/getting-started-improving-your-classifier.md
Title: Improving your model - Custom Vision Service description: In this article you'll learn how the amount, quality and variety of data can improve the quality of your model in the Custom Vision service.-
+#
ai-services Iot Visual Alerts Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/iot-visual-alerts-tutorial.md
Title: "Tutorial: IoT Visual Alerts sample" description: In this tutorial, you use Custom Vision with an IoT device to recognize and report visual states from a camera's video feed.-
+#
ai-services Limits And Quotas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/limits-and-quotas.md
Title: Limits and quotas - Custom Vision Service description: This article explains the different types of licensing keys and about the limits and quotas for the Custom Vision Service.-
+#
ai-services Logo Detector Mobile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/logo-detector-mobile.md
Title: "Tutorial: Use custom logo detector to recognize Azure services - Custom Vision" description: In this tutorial, you will step through a sample app that uses Custom Vision as part of a logo detection scenario. Learn how Custom Vision is used with other components to deliver an end-to-end application.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/overview.md
Title: What is Custom Vision? description: Learn how to use the Azure AI Custom Vision service to build custom AI models to detect objects or classify images.-
+#
ai-services Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/release-notes.md
Title: Release Notes - Custom Vision Service description: Get the latest information on new releases from the Custom Vision team.-
+#
ai-services Select Domain https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/select-domain.md
Title: "Select a domain for a Custom Vision project - Azure AI Vision" description: This article will show you how to select a domain for your project in the Custom Vision Service.-
+#
ai-services Suggested Tags https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/suggested-tags.md
Title: "Label images faster with Smart Labeler" description: In this guide, you'll learn how to use Smart Labeler to generate suggested tags for images. This lets you label a large number of images more quickly when training a Custom Vision model.-
+#
ai-services Test Your Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/test-your-model.md
Title: Test and retrain a model - Custom Vision Service description: Learn how to test an image and then use it to retrain your model in the Custom Vision service.-
+#
ai-services Update Application To 3.0 Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/update-application-to-3.0-sdk.md
Title: How to update your project to the 3.0 API description: Learn how to update Custom Vision projects from the previous version of the API to the 3.0 API.-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/custom-vision-service/whats-new.md
Title: What's new in Custom Vision? description: This article contains news about Custom Vision.-
+#
ai-services Diagnostic Logging https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/diagnostic-logging.md
Title: Diagnostic logging description: This guide provides step-by-step instructions to enable diagnostic logging for an Azure AI service. These logs provide rich, frequent data about the operation of a resource that are used for issue identification and debugging.-
+#
ai-services Disable Local Auth https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/disable-local-auth.md
Title: Disable local authentication in Azure AI Services description: "This article describes disabling local authentication in Azure AI Services."-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/document-intelligence/service-limits.md
Title: Service quotas and limits - Document Intelligence (formerly Form Recognizer) description: Quick reference, detailed description, and best practices for working within Azure AI Document Intelligence service Quotas and Limits-
+#
ai-services How To Create Immersive Reader https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/how-to-create-immersive-reader.md
Title: "Create an Immersive Reader Resource" description: This article shows you how to create a new Immersive Reader resource with a custom subdomain and then configure Microsoft Entra ID in your Azure tenant.-
+#
ai-services How To Customize Launch Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/how-to-customize-launch-button.md
Title: "Edit the Immersive Reader launch button" description: This article will show you how to customize the button that launches the Immersive Reader.-
+#
ai-services Set Cookie Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/how-to/set-cookie-policy.md
Title: "Set Immersive Reader Cookie Policy" description: This article will show you how to set the cookie policy for the Immersive Reader.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/language-support.md
Title: Language support - Immersive Reader description: Learn more about the human languages that are available with Immersive Reader.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/overview.md
Title: What is Azure AI Immersive Reader? description: Immersive Reader is a tool that is designed to help people with learning differences or help new readers and language learners with reading comprehension.-
+#
ai-services Client Libraries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/quickstarts/client-libraries.md
Title: "Quickstart: Immersive Reader client library" description: "The Immersive Reader client library makes it easy to integrate the Immersive Reader service into your web applications to improve reading comprehension. In this quickstart, you'll learn how to use Immersive Reader for text selection, recognizing parts of speech, reading selected text out loud, translation, and more."-
+#
zone_pivot_groups: programming-languages-set-twenty
ai-services Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/reference.md
Title: "Immersive Reader SDK Reference" description: The Immersive Reader SDK contains a JavaScript library that allows you to integrate the Immersive Reader into your application.-
+#
ai-services Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/release-notes.md
Title: "Immersive Reader SDK Release Notes" description: Learn more about what's new in the Immersive Reader JavaScript SDK.-
+#
ai-services Security How To Update Role Assignment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/security-how-to-update-role-assignment.md
Title: "Security Advisory: Update Role Assignment for Microsoft Entra authentication permissions" description: This article will show you how to update the role assignment on existing Immersive Reader resources due to a security bug discovered in November 2021-
+#
ai-services Tutorial Ios Picture Immersive Reader https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/immersive-reader/tutorial-ios-picture-immersive-reader.md
Title: "Tutorial: Create an iOS app that takes a photo and launches it in the Immersive Reader (Swift)" description: In this tutorial, you will build an iOS app from scratch and add the Picture to Immersive Reader functionality.-
+#
ai-services Configure Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/configure-containers.md
Title: Configure containers - Language service description: Language service provides each container with a common configuration framework, so that you can easily configure and manage storage, logging and telemetry, and security settings for your containers.-
+#
ai-services Multi Region Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/custom-features/multi-region-deployment.md
Title: Deploy custom language projects to multiple regions in Azure AI Language description: Learn about deploying your language projects to multiple regions.-
+#
ai-services Project Versioning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/custom-features/project-versioning.md
Title: Conversational Language Understanding Project Versioning description: Learn how versioning works in conversational language understanding-
+#
ai-services Data Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/data-limits.md
Title: Data limits for Language service features description: Data and service limitations for Azure AI Language features.-
+#
ai-services Developer Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/developer-guide.md
Title: Use the Language SDK and REST API description: Learn about how to integrate the Language service SDK and REST API into your applications.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/language-support.md
Title: Language support for language features description: This article explains which natural languages are supported by the different features of Azure AI Language.-
+#
ai-services Migrate Language Service Latest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/migrate-language-service-latest.md
Title: Migrate to the latest version of Azure AI Language description: Learn how to move your Text Analytics applications to use the latest version of the Language service.-
+#
ai-services Model Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/model-lifecycle.md
Title: Model Lifecycle of Language service models description: This article describes the timelines for models and model versions used by Language service features.-
+#
ai-services Multilingual Emoji Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/multilingual-emoji-support.md
Title: Multilingual and emoji support in Azure AI Language description: Learn about offsets caused by multilingual and emoji encodings in Language service features.-
+#
ai-services Previous Updates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/previous-updates.md
Title: Previous language service updates description: An archive of previous Azure AI Language updates.-
+#
ai-services Regional Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/regional-support.md
Title: Regional support for Azure AI Language description: Learn which Azure regions are supported by the Language service.-
+#
ai-services Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/role-based-access-control.md
Title: Role-based access control for the Language service description: Learn how to use Azure RBAC for managing individual access to Azure resources.-
+#
ai-services Use Asynchronously https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/concepts/use-asynchronously.md
Title: "How to: Use Language service features asynchronously" description: Learn how to send Language service API requests asynchronously.-
+#
ai-services App Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/app-architecture.md
Title: When to choose conversational language understanding or orchestration workflow description: Learn when to choose conversational language understanding or orchestration workflow-
+#
ai-services Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/best-practices.md
Title: Conversational language understanding best practices description: Apply best practices when using conversational language understanding-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/data-formats.md
Title: conversational language understanding data formats description: Learn about the data formats accepted by conversational language understanding.-
+#
ai-services Entity Components https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/entity-components.md
Title: Entity components in Conversational Language Understanding description: Learn how Conversational Language Understanding extracts entities from text-
+#
ai-services Evaluation Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/evaluation-metrics.md
Title: Conversational Language Understanding evaluation metrics description: Learn about evaluation metrics in Conversational Language Understanding-
+#
ai-services Multiple Languages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/multiple-languages.md
Title: Multilingual projects description: Learn about which how to make use of multilingual projects in conversational language understanding-
+#
ai-services None Intent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/concepts/none-intent.md
Title: Conversational Language Understanding None Intent description: Learn about the default None intent in conversational language understanding-
+#
ai-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/faq.md
Title: Frequently Asked Questions description: Use this article to quickly get the answers to FAQ about conversational language understanding-
+#
ai-services Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/glossary.md
Title: Definitions used in conversational language understanding description: Learn about definitions used in conversational language understanding.-
+#
ai-services Build Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/build-schema.md
Title: How to build a Conversational Language Understanding project schema description: Use this article to start building a Conversational Language Understanding project schema-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/call-api.md
Title: Send prediction requests to a conversational language understanding deployment description: Learn about sending prediction requests for conversational language understanding.-
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/create-project.md
Title: How to create projects in Conversational Language Understanding description: Use this article to learn how to create projects in Conversational Language Understanding.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/deploy-model.md
Title: How to deploy a model for conversational language understanding description: Use this article to learn how to deploy models for conversational language understanding.-
+#
ai-services Fail Over https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/fail-over.md
Title: Back up and recover your conversational language understanding models description: Learn how to save and recover your conversational language understanding models.-
+#
ai-services Migrate From Luis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/migrate-from-luis.md
Title: Conversational Language Understanding backwards compatibility description: Learn about backwards compatibility between LUIS and Conversational Language Understanding-
+#
ai-services Tag Utterances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/tag-utterances.md
Title: How to tag utterances in Conversational Language Understanding description: Use this article to tag your utterances in Conversational Language Understanding projects-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/train-model.md
Title: How to train and evaluate models in Conversational Language Understanding description: Use this article to train a model and view its evaluation details to make improvements.-
+#
ai-services View Model Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/how-to/view-model-evaluation.md
Title: How to view conversational language understanding models details description: Use this article to learn about viewing the details for a conversational language understanding model. -
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/language-support.md
Title: Conversational language understanding language support description: This article explains which natural languages are supported by the conversational language understanding feature of Azure AI Language.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/overview.md
Title: Conversational Language Understanding - Azure AI services description: Customize an AI model to predict the intentions of utterances, and extract important information from them.-
+#
ai-services Prebuilt Component Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/prebuilt-component-reference.md
Title: Supported prebuilt entity components description: Learn about which entities can be detected automatically in Conversational Language Understanding-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/quickstart.md
Title: Quickstart - create a conversational language understanding project description: Quickly start building an AI model to extract information and predict the intentions of text-based utterances.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/conversational-language-understanding/service-limits.md
Title: Conversational Language Understanding limits description: Learn about the data, region, and throughput limits for Conversational Language Understanding-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/concepts/data-formats.md
Title: Custom NER data formats description: Learn about the data formats accepted by custom NER.-
+#
ai-services Evaluation Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/concepts/evaluation-metrics.md
Title: Custom NER evaluation metrics description: Learn about evaluation metrics in Custom Named Entity Recognition (NER)-
+#
ai-services Fail Over https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/fail-over.md
Title: Back up and recover your custom Named Entity Recognition (NER) models description: Learn how to save and recover your custom NER models.-
+#
ai-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/faq.md
Title: Custom Named Entity Recognition (NER) FAQ description: Learn about Frequently asked questions when using custom Named Entity Recognition.-
+#
ai-services Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/glossary.md
Title: Definitions and terms used for Custom Named Entity Recognition (NER) description: Definitions and terms you may encounter when building AI models using Custom Named Entity Recognition-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/call-api.md
Title: Send a Named Entity Recognition (NER) request to your custom model description: Learn how to send requests for custom NER. -
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/create-project.md
Title: Create custom NER projects and use Azure resources description: Learn how to create and manage projects and Azure resources for custom NER.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/deploy-model.md
Title: How to deploy a custom NER model description: Learn how to deploy a model for custom NER.-
+#
ai-services Design Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/design-schema.md
Title: Preparing data and designing a schema for custom NER description: Learn about how to select and prepare data, to be successful in creating custom NER projects.-
+#
ai-services Tag Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/tag-data.md
Title: How to label your data for Custom Named Entity Recognition (NER) description: Learn how to label your data for use with Custom Named Entity Recognition (NER).-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/train-model.md
Title: How to train your Custom Named Entity Recognition (NER) model description: Learn about how to train your model for Custom Named Entity Recognition (NER).-
+#
ai-services Use Autolabeling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/use-autolabeling.md
Title: How to use autolabeling in custom named entity recognition description: Learn how to use autolabeling in custom named entity recognition.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/use-containers.md
Title: Use Docker containers for Custom Named Entity Recognition on-premises description: Learn how to use Docker containers for Custom Named Entity Recognition on-premises.-
+#
ai-services View Model Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/how-to/view-model-evaluation.md
Title: Evaluate a Custom Named Entity Recognition (NER) model description: Learn how to evaluate and score your Custom Named Entity Recognition (NER) model-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/language-support.md
Title: Language and region support for custom named entity recognition description: Learn about the languages and regions supported by custom named entity recognition.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/overview.md
Title: Custom named entity recognition - Azure AI services description: Customize an AI model to label and extract information from documents using Azure AI services.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/quickstart.md
Title: Quickstart - Custom named entity recognition (NER) description: Quickly start building an AI model to categorize and extract information from unstructured text.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-named-entity-recognition/service-limits.md
Title: Custom Named Entity Recognition (NER) service limits description: Learn about the data and service limits when using Custom Named Entity Recognition (NER).-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/concepts/data-formats.md
Title: Custom Text Analytics for health data formats description: Learn about the data formats accepted by custom text analytics for health.-
+#
ai-services Entity Components https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/concepts/entity-components.md
Title: Entity components in custom Text Analytics for health description: Learn how custom Text Analytics for health extracts entities from text-
+#
ai-services Evaluation Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/concepts/evaluation-metrics.md
Title: Custom text analytics for health evaluation metrics description: Learn about evaluation metrics in custom Text Analytics for health-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/call-api.md
Title: Send a custom Text Analytics for health request to your custom model description: Learn how to send a request for custom text analytics for health. -
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/create-project.md
Title: Using Azure resources in custom Text Analytics for health description: Learn about the steps for using Azure resources with custom text analytics for health.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/deploy-model.md
Title: Deploy a custom Text Analytics for health model description: Learn about deploying a model for custom Text Analytics for health.-
+#
ai-services Design Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/design-schema.md
Title: Preparing data and designing a schema for custom Text Analytics for health description: Learn about how to select and prepare data, to be successful in creating custom TA4H projects.-
+#
ai-services Fail Over https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/fail-over.md
Title: Back up and recover your custom Text Analytics for health models description: Learn how to save and recover your custom Text Analytics for health models.-
+#
ai-services Label Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/label-data.md
Title: How to label your data for custom Text Analytics for health description: Learn how to label your data for use with custom Text Analytics for health.-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/train-model.md
Title: How to train your custom Text Analytics for health model description: Learn about how to train your model for custom Text Analytics for health.-
+#
ai-services View Model Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/how-to/view-model-evaluation.md
Title: Evaluate a Custom Text Analytics for health model description: Learn how to evaluate and score your Custom Text Analytics for health model-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/language-support.md
Title: Language and region support for custom Text Analytics for health description: Learn about the languages and regions supported by custom Text Analytics for health-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/overview.md
Title: Custom Text Analytics for health - Azure AI services description: Customize an AI model to label and extract healthcare information from documents using Azure AI services.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/quickstart.md
Title: Quickstart - Custom Text Analytics for health (Custom TA4H) description: Quickly start building an AI model to categorize and extract information from healthcare unstructured text.-
+#
ai-services Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/reference/glossary.md
Title: Definitions used in custom Text Analytics for health description: Learn about definitions used in custom Text Analytics for health-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-analytics-for-health/reference/service-limits.md
Title: Custom Text Analytics for health service limits description: Learn about the data and service limits when using Custom Text Analytics for health.-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/concepts/data-formats.md
Title: Custom text classification data formats description: Learn about the data formats accepted by custom text classification.-
+#
ai-services Evaluation Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/concepts/evaluation-metrics.md
Title: Custom text classification evaluation metrics description: Learn about evaluation metrics in custom text classification.-
+#
ai-services Fail Over https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/fail-over.md
Title: Back up and recover your custom text classification models description: Learn how to save and recover your custom text classification models.-
+#
ai-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/faq.md
Title: Custom text classification FAQ description: Learn about Frequently asked questions when using the custom text classification API.-
+#
ai-services Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/glossary.md
Title: Definitions used in custom text classification description: Learn about definitions used in custom text classification.-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/call-api.md
Title: Send a text classification request to your custom model description: Learn how to send requests for custom text classification. -
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/create-project.md
Title: How to create custom text classification projects description: Learn about the steps for using Azure resources with custom text classification.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/deploy-model.md
Title: How to deploy a custom text classification model description: Learn how to deploy a model for custom text classification.-
+#
ai-services Design Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/design-schema.md
Title: How to prepare data and define a custom classification schema description: Learn about data selection, preparation, and creating a schema for custom text classification projects.-
+#
ai-services Tag Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/tag-data.md
Title: How to label your data for custom classification - Azure AI services description: Learn about how to label your data for use with the custom text classification.-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/train-model.md
Title: How to train your custom text classification model - Azure AI services description: Learn about how to train your model for custom text classification.-
+#
ai-services Use Autolabeling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/use-autolabeling.md
Title: How to use autolabeling in custom text classification description: Learn how to use autolabeling in custom text classification.-
+#
ai-services View Model Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/how-to/view-model-evaluation.md
Title: View a custom text classification model evaluation - Azure AI services description: Learn how to view the evaluation scores for a custom text classification model-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/language-support.md
Title: Language support in custom text classification description: Learn about which languages are supported by custom text classification.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/overview.md
Title: Custom text classification - Azure AI services description: Customize an AI model to classify documents and other content using Azure AI services.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/quickstart.md
Title: Quickstart - Custom text classification description: Quickly start building an AI model to identify and apply labels (classify) unstructured text.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/service-limits.md
Title: Custom text classification limits description: Learn about the data and rate limits when using custom text classification.-
+#
Last updated 08/23/2023
ai-services Triage Email https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom-text-classification/tutorials/triage-email.md
Title: Triage incoming emails with Power Automate description: Learn how to use custom text classification to categorize and triage incoming emails with Power Automate-
+#
ai-services Azure Machine Learning Labeling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/custom/azure-machine-learning-labeling.md
Title: Use Azure Machine Learning labeling in Language Studio description: Learn how to label your data in Azure Machine Learning, and import it for use in the Language service. -
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/entity-linking/how-to/call-api.md
Title: How to call the entity linking API description: Learn how to identify and link entities found in text with the entity linking API.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/entity-linking/language-support.md
Title: Language support for key phrase analysis description: A list of natural languages supported by the entity linking API-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/entity-linking/overview.md
Title: What is entity linking in Azure AI Language? description: An overview of entity linking in Azure AI services, which helps you extract entities from text, and provides links to an online knowledge base.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/entity-linking/quickstart.md
Title: "Quickstart: Entity linking using the client library and REST API" description: 'Use this quickstart to perform Entity Linking, using C#, Python, Java, JavaScript, and the REST API.'-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/how-to/call-api.md
Title: how to call the Key Phrase Extraction API description: How to extract key phrases by using the Key Phrase Extraction API.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/how-to/use-containers.md
Title: Use Docker containers for Key Phrase Extraction on-premises description: Learn how to use Docker containers for Key Phrase Extraction on-premises.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/language-support.md
Title: Language support for Key Phrase Extraction description: Use this article to find the natural languages supported by Key Phrase Extraction.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/overview.md
Title: What is key phrase extraction in Azure AI Language? description: An overview of key phrase extraction in Azure AI services, which helps you identify main concepts in unstructured text-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/quickstart.md
Title: "Quickstart: Use the Key Phrase Extraction client library" description: Use this quickstart to start using the Key Phrase Extraction API.-
+#
ai-services Integrate Power Bi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/key-phrase-extraction/tutorials/integrate-power-bi.md
Title: 'Tutorial: Integrate Power BI with key phrase extraction' description: Learn how to use the key phrase extraction feature to get text stored in Power BI.-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/language-detection/how-to/call-api.md
Title: How to perform language detection description: This article will show you how to detect the language of written text using language detection.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/language-detection/how-to/use-containers.md
Title: Use language detection Docker containers on-premises description: Use Docker containers for the Language Detection API to determine the language of written text, on-premises.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/language-detection/language-support.md
Title: Language Detection language support description: This article explains which natural languages are supported by the Language Detection API.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/language-detection/overview.md
Title: What is language detection in Azure AI Language? description: An overview of language detection in Azure AI services, which helps you detect the language that text is written in by returning language codes.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/language-detection/quickstart.md
Title: "Quickstart: Use the Language Detection client library" description: Use this quickstart to start using Language Detection.-
+#
ai-services Entity Metadata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/concepts/entity-metadata.md
Title: Entity Metadata provided by Named Entity Recognition description: Learn about entity metadata in the NER feature.-
+#
ai-services Entity Resolutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/concepts/entity-resolutions.md
Title: Entity resolutions provided by Named Entity Recognition description: Learn about entity resolutions in the NER feature.-
+#
ai-services Ga Preview Mapping https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/concepts/ga-preview-mapping.md
Title: Preview API overview description: Learn about the NER preview API.-
+#
ai-services Named Entity Categories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/concepts/named-entity-categories.md
Title: Entity categories recognized by Named Entity Recognition in Azure AI Language description: Learn about the entities the NER feature can recognize from unstructured text.-
+#
ai-services How To Call https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/how-to-call.md
Title: How to perform Named Entity Recognition (NER) description: This article will show you how to extract named entities from text.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/how-to/use-containers.md
Title: Use named entity recognition Docker containers on-premises description: Use Docker containers for the Named Entity Recognition API to determine the language of written text, on-premises.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/language-support.md
Title: Named Entity Recognition (NER) language support description: This article explains which natural languages are supported by the NER feature of Azure AI Language.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/overview.md
Title: What is the Named Entity Recognition (NER) feature in Azure AI Language? description: An overview of the Named Entity Recognition feature in Azure AI services, which helps you extract categories of entities in text.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/quickstart.md
Title: "Quickstart: Use the NER client library" description: Use this quickstart to start using the Named Entity Recognition (NER) API.-
+#
ai-services Extract Excel Information https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/named-entity-recognition/tutorials/extract-excel-information.md
Title: Extract information in Excel using Power Automate description: Learn how to Extract Excel text without having to write code, using Named Entity Recognition and Power Automate.-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/concepts/data-formats.md
Title: Orchestration workflow data formats description: Learn about the data formats accepted by orchestration workflow.-
+#
ai-services Evaluation Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/concepts/evaluation-metrics.md
Title: Orchestration workflow model evaluation metrics description: Learn about evaluation metrics in orchestration workflow-
+#
ai-services Fail Over https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/concepts/fail-over.md
Title: Save and recover orchestration workflow models description: Learn how to save and recover your orchestration workflow models.-
+#
ai-services None Intent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/concepts/none-intent.md
Title: Orchestration workflow none intent description: Learn about the default None intent in orchestration workflow.-
+#
ai-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/faq.md
Title: Frequently Asked Questions for orchestration projects description: Use this article to quickly get the answers to FAQ about orchestration projects-
+#
ai-services Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/glossary.md
Title: Definitions used in orchestration workflow description: Learn about definitions used in orchestration workflow.-
+#
ai-services Build Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/build-schema.md
Title: How to build an orchestration project schema description: Learn how to define intents for your orchestration workflow project. -
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/call-api.md
Title: How to send requests to orchestration workflow description: Learn about sending requests for orchestration workflow.-
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/create-project.md
Title: Create orchestration workflow projects and use Azure resources description: Use this article to learn how to create projects in orchestration workflow-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/deploy-model.md
Title: How to deploy an orchestration workflow project description: Learn about deploying orchestration workflow projects.-
+#
ai-services Tag Utterances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/tag-utterances.md
Title: How to tag utterances in an orchestration workflow project description: Use this article to tag utterances-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/train-model.md
Title: How to train and evaluate models in orchestration workflow description: Learn how to train a model for orchestration workflow projects. -
+#
ai-services View Model Evaluation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/how-to/view-model-evaluation.md
Title: How to view orchestration workflow models details description: Learn how to view details for your model and evaluate its performance. -
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/language-support.md
Title: Language support for orchestration workflow description: Learn about the languages supported by orchestration workflow.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/overview.md
Title: Orchestration workflows - Azure AI services description: Customize an AI model to connect your Conversational Language Understanding, question answering and LUIS applications.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/quickstart.md
Title: Quickstart - Orchestration workflow description: Quickly start creating an AI model to connect your Conversational Language Understanding, question answering and LUIS applications.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/orchestration-workflow/service-limits.md
Title: Orchestration workflow limits description: Learn about the data, region, and throughput limits for Orchestration workflow-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/overview.md
Title: What is Azure AI Language description: Learn how to integrate AI into your applications that can extract information and understand written language.-
+#
ai-services Conversations Entity Categories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/concepts/conversations-entity-categories.md
Title: Entity categories recognized by Conversational Personally Identifiable Information (detection) in Azure AI Language description: Learn about the entities the Conversational PII feature (preview) can recognize from conversation inputs.-
+#
ai-services Entity Categories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/concepts/entity-categories.md
Title: Entity categories recognized by Personally Identifiable Information (detection) in Azure AI Language description: Learn about the entities the PII feature can recognize from unstructured text.-
+#
ai-services How To Call For Conversations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/how-to-call-for-conversations.md
Title: How to detect Personally Identifiable Information (PII) in conversations. description: This article will show you how to extract PII from chat and spoken transcripts and redact identifiable information.-
+#
ai-services How To Call https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/how-to-call.md
Title: How to detect Personally Identifiable Information (PII) description: This article will show you how to extract PII and health information (PHI) from text and detect identifiable information.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/language-support.md
Title: Personally Identifiable Information (PII) detection language support description: This article explains which natural languages are supported by the PII detection feature of Azure AI Language.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/overview.md
Title: What is the Personally Identifying Information (PII) detection feature in Azure AI Language? description: An overview of the PII detection feature in Azure AI services, which helps you extract entities and sensitive information (PII) in text.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/personally-identifiable-information/quickstart.md
Title: "Quickstart: Detect Personally Identifying Information (PII) in text" description: Use this quickstart to start using the PII detection API.-
+#
ai-services Confidence Score https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/question-answering/concepts/confidence-score.md
Title: Confidence score - question answering description: When a user query is matched against a knowledge base, question answering returns relevant answers, along with a confidence score.-
+#
ai-services Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/question-answering/how-to/analytics.md
Title: Analytics on projects - custom question answering description: Custom question answering uses Azure diagnostic logging to store the telemetry data and chat logs-
+#
ai-services Chit Chat https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/question-answering/how-to/chit-chat.md
Title: Adding chitchat to a custom question answering project description: Adding personal chitchat to your bot makes it more conversational and engaging when you create a project. Custom question answering allows you to easily add a pre-populated set of the top chitchat, into your projects.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/question-answering/language-support.md
Title: Language support - custom question answering description: A list of culture, natural languages supported by custom question answering for your project. Do not mix languages in the same project.-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/concepts/data-formats.md
Title: Custom sentiment analysis data formats description: Learn about the data formats accepted by custom sentiment analysis.-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/call-api.md
Title: Send a Custom sentiment analysis request to your custom model description: Learn how to send requests for Custom sentiment analysis. -
+#
ai-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/create-project.md
Title: How to create Custom sentiment analysis projects description: Learn about the steps for using Azure resources with Custom sentiment analysis.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/deploy-model.md
Title: Deploy a Custom sentiment analysis model description: Learn about deploying a model for Custom sentiment analysis.-
+#
ai-services Design Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/design-schema.md
Title: How to prepare data and define a custom sentiment analysis schema description: Learn about data selection and preparation for custom sentient analysis projects.-
+#
ai-services Label Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/label-data.md
Title: How to label your data for Custom sentiment analysis - Azure AI services description: Learn about how to label your data for use with the custom Sentiment analysis.-
+#
ai-services Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/how-to/train-model.md
Title: How to train your Custom sentiment analysis model - Azure AI services description: Learn about how to train your model for Custom sentiment analysis.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/custom/quickstart.md
Title: Quickstart - Custom sentiment analysis description: Quickly start building an AI model to identify the sentiment of text.-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/how-to/call-api.md
Title: How to perform sentiment analysis and opinion mining description: This article will show you how to detect sentiment, and mine for opinions in text.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/how-to/use-containers.md
Title: Install and run Docker containers for Sentiment Analysis description: Use the Docker containers for the Sentiment Analysis API to perform natural language processing such as sentiment analysis, on-premises.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/language-support.md
Title: Sentiment Analysis and Opinion Mining language support description: This article explains which languages are supported by the Sentiment Analysis and Opinion Mining features of the Language service.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/overview.md
Title: What is sentiment analysis and opinion mining in the Language service? description: An overview of the sentiment analysis feature in Azure AI services, which helps you find out what people think of a topic by mining text for clues.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/sentiment-opinion-mining/quickstart.md
Title: "Quickstart: Use the Sentiment Analysis client library and REST API" description: Use this quickstart to start using the Sentiment Analysis API.-
+#
ai-services Data Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/custom/how-to/data-formats.md
Title: Prepare data for custom summarization description: Learn about how to select and prepare data, to be successful in creating custom summarization projects.-
+#
ai-services Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/custom/how-to/deploy-model.md
Title: Deploy a custom summarization model description: Learn about deploying a model for Custom summarization.-
+#
ai-services Test Evaluate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/custom/how-to/test-evaluate.md
Title: Test and evaluate models in custom summarization description: Learn about how to test and evaluate custom summarization models.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/custom/quickstart.md
Title: Quickstart - Custom summarization (preview) description: Quickly start building an AI model to summarize text.-
+#
ai-services Conversation Summarization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/how-to/conversation-summarization.md
Title: Summarize text with the conversation summarization API description: This article shows you how to summarize chat logs with the conversation summarization API.-
+#
ai-services Document Summarization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/how-to/document-summarization.md
Title: Summarize text with the extractive summarization API description: This article shows you how to summarize text with the extractive summarization API.-
+#
curl -i -X POST https://<your-language-resource-endpoint>/language/analyze-text/
"kind": "AbstractiveSummarization", "taskName": "Document Abstractive Summarization Task 1", "parameters": {
- "sentenceCount": 1
+ "summaryLength": short
} } ] } ' ```
-If you don't specify `sentenceCount`, the model determines the summary length. Note that `sentenceCount` is the approximation of the sentence count of the output summary, range 1 to 20.
+If you don't specify `sentenceCount`, the model determines the summary length. Note that `sentenceCount` is the approximation of the sentence count of the output summary, range 1 to 20. Using sentenceCount is not recommended for abstractive summarization.
2. Make the following changes in the command where needed: - Replace the value `your-language-resource-key` with your key.
curl -X GET https://<your-language-resource-endpoint>/language/analyze-text/jobs
The following cURL commands are executed from a BASH shell. Edit these commands with your own resource name, resource key, and JSON values.
-## Query based extractive summarization
+## Query based summarization
-The query-based extractive summarization API is an extension to the existing document summarization API.
+The query-based document summarization API is an extension to the existing document summarization API.
-The biggest difference is a new `query` field in the request body (under `tasks` > `parameters` > `query`). Additionally, there's a new way to specify the preferred `summaryLength` in "buckets" of short/medium/long, which we recommend using instead of `sentenceCount`. Below is an example request:
+The biggest difference is a new `query` field in the request body (under `tasks` > `parameters` > `query`). Additionally, there's a new way to specify the preferred `summaryLength` in "buckets" of short/medium/long, which we recommend using instead of `sentenceCount`, especially when using abstractive. Below is an example request:
```bash curl -i -X POST https://<your-language-resource-endpoint>/language/analyze-text/jobs?api-version=2023-11-15-preview \
curl -i -X POST https://<your-language-resource-endpoint>/language/analyze-text/
"taskName": "Document Extractive Summarization Task 1", "parameters": { "query": "XYZ-code",
- "sentenceCount": 1
+ "summaryLength": short
} } ]
curl -i -X POST https://<your-language-resource-endpoint>/language/analyze-text/
' ```
-## Query based abstractive summarization
-
-The query-based abstractive summarization API is an extension to the existing document summarization API.
-
-The biggest difference is a new `query` field in the request body (under `tasks` > `parameters` > `query`). Additionally, there's a new way to specify the preferred `summaryLength` in "buckets" of short/medium/long, which we recommend using instead of `sentenceCount`. Below is an example request:
-
-```bash
-curl -i -X POST https://<your-language-resource-endpoint>/language/analyze-text/jobs?api-version=2023-11-15-preview \
--H "Content-Type: application/json" \--H "Ocp-Apim-Subscription-Key: <your-language-resource-key>" \--d \
-'
-{
- "displayName": "Document Abstractive Summarization Task Example",
- "analysisInput": {
- "documents": [
- {
- "id": "1",
- "language": "en",
- "text": "At Microsoft, we have been on a quest to advance AI beyond existing techniques, by taking a more holistic, human-centric approach to learning and understanding. As Chief Technology Officer of Azure AI services, I have been working with a team of amazing scientists and engineers to turn this quest into a reality. In my role, I enjoy a unique perspective in viewing the relationship among three attributes of human cognition: monolingual text (X), audio or visual sensory signals, (Y) and multilingual (Z). At the intersection of all three, thereΓÇÖs magicΓÇöwhat we call XYZ-code as illustrated in Figure 1ΓÇöa joint representation to create more powerful AI that can speak, hear, see, and understand humans better. We believe XYZ-code enables us to fulfill our long-term vision: cross-domain transfer learning, spanning modalities and languages. The goal is to have pretrained models that can jointly learn representations to support a broad range of downstream AI tasks, much in the way humans do today. Over the past five years, we have achieved human performance on benchmarks in conversational speech recognition, machine translation, conversational question answering, machine reading comprehension, and image captioning. These five breakthroughs provided us with strong signals toward our more ambitious aspiration to produce a leap in AI capabilities, achieving multi-sensory and multilingual learning that is closer in line with how humans learn and understand. I believe the joint XYZ-code is a foundational component of this aspiration, if grounded with external knowledge sources in the downstream AI tasks."
- }
- ]
- },
- "tasks": [
- {
- "kind": "AbstractiveSummarization",
- "taskName": "Document Abstractive Summarization Task 1",
- "parameters": {
- "query": "XYZ-code",
- "summaryLength": "short"
- }
- }
- ]
-}
-'
-```
### Using the summaryParameter For the `summaryLength` parameter, three values are accepted: * short: Generates a summary of mostly 2-3 sentences, with around 120 tokens.
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/how-to/use-containers.md
Title: Use summarization Docker containers on-premises description: Use Docker containers for the summarization API to summarize text, on-premises.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/language-support.md
Title: Summarization language support description: Learn about which languages are supported by document summarization.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/overview.md
Title: What is document and conversation summarization? description: Learn about summarizing text.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/quickstart.md
Title: "Quickstart: Use Document Summarization" description: Use this quickstart to start using Document Summarization.-
+#
ai-services Region Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/summarization/region-support.md
Title: Summarization region support description: Learn about which regions are supported by document summarization.-
+#
Some summarization features are only available in limited regions. More regions
## Regional availability table
-|Region |Document abstractive summarization|Conversation issue and resolution summarization|Conversation narrative summarization with chapters|Custom summarization|
-||-|--|--|--|
-|Azure Gov Virginia|&#9989; |&#9989; |&#9989; |&#10060; |
-|North Europe |&#9989; |&#9989; |&#9989; |&#10060; |
-|East US |&#9989; |&#9989; |&#9989; |&#9989; |
-|UK South |&#9989; |&#9989; |&#9989; |&#10060; |
-|Southeast Asia |&#9989; |&#9989; |&#9989; |&#10060; |
-|Central Sweden |&#9989; |&#10060; |&#10060; |&#10060; |
+|Region |Document abstractive summarization|Conversation summarization |Custom summarization|
+||-|--|--|
+|Azure Gov Virginia|&#9989; |&#9989; |&#10060; |
+|North Europe |&#9989; |&#9989; |&#10060; |
+|East US |&#9989; |&#9989; |&#9989; |
+|South UK |&#9989; |&#9989; |&#10060; |
+|Southeast Asia |&#9989; |&#9989; |&#10060; |
+|Central Sweden |&#9989; |&#9989; |&#10060; |
## Next steps
ai-services Assertion Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/concepts/assertion-detection.md
Title: Assertion detection in Text Analytics for health description: Learn about assertion detection.-
+#
ai-services Health Entity Categories https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/concepts/health-entity-categories.md
Title: Entity categories recognized by Text Analytics for health description: Learn about categories recognized by Text Analytics for health-
+#
ai-services Relation Extraction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/concepts/relation-extraction.md
Title: Relation extraction in Text Analytics for health description: Learn about relation extraction-
+#
ai-services Call Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/how-to/call-api.md
Title: How to call Text Analytics for health description: Learn how to extract and label medical information from unstructured clinical text with Text Analytics for health.-
+#
ai-services Configure Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/how-to/configure-containers.md
Title: Configure Text Analytics for health containers description: Text Analytics for health containers uses a common configuration framework, so that you can easily configure and manage storage, logging and telemetry, and security settings for your containers.-
+#
ai-services Use Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/how-to/use-containers.md
Title: How to use Text Analytics for health containers description: Learn how to extract and label medical information on premises using Text Analytics for health Docker container.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/language-support.md
Title: Text Analytics for health language support description: "This article explains which natural languages are supported by the Text Analytics for health."-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/overview.md
Title: What is the Text Analytics for health in Azure AI Language? description: An overview of Text Analytics for health in Azure AI services, which helps you extract medical information from unstructured text, like clinical documents.-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/text-analytics-for-health/quickstart.md
Title: "Quickstart: Use the Text Analytics for health REST API and client library" description: Use this quickstart to start using Text Analytics for health.-
+#
ai-services Power Automate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/tutorials/power-automate.md
Title: Use Language service in power automate description: Learn how to use Azure AI Language in power automate, without writing code.-
+#
ai-services Use Kubernetes Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/tutorials/use-kubernetes-service.md
Title: Deploy a key phrase extraction container to Azure Kubernetes Service description: Deploy a key phrase extraction container image to Azure Kubernetes Service, and test it in a web browser.-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-service/whats-new.md
Title: What's new in Azure AI Language? description: Find out about new releases and features for the Azure AI Language.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/language-support.md
Title: Language support description: Azure AI services enable you to build applications that see, hear, speak with, and understand your users.-
+#
ai-services Multi Service Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/multi-service-resource.md
Title: Create a multi-service resource for Azure AI services description: Create and manage a multi-service resource for Azure AI services-
+#
keywords: Azure AI services, cognitive
ai-services Chatgpt Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/chatgpt-quickstart.md
Title: 'Quickstart - Get started using GPT-35-Turbo and GPT-4 with Azure OpenAI Service' description: Walkthrough on how to get started with GPT-35-Turbo and GPT-4 on Azure OpenAI Service.-
+#
ai-services Fine Tuning Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/fine-tuning-considerations.md
Title: Azure OpenAI Service fine-tuning considerations description: Learn more about what you should take into consideration before fine-tuning with Azure OpenAI Service -
+#
ai-services Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/models.md
description: Learn about the different model capabilities that are available with Azure OpenAI. Previously updated : 10/04/2023 Last updated : 11/17/2023
Azure OpenAI Service is powered by a diverse set of models with different capabi
| Models | Description | |--|--|
-| [GPT-4](#gpt-4) | A set of models that improve on GPT-3.5 and can understand and generate natural language and code. |
+| [GPT-4](#gpt-4-and-gpt-4-turbo-preview) | A set of models that improve on GPT-3.5 and can understand and generate natural language and code. |
| [GPT-3.5](#gpt-35) | A set of models that improve on GPT-3 and can understand and generate natural language and code. | | [Embeddings](#embeddings-models) | A set of models that can convert text into numerical vector form to facilitate text similarity. | | [DALL-E](#dall-e-models-preview) (Preview) | A series of models in preview that can generate original images from natural language. | | [Whisper](#whisper-models-preview) (Preview) | A series of models in preview that can transcribe and translate speech to text. |
-## GPT-4
+## GPT-4 and GPT-4 Turbo Preview
GPT-4 can solve difficult problems with greater accuracy than any of OpenAI's previous models. Like GPT-3.5 Turbo, GPT-4 is optimized for chat and works well for traditional completions tasks. Use the Chat Completions API to use GPT-4. To learn more about how to interact with GPT-4 and the Chat Completions API check out our [in-depth how-to](../how-to/chatgpt.md).
You can also use the Whisper model via Azure AI Speech [batch transcription](../
> > - South Central US is temporarily unavailable for creating new resources and deployments.
-### GPT-4 models
+### GPT-4 and GPT-4 Turbo Preview models
GPT-4 and GPT-4-32k models are now available to all Azure OpenAI Service customers. Availability varies by region. If you don't see GPT-4 in your region, please check back later.
See [model versions](../concepts/model-versions.md) to learn about how Azure Ope
> Version `0314` of `gpt-4` and `gpt-4-32k` will be retired no earlier than July 5, 2024. See [model updates](../how-to/working-with-models.md#model-updates) for model upgrade behavior. | Model ID | Max Request (tokens) | Training Data (up to) |
-| | :: | :: |
+| | : | :: |
| `gpt-4` (0314) | 8,192 | Sep 2021 | | `gpt-4-32k`(0314) | 32,768 | Sep 2021 | | `gpt-4` (0613) | 8,192 | Sep 2021 | | `gpt-4-32k` (0613) | 32,768 | Sep 2021 |
+| `gpt-4` (1106-preview)**<sup>1</sup>** | Input: 128,000 <br> Output: 4096 | Apr 2023 |
+
+**<sup>1</sup>** We don't recommend using this model in production. We will upgrade all deployments of this model to a future stable version. Models designated preview do not follow the standard Azure OpenAI model lifecycle.
> [!NOTE]
-> Regions where GPT-4 is listed as available have access to both the 8K and 32K versions of the model
+> Regions where GPT-4 (0314) & (0613) are listed as available have access to both the 8K and 32K versions of the model
-### GPT-4 model availability
+### GPT-4 and GPT-4 Turbo Preview model availability
-| Model Availability | gpt-4 (0314) | gpt-4 (0613) |
-||:|:|
-| Available to all subscriptions with Azure OpenAI access | | Australia East <br> Canada East <br> France Central <br> Sweden Central <br> Switzerland North |
+| Model Availability | gpt-4 (0314) | gpt-4 (0613) | gpt-4 (1106-preview) |
+||:|:|:|
+| Available to all subscriptions with Azure OpenAI access | | Australia East <br> Canada East <br> France Central <br> Sweden Central <br> Switzerland North | Australia East <br> Canada East <br> East US 2 <br> France Central <br> Norway East <br> South India <br> Sweden Central <br> UK South <br> West US |
| Available to subscriptions with current access to the model version in the region | East US <br> France Central <br> South Central US <br> UK South | East US <br> East US 2 <br> Japan East <br> UK South | ### GPT-3.5 models
See [model versions](../concepts/model-versions.md) to learn about how Azure Ope
| Model ID | Model Availability | Max Request (tokens) | Training Data (up to) | | | -- |::|:-:|
-| `gpt-35-turbo`<sup>1</sup> (0301) | East US <br> France Central <br> South Central US <br> UK South <br> West Europe | 4096 | Sep 2021 |
+| `gpt-35-turbo`**<sup>1</sup>** (0301) | East US <br> France Central <br> South Central US <br> UK South <br> West Europe | 4096 | Sep 2021 |
| `gpt-35-turbo` (0613) | Australia East <br> Canada East <br> East US <br> East US 2 <br> France Central <br> Japan East <br> North Central US <br> Sweden Central <br> Switzerland North <br> UK South | 4096 | Sep 2021 | | `gpt-35-turbo-16k` (0613) | Australia East <br> Canada East <br> East US <br> East US 2 <br> France Central <br> Japan East <br> North Central US <br> Sweden Central <br> Switzerland North<br> UK South | 16,384 | Sep 2021 | | `gpt-35-turbo-instruct` (0914) | East US <br> Sweden Central | 4097 |Sep 2021 |
+| `gpt-35-turbo` (1106) | Australia East <br> Canada East <br> France Central <br> South India <br> Sweden Central<br> UK South <br> West US | Input: 16,385<br> Output: 4,096 | Sep 2021|
-<sup>1</sup> This model will accept requests > 4096 tokens. It is not recommended to exceed the 4096 input token limit as the newer version of the model are capped at 4096 tokens. If you encounter issues when exceeding 4096 input tokens with this model this configuration is not officially supported.
+**<sup>1</sup>** This model will accept requests > 4096 tokens. It is not recommended to exceed the 4096 input token limit as the newer version of the model are capped at 4096 tokens. If you encounter issues when exceeding 4096 input tokens with this model this configuration is not officially supported.
### Embeddings models
ai-services Understand Embeddings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/understand-embeddings.md
Title: Azure OpenAI Service embeddings description: Learn more about how the Azure OpenAI embeddings API uses cosine similarity for document search and to measure similarity between texts.-
+#
ai-services Use Your Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/concepts/use-your-data.md
Title: 'Using your data with Azure OpenAI Service' description: Use this article to learn about using your data for better text generation in Azure OpenAI.-
+#
ai-services Dall E Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/dall-e-quickstart.md
Title: 'Quickstart: Generate images with Azure OpenAI Service' description: Learn how to get started generating images with Azure OpenAI Service by using the Python SDK, the REST APIs, or Azure OpenAI Studio.-
+#
ai-services Business Continuity Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/business-continuity-disaster-recovery.md
Title: 'Business Continuity and Disaster Recovery (BCDR) with Azure OpenAI Service' description: Considerations for implementing Business Continuity and Disaster Recovery (BCDR) with Azure OpenAI -
+#
ai-services Completions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/completions.md
Title: 'How to generate text with Azure OpenAI Service' description: Learn how to generate or manipulate text, including code by using a completion endpoint in Azure OpenAI Service.-
+#
ai-services Content Filters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/content-filters.md
Title: 'How to use content filters (preview) with Azure OpenAI Service' description: Learn how to use content filters (preview) with Azure OpenAI Service-
+#
ai-services Create Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/create-resource.md
Title: 'How-to: Create and deploy an Azure OpenAI Service resource' description: Learn how to get started with Azure OpenAI Service and create your first resource and deploy your first model in the Azure CLI or the Azure portal.-
+#
ai-services Embeddings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/embeddings.md
Title: 'How to generate embeddings with Azure OpenAI Service' description: Learn how to generate embeddings with Azure OpenAI-
+#
ai-services Fine Tuning https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/fine-tuning.md
Title: 'Customize a model with Azure OpenAI Service' description: Learn how to create your own customized model with Azure OpenAI Service by using Python, the REST APIs, or Azure OpenAI Studio.-
+#
ai-services Integrate Synapseml https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/integrate-synapseml.md
Title: 'Use Azure OpenAI Service with large datasets' description: Learn how to integrate Azure OpenAI Service with SynapseML and Apache Spark to apply large language models at a distributed scale.-
+#
ai-services Json Mode https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/json-mode.md
+
+ Title: 'How to use JSON mode with Azure OpenAI Service'
+
+description: Learn how to improve your chat completions with Azure OpenAI JSON mode
++++ Last updated : 11/17/2023++
+recommendations: false
+keywords:
+++
+# Learn how to use JSON mode
+
+JSON mode allows you to set the models response format to return a valid JSON object as part of a chat completion. While generating valid JSON was possible previously, there could be issues with response consistency that would lead to invalid JSON objects being generated.
+
+## JSON mode support
+
+JSON mode is only currently supported with the following:
+
+### Supported models
+
+- `gpt-4-1106-preview`
+- `gpt-35-turbo-1106`
+
+### API version
+
+- `2023-12-01-preview`
+
+## Example
+
+```python
+import os
+from openai import AzureOpenAI
+
+client = AzureOpenAI(
+ azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
+ api_key=os.getenv("AZURE_OPENAI_KEY"),
+ api_version="2023-12-01-preview"
+)
+
+response = client.chat.completions.create(
+ model="gpt-4-1106-preview", # Model = should match the deployment name you chose for your 1106-preview model deployment
+ response_format={ "type": "json_object" },
+ messages=[
+ {"role": "system", "content": "You are a helpful assistant designed to output JSON."},
+ {"role": "user", "content": "Who won the world series in 2020?"}
+ ]
+)
+print(response.choices[0].message.content)
+```
+
+### Output
+
+```output
+{
+ "winner": "Los Angeles Dodgers",
+ "event": "World Series",
+ "year": 2020
+}
+```
+
+There are two key factors that need to be present to successfully use JSON mode:
+
+- `response_format={ "type": "json_object" }`
+- We have told the model to output JSON as part of the system message.
+
+Including guidance to the model that it should produce JSON as part of the messages conversation is **required**. We recommend adding this instruction as part of the system message. According to OpenAI failure to add this instruction can cause the model to *"generate an unending stream of whitespace and the request may run continually until it reaches the token limit."*
+
+When using the [OpenAI Python API library](https://github.com/openai/openai-python) failure to include "JSON" within the messages will return:
+
+### Output
+
+```output
+BadRequestError: Error code: 400 - {'error': {'message': "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.", 'type': 'invalid_request_error', 'param': 'messages', 'code': None}}
+```
+
+## Additional considerations
+
+You should check `finish_reason` for the value `length` before parsing the response. When this is present, you might have generated partial JSON. This means that output from the model was larger than the available max_tokens that were set as part of the request, or the conversation itself exceeded the token limit.
+
+JSON mode will produce JSON that is valid and will parse without errors. However, this doesn't mean that output will match a specific schema.
ai-services Quota https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/quota.md
Title: Manage Azure OpenAI Service quota description: Learn how to use Azure OpenAI to control your deployments rate limits.-
+#
ai-services Reproducible Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/reproducible-output.md
+
+ Title: 'How to generate reproducible output with Azure OpenAI Service'
+
+description: Learn how to generate reproducible output (preview) with Azure OpenAI Service
++++ Last updated : 11/17/2023++
+recommendations: false
+keywords:
+++
+# Learn how to use reproducible output (preview)
+
+By default if you ask an Azure OpenAI Chat Completion model the same question multiple times you are likely to get a different response. The responses are therefore considered to be non-deterministic. Reproducible output is a new preview feature that allows you to selectively change the default behavior towards producing more deterministic outputs.
+
+## Reproducible output support
+
+Reproducible output is only currently supported with the following:
+
+### Supported models
+
+- `gpt-4-1106-preview`
+- `gpt-35-turbo-1106`
+
+### API Version
+
+- `2023-12-01-preview`
+
+## Example
+
+First we'll generate three responses to the same question to demonstrate the variability that is common to Chat Completion responses even when other parameters are the same:
+
+```python
+import os
+from openai import AzureOpenAI
+
+client = AzureOpenAI(
+ azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
+ api_key=os.getenv("AZURE_OPENAI_KEY"),
+ api_version="2023-12-01-preview"
+)
+
+for i in range(3):
+ print(f'Story Version {i + 1}\n')
+
+ response = client.chat.completions.create(
+ model="gpt-4-1106-preview", # Model = should match the deployment name you chose for your 1106-preview model deployment
+ #seed=42,
+ temperature=0.7,
+ max_tokens =200,
+ messages=[
+ {"role": "system", "content": "You are a helpful assistant."},
+ {"role": "user", "content": "Tell me a story about how the universe began?"}
+ ]
+ )
+
+ print(response.choices[0].message.content)
+ print("\n")
+
+ del response
+```
+
+### Output
+
+```output
+Story Version 1
+
+In the beginning, there was nothingness, a vast expanse of empty space, a blank canvas waiting to be painted with the wonders of existence. Then, approximately 13.8 billion years ago, something extraordinary happened, an event that would mark the birth of the universe ΓÇô the Big Bang.
+
+The Big Bang was not an explosion in the conventional sense but rather an expansion, an incredibly rapid stretching of space that took place everywhere in the universe at once. In just a fraction of a second, the universe grew from smaller than a single atom to an incomprehensibly large expanse.
+
+In these first moments, the universe was unimaginably hot and dense, filled with a seething soup of subatomic particles and radiant energy. As the universe expanded, it began to cool, allowing the first particles to form. Protons and neutrons came together to create the first simple atomic nuclei in a process known as nucleosynthesis.
+
+For hundreds of thousands of years, the universe continued to cool and expand
++
+Story Version 2
+
+Once upon a time, in the vast expanse of nothingness, there was a moment that would come to define everything. This moment, a tiny fraction of a second that would be forever known as the Big Bang, marked the birth of the universe as we know it.
+
+Before this moment, there was no space, no time, just an infinitesimally small point of pure energy, a singularity where all the laws of physics as we understand them did not apply. Then, suddenly, this singular point began to expand at an incredible rate. In a cosmic symphony of creation, matter, energy, space, and time all burst forth into existence.
+
+The universe was a hot, dense soup of particles, a place of unimaginable heat and pressure. It was in this crucible of creation that the simplest elements were formed. Hydrogen and helium, the building blocks of the cosmos, came into being.
+
+As the universe continued to expand and cool, these primordial elements began to co
++
+Story Version 3
+
+Once upon a time, in the vast expanse of nothingness, there was a singularity, an infinitely small and infinitely dense point where all the mass and energy of what would become the universe were concentrated. This singularity was like a tightly wound cosmic spring holding within it the potential of everything that would ever exist.
+
+Then, approximately 13.8 billion years ago, something extraordinary happened. This singularity began to expand in an event we now call the Big Bang. In just a fraction of a second, the universe grew exponentially during a period known as cosmic inflation. It was like a symphony's first resounding chord, setting the stage for a cosmic performance that would unfold over billions of years.
+
+As the universe expanded and cooled, the fundamental forces of nature that we know today ΓÇô gravity, electromagnetism, and the strong and weak nuclear forces ΓÇô began to take shape. Particles of matter were created and began to clump together under the force of gravity, forming the first atoms
+
+```
+
+Notice that while each story might have similar elements and some verbatim repetition the longer the response goes on the more they tend to diverge.
+
+Now we'll run the same code as before but this time uncomment the line for the parameter that says `seed=42`
+
+```python
+import os
+from openai import AzureOpenAI
+
+client = AzureOpenAI(
+ azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
+ api_key=os.getenv("AZURE_OPENAI_KEY"),
+ api_version="2023-12-01-preview"
+)
+
+for i in range(3):
+ print(f'Story Version {i + 1}\n')
+
+ response = client.chat.completions.create(
+ model="gpt-4-1106-preview", # Model = should match the deployment name you chose for your 1106-preview model deployment
+ seed=42,
+ temperature=0.7,
+ max_tokens =200,
+ messages=[
+ {"role": "system", "content": "You are a helpful assistant."},
+ {"role": "user", "content": "Tell me a story about how the universe began?"}
+ ]
+ )
+
+ print(response.choices[0].message.content)
+ print("\n")
+
+ del response
+```
+
+### Output
+
+```
+Story Version 1
+
+In the beginning, there was nothing but a vast emptiness, a void without form or substance. Then, from this nothingness, a singular event occurred that would change the course of existence foreverΓÇöThe Big Bang.
+
+Around 13.8 billion years ago, an infinitely hot and dense point, no larger than a single atom, began to expand at an inconceivable speed. This was the birth of our universe, a moment where time and space came into being. As this primordial fireball grew, it cooled, and the fundamental forces that govern the cosmosΓÇögravity, electromagnetism, and the strong and weak nuclear forcesΓÇöbegan to take shape.
+
+Matter coalesced into the simplest elements, hydrogen and helium, which later formed vast clouds in the expanding universe. These clouds, driven by the force of gravity, began to collapse in on themselves, creating the first stars. The stars were crucibles of nuclear fusion, forging heavier elements like carbon, nitrogen, and oxygen
++
+Story Version 2
+
+In the beginning, there was nothing but a vast emptiness, a void without form or substance. Then, from this nothingness, a singular event occurred that would change the course of existence foreverΓÇöThe Big Bang.
+
+Around 13.8 billion years ago, an infinitely hot and dense point, no larger than a single atom, began to expand at an inconceivable speed. This was the birth of our universe, a moment where time and space came into being. As this primordial fireball grew, it cooled, and the fundamental forces that govern the cosmosΓÇögravity, electromagnetism, and the strong and weak nuclear forcesΓÇöbegan to take shape.
+
+Matter coalesced into the simplest elements, hydrogen and helium, which later formed vast clouds in the expanding universe. These clouds, driven by the force of gravity, began to collapse in on themselves, creating the first stars. The stars were crucibles of nuclear fusion, forging heavier elements like carbon, nitrogen, and oxygen
++
+Story Version 3
+
+In the beginning, there was nothing but a vast emptiness, a void without form or substance. Then, from this nothingness, a singular event occurred that would change the course of existence foreverΓÇöThe Big Bang.
+
+Around 13.8 billion years ago, an infinitely hot and dense point, no larger than a single atom, began to expand at an inconceivable speed. This was the birth of our universe, a moment where time and space came into being. As this primordial fireball grew, it cooled, and the fundamental forces that govern the cosmosΓÇögravity, electromagnetism, and the strong and weak nuclear forcesΓÇöbegan to take shape.
+
+Matter coalesced into the simplest elements, hydrogen and helium, which later formed vast clouds in the expanding universe. These clouds, driven by the force of gravity, began to collapse in on themselves, creating the first stars. The stars were crucibles of nuclear fusion, forging heavier elements like carbon, nitrogen, and oxygen
+
+```
+
+By using the same `seed` parameter of 42 for each of our three requests we're able to produce much more consistent (in this case identical) results.
+
+## Parameter details
+
+`seed` is an optional parameter, which can be set to an integer or null.
+
+This feature is in Preview. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same seed and parameters should return the same result. Determinism isn't guaranteed, and you should refer to the `system_fingerprint` response parameter to monitor changes in the backend.
+
+`system_fingerprint` is a string and is part of the chat completion object.
+
+This fingerprint represents the backend configuration that the model runs with.
+
+It can be used with the seed request parameter to understand when backend changes have been made that might affect determinism.
+
+To view the full chat completion object with `system_fingerprint`, you could add ` print(response.model_dump_json(indent=2))` to the previous code next to the existing print statement. This change results in the following additional information being part of the output:
+
+### Output
+
+```JSON
+{
+ "id": "chatcmpl-8LmLRatZxp8wsx07KGLKQF0b8Zez3",
+ "choices": [
+ {
+ "finish_reason": "length",
+ "index": 0,
+ "message": {
+ "content": "In the beginning, there was nothing but a vast emptiness, a void without form or substance. Then, from this nothingness, a singular event occurred that would change the course of existence foreverΓÇöThe Big Bang.\n\nAround 13.8 billion years ago, an infinitely hot and dense point, no larger than a single atom, began to expand at an inconceivable speed. This was the birth of our universe, a moment where time and space came into being. As this primordial fireball grew, it cooled, and the fundamental forces that govern the cosmosΓÇögravity, electromagnetism, and the strong and weak nuclear forcesΓÇöbegan to take shape.\n\nMatter coalesced into the simplest elements, hydrogen and helium, which later formed vast clouds in the expanding universe. These clouds, driven by the force of gravity, began to collapse in on themselves, creating the first stars. The stars were crucibles of nuclear fusion, forging heavier elements like carbon, nitrogen, and oxygen",
+ "role": "assistant",
+ "function_call": null,
+ "tool_calls": null
+ },
+ "content_filter_results": {
+ "hate": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "self_harm": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "sexual": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "violence": {
+ "filtered": false,
+ "severity": "safe"
+ }
+ }
+ }
+ ],
+ "created": 1700201417,
+ "model": "gpt-4",
+ "object": "chat.completion",
+ "system_fingerprint": "fp_50a4261de5",
+ "usage": {
+ "completion_tokens": 200,
+ "prompt_tokens": 27,
+ "total_tokens": 227
+ },
+ "prompt_filter_results": [
+ {
+ "prompt_index": 0,
+ "content_filter_results": {
+ "hate": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "self_harm": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "sexual": {
+ "filtered": false,
+ "severity": "safe"
+ },
+ "violence": {
+ "filtered": false,
+ "severity": "safe"
+ }
+ }
+ }
+ ]
+}
+```
+
+## Additional considerations
+
+When you want to use reproducible outputs, you need to set the `seed` to the same integer across chat completions calls. You should also match any other parameters like `temperature`, `max_tokens`, etc.
ai-services Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/role-based-access-control.md
Title: Role-based access control for Azure OpenAI description: Learn how to use Azure RBAC for managing individual access to Azure OpenAI resources.-
+#
ai-services Use Blocklists https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/use-blocklists.md
Title: 'How to use blocklists with Azure OpenAI Service' description: Learn how to use blocklists with Azure OpenAI Service-
+#
ai-services Work With Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/how-to/work-with-code.md
Title: 'How to use the Codex models to work with code' description: Learn how to use the Codex models on Azure OpenAI to handle a variety of coding tasks-
+#
ai-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/quickstart.md
Title: 'Quickstart - Deploy a model and generate text using Azure OpenAI Service' description: Walkthrough on how to get started with Azure OpenAI and make your first completions call.-
+#
ai-services Quotas Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/quotas-limits.md
Title: Azure OpenAI Service quotas and limits description: Quick reference, detailed description, and best practices on the quotas and limits for the OpenAI service in Azure AI services.-
+#
- ignite-2023 Previously updated : 11/15/2023 Last updated : 11/17/2023
The default quota for models varies by model and region. Default quota limits ar
<td>East US, Sweden Central</td> <td>240 K</td> </tr>
+ <tr>
+ <td>gpt-35-turbo (1106)</td>
+ <td> Australia East, Canada East, France Central, South India, Sweden Central, UK South, West US
+</td>
+ <td>120 K</td>
+ </tr>
<tr> <td rowspan="2">gpt-4</td> <td>East US, South Central US, West Europe, France Central</td>
The default quota for models varies by model and region. Default quota limits ar
<tr> <td>North Central US, Australia East, East US 2, Canada East, Japan East, UK South, Sweden Central, Switzerland North</td> <td>80 K</td>
+ </tr>
+<tr>
+ <td rowspan="2">gpt-4 (1106-preview)<br>GPT-4 Turbo </td>
+ <td>Australia East, Canada East, East US 2, France Central, UK South, West US</td>
+ <td>80 K</td>
</tr>
+ <tr>
+ <td>South India, Norway East, Sweden Central</td>
+ <td>150 K</td>
+ </tr>
<tr> <td rowspan="2">text-embedding-ada-002</td> <td>East US, South Central US, West Europe, France Central</td>
ai-services Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/reference.md
Title: Azure OpenAI Service REST API reference description: Learn how to use Azure OpenAI's REST API. In this article, you'll learn about authorization options, how to structure a request and receive a response.-
+#
ai-services Embeddings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/tutorials/embeddings.md
Title: Azure OpenAI Service embeddings tutorial description: Learn how to use Azure OpenAI's embeddings API for document search with the BillSum dataset-
+#
ai-services Fine Tune https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/tutorials/fine-tune.md
Title: Azure OpenAI Service fine-tuning gpt-3.5-turbo description: Learn how to use Azure OpenAI's latest fine-tuning capabilities with gpt-3.5-turbo-
+#
ai-services Use Your Data Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/use-your-data-quickstart.md
Title: 'Use your own data with Azure OpenAI Service' description: Use this article to import and use your data in Azure OpenAI.-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/whats-new.md
- ignite-2023 Previously updated : 10/16/2023 Last updated : 10/17/2023 recommendations: false keywords:
keywords:
## November 2023
+### GPT-4 Turbo Preview & GPT-3.5-Turbo-1106 released
+
+Both models are the latest release from OpenAI with improved instruction following, [JSON mode](./how-to/json-mode.md), [reproducible output](./how-to/reproducible-output.md), and parallel function calling.
+
+- **GPT-4 Turbo Preview** has a max context window of 128,000 tokens and can generate 4,096 output tokens. It has the latest training data with knowledge up to April 2023. This model is in preview and is not recommended for production use. All deployments of this preview model will be automatically updated in place once the stable release becomes available.
+
+- **GPT-3.5-Turbo-1106** has a max context window of 16,385 tokens and can generate 4,096 output tokens.
+
+For information on model regional availability consult the [models page](./concepts/models.md).
+
+The models have their own unique per region [quota allocations](./quotas-limits.md).
+ ### DALL-E 3 public preview DALL-E 3 is the latest image generation model from OpenAI. It features enhanced image quality, more complex scenes, and improved performance when rendering text in images. It also comes with more aspect ratio options. DALL-E 3 is available through OpenAI Studio and through the REST API. Your OpenAI resource must be in the `SwedenCentral` Azure region.
ai-services Whisper Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/openai/whisper-quickstart.md
Title: 'Speech to text with Azure OpenAI Service' description: Use the Azure OpenAI Whisper model for speech to text.-
+#
ai-services Concept Multi Slot Personalization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/personalizer/concept-multi-slot-personalization.md
Title: Multi-slot personalization description: Learn where and when to use single-slot and multi-slot personalization with the Personalizer Rank and Reward APIs.-
+#
ai-services How To Multi Slot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/personalizer/how-to-multi-slot.md
Title: How to use multi-slot with Personalizer description: Learn how to use multi-slot with Personalizer to improve content recommendations provided by the service.-
+#
ai-services How To Thick Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/personalizer/how-to-thick-client.md
Title: How to use local inference with the Personalizer SDK description: Learn how to use local inference to improve latency.-
+#
ai-services Confidence Score https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/Concepts/confidence-score.md
Title: Confidence score - QnA Maker description: When a user query is matched against a knowledge base, QnA Maker returns relevant answers, along with a confidence score.-
+#
ai-services Chit Chat Knowledge Base https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/How-To/chit-chat-knowledge-base.md
Title: Adding chit-chat to a QnA Maker knowledge base description: Adding personal chit-chat to your bot makes it more conversational and engaging when you create a KB. QnA Maker allows you to easily add a pre-populated set of the top chit-chat, into your KB.-
+#
ai-services Get Analytics Knowledge Base https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/How-To/get-analytics-knowledge-base.md
Title: Analytics on knowledgebase - QnA Maker description: QnA Maker stores all chat logs and other telemetry, if you have enabled App Insights during the creation of your QnA Maker service. Run the sample queries to get your chat logs from App Insights.-
+#
ai-services Metadata Generateanswer Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/How-To/metadata-generateanswer-usage.md
Title: Metadata with GenerateAnswer API - QnA Maker description: QnA Maker lets you add metadata, in the form of key/value pairs, to your question/answer pairs. You can filter results to user queries, and store additional information that can be used in follow-up conversations.-
+#
ai-services Query Knowledge Base With Metadata https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/How-To/query-knowledge-base-with-metadata.md
Title: Contextually filter by using metadata description: QnA Maker filters QnA pairs by metadata.-
+#
ai-services Using Prebuilt Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/How-To/using-prebuilt-api.md
Title: QnA Maker Prebuilt API description: Use the Prebuilt API to get answers over text-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/Overview/language-support.md
Title: Language support - QnA Maker description: A list of culture, natural languages supported by QnA Maker for your knowledge base. Do not mix languages in the same knowledge base.-
+#
ai-services Get Answer From Knowledge Base Using Url Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/Quickstarts/get-answer-from-knowledge-base-using-url-tool.md
Title: Use URL tool to get answer from knowledge base - QnA Maker description: This article walks you through getting an answer from your knowledge base using a URL test tool such as cURL or Postman.-
+#
ai-services Reference Tsv Format Batch Testing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/reference-tsv-format-batch-testing.md
Title: Batch test TSV format - QnA Maker description: Understand the TSV format for batch testing-
+#
ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/qnamaker/whats-new.md
Title: What's new in QnA Maker service? description: This article contains news about QnA Maker.-
+#
ai-services Responsible Use Of Ai Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/responsible-use-of-ai-overview.md
Title: Overview of Responsible use of AI description: Azure AI services provides information and guidelines on how to responsibly use our AI services in applications. Below are the links to articles that provide this guidance for the different services within the Azure AI services suite.-
+#
ai-services Rotate Keys https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/rotate-keys.md
Title: Rotate keys in Azure AI services description: "Learn how to rotate API keys for better security, without interrupting service"-
+#
ai-services Security Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/security-features.md
Title: Azure AI services security description: Learn about the security considerations for Azure AI services usage.-
+#
ai-services Audio Processing Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/audio-processing-overview.md
Title: Audio processing - Speech service description: An overview of audio processing and capabilities of the Microsoft Audio Stack.-
+#
ai-services Audio Processing Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/audio-processing-speech-sdk.md
Title: Use the Microsoft Audio Stack (MAS) - Speech service description: An overview of the features, capabilities, and restrictions for audio processing using the Speech Software Development Kit (SDK).-
+#
ai-services Batch Synthesis Properties https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-synthesis-properties.md
Title: Batch synthesis properties for text to speech - Speech service description: Learn about the batch synthesis properties for text to speech.-
+#
ai-services Batch Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-synthesis.md
Title: Batch synthesis API (Preview) for text to speech - Speech service description: Learn how to use the batch synthesis API for asynchronous synthesis of long-form text to speech.-
+#
ai-services Batch Transcription Audio Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-transcription-audio-data.md
Title: Locate audio files for batch transcription - Speech service description: Batch transcription is used to transcribe a large amount of audio in storage. You should provide multiple files per request or point to an Azure Blob Storage container with the audio files to transcribe.-
+#
The batch transcription API supports a number of different formats and codecs, s
- MULAW in WAV container - AMR - WebM-- MP4 - M4A - SPEEX
ai-services Batch Transcription Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-transcription-create.md
Title: Create a batch transcription - Speech service description: With batch transcriptions, you submit the audio, and then retrieve transcription results asynchronously.-
+#
ai-services Batch Transcription Get https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-transcription-get.md
Title: Get batch transcription results - Speech service description: With batch transcription, the Speech service transcribes the audio data and stores the results in a storage container. You can then retrieve the results from the storage container.-
+#
ai-services Batch Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/batch-transcription.md
Title: Batch transcription overview - Speech service description: Batch transcription is ideal if you want to transcribe a large quantity of audio in storage, such as Azure blobs. Then you can asynchronously retrieve transcriptions.-
+#
To use the batch transcription REST API:
1. [Create a batch transcription](batch-transcription-create.md) - Submit the transcription job with parameters such as the audio files, the transcription language, and the transcription model. 1. [Get batch transcription results](batch-transcription-get.md) - Check transcription status and retrieve transcription results asynchronously.
-Batch transcription jobs are scheduled on a best-effort basis. You can't estimate when a job will change into the running state, but it should happen within minutes under normal system load. When the job is in the running state, the transcription occurs faster than the audio runtime playback speed.
+> [!IMPORTANT]
+> Batch transcription jobs are scheduled on a best-effort basis. At pick hours it may take up to 30 minutes or longer for a transcription job to start processing. See how to check the current status of a batch transcription job in [this section](batch-transcription-get.md#get-transcription-status).
## Next steps
ai-services Bring Your Own Storage Speech Resource Speech To Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/bring-your-own-storage-speech-resource-speech-to-text.md
Title: Use Bring your own storage (BYOS) Speech resource for Speech to text description: Learn how to use Bring your own storage (BYOS) Speech resource with Speech to text.-
+#
ai-services Bring Your Own Storage Speech Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/bring-your-own-storage-speech-resource.md
Title: Set up the Bring your own storage (BYOS) Speech resource description: Learn how to set up Bring your own storage (BYOS) Speech resource.-
+#
ai-services Call Center Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/call-center-overview.md
Title: Azure AI services for Call Center Overview description: Azure AI services for Language and Speech can help you realize partial or full automation of telephony-based customer interactions, and provide accessibility across multiple channels.-
+#
ai-services Call Center Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/call-center-quickstart.md
Title: "Post-call transcription and analytics quickstart - Speech service" description: In this quickstart, you perform sentiment analysis and conversation summarization of call center transcriptions.-
+#
ai-services Call Center Telephony Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/call-center-telephony-integration.md
Title: Call Center Telephony Integration - Speech service description: A common scenario for speech to text is transcribing large volumes of telephony data that come from various systems, such as interactive voice response (IVR) in real-time. This requires an integration with the Telephony System used.-
+#
ai-services Captioning Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/captioning-concepts.md
Title: Captioning with speech to text - Speech service description: An overview of key concepts for captioning with speech to text.-
+#
ai-services Captioning Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/captioning-quickstart.md
Title: "Create captions with speech to text quickstart - Speech service" description: In this quickstart, you convert speech to text as captions.-
+#
ai-services Custom Commands Encryption Of Data At Rest https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-commands-encryption-of-data-at-rest.md
Title: Custom Commands service encryption of data at rest description: Custom Commands encryption of data at rest.-
+#
ai-services Custom Commands References https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-commands-references.md
Title: 'Custom Commands concepts and definitions - Speech service' description: In this article, you learn about concepts and definitions for Custom Commands applications.-
+#
ai-services Custom Commands https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-commands.md
Title: Custom Commands overview - Speech service description: An overview of the features, capabilities, and restrictions for Custom Commands, a solution for creating voice applications.-
+#
ai-services Custom Keyword Basics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-keyword-basics.md
Title: Create a custom keyword quickstart - Speech service description: When a user speaks the keyword, your device sends their dictation to the cloud, until the user stops speaking. Customizing your keyword is an effective way to differentiate your device and strengthen your branding.-
+#
ai-services Custom Neural Voice Lite https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-neural-voice-lite.md
Title: Custom Neural Voice Lite - Speech service description: Use Custom Neural Voice Lite to demo and evaluate Custom Neural Voice before investing in professional recordings to create a higher-quality voice.-
+#
ai-services Custom Neural Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-neural-voice.md
Title: Custom Neural Voice overview - Speech service description: Custom Neural Voice is a text to speech feature that allows you to create a one-of-a-kind, customized, synthetic voice for your applications. You provide your own audio data as a sample.-
+#
ai-services Custom Speech Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/custom-speech-overview.md
Title: Custom Speech overview - Speech service description: Custom Speech is a set of online tools that allows you to evaluate and improve the speech to text accuracy for your applications, tools, and products. -
+#
ai-services Devices Sdk Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/devices-sdk-release-notes.md
Title: Speech Devices SDK release notes description: The release notes provide a log of updates, enhancements, bug fixes, and changes to the Speech Devices SDK. This article is updated with each release of the Speech Devices SDK.-
+#
ai-services Direct Line Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/direct-line-speech.md
Title: Direct Line Speech - Speech service description: An overview of the features, capabilities, and restrictions for Voice assistants using Direct Line Speech with the Speech Software Development Kit (SDK).-
+#
ai-services Display Text Format https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/display-text-format.md
Title: Display text formatting with speech to text - Speech service description: An overview of key concepts for display text formatting with speech to text.-
+#
ai-services Embedded Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/embedded-speech.md
Title: Embedded Speech - Speech service description: Embedded Speech is designed for on-device scenarios where cloud connectivity is intermittent or unavailable.-
+#
dependencies {
For embedded speech, you need to download the speech recognition models for [speech to text](speech-to-text.md) and voices for [text to speech](text-to-speech.md). Instructions are provided upon successful completion of the [limited access review](https://aka.ms/csgate-embedded-speech) process.
-The following [speech to text](speech-to-text.md) models are available: de-DE, en-AU, en-CA, en-GB, en-IE, en-IN, en-NZ, en-US, es-ES, es-MX, fr-CA, fr-FR, hi-IN, it-IT, ja-JP, ko-KR, nl-NL, pt-BR, ru-RU, sv-SE, tr-TR, zh-CN, zh-HK, and zh-TW.
+The following [speech to text](speech-to-text.md) models are available: da-DK, de-DE, en-AU, en-CA, en-GB, en-IE, en-IN, en-NZ, en-US, es-ES, es-MX, fr-CA, fr-FR, it-IT, ja-JP, ko-KR, pt-BR, pt-PT, zh-CN, zh-HK, and zh-TW.
All text to speech locales [here](language-support.md?tabs=tts) (except fa-IR, Persian (Iran)) are available out of box with either 1 selected female and/or 1 selected male voices. We welcome your input to help us gauge demand for more languages and voices.
ai-services Gaming Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/gaming-concepts.md
Title: Game development with Azure AI Speech - Speech service description: Concepts for game development with Azure AI Speech.-
+#
ai-services Get Speech Recognition Results https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-speech-recognition-results.md
Title: "Get speech recognition results - Speech service" description: Learn how to get speech recognition results.-
+#
ai-services Get Started Intent Recognition Clu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-intent-recognition-clu.md
Title: "Intent recognition with CLU quickstart - Speech service" description: In this quickstart, you recognize intents from audio data with the Speech service and Language service.-
+#
ai-services Get Started Intent Recognition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-intent-recognition.md
Title: "Intent recognition quickstart - Speech service" description: In this quickstart, you recognize intents from audio data with the Speech service and LUIS.-
+#
ai-services Get Started Speaker Recognition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-speaker-recognition.md
Title: "Speaker Recognition quickstart - Speech service" description: In this quickstart, you use speaker recognition to confirm who is speaking. Learn about common design patterns for working with speaker verification and identification. -
+#
ai-services Get Started Speech To Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-speech-to-text.md
Title: "Speech to text quickstart - Speech service" description: In this quickstart, learn how to convert speech to text with recognition from a microphone or .wav file.-
+#
ai-services Get Started Speech Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-speech-translation.md
Title: Speech translation quickstart - Speech service description: In this quickstart, you translate speech from one language to text in another language. -
+#
ai-services Get Started Stt Diarization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-stt-diarization.md
Title: "Real-time diarization quickstart - Speech service" description: In this quickstart, you convert speech to text continuously from a file. The service transcribes the speech and identifies one or more speakers.-
+#
ai-services Get Started Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/get-started-text-to-speech.md
Title: "Text to speech quickstart - Speech service" description: In this quickstart, you convert text to speech. Learn about object construction and design patterns, supported audio formats, and custom configuration options.-
+#
ai-services How To Async Meeting Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-async-meeting-transcription.md
Title: Asynchronous meeting transcription - Speech service description: Learn how to use asynchronous meeting transcription using the Speech service. Available for Java and C# only.-
+#
ai-services How To Audio Content Creation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-audio-content-creation.md
Title: Audio Content Creation - Speech service description: Audio Content Creation is an online tool that allows you to run Text to speech synthesis without writing any code.-
+#
ai-services How To Configure Azure Ad Auth https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-configure-azure-ad-auth.md
Title: How to configure Microsoft Entra authentication description: Learn how to authenticate using Microsoft Entra authentication-
+#
ai-services How To Configure Openssl Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-configure-openssl-linux.md
Title: How to configure OpenSSL for Linux description: Learn how to configure OpenSSL for Linux.-
+#
ai-services How To Configure Rhel Centos 7 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-configure-rhel-centos-7.md
Title: How to configure RHEL/CentOS 7 - Speech service description: Learn how to configure RHEL/CentOS 7 so that the Speech SDK can be used.-
+#
ai-services How To Control Connections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-control-connections.md
Title: Service connectivity how-to - Speech SDK description: Learn how to monitor for connection status and manually connect or disconnect from the Speech service.-
+#
ai-services How To Custom Commands Debug Build Time https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-debug-build-time.md
Title: 'Debug errors when authoring a Custom Commands application (Preview)' description: In this article, you learn how to debug errors when authoring Custom Commands application.-
+#
ai-services How To Custom Commands Debug Runtime https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-debug-runtime.md
Title: 'Troubleshooting guide for a Custom Commands application at runtime' description: In this article, you learn how to debug runtime errors in a Custom Commands application.-
+#
ai-services How To Custom Commands Deploy Cicd https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-deploy-cicd.md
Title: 'Continuous Deployment with Azure DevOps (Preview)' description: In this article, you learn how to set up continuous deployment for your Custom Commands applications. You create the scripts to support the continuous deployment workflows.-
+#
ai-services How To Custom Commands Developer Flow Test https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-developer-flow-test.md
Title: 'Test your Custom Commands app' description: In this article, you learn different approaches to testing a custom commands application. -
+#
ai-services How To Custom Commands Send Activity To Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-send-activity-to-client.md
Title: 'Send Custom Commands activity to client application' description: In this article, you learn how to send activity from a Custom Commands application to a client application running the Speech SDK.-
+#
ai-services How To Custom Commands Setup Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-setup-speech-sdk.md
Title: 'Integrate with a client app using Speech SDK' description: how to make requests to a published Custom Commands application from the Speech SDK running in a UWP application.-
+#
ai-services How To Custom Commands Setup Web Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-setup-web-endpoints.md
Title: 'Set up web endpoints' description: set up web endpoints for Custom Commands-
+#
ai-services How To Custom Commands Update Command From Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-update-command-from-client.md
Title: 'Update a command parameter from a client app' description: Learn how to update a command from a client application.-
+#
ai-services How To Custom Commands Update Command From Web Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-commands-update-command-from-web-endpoint.md
Title: 'Update a command from a web endpoint' description: Learn how to update the state of a command by using a call to a web endpoint.-
+#
ai-services How To Custom Speech Continuous Integration Continuous Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-continuous-integration-continuous-deployment.md
Title: CI/CD for Custom Speech - Speech service description: Apply DevOps with Custom Speech and CI/CD workflows. Implement an existing DevOps solution for your own project.-
+#
ai-services How To Custom Speech Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-create-project.md
Title: Create a Custom Speech project - Speech service description: Learn about how to create a project for Custom Speech. -
+#
ai-services How To Custom Speech Deploy Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-deploy-model.md
Title: Deploy a Custom Speech model - Speech service description: Learn how to deploy Custom Speech models. -
+#
ai-services How To Custom Speech Evaluate Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-evaluate-data.md
Title: Test accuracy of a Custom Speech model - Speech service description: In this article, you learn how to quantitatively measure and improve the quality of our speech to text model or your custom model.-
+#
ai-services How To Custom Speech Human Labeled Transcriptions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-human-labeled-transcriptions.md
Title: Human-labeled transcriptions guidelines - Speech service description: You use human-labeled transcriptions with your audio data to improve speech recognition accuracy. This is especially helpful when words are deleted or incorrectly replaced. -
+#
ai-services How To Custom Speech Inspect Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-inspect-data.md
Title: Test recognition quality of a Custom Speech model - Speech service description: Custom Speech lets you qualitatively inspect the recognition quality of a model. You can play back uploaded audio and determine if the provided recognition result is correct.-
+#
ai-services How To Custom Speech Model And Endpoint Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-model-and-endpoint-lifecycle.md
Title: Model lifecycle of Custom Speech - Speech service description: Custom Speech provides base models for training and lets you create custom models from your data. This article describes the timelines for models and for endpoints that use these models.-
+#
ai-services How To Custom Speech Test And Train https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-test-and-train.md
Title: "Training and testing datasets - Speech service" description: Learn about types of training and testing data for a Custom Speech project, along with how to use and manage that data.-
+#
ai-services How To Custom Speech Train Model https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-train-model.md
Title: Train a Custom Speech model - Speech service description: Learn how to train Custom Speech models. Training a speech to text model can improve recognition accuracy for the Microsoft base model or a custom model.-
+#
ai-services How To Custom Speech Transcription Editor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-transcription-editor.md
Title: How to use the online transcription editor for Custom Speech - Speech service description: The online transcription editor allows you to create or edit audio + human-labeled transcriptions for Custom Speech.-
+#
ai-services How To Custom Speech Upload Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-speech-upload-data.md
Title: "Upload training and testing datasets for Custom Speech - Speech service" description: Learn about how to upload data to test or train a Custom Speech model.-
+#
ai-services How To Custom Voice Create Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-voice-create-voice.md
Title: Train your custom voice model - Speech service description: Learn how to train a custom neural voice through the Speech Studio portal. Training duration varies depending on how much data you're training.-
+#
ai-services How To Custom Voice Prepare Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-voice-prepare-data.md
Title: "How to prepare data for Custom Voice - Speech service" description: "Learn how to provide studio recordings and the associated scripts that will be used to train your Custom Neural Voice."-
+#
ai-services How To Custom Voice Talent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-voice-talent.md
Title: "Set up voice talent for custom neural voice - Speech service" description: Create a voice talent profile with an audio file recorded by the voice talent, consenting to the usage of their speech data to train a custom voice model.-
+#
ai-services How To Custom Voice Training Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-voice-training-data.md
Title: "Training data for Custom Neural Voice - Speech service" description: "Learn about the data types that you can use to train a Custom Neural Voice."-
+#
ai-services How To Custom Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-custom-voice.md
Title: Create a project for Custom Neural Voice - Speech service description: Learn how to create a Custom Neural Voice project that contains data, models, tests, and endpoints in Speech Studio.-
+#
ai-services How To Deploy And Use Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-deploy-and-use-endpoint.md
Title: How to deploy and use voice model - Speech service description: Learn about how to deploy and use a custom neural voice model.-
+#
ai-services How To Develop Custom Commands Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-develop-custom-commands-application.md
Title: 'How-to: Develop Custom Commands applications - Speech service' description: Learn how to develop and customize Custom Commands applications. These voice-command apps are best suited for task completion or command-and-control scenarios.-
+#
ai-services How To Get Speech Session Id https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-get-speech-session-id.md
Title: How to get Speech to text Session ID and Transcription ID description: Learn how to get Speech service Speech to text Session ID and Transcription ID-
+#
ai-services How To Lower Speech Synthesis Latency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-lower-speech-synthesis-latency.md
Title: How to lower speech synthesis latency using Speech SDK description: How to lower speech synthesis latency using Speech SDK, including streaming, pre-connection, and so on.-
+#
ai-services How To Migrate To Custom Neural Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-migrate-to-custom-neural-voice.md
Title: Migrate from custom voice to custom neural voice - Speech service description: This document helps users migrate from custom voice to custom neural voice.-
+#
ai-services How To Migrate To Prebuilt Neural Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-migrate-to-prebuilt-neural-voice.md
Title: Migrate from prebuilt standard voice to prebuilt neural voice - Speech service description: This document helps users migrate from prebuilt standard voice to prebuilt neural voice.-
+#
ai-services How To Pronunciation Assessment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-pronunciation-assessment.md
Title: Use pronunciation assessment description: Learn about pronunciation assessment features that are currently publicly available.-
+#
ai-services How To Recognize Intents From Speech Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-recognize-intents-from-speech-csharp.md
Title: How to recognize intents from speech using the Speech SDK C# description: In this guide, you learn how to recognize intents from speech using the Speech SDK for C#.-
+#
ai-services How To Recognize Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-recognize-speech.md
Title: "How to recognize speech - Speech service" description: Learn how to convert speech to text, including object construction, supported audio input formats, and configuration options for speech recognition.-
+#
ai-services How To Select Audio Input Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-select-audio-input-devices.md
Title: Select an audio input device with the Speech SDK description: 'Learn about selecting audio input devices in the Speech SDK (C++, C#, Python, Objective-C, Java, and JavaScript) by obtaining the IDs of the audio devices connected to a system.'-
+#
ai-services How To Speech Synthesis Viseme https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-speech-synthesis-viseme.md
Title: Get facial position with viseme description: Speech SDK supports viseme events during speech synthesis, which represent key poses in observed speech, such as the position of the lips, jaw, and tongue when producing a particular phoneme.-
+#
ai-services How To Speech Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-speech-synthesis.md
Title: "How to synthesize speech from text - Speech service" description: Learn how to convert text to speech, including object construction and design patterns, supported audio output formats, and custom configuration options.-
+#
ai-services How To Track Speech Sdk Memory Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-track-speech-sdk-memory-usage.md
Title: How to track Speech SDK memory usage - Speech service description: The Speech SDK supports numerous programming languages for speech to text and text to speech conversion, along with speech translation. This article discusses memory management tooling built into the SDK.-
+#
ai-services How To Translate Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-translate-speech.md
Title: "How to translate speech - Speech service" description: Learn how to translate speech from one language to text in another language, including object construction and supported audio input formats.-
+#
ai-services How To Use Audio Input Streams https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-audio-input-streams.md
Title: Speech SDK audio input stream concepts description: An overview of the capabilities of the Speech SDK audio input stream.-
+#
ai-services How To Use Codec Compressed Audio Input Streams https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-codec-compressed-audio-input-streams.md
Title: How to use compressed input audio - Speech service description: Learn how to use compressed input audio the Speech SDK and CLI. -
+#
ai-services How To Use Custom Entity Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-custom-entity-pattern-matching.md
Title: How to recognize intents with custom entity pattern matching description: In this guide, you learn how to recognize intents and custom entities from simple patterns.-
+#
ai-services How To Use Logging https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-logging.md
Title: Speech SDK logging - Speech service description: Learn about how to enable logging in the Speech SDK (C++, C#, Python, Objective-C, Java).-
+#
ai-services How To Use Meeting Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-meeting-transcription.md
Title: Real-time meeting transcription quickstart - Speech service description: In this quickstart, learn how to transcribe meetings. You can add, remove, and identify multiple participants by streaming audio to the Speech service.-
+#
ai-services How To Use Simple Language Pattern Matching https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-use-simple-language-pattern-matching.md
Title: How to recognize intents with simple language pattern matching description: In this guide, you learn how to recognize intents and entities from simple patterns.-
+#
ai-services How To Windows Voice Assistants Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/how-to-windows-voice-assistants-get-started.md
Title: Voice Assistants on Windows - Get Started description: The steps to begin developing a windows voice agent, including a reference to the sample code quickstart.-
+#
ai-services Ingestion Client https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/ingestion-client.md
Title: Ingestion Client - Speech service description: In this article we describe a tool released on GitHub that enables customers push audio files to Speech service easily and quickly -
+#
ai-services Intent Recognition https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/intent-recognition.md
Title: Intent recognition overview - Speech service description: Intent recognition allows you to recognize user objectives you have pre-defined. This article is an overview of the benefits and capabilities of the intent recognition service.-
+#
ai-services Keyword Recognition Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/keyword-recognition-guidelines.md
Title: Keyword recognition recommendations and guidelines - Speech service description: An overview of recommendations and guidelines when using keyword recognition.-
+#
ai-services Keyword Recognition Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/keyword-recognition-overview.md
Title: Keyword recognition overview - Speech service description: An overview of the features, capabilities, and restrictions for keyword recognition by using the Speech Software Development Kit (SDK).-
+#
ai-services Language Identification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/language-identification.md
Title: Language identification - Speech service description: Language identification is used to determine the language being spoken in audio when compared against a list of provided languages.-
+#
ai-services Language Learning Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/language-learning-overview.md
Title: Language learning with Azure AI Speech description: Azure AI services for Speech can be used to learn languages.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/language-support.md
Title: Language support - Speech service description: The Speech service supports numerous languages for speech to text and text to speech conversion, along with speech translation. This article provides a comprehensive list of language support by service feature.-
+#
ai-services Logging Audio Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/logging-audio-transcription.md
Title: How to log audio and transcriptions for speech recognition description: Learn how to use audio and transcription logging for speech to text and speech translation.-
+#
ai-services Meeting Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/meeting-transcription.md
Title: Meeting transcription overview - Speech service description: You use the meeting transcription feature for meetings. It combines recognition, speaker ID, and diarization to provide transcription of any meeting.-
+#
ai-services Migrate To Batch Synthesis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/migrate-to-batch-synthesis.md
Title: Migrate to Batch synthesis API - Speech service description: This document helps developers migrate code from Long Audio REST API to Batch synthesis REST API.-
+#
ai-services Migrate V2 To V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/migrate-v2-to-v3.md
Title: Migrate from v2 to v3 REST API - Speech service description: This document helps developers migrate code from v2 to v3 of the Speech to text REST API.speech-to-text REST API.-
+#
ai-services Migrate V3 0 To V3 1 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/migrate-v3-0-to-v3-1.md
Title: Migrate from v3.0 to v3.1 REST API - Speech service description: This document helps developers migrate code from v3.0 to v3.1 of the Speech to text REST API.-
+#
ai-services Migrate V3 1 To V3 2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/migrate-v3-1-to-v3-2.md
Title: Migrate from v3.1 to v3.2 REST API - Speech service description: This document helps developers migrate code from v3.1 to v3.2 of the Speech to text REST API.-
+#
ai-services Migration Overview Neural Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/migration-overview-neural-voice.md
Title: Migration to neural voice - Speech service description: This document summarizes the benefits of migration from non-neural voice to neural voice.-
+#
ai-services Multi Device Conversation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/multi-device-conversation.md
Title: Multi-device Conversation overview - Speech service description: Multi-device conversation makes it easy to create a speech or text conversation between multiple clients and coordinate the messages that are sent between them.-
+#
ai-services Openai Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/openai-speech.md
Title: "Azure OpenAI speech to speech chat - Speech service" description: In this how-to guide, you can use Speech to converse with Azure OpenAI. The text recognized by the Speech service is sent to Azure OpenAI. The text response from Azure OpenAI is then synthesized by the Speech service.-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/overview.md
Title: What is the Speech service? description: The Speech service provides speech to text, text to speech, and speech translation capabilities with an Azure resource. Add speech to your applications, tools, and devices with the Speech SDK, Speech Studio, or REST APIs.-
+#
ai-services Power Automate Batch Transcription https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/power-automate-batch-transcription.md
Title: Power automate batch transcription - Speech service description: Transcribe audio files from an Azure Storage container using the Power Automate batch transcription connector.-
+#
ai-services Pronunciation Assessment Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/pronunciation-assessment-tool.md
Title: How to use pronunciation assessment in Speech Studio description: The pronunciation assessment tool in Speech Studio gives you feedback on the accuracy and fluency of your speech, no coding required.-
+#
At the bottom of the Assessment result, two overall scores are displayed: Pronun
**Content Score**: This score provides an aggregated assessment of the content of the speech and includes three sub-aspects. This score is only available in the speaking tab for an unscripted assessment. > [!NOTE]
-> Content score is currently available on the following regions: `westcentralus`, `eastasia`, `eastus`, `northeurope`, `westeurope`, and `westus2`. All other regions will have Content score available starting from Nov 30, 2023.
+> Content score is currently available on the following regions in Speech Studio: `westcentralus`, `eastasia`, `eastus`, `northeurope`, `westeurope`, and `westus2`. All other regions will have Content score available starting from Nov 30, 2023.
- **Vocabulary score**: Evaluates the speaker's effective usage of words and their appropriateness within the given context to express ideas accurately, as well as the level of lexical complexity. - **Grammar score**: Evaluates the correctness of grammar usage and variety of sentence patterns. It considers lexical accuracy, grammatical accuracy, and diversity of sentence structures, providing a more comprehensive evaluation of language proficiency.
ai-services Quickstart Custom Commands Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/quickstart-custom-commands-application.md
Title: 'Quickstart: Create a voice assistant using Custom Commands - Speech service' description: In this quickstart, you create and test a basic Custom Commands application in Speech Studio. -
+#
ai-services Multi Device Conversation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/quickstarts/multi-device-conversation.md
Title: 'Quickstart: Multi-device Conversation - Speech service' description: In this quickstart, you'll learn how to create and join clients to a multi-device conversation by using the Speech SDK.-
+#
ai-services Setup Platform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/quickstarts/setup-platform.md
Title: Install the Speech SDK description: In this quickstart, you learn how to install the Speech SDK for your preferred programming language.-
+#
ai-services Voice Assistants https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/quickstarts/voice-assistants.md
Title: 'Quickstart: Create a custom voice assistant - Speech service' description: In this quickstart, you use the Speech SDK to create a custom voice assistant.-
+#
ai-services Record Custom Voice Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/record-custom-voice-samples.md
Title: "Recording custom voice samples - Speech service" description: Make a production-quality custom voice by preparing a robust script, hiring good voice talent, and recording professionally.-
+#
ai-services Releasenotes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/releasenotes.md
Title: What's new - Speech service description: Find out about new releases and features for Azure AI Speech.-
+#
ai-services Resiliency And Recovery Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/resiliency-and-recovery-plan.md
Title: How to back up and recover speech customer resources description: Learn how to prepare for service outages with Custom Speech and Custom Voice.-
+#
ai-services Rest Speech To Text Short https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/rest-speech-to-text-short.md
Title: Speech to text REST API for short audio - Speech service description: Learn how to use Speech to text REST API for short audio to convert speech to text.-
+#
ai-services Rest Speech To Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/rest-speech-to-text.md
Title: Speech to text REST API - Speech service description: Get reference documentation for Speech to text REST API.-
+#
ai-services Rest Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/rest-text-to-speech.md
Title: Text to speech API reference (REST) - Speech service description: Learn how to use the REST API to convert text into synthesized speech.-
+#
ai-services Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/role-based-access-control.md
Title: Role-based access control for Speech resources - Speech service description: Learn how to assign access roles for a Speech resource.-
+#
ai-services Sovereign Clouds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/sovereign-clouds.md
Title: Sovereign Clouds - Speech service description: Learn how to use Sovereign Clouds-
+#
Previously updated : 05/10/2022 Last updated : 11/17/2023
Available to organizations with a business presence in China. See more informati
- Neural voice - Speech translator - **Unsupported features:**
- - Custom Voice
- - Custom Commands
+ - Custom commands
+ - Custom neural voice
+ - Personal voice
+ - Text to speech avatar
- **Supported languages:** - See the list of supported languages [here](language-support.md)
ai-services Speaker Recognition Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speaker-recognition-overview.md
Title: Speaker recognition overview - Speech service description: Speaker recognition provides algorithms that verify and identify speakers by their unique voice characteristics, by using voice biometry. Speaker recognition is used to answer the question ΓÇ£who is speaking?ΓÇ¥. This article is an overview of the benefits and capabilities of the speaker recognition feature.-
+#
ai-services Speech Container Batch Processing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-batch-processing.md
Title: Batch processing kit for Speech containers description: Use the Batch processing kit to scale Speech container requests. -
+#
ai-services Speech Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-configuration.md
Title: Configure Speech containers description: Speech service provides each container with a common configuration framework, so that you can easily configure and manage storage, logging and telemetry, and security settings for your containers.-
+#
ai-services Speech Container Cstt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-cstt.md
Title: Custom speech to text containers - Speech service description: Install and run custom speech to text containers with Docker to perform speech recognition, transcription, generation, and more on-premises.-
+#
ai-services Speech Container Howto On Premises https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-howto-on-premises.md
Title: Use Speech service containers with Kubernetes and Helm description: Using Kubernetes and Helm to define the speech to text and text to speech container images, we'll create a Kubernetes package. This package will be deployed to a Kubernetes cluster on-premises.-
+#
ai-services Speech Container Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-howto.md
Title: Install and run Speech containers with Docker - Speech service description: Use the Speech containers with Docker to perform speech recognition, transcription, generation, and more on-premises.-
+#
ai-services Speech Container Lid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-lid.md
Title: Language identification containers - Speech service description: Install and run language identification containers with Docker to perform speech recognition, transcription, generation, and more on-premises.-
+#
ai-services Speech Container Ntts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-ntts.md
Title: Neural text to speech containers - Speech service description: Install and run neural text to speech containers with Docker to perform speech synthesis and more on-premises.-
+#
ai-services Speech Container Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-overview.md
Title: Speech containers overview - Speech service description: Use the Docker containers for the Speech service to perform speech recognition, transcription, generation, and more on-premises.-
+#
ai-services Speech Container Stt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-container-stt.md
Title: Speech to text containers - Speech service description: Install and run speech to text containers with Docker to perform speech recognition, transcription, generation, and more on-premises.-
+#
ai-services Speech Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-devices.md
Title: Speech devices overview - Speech service description: Get started with the Speech devices. The Speech service works with a wide variety of devices and audio sources. -
+#
ai-services Speech Sdk Microphone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-sdk-microphone.md
Title: Microphone array recommendations - Speech service description: Speech SDK microphone array recommendations. These array geometries are recommended for use with the Microsoft Audio Stack.-
+#
ai-services Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-sdk.md
Title: About the Speech SDK - Speech service description: The Speech software development kit (SDK) exposes many of the Speech service capabilities, making it easier to develop speech-enabled applications.-
+#
ai-services Speech Service Vnet Service Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-service-vnet-service-endpoint.md
Title: Use Virtual Network service endpoints with Speech service description: This article describes how to use Speech service with an Azure Virtual Network service endpoint.-
+#
ai-services Speech Services Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-services-private-link.md
Title: How to use private endpoints with Speech service description: Learn how to use Speech service with private endpoints provided by Azure Private Link-
+#
ai-services Speech Services Quotas And Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-services-quotas-and-limits.md
Title: Speech service quotas and limits description: Quick reference, detailed description, and best practices on the quotas and limits for the Speech service in Azure AI services.-
+#
ai-services Speech Ssml Phonetic Sets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-ssml-phonetic-sets.md
Title: Speech phonetic alphabets - Speech service description: This article presents Speech service phonetic alphabet and International Phonetic Alphabet (IPA) examples.-
+#
ai-services Speech Studio Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-studio-overview.md
Title: "Speech Studio overview - Speech service" description: Speech Studio is a set of UI-based tools for building and integrating features from Speech service in your applications.-
+#
ai-services Speech Synthesis Markup Pronunciation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-synthesis-markup-pronunciation.md
Title: Pronunciation with Speech Synthesis Markup Language (SSML) - Speech service description: Learn about Speech Synthesis Markup Language (SSML) elements to improve pronunciation.-
+#
ai-services Speech Synthesis Markup Structure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-synthesis-markup-structure.md
Title: Speech Synthesis Markup Language (SSML) document structure and events - Speech service description: Learn about the Speech Synthesis Markup Language (SSML) document structure.-
+#
ai-services Speech Synthesis Markup Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-synthesis-markup-voice.md
Title: Voice and sound with Speech Synthesis Markup Language (SSML) - Speech service description: Learn about how you can use Speech Synthesis Markup Language (SSML) elements to customize what your Speech service voice sounds like.-
+#
ai-services Speech Synthesis Markup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-synthesis-markup.md
Title: Speech Synthesis Markup Language (SSML) overview - Speech service description: Learn how to use the Speech Synthesis Markup Language to control pronunciation and prosody in text to speech.-
+#
ai-services Speech To Text https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-to-text.md
Title: Speech to text overview - Speech service description: Get an overview of the benefits and capabilities of the speech to text feature of the Speech service.-
+#
ai-services Speech Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/speech-translation.md
Title: Speech translation overview - Speech service description: With speech translation, you can add end-to-end, real-time, multi-language translation of speech to your applications, tools, and devices.-
+#
ai-services Spx Basics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/spx-basics.md
Title: "Quickstart: The Speech CLI - Speech service" description: In this Azure AI Speech CLI quickstart, you interact with speech to text, text to speech, and speech translation without having to write code.-
+#
ai-services Spx Batch Operations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/spx-batch-operations.md
Title: "Run batch operations with the Speech CLI - Speech service" description: Learn how to do batch speech to text (speech recognition), batch text to speech (speech synthesis) with the Speech CLI.-
+#
ai-services Spx Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/spx-overview.md
Title: The Azure AI Speech CLI description: In this article, you learn about the Speech CLI, a command-line tool for using Speech service without having to write any code.-
+#
ai-services Swagger Documentation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/swagger-documentation.md
Title: Generate a REST API client library - Speech service description: The Swagger documentation can be used to auto-generate SDKs for a number of programming languages. -
+#
ai-services Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/text-to-speech.md
Title: Text to speech overview - Speech service description: Get an overview of the benefits and capabilities of the text to speech feature of the Speech service.-
+#
ai-services Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/troubleshooting.md
Title: Troubleshoot the Speech SDK - Speech service description: This article provides information to help you solve issues you might encounter when you use the Speech SDK.-
+#
ai-services Tutorial Voice Enable Your Bot Speech Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/tutorial-voice-enable-your-bot-speech-sdk.md
Title: "Tutorial: Voice-enable your bot - Speech service" description: In this tutorial, you'll create an echo bot and configure a client app that lets you speak to your bot and hear it respond back to you.-
+#
ai-services Voice Assistants https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/voice-assistants.md
Title: Voice assistants overview - Speech service description: An overview of the features, capabilities, and restrictions for voice assistants with the Speech SDK.-
+#
ai-services Whisper Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/whisper-overview.md
Title: The Whisper model from OpenAI description: In this article, you learn about the Whisper model from OpenAI that you can use for speech to text and speech translation.-
+#
ai-services Windows Voice Assistants Automatic Enablement Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/windows-voice-assistants-automatic-enablement-guidelines.md
Title: Privacy guidelines for voice assistants on Windows description: The instructions to enable voice activation for a voice agent by default.-
+#
ai-services Windows Voice Assistants Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/windows-voice-assistants-best-practices.md
Title: Voice Assistants on Windows - Design Guidelines description: Guidelines for best practices when designing a voice agent experience.-
+#
ai-services Windows Voice Assistants Implementation Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/windows-voice-assistants-implementation-guide.md
Title: Voice Assistants on Windows - Above Lock Implementation Guidelines description: The instructions to implement voice activation and above-lock capabilities for a voice agent application.-
+#
ai-services Windows Voice Assistants Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/speech-service/windows-voice-assistants-overview.md
Title: Voice Assistants on Windows overview - Speech service description: An overview of the voice assistants on Windows, including capabilities and development resources available.-
+#
ai-services Deploy User Managed Glossary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/containers/deploy-user-managed-glossary.md
Title: Deploy a user-managed glossary in Translator container description: How to deploy a user-managed glossary in the Translator container environment.-
+#
ai-services Translator Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/containers/translator-container-configuration.md
Title: Configure containers - Translator description: The Translator container runtime environment is configured using the `docker run` command arguments. There are both required and optional settings.-
+#
ai-services Translator Container Supported Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/containers/translator-container-supported-parameters.md
Title: "Container: Translate method" description: Understand the parameters, headers, and body messages for the container Translate method of Azure AI Translator to translate text.-
+#
ai-services Translator Disconnected Containers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/containers/translator-disconnected-containers.md
Title: Use Translator Docker containers in disconnected environments description: Learn how to run Azure AI Translator containers in disconnected environments.-
+#
ai-services Translator How To Install Container https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/containers/translator-how-to-install-container.md
Title: Install and run Docker containers for Translator API description: Use the Docker container for Translator API to translate text.-
+#
ai-services Create Translator Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/create-translator-resource.md
Title: Create a Translator resource description: Learn how to create an Azure AI Translator resource and retrieve your API key and endpoint URL in the Azure portal.-
+#
ai-services Customization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/custom-translator/concepts/customization.md
Title: Translation Customization - Translator description: Use the Microsoft Translator Hub to build your own machine translation system using your preferred terminology and style.-
+#
ai-services Workspace And Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/custom-translator/concepts/workspace-and-project.md
Title: "What is a workspace and project? - Custom Translator" description: This article will explain the differences between a workspace and a project as well as project categories and labels for the Custom Translator service.-
+#
ai-services Enable Vnet Service Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/custom-translator/how-to/enable-vnet-service-endpoint.md
Title: Enable Virtual Network service endpoints with Custom Translator service description: This article describes how to use Custom Translator service with an Azure Virtual Network service endpoint.-
+#
ai-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/faq.md
Title: Frequently asked questions - Document Translation description: Get answers to frequently asked questions about Document Translation.-
+#
ai-services Use Rest Api Programmatically https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/how-to-guides/use-rest-api-programmatically.md
Title: Use Document Translation APIs programmatically description: "How to create a Document Translation service using C#, Go, Java, Node.js, or Python and the REST API"-
+#
ai-services Language Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/language-studio.md
Title: Try Document Translation in Language Studio description: "Document Translation in Azure AI Language Studio."-
+#
ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/overview.md
Title: What is Document Translation? description: An overview of the cloud-based batch Document Translation service and process.-
+#
ai-services Document Translation Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/quickstarts/document-translation-rest-api.md
Title: Get started with Document Translation description: "How to create a Document Translation service using C#, Go, Java, Node.js, or Python programming languages and the REST API"-
+#
ai-services Document Translation Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/quickstarts/document-translation-sdk.md
Title: "Document Translation C#/.NET or Python client library" description: Use the Document Translator C#/.NET or Python client library (SDK) for cloud-based batch document translation service and process-
+#
ai-services Cancel Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/cancel-translation.md
Title: Cancel translation method description: The cancel translation method cancels a current processing or queued operation.-
+#
ai-services Get Document Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-document-status.md
Title: Get document status method description: The get document status method returns the status for a specific document.-
+#
ai-services Get Documents Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-documents-status.md
Title: Get documents status description: The get documents status method returns the status for all documents in a batch document translation request.-
+#
ai-services Get Supported Document Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-supported-document-formats.md
Title: Get supported document formats method description: The get supported document formats method returns a list of supported document formats.-
+#
ai-services Get Supported Glossary Formats https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-supported-glossary-formats.md
Title: Get supported glossary formats method description: The get supported glossary formats method returns the list of supported glossary formats.-
+#
ai-services Get Supported Storage Sources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-supported-storage-sources.md
Title: Get supported storage sources method description: The get supported storage sources method returns a list of supported storage sources.-
+#
ai-services Get Translation Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-translation-status.md
Title: Get translation status description: The get translation status method returns the status for a document translation request.-
+#
ai-services Get Translations Status https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/get-translations-status.md
Title: Get translations status description: The get translations status method returns a list of batch requests submitted and the status for each request.-
+#
ai-services Rest Api Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/rest-api-guide.md
Title: "Document Translation REST API reference guide" description: View a list of with links to the Document Translation REST APIs.-
+#
ai-services Start Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/document-translation/reference/start-translation.md
Title: Start translation description: Start a document translation request with the Document Translation service.-
+#
ai-services Dynamic Dictionary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/dynamic-dictionary.md
Title: Dynamic Dictionary - Translator description: This article explains how to use the dynamic dictionary feature of the Azure AI Translator.-
+#
ai-services Firewalls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/firewalls.md
Title: Translate behind firewalls - Translator description: Azure AI Translator can translate behind firewalls using either domain-name or IP filtering.-
+#
ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/language-support.md
Title: Language support - Translator description: Azure AI Translator supports the following languages for text to text translation using Neural Machine Translation (NMT).-
+#
ai-services Migrate To V3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/migrate-to-v3.md
Title: Migrate to V3 - Translator description: This article provides the steps to help you migrate from V2 to V3 of the Azure AI Translator.-
+#
ai-services Modifications Deprecations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/modifications-deprecations.md
Title: Modifications to Translator Service description: Translator Service changes, modifications, and deprecations-
+#
ai-services Prevent Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/prevent-translation.md
Title: Prevent content translation - Translator description: Prevent translation of content with the Translator. The Translator allows you to tag content so that it isn't translated.-
+#
ai-services Profanity Filtering https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/profanity-filtering.md
Title: Profanity filtering - Translator description: Use Translator profanity filtering to determine the level of profanity translated in your text.-
+#
ai-services Quickstart Text Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/quickstart-text-rest-api.md
Title: "Quickstart: Azure AI Translator REST APIs" description: "Learn to translate text with the Translator service REST APIs. Examples are provided in C#, Go, Java, JavaScript and Python."-
+#
ai-services Quickstart Text Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/quickstart-text-sdk.md
Title: "Quickstart: Azure AI Translator SDKs" description: "Learn to translate text with the Translator service SDks in a programming language of your choice: C#, Java, JavaScript, or Python."-
+#
ai-services Rest Api Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/rest-api-guide.md
Title: "Text Translation REST API reference guide" description: View a list of with links to the Text Translation REST APIs.-
+#
ai-services V3 0 Break Sentence https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-break-sentence.md
Title: Translator BreakSentence Method description: The Translator BreakSentence method identifies the positioning of sentence boundaries in a piece of text.-
+#
ai-services V3 0 Detect https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-detect.md
Title: Translator Detect Method description: Identify the language of a piece of text with the Azure AI Translator Detect method.-
+#
ai-services V3 0 Dictionary Examples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-dictionary-examples.md
Title: Translator Dictionary Examples Method description: The Translator Dictionary Examples method provides examples that show how terms in the dictionary are used in context.-
+#
ai-services V3 0 Dictionary Lookup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-dictionary-lookup.md
Title: Translator Dictionary Lookup Method description: The Dictionary Lookup method provides alternative translations for a word and a few idiomatic phrases.-
+#
ai-services V3 0 Languages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-languages.md
Title: Translator Languages Method description: The Languages method gets the set of languages currently supported by other operations of the Translator.-
+#
ai-services V3 0 Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-reference.md
Title: Translator V3.0 Reference description: Reference documentation for the Translator V3.0. Version 3.0 of the Translator provides a modern JSON-based Web API.-
+#
ai-services V3 0 Translate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-translate.md
Title: Translator Translate Method description: Understand the parameters, headers, and body messages for the Translate method of Azure AI Translator to translate text.-
+#
ai-services V3 0 Transliterate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/reference/v3-0-transliterate.md
Title: Translator Transliterate Method description: Convert text in one language from one script to another script with the Translator Transliterate method.-
+#
ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/service-limits.md
Title: Service limits - Translator Service description: This article lists service limits for the Translator text and document translation. Charges are incurred based on character count, not request frequency with a limit of 50,000 characters per request. Character limits are subscription-based, with F0 limited to 2 million characters per hour.-
+#
ai-services Sovereign Clouds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/sovereign-clouds.md
Title: "Translator: sovereign clouds" description: Using Translator in sovereign clouds-
+#
ai-services Text Translation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/text-translation-overview.md
Title: What is Azure Text Translation? description: Integrate the Text Translation API into your applications, websites, tools, and other solutions to provide multi-language user experiences.-
+#
ai-services Translator Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/translator-faq.md
Title: Frequently asked questions - Translator description: Get answers to frequently asked questions about the Translator API in Azure AI services.-
+#
ai-services Translator Text Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/translator-text-apis.md
Title: "Use Azure AI Translator APIs" description: "Learn to translate text, transliterate text, detect language and more with the Translator service. Examples are provided in C#, Java, JavaScript and Python."-
+#
ai-services Word Alignment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-services/translator/word-alignment.md
Title: Word alignment - Translator description: To receive alignment information, use the Translate method and include the optional includeAlignment parameter.-
+#
ai-studio Configure Managed Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-studio/how-to/configure-managed-network.md
There are three different configuration modes for outbound traffic from the mana
| -- | -- | -- | | Allow internet outbound | Allow all internet outbound traffic from the managed VNet. | You want unrestricted access to machine learning resources on the internet, such as python packages or pretrained models.<sup>1</sup> | | Allow only approved outbound | Outbound traffic is allowed by specifying service tags. | * You want to minimize the risk of data exfiltration, but you need to prepare all required machine learning artifacts in your private environment.</br>* You want to configure outbound access to an approved list of services, service tags, or FQDNs. |
-| Disabled | Inbound and outbound traffic isn't restricted or you're using your own Azure Virtual Network to protect resources. | You want public inbound and outbound from the Azure AI, or you're handling network isolation with your own Azure VNet. |
+| Disabled | Inbound and outbound traffic isn't restricted. | You want public inbound and outbound from the Azure AI. |
<sup>1</sup> You can use outbound rules with _allow only approved outbound_ mode to achieve the same result as using allow internet outbound. The differences are:
ai-studio Index Add https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-studio/how-to/index-add.md
You must have:
1. Select **Next** after configuring index settings 1. Review the details you entered and select **Create**
+
+ > [!NOTE]
+ > If you see a **DeploymentNotFound** error, you need to assign more permissions. See [mitigate DeploymentNotFound error](#mitigate-deploymentnotfound-error) for more details.
+ 1. You're taken to the index details page where you can see the status of your index creation
+### Mitigate DeploymentNotFound error
+
+When you try to create a vector index, you might see the following error at the **Review + Finish** step:
+
+**Failed to create vector index. DeploymentNotFound: A valid deployment for the model=text-embedding-ada-002 was not found in the workspace connection=Default_AzureOpenAI provided.**
+
+This can happen if you are trying to create an index using an **Owner**, **Contributor**, or **Azure AI Developer** role at the project level. To mitigate this error, you might need to assign more permissions using either of the following methods.
+
+> [!NOTE]
+> You need to be assigned the **Owner** role of the resource group or higher scope (like Subscription) to perform the operation in the next steps. This is because only the Owner role can assign roles to others. See details [here](/azure/role-based-access-control/built-in-roles).
+
+#### Method 1: Assign more permissions to the user on the Azure AI resource
+
+If the Azure AI resource the project uses was created through Azure AI Studio:
+1. Sign in to [Azure AI Studio](https://aka.ms/azureaistudio) and select your project via **Build** > **Projects**.
+1. Select **Settings** from the collapsible left menu.
+1. From the **Resource Configuration** section, select the link for your resource group name that takes you to the Azure portal.
+1. In the Azure portal under **Overview** > **Resources** select the Azure AI service type. It's named similar to "YourAzureAIResourceName-aiservices."
+
+ :::image type="content" source="../media/roles-access/resource-group-azure-ai-service.png" alt-text="Screenshot of Azure AI service in a resource group." lightbox="../media/roles-access/resource-group-azure-ai-service.png":::
+
+1. Select **Access control (IAM)** > **+ Add** to add a role assignment.
+1. Add the **Cognitive Services OpenAI User** role to the user who wants to make an index. `Cognitive Services OpenAI Contributor` and `Cognitive Services Contributor` also work, but they assign more permissions than needed for creating an index in Azure AI Studio.
+
+> [!NOTE]
+> You can also opt to assign more permissions [on the resource group](#method-2-assign-more-permissions-on-the-resource-group). However, that method assigns more permissions than needed to mitigate the **DeploymentNotFound** error.
+
+#### Method 2: Assign more permissions on the resource group
+
+If the Azure AI resource the project uses was created through Azure portal:
+1. Sign in to [Azure AI Studio](https://aka.ms/azureaistudio) and select your project via **Build** > **Projects**.
+1. Select **Settings** from the collapsible left menu.
+1. From the **Resource Configuration** section, select the link for your resource group name that takes you to the Azure portal.
+1. Select **Access control (IAM)** > **+ Add** to add a role assignment.
+1. Add the **Cognitive Services OpenAI User** role to the user who wants to make an index. `Cognitive Services OpenAI Contributor` and `Cognitive Services Contributor` also work, but they assign more permissions than needed for creating an index in Azure AI Studio.
++ ## Use an index in prompt flow 1. Open your AI Studio project
You must have:
## Next steps - [Learn more about RAG](../concepts/retrieval-augmented-generation.md)-
ai-studio Sdk Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/ai-studio/how-to/sdk-install.md
The Azure AI code samples in GitHub Codespaces help you quickly get started with
- [Get started building a sample copilot application](https://github.com/azure/aistudio-copilot-sample) - [Try the Azure AI CLI from Azure AI Studio in a browser](vscode-web.md)-- [Azure SDK for Python reference documentation](/python/api/overview/azure)
+- [Azure SDK for Python reference documentation](/python/api/overview/azure/ai)
aks App Routing Dns Ssl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/app-routing-dns-ssl.md
+
+ Title: Use an Azure DNS zone with SSL/TLS certificates from Azure Key Vault
+description: Understand what Azure DNS zone and Azure Key Vault configuration options are supported with the application routing add-on for Azure Kubernetes Service.
+++ Last updated : 11/03/2023++
+# Use an Azure DNS zone with SSL/TLS certificates from Azure Key Vault with the application routing add-on
+
+An Ingress is an API object that defines rules, which allow external access to services in an Azure Kubernetes Service (AKS) cluster. When you create an Ingress object that uses the application routing add-on nginx Ingress classes, the add-on creates, configures, and manages one or more Ingress controllers in your AKS cluster.
+
+This article shows you how to set up an advanced Ingress configuration to encrypt the traffic and use Azure DNS to manage DNS zones.
+
+## Application routing add-on with nginx features
+
+The application routing add-on with nginx delivers the following:
+
+* Easy configuration of managed nginx Ingress controllers.
+* Integration with an external DNS such as [Azure DNS][azure-dns-overview] for public and private zone management
+* SSL termination with certificates stored in a key vault, such as [Azure Key Vault][azure-key-vault-overview].
+
+## Prerequisites
+
+- An AKS cluster with the [application routing add-on][app-routing-add-on-basic-configuration].
+- Azure Key Vault if you want to configure SSL termination and store certificates in the vault hosted in Azure.
+- Azure DNS if you want to configure public and private zone management and host them in Azure.
+- To attach an Azure Key Vault or Azure DNS Zone, you need the [Owner][rbac-owner], [Azure account administrator][rbac-classic], or [Azure co-administrator][rbac-classic] role on your Azure subscription.
+
+## Connect to your AKS cluster
+
+To connect to the Kubernetes cluster from your local computer, you use `kubectl`, the Kubernetes command-line client. You can install it locally using the [az aks install-cli][az-aks-install-cli] command. If you use the Azure Cloud Shell, `kubectl` is already installed.
+
+Configure kubectl to connect to your Kubernetes cluster using the [`az aks get-credentials`][az-aks-get-credentials] command.
+
+```azurecli-interactive
+az aks get-credentials -g <ResourceGroupName> -n <ClusterName>
+```
+
+## Terminate HTTPS traffic with certificates from Azure Key Vault
+
+To enable support for HTTPS traffic, see the following prerequisites:
+
+* An SSL certificate. If you don't have one, you can [create a certificate][create-and-export-a-self-signed-ssl-certificate].
+
+### Create an Azure Key Vault to store the certificate
+
+> [!NOTE]
+> If you already have an Azure Key Vault, you can skip this step.
+
+Create an Azure Key Vault using the [`az keyvault create`][az-keyvault-create] command.
+
+```azurecli-interactive
+az keyvault create -g <ResourceGroupName> -l <Location> -n <KeyVaultName> --enable-rbac-authorization true
+```
+
+### Create and export a self-signed SSL certificate
+
+> [!NOTE]
+> If you already have a certificate, you can skip this step.
+>
+1. Create a self-signed SSL certificate to use with the Ingress using the `openssl req` command. Make sure you replace *`<Hostname>`* with the DNS name you're using.
+
+ ```bash
+ openssl req -new -x509 -nodes -out aks-ingress-tls.crt -keyout aks-ingress-tls.key -subj "/CN=<Hostname>" -addext "subjectAltName=DNS:<Hostname>"
+ ```
+
+2. Export the SSL certificate and skip the password prompt using the `openssl pkcs12 -export` command.
+
+ ```bash
+ openssl pkcs12 -export -in aks-ingress-tls.crt -inkey aks-ingress-tls.key -out aks-ingress-tls.pfx
+ ```
+
+### Import certificate into Azure Key Vault
+
+Import the SSL certificate into Azure Key Vault using the [`az keyvault certificate import`][az-keyvault-certificate-import] command. If your certificate is password protected, you can pass the password through the `--password` flag.
+
+```azurecli-interactive
+az keyvault certificate import --vault-name <KeyVaultName> -n <KeyVaultCertificateName> -f aks-ingress-tls.pfx [--password <certificate password if specified>]
+```
+
+> [!IMPORTANT]
+> To enable the add-on to reload certificates from Azure Key Vault when they change, you should to enable the [secret autorotation feature][csi-secrets-store-autorotation] of the Secret Store CSI driver with the `--enable-secret-rotation` argument. When autorotation is enabled, the driver updates the pod mount and the Kubernetes secret by polling for changes periodically, based on the rotation poll interval you define. The default rotation poll interval is two minutes.
++
+### Enable Azure Key Vault integration
+
+On a cluster with the application routing add-on enabled, use the [`az aks approuting update`][az-aks-approuting-update] command using the `--enable-kv` and `--attach-kv` arguments to enable the Azure Key Vault provider for Secrets Store CSI Driver and apply the required role assignments.
+
+Azure Key Vault offers [two authorization systems][authorization-systems]: **Azure role-based access control (Azure RBAC)**, which operates on the management plane, and the **access policy model**, which operates on both the management plane and the data plane. The `--attach-kv` operation will choose the appropriate access model to use.
+
+> [!NOTE]
+> The `az aks approuting update --attach-kv` command uses the permissions of the user running the command to create the Azure Key Vault role assignment. This role is assigned to the add-on's managed identity. For more information on AKS managed identities, see [Summary of managed identities][summary-msi].
+
+Retrieve the Azure Key Vault resource ID.
+
+```azurecli-interactive
+KEYVAULTID=$(az keyvault show --name <KeyVaultName> --query "id" --output tsv)
+```
+
+Then update the app routing add-on to enable the Azure Key Vault secret store CSI driver and apply the role assignment.
+
+```azurecli-interactive
+az aks approuting update -g <ResourceGroupName> -n <ClusterName> --enable-kv --attach-kv ${KEYVAULTID}
+```
+
+## Enable Azure DNS integration
+
+To enable support for DNS zones, see the following prerequisites:
+
+* The app routing add-on can be configured to automatically create records on one or more Azure public and private DNS zones for hosts defined on Ingress resources. All global Azure DNS zones need to be in the same resource group, and all private Azure DNS zones need to be in the same resource group. If you don't have an Azure DNS zone, you can [create one][create-an-azure-dns-zone].
+
+### Create a global Azure DNS zone
+
+> [!NOTE]
+> If you already have an Azure DNS Zone, you can skip this step.
+>
+1. Create an Azure DNS zone using the [`az network dns zone create`][az-network-dns-zone-create] command.
+
+ ```azurecli-interactive
+ az network dns zone create -g <ResourceGroupName> -n <ZoneName>
+ ```
+
+### Attach Azure DNS zone to the application routing add-on
+
+> [!NOTE]
+> The `az aks approuting zone add` command uses the permissions of the user running the command to create the Azure DNS Zone role assignment. This role is assigned to the add-on's managed identity. For more information on AKS managed identities, see [Summary of managed identities][summary-msi].
+
+1. Retrieve the resource ID for the DNS zone using the [`az network dns zone show`][az-network-dns-zone-show] command and set the output to a variable named *ZONEID*.
+
+ ```azurecli-interactive
+ ZONEID=$(az network dns zone show -g <ResourceGroupName> -n <ZoneName> --query "id" --output tsv)
+ ```
+
+1. Update the add-on to enable the integration with Azure DNS using the [`az aks approuting zone`][az-aks-approuting-zone] command. You can pass a comma-separated list of DNS zone resource IDs.
+
+ ```azurecli-interactive
+ az aks approuting zone add -g <ResourceGroupName> -n <ClusterName> --ids=${ZONEID} --attach-zones
+ ```
+
+## Create the Ingress that uses a host name and a certificate from Azure Key Vault
+
+The application routing add-on creates an Ingress class on the cluster named *webapprouting.kubernetes.azure.com*. When you create an Ingress object with this class, it activates the add-on.
+
+1. Get the certificate URI to use in the Ingress from Azure Key Vault using the [`az keyvault certificate show`][az-keyvault-certificate-show] command.
+
+ ```azurecli-interactive
+ az keyvault certificate show --vault-name <KeyVaultName> -n <KeyVaultCertificateName> --query "id" --output tsv
+ ```
+
+2. Copy the following YAML manifest into a new file named **ingress.yaml** and save the file to your local computer.
+
+ > [!NOTE]
+ > Update *`<Hostname>`* with your DNS host name and *`<KeyVaultCertificateUri>`* with the ID returned from Azure Key Vault.
+ > The *`secretName`* key in the `tls` section defines the name of the secret that contains the certificate for this Ingress resource. This certificate will be presented in the browser when a client browses to the URL defined in the `<Hostname>` key. Make sure that the value of `secretName` is equal to `keyvault-` followed by the value of the Ingress resource name (from `metadata.name`). In the example YAML, secretName will need to be equal to `keyvault-<your Ingress name>`.
+
+ ```yml
+ apiVersion: networking.k8s.io/v1
+ kind: Ingress
+ metadata:
+ annotations:
+ kubernetes.azure.com/tls-cert-keyvault-uri: <KeyVaultCertificateUri>
+ name: aks-helloworld
+ namespace: hello-web-app-routing
+ spec:
+ ingressClassName: webapprouting.kubernetes.azure.com
+ rules:
+ - host: <Hostname>
+ http:
+ paths:
+ - backend:
+ service:
+ name: aks-helloworld
+ port:
+ number: 80
+ path: /
+ pathType: Prefix
+ tls:
+ - hosts:
+ - <Hostname>
+ secretName: keyvault-<your ingress name>
+ ```
+
+3. Create the cluster resources using the [`kubectl apply`][kubectl-apply] command.
+
+ ```bash
+ kubectl apply -f ingress.yaml -n hello-web-app-routing
+ ```
+
+ The following example output shows the created resource:
+
+ ```output
+ Ingress.networking.k8s.io/aks-helloworld created
+ ```
+
+## Verify the managed Ingress was created
+
+You can verify the managed Ingress was created using the [`kubectl get ingress`][kubectl-get] command.
+
+```bash
+kubectl get ingress -n hello-web-app-routing
+```
+
+The following example output shows the created managed Ingress:
+
+```output
+NAME CLASS HOSTS ADDRESS PORTS AGE
+aks-helloworld webapprouting.kubernetes.azure.com myapp.contoso.com 20.51.92.19 80, 443 4m
+```
+
+## Next steps
+
+Learn about monitoring the Ingress-nginx controller metrics included with the application routing add-on with [with Prometheus in Grafana][prometheus-in-grafana] as part of analyzing the performance and usage of your application.
+
+<!-- LINKS - external -->
+[kubectl-apply]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#apply
+[kubectl-get]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#get
+
+<!-- LINKS - internal -->
+[summary-msi]: use-managed-identity.md#summary-of-managed-identities
+[rbac-owner]: ../role-based-access-control/built-in-roles.md#owner
+[rbac-classic]: ../role-based-access-control/rbac-and-directory-admin-roles.md#classic-subscription-administrator-roles
+[app-routing-add-on-basic-configuration]: app-routing.md
+[secret-store-csi-provider]: csi-secrets-store-driver.md
+[csi-secrets-store-autorotation]: csi-secrets-store-configuration-options.md#enable-and-disable-auto-rotation
+[az-keyvault-set-policy]: /cli/azure/keyvault#az-keyvault-set-policy
+[azure-key-vault-overview]: ../key-vault/general/overview.md
+[az-aks-addon-update]: /cli/azure/aks/addon#az-aks-addon-update
+[az-aks-approuting-update]: /cli/azure/aks/approuting#az-aks-approuting-update
+[az-aks-approuting-zone]: /cli/azure/aks/approuting/zone
+[az-network-dns-zone-show]: /cli/azure/network/dns/zone#az-network-dns-zone-show
+[az-role-assignment-create]: /cli/azure/role/assignment#az-role-assignment-create
+[az-network-dns-zone-create]: /cli/azure/network/dns/zone#az-network-dns-zone-create
+[az-keyvault-certificate-import]: /cli/azure/keyvault/certificate#az-keyvault-certificate-import
+[az-keyvault-create]: /cli/azure/keyvault#az-keyvault-create
+[authorization-systems]: ../key-vault/general/rbac-access-policy.md
+[az-aks-install-cli]: /cli/azure/aks#az-aks-install-cli
+[az-aks-get-credentials]: /cli/azure/aks#az-aks-get-credentials
+[create-and-export-a-self-signed-ssl-certificate]: #create-and-export-a-self-signed-ssl-certificate
+[create-an-azure-dns-zone]: #create-a-global-azure-dns-zone
+[azure-dns-overview]: ../dns/dns-overview.md
+[az-keyvault-certificate-show]: /cli/azure/keyvault/certificate#az-keyvault-certificate-show
+[az-aks-enable-addons]: /cli/azure/aks/addon#az-aks-enable-addon
+[az-aks-show]: /cli/azure/aks/addon#az-aks-show
+[prometheus-in-grafana]: app-routing-nginx-prometheus.md
aks App Routing Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/app-routing-migration.md
In this article, you learn how to migrate your Azure Kubernetes Service (AKS) cl
## Prerequisites
-Azure CLI version `2.49.0` or later. If you haven't yet, follow the instructions to [Install Azure CLI][install-azure-cli]. Run `az --version` to find the version, and run `az upgrade` to upgrade the version if not already on the latest.
+- Azure CLI version 2.54.0 or later installed and configured. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][install-azure-cli].
+- `aks-preview` Azure CLI extension of version 0.5.171 or later installed
> [!NOTE] > These steps detail migrating from an unsupported configuration. As such, AKS cannot offer support for issues that arise during the migration process.
Azure CLI version `2.49.0` or later. If you haven't yet, follow the instructions
1. Enable the application routing add-on. ```azurecli-interactive
- az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons web_application_routing
+ az aks approuting enable -g <ResourceGroupName> -n <ClusterName>
``` 2. Update your Ingresses, setting `ingressClassName` to `webapprouting.kubernetes.azure.com`. Remove the `kubernetes.io/ingress.class` annotation. You also need to update the host to one that you own, as the application routing add-on doesn't have a managed cluster DNS zone. If you don't have a DNS zone, follow instructions to [create][app-routing-dns-create] and [configure][app-routing-dns-configure] one.
After migrating to the application routing add-on, learn how to [monitor Ingress
<!-- INTERNAL LINKS --> [install-azure-cli]: /cli/azure/install-azure-cli
-[app-routing-dns-create]: ./app-routing-configuration.md#create-a-global-azure-dns-zone
-[app-routing-dns-configure]: ./app-routing-configuration.md#configure-the-add-on-to-use-azure-dns-to-manage-dns-zones
+[app-routing-dns-create]: ./app-routing-dns-ssl.md#create-a-global-azure-dns-zone
+[app-routing-dns-configure]: ./app-routing-dns-ssl.md#attach-azure-dns-zone-to-the-application-routing-add-on
<!-- EXTERNAL LINKS --> [kubectl-get]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#get
aks App Routing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/app-routing.md
With the retirement of [Open Service Mesh][open-service-mesh-docs] (OSM) by the
## Prerequisites - An Azure subscription. If you don't have an Azure subscription, you can create a [free account](https://azure.microsoft.com/free).-- Azure CLI version 2.47.0 or later installed and configured. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][install-azure-cli].
+- Azure CLI version 2.54.0 or later installed and configured. Run `az --version` to find the version. If you need to install or upgrade, see [Install Azure CLI][install-azure-cli].
+- `aks-preview` Azure CLI extension of version 0.5.171 or later installed
## Limitations
With the retirement of [Open Service Mesh][open-service-mesh-docs] (OSM) by the
### Enable on a new cluster
-To enable application routing on a new cluster, use the [`az aks create`][az-aks-create] command, specifying `web_application_routing` with the `enable-addons` argument.
+To enable application routing on a new cluster, use the [`az aks create`][az-aks-create] command, specifying the `--enable-app-routing` flag.
```azurecli-interactive
-az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-addons web_application_routing --generate-ssh-keys
+az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-app-routing
``` ### Enable on an existing cluster
-To enable application routing on an existing cluster, use the [`az aks enable-addons`][az-aks-enable-addons] command specifying `web_application_routing` with the `--addons` argument.
+To enable application routing on an existing cluster, use the [`az aks approuting enable`][az-aks-approuting-enable] command.
```azurecli-interactive
-az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons web_application_routing
+az aks approuting enable -g <ResourceGroupName> -n <ClusterName>
```
-# [Open Service Mesh (OSM)](#tab/with-osm)
+# [Open Service Mesh (OSM) (retired)](#tab/with-osm)
>[!NOTE] >Open Service Mesh (OSM) has been retired by the CNCF. Creating Ingresses using the application routing add-on with OSM integration is not recommended and will be retired.
The following add-ons are required to support this configuration:
### Enable on a new cluster
-Enable application routing on a new AKS cluster using the [`az aks create`][az-aks-create] command and the `--enable-addons` parameter with the following add-ons:
+Enable application routing on a new AKS cluster using the [`az aks create`][az-aks-create] command specifying the `--enable-app-routing` flag and the `--enable-addons` parameter with the `open-service-mesh` add-on:
```azurecli-interactive
-az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-addons open-service-mesh,web_application_routing --generate-ssh-keys
+az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-app-routing --enable-addons open-service-mesh
``` ### Enable on an existing cluster
-Enable application routing on an existing cluster using the [`az aks enable-addons`][az-aks-enable-addons] command and the `--addons` parameter with the following add-ons:
+To enable application routing on an existing cluster, use the [`az aks approuting enable`][az-aks-approuting-enable] command and the [`az aks enable-addons`][az-aks-enable-addons] command with the `--addons` parameter set to `open-service-mesh`:
```azurecli-interactive
-az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons open-service-mesh,web_application_routing
+az aks approuting enable -g <ResourceGroupName> -n <ClusterName>
+az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons open-service-mesh
``` > [!NOTE]
az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons open-servi
### Enable on a new cluster
-Enable application routing on a new AKS cluster using the [`az aks create`][az-aks-create] command and the `--enable-addons` parameter with the following add-ons:
+To enable application routing on a new cluster, use the [`az aks create`][az-aks-create] command, specifying `--enable-app-routing` flag.
```azurecli-interactive
-az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-addons web_application_routing --generate-ssh-keys
+az aks create -g <ResourceGroupName> -n <ClusterName> -l <Location> --enable-app-routing
``` ### Enable on an existing cluster
-Enable application routing on an existing cluster using the [`az aks enable-addons`][az-aks-enable-addons] command and the `--addons` parameter with the following add-ons:
+To enable application routing on an existing cluster, use the [`az aks approuting enable`][az-aks-approuting-enable] command:
```azurecli-interactive
-az aks enable-addons -g <ResourceGroupName> -n <ClusterName> --addons web_application_routing --enable-secret-rotation
+az aks approuting enable -g <ResourceGroupName> -n <ClusterName>
```
The application routing add-on creates an Ingress class on the cluster named *we
spec: ingressClassName: webapprouting.kubernetes.azure.com rules:
- - host: <Hostname>
+ - host: <Hostname>
http: paths: - backend:
The application routing add-on creates an Ingress class on the cluster called *w
spec: ingressClassName: webapprouting.kubernetes.azure.com rules:
- - host: <Hostname>
+ - host: <Hostname>
http: paths: - backend:
To remove the associated namespace, use the `kubectl delete namespace` command.
kubectl delete namespace hello-web-app-routing ```
-To remove the application routing add-on from your cluster, use the [`az aks disable-addons`][az-aks-disable-addons] command.
+To remove the application routing add-on from your cluster, use the [`az aks approuting disable`][az-aks-approuting-disable] command.
```azurecli-interactive
-az aks disable-addons --addons web_application_routing --name myAKSCluster --resource-group myResourceGroup
+az aks approuting disable --name myAKSCluster --resource-group myResourceGroup
``` When the application routing add-on is disabled, some Kubernetes resources might remain in the cluster. These resources include *configMaps* and *secrets* and are created in the *app-routing-system* namespace. You can remove these resources if you want.
When the application routing add-on is disabled, some Kubernetes resources might
<!-- LINKS - internal --> [azure-dns-overview]: ../dns/dns-overview.md
+[az-aks-approuting-enable]: /cli/azure/aks/approuting#az-aks-approuting-enable
+[az-aks-approuting-disable]: /cli/azure/aks/approuting#az-aks-approuting-disable
[az-aks-enable-addons]: /cli/azure/aks#az-aks-enable-addons [az-aks-disable-addons]: /cli/azure/aks#az-aks-disable-addons [az-aks-install-cli]: /cli/azure/aks#az-aks-install-cli [az-aks-get-credentials]: /cli/azure/aks#az-aks-get-credentials [install-azure-cli]: /cli/azure/install-azure-cli
-[custom-ingress-configurations]: app-routing-configuration.md
+[custom-ingress-configurations]: app-routing-dns-ssl.md
[az-aks-create]: /cli/azure/aks#az-aks-create [prometheus-in-grafana]: app-routing-nginx-prometheus.md
aks Artifact Streaming https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/artifact-streaming.md
Enablement on ACR is a prerequisite for Artifact Streaming on AKS. For more info
--enable-artifact-streaming true ```
-### Enable Artifact Streaming on an existing node pool
-
-* Enable Artifact Streaming on an existing node pool using the [`az aks nodepool update`][az-aks-nodepool-update] command with the `--enable-artifact-streaming` flag.
-
- ```azurecli-interactive
- az aks nodepool update \
- --resource-group myResourceGroup \
- --cluster-name myAKSCluster \
- --name myNodePool \
- --enable-artifact-streaming
- ```
- ## Check if Artifact Streaming is enabled Now that you enabled Artifact Streaming on a premium ACR and connected that to an AKS node pool with Artifact Streaming enabled, any new pod deployments on this cluster with an image pull from the ACR with Artifact Streaming enabled will see reductions in image pull times.
Now that you enabled Artifact Streaming on a premium ACR and connected that to a
In the output, check that the `Enabled` field is set to `true`.
-## Disable Artifact Streaming on AKS
-
-You can disable Artifact Streaming at the node pool level. The change takes effect on the next node pool upgrade.
-
-> [!NOTE]
-> Artifact Streaming requires connection to and enablement on an ACR. If you disconnect or disable from ACR, Artifact Streaming is automatically disabled on the node pool. If you don't disable Artifact Streaming at the node pool level, it begins working immediately once you resume the connection to and enablement on ACR.
-
-### Disable Artifact Streaming on an existing node pool
-
-* Disable Artifact Streaming on an existing node pool using the [`az aks nodepool update`][az-aks-nodepool-update] command with the `--disable-artifact-streaming` flag.
-
- ```azurecli-interactive
- az aks nodepool update \
- --resource-group myResourceGroup \
- --cluster-name myAKSCluster \
- --name myNodePool \
- --disable-artifact-streaming
- ```
- ## Next steps This article described how to enable Artifact Streaming on your AKS node pools to stream artifacts from ACR and reduce image pull time. To learn more about working with container images in AKS, see [Best practices for container image management and security in AKS][aks-image-management].
aks Csi Secrets Store Driver https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-driver.md
A container using *subPath volume mount* doesn't receive secret updates when it'
## Create or use an existing Azure Key Vault
-1. Create a key vault using the [`az keyvault create`][az-keyvault-create] command. The name of the key vault must be globally unique.
+1. Create or update a key vault with Azure role-based access control (Azure RBAC) enabled using the [`az keyvault create`][az-keyvault-create] command or the [`az keyvault update`][az-keyvault-update] command with the `--enable-rbac-authorization` flag. The name of the key vault must be globally unique. For more details on key vault permission models and Azure RBAC, see [Provide access to Key Vault keys, certificates, and secrets with an Azure role-based access control](/azure/key-vault/general/rbac-guide)
+ ```azurecli-interactive
- az keyvault create -n <keyvault-name> -g myResourceGroup -l eastus2
+ ## Create a new Azure key vault
+ az keyvault create -n <keyvault-name> -g myResourceGroup -l eastus2 --enable-rbac-authorization
+
+ ## Update an existing Azure key vault
+ az keyvault update -n <keyvault-name> -g myResourceGroup -l eastus2 --enable-rbac-authorization
``` 2. Your key vault can store keys, secrets, and certificates. In this example, use the [`az keyvault secret set`][az-keyvault-secret-set] command to set a plain-text secret called `ExampleSecret`.
In this article, you learned how to use the Azure Key Vault provider for Secrets
[az-aks-enable-addons]: /cli/azure/aks#az-aks-enable-addons [identity-access-methods]: ./csi-secrets-store-identity-access.md [az-keyvault-create]: /cli/azure/keyvault#az-keyvault-create.md
+[az-keyvault-update]: /cli/azure/keyvault#az-keyvault-update.md
[az-keyvault-secret-set]: /cli/azure/keyvault#az-keyvault-secret-set.md [az-group-create]: /cli/azure/group#az-group-create
aks Long Term Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/long-term-support.md
+
+ Title: Long term support for Azure Kubernetes Service (AKS)
+description: Learn about Azure Kubernetes Service (AKS) Long term support for Kubernetes
+ Last updated : 08/16/2023++
+#Customer intent: As a cluster operator or developer, I want to understand how Long Term Support for Kubernetes on AKS works.
++
+# Long term support
+The Kubernetes community releases a new minor version approximately every four months, with a support window for each version for one year. This support in terms of Azure Kubernetes Service (AKS) is called "Community Support."
+
+AKS supports versions of Kubernetes that are within this Community Support window, to push bug fixes and security updates from community releases.
+
+While innovation delivered with this release cadence provides huge benefits to you, it challenges you to keep up to date with Kubernetes releases, which can be made more difficult based on the number of AKS clusters you have to maintain.
++
+## AKS support types
+After approximately one year, the Kubernetes version exits Community Support and your AKS clusters are now at-risk as bug fixes and security updates become unavailable.
+
+AKS provides one year Community Support and one year of Long Term Support (LTS) to back port security fixes from the community upstream in our public repository. Our upstream LTS working group contributes efforts back to the community to provide our customers with a longer support window.
+
+LTS intends to give you an extended period of time to plan and test for upgrades over a two-year period from the General Availability of the designated Kubernetes version.
+
+| | Community Support |Long Term Support |
+||||
+| **When to use** | When you can keep up with upstream Kubernetes releases | When you need control over when to migrate from one version to another |
+| **Support versions** | Three GA minor versions | One Kubernetes version (currently *1.27*) for two years |
++
+## Enable Long Term Support
+
+Enabling and disabling Long Term Support is a combination of moving your cluster to the Premium tier and explicitly selecting the LTS support plan.
+
+> [!NOTE]
+> While it's possible to enable LTS when the cluster is in Community Support, you'll be charged once you enable the Premium tier.
+
+### Create a cluster with LTS enabled
+```
+az aks create --resource-group myResourceGroup --name myAKSCluster --tier premium --k8s-support-plan AKSLongTermSupport --kubernetes-version 1.27
+```
+
+> [!NOTE]
+> Enabling and disabling LTS is a combination of moving your cluster to the Premium tier, as well as enabling Long Term Support. Both must either be turned on or off.
+
+### Enable LTS on an existing cluster
+```
+az aks update --resource-group myResourceGroup --name myAKSCluster --tier premium --k8s-support-plan AKSLongTermSupport
+```
+
+### Disable LTS on an existing cluster
+```
+az aks update --resource-group myResourceGroup --name myAKSCluster --tier [free|standard] --k8s-support-plan KubernetesOfficial
+```
+
+## Long term support, add-ons and features
+The AKS team currently tracks add-on versions where Kubernetes community support exists. Once a version leaves Community Support, we rely on Open Source projects for managed add-ons to continue that support. Due to various external factors, some add-ons and features may not support Kubernetes versions outside these upstream Community Support windows.
+
+See the following table for a list of add-ons and features that aren't supported and the reason why.
+
+| Add-on / Feature | Reason it's unsupported |
+||
+| Istio | The Istio support cycle is short (six months), and there will not be maintenance releases for Kubernetes 1.27 |
+| Keda | Unable to guarantee future version compatibility with Kubernetes 1.27 |
+| Calico | Requires Calico Enterprise agreement past Community Support |
+| Cillium | Requires Cillium Enterprise agreement past Community Support |
+| Azure Linux | Support timeframe for Azure Linux 2 ends during this LTS cycle |
+| Key Management Service (KMS) | KMSv2 replaces KMS during this LTS cycle |
+| Dapr | AKS extensions are not supported |
+| Application Gateway Ingress Controller | Migration to App Gateway for Containers happens during LTS period |
+| Open Service Mesh | OSM will be deprecated|
+| AAD Pod Identity | Deprecated in place of Workload Identity |
++
+> [!NOTE]
+>You can't move your cluster to Long Term support if any of these add-ons or features are enabled.
+>Whilst these AKS managed add-ons aren't supported by Microsoft, you're able to install the Open Source versions of these on your cluster if you wish to use it past Community Support.
+
+## How we decide the next LTS version
+Versions of Kubernetes LTS are available for two years from General Availability, we mark a later version of Kubernetes as LTS based on the following criteria:
+* Sufficient time for customers to migrate from the prior LTS version to the current have passed
+* The previous version has had a two year support window
+
+Read the AKS release notes to stay informed of when you're able to plan your migration.
+
+### Migrate from LTS to Community support
+Using LTS is a way to extend your window to plan a Kubernetes version upgrade. You may want to migrate to a version of Kubernetes that is within the [standard support window](supported-kubernetes-versions.md#kubernetes-version-support-policy).
+
+To move from an LTS enabled cluster to a version of Kubernetes that is within the standard support window, you need to disable LTS on the cluster:
+
+```
+az aks update --resource-group myResourceGroup --name myAKSCluster --tier [free|standard] --k8s-support KubernetesCommunitySupport
+```
+
+And then upgrade the cluster to a later supported version:
+
+```
+az aks upgrade --resource-group myResourceGroup --name myAKSCluster --kubernetes-version 1.28.3
+```
+> [!NOTE]
+> Kubernetes 1.28.3 is used as an example here, please check the [AKS release tracker](release-tracker.md) for available Kubernetes releases.
+
+There are approximately two years between one LTS version and the next. In lieu of upstream support for migrating more than two minor versions, there's a high likelihood your application depends on Kubernetes APIs that have been deprecated. We recommend you thoroughly test your application on the target LTS Kubernetes version and carry out a blue/green deployment from one version to another.
+
+### Migrate from LTS to the next LTS release
+The upstream Kubernetes community supports a two minor version upgrade path. The process migrates the objects in your Kubernetes cluster as part of the upgrade process, and provides a tested, and accredited migration path.
+
+For customers that wish to carry out an in-place migration, the AKS service will migrate your control plane from the previous LTS version to the latest, and then migrate your data plane.
+
+To carry out an in-place upgrade to the latest LTS version, you need to specify an LTS enabled Kubernetes version as the upgrade target.
+
+```
+az aks upgrade --resource-group myResourceGroup --name myAKSCluster --kubernetes-version 1.30.2
+```
+
+> [!NOTE]
+> Kubernetes 1.30.2 is used as an example here, please check the [AKS release tracker](release-tracker.md) for available Kubernetes releases.
aks Stop Cluster Upgrade Api Breaking Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/stop-cluster-upgrade-api-breaking-changes.md
You can also check past API usage by enabling [Container Insights][container-ins
### Bypass validation to ignore API changes > [!NOTE]
-> This method requires you to use the Azure CLI version 2.53 or later. This method isn't recommended, as deprecated APIs in the targeted Kubernetes version may not work long term. We recommend removing them as soon as possible after the upgrade completes.
+> This method requires you to use the Azure CLI version 2.53 or later. If you have the `aks-preview` CLI extension installed, you'll need to update to version `0.5.154` or later. This method isn't recommended, as deprecated APIs in the targeted Kubernetes version may not work long term. We recommend removing them as soon as possible after the upgrade completes.
* Bypass validation to ignore API breaking changes using the [`az aks update`][az-aks-update] command. Specify the `enable-force-upgrade` flag and set the `upgrade-override-until` property to define the end of the window during which validation is bypassed. If no value is set, it defaults the window to three days from the current time. The date and time you specify must be in the future.
aks Supported Kubernetes Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/supported-kubernetes-versions.md
Install-AzAksKubectl -Version latest
-## Long Term Support (LTS)
-
-AKS provides a Long Term Support (LTS) version of Kubernetes for a two-year period. There's only a single minor version of Kubernetes deemed LTS at any one time.
-
-| | Community Support |Long Term Support |
-||||
-| **When to use** | When you can keep up with upstream Kubernetes releases | When you need control over when to migrate from one version to another |
-| **Support versions** | Three GA minor versions | One Kubernetes version (currently *1.27*) for two years |
-| **Pricing** | Included | Per hour cluster cost |
-
-The upstream community maintains a minor release of Kubernetes for one year from release. After this period, Microsoft creates and applies security updates to the LTS version of Kubernetes to provide a total of two years of support on AKS.
-
-> [!IMPORTANT]
-> Kubernetes version 1.27 will be the first supported LTS version of Kubernetes on AKS. It is not yet available.
- ## Release and deprecation process You can reference upcoming version releases and deprecations on the [AKS Kubernetes release calendar](#aks-kubernetes-release-calendar).
aks Use Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-managed-identity.md
AKS uses several managed identities for built-in services and add-ons.
| Add-on | azure-policy | No identity required. | N/A | No | Add-on | Calico | No identity required. | N/A | No | Add-on | Dashboard | No identity required. | N/A | No
+| Add-on | application-routing | Manages Azure DNS and Azure Key Vault certificates | Key Vault Secrets User role for Key Vault, DNZ Zone Contributor role for DNS zone | No
| Add-on | HTTPApplicationRouting | Manages required network resources. | Reader role for node resource group, contributor role for DNS zone | No | Add-on | Ingress application gateway | Manages required network resources. | Contributor role for node resource group | No | Add-on | omsagent | Used to send AKS metrics to Azure Monitor. | Monitoring Metrics Publisher role | No
AKS uses several managed identities for built-in services and add-ons.
## Enable managed identities on a new AKS cluster > [!NOTE]
-> AKS creates a user-assigned kubelet identity in the node resource group if you don't [specify your own kubelet managed identity][Use a pre-created kubelet managed identity].
+> AKS creates a user-assigned kubelet identity in the node resource group if you don't [specify your own kubelet managed identity][use-a-pre-created-kubelet-managed-identity].
1. Create an Azure resource group using the [`az group create`][az-group-create] command.
A custom user-assigned managed identity for the control plane enables access to
> > USDOD Central, USDOD East, and USGov Iowa regions in Azure US Government cloud aren't supported. >
-> AKS creates a user-assigned kubelet identity in the node resource group if you don't [specify your own kubelet managed identity][Use a pre-created kubelet managed identity].
+> AKS creates a user-assigned kubelet identity in the node resource group if you don't [specify your own kubelet managed identity][use-a-pre-created-kubelet-managed-identity].
* If you don't have a managed identity, create one using the [`az identity create`][az-identity-create] command.
aks Workload Identity Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/workload-identity-overview.md
Title: Use a Microsoft Entra Workload ID on Azure Kubernetes Service (AKS)
description: Learn about Microsoft Entra Workload ID for Azure Kubernetes Service (AKS) and how to migrate your application to authenticate using this identity. Previously updated : 09/13/2023 Last updated : 11/17/2023 # Use Microsoft Entra Workload ID with Azure Kubernetes Service (AKS)
The following table provides the **minimum** package version required for each l
| Ecosystem | Library | Minimum version | |--||--| | .NET | [Azure.Identity](/dotnet/api/overview/azure/identity-readme) | 1.9.0 |
-| C++ | [azure-identity-cpp](https://github.com/Azure/azure-sdk-for-cpp/blob/main/sdk/identity/azure-identity/README.md) | 1.6.0-beta.2 |
+| C++ | [azure-identity-cpp](https://github.com/Azure/azure-sdk-for-cpp/blob/main/sdk/identity/azure-identity/README.md) | 1.6.0 |
| Go | [azidentity](https://pkg.go.dev/github.com/Azure/azure-sdk-for-go/sdk/azidentity) | 1.3.0 | | Java | [azure-identity](/java/api/overview/azure/identity-readme) | 1.9.0 | | Node.js | [@azure/identity](/javascript/api/overview/azure/identity-readme) | 3.2.0 |
automation Automation Managing Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-managing-data.md
Title: Azure Automation data security
description: This article helps you learn how Azure Automation protects your privacy and secures your data. Previously updated : 08/01/2023 Last updated : 11/17/2023
For information about TLS 1.2 support with the Log Analytics agent for Windows a
### Upgrade TLS protocol for Hybrid Workers and Webhook calls
-From **30 October 2023**, all agent-based and extension-based User Hybrid Runbook Workers using Transport Layer Security (TLS) 1.0 and 1.1 protocols would no longer be able to connect to Azure Automation and all jobs running or scheduled on these machines would fail.
+From **1 October 2024**, all agent-based and extension-based User Hybrid Runbook Workers using Transport Layer Security (TLS) 1.0 and 1.1 protocols would no longer be able to connect to Azure Automation and all jobs running or scheduled on these machines would fail.
Ensure that the Webhook calls that trigger runbooks navigate on TLS 1.2 or higher. Ensure to make registry changes so that Agent and Extension based workers negotiate only on TLS 1.2 and higher protocols. Learn how to [disable TLS 1.0/1.1 protocols on Windows Hybrid Worker and enable TLS 1.2 or above](/system-center/scom/plan-security-tls12-config#configure-windows-operating-system-to-only-use-tls-12-protocol) on Windows machine.
azure-app-configuration Enable Dynamic Configuration Azure Kubernetes Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/enable-dynamic-configuration-azure-kubernetes-service.md
+
+ Title: "Tutorial: Use dynamic configuration in Azure App Configuration Kubernetes Provider | Microsoft Docs"
+description: "In this quickstart, use the Azure App Configuration Kubernetes Provider to dynamically load updated key-values from App Configuration store."
+++
+ms.devlang: csharp
++ Last updated : 11/14/2023+
+#Customer intent: As an Azure Kubernetes Service user, I want to manage all my app settings in one place using Azure App Configuration.
++
+# Tutorial: Use dynamic configuration in Azure Kubernetes Service
+
+If you use Azure Kubernetes Service (AKS), this tutorial will show you how to enable dynamic configuration for your workloads in AKS by leveraging Azure App Configuration and its Kubernetes Provider. The tutorial assumes that you have already worked through the quickstart and have an App Configuration Kubernetes Provider set up, so before proceeding, make sure you have completed the [Use Azure App Configuration in Azure Kubernetes Service](./quickstart-azure-kubernetes-service.md) quickstart.
++
+## Prerequisites
+
+Finish the quickstart: [Use Azure App Configuration in Azure Kubernetes Service](./quickstart-azure-kubernetes-service.md).
+
+> [!TIP]
+> The Azure Cloud Shell is a free, interactive shell that you can use to run the command line instructions in this article. It has common Azure tools preinstalled, including the .NET Core SDK. If you're logged in to your Azure subscription, launch your [Azure Cloud Shell](https://shell.azure.com) from shell.azure.com. You can learn more about Azure Cloud Shell by [reading our documentation](../cloud-shell/overview.md).
+>
+## Add a sentinel key
+
+A *sentinel key* is a key that you update after you complete the change of all other keys. Your app monitors the sentinel key. When a change is detected, your app refreshes all configuration values. This approach helps to ensure the consistency of configuration in your app and reduces the overall number of requests made to your App Configuration store, compared to monitoring all keys for changes.
+
+Add the following key-value to your App Configuration store. For more information about how to add key-values to a store using the Azure portal or the CLI, go to [Create a key-value](./quickstart-azure-app-configuration-create.md#create-a-key-value).
+
+| Key | Value |
+|||
+| Settings:Sentinel | 1 |
+
+## Reload data from App Configuration
+
+1. Open the *appConfigurationProvider.yaml* file located in the *Deployment* directory. Then, add the `refresh` section under the `configuration` property as shown below. It enables configuration refresh by monitoring the sentinel key.
+
+ ```yaml
+ apiVersion: azconfig.io/v1
+ kind: AzureAppConfigurationProvider
+ metadata:
+ name: appconfigurationprovider-sample
+ spec:
+ endpoint: <your-app-configuration-store-endpoint>
+ target:
+ configMapName: configmap-created-by-appconfig-provider
+ configMapData:
+ type: json
+ key: mysettings.json
+ auth:
+ workloadIdentity:
+ managedIdentityClientId: <your-managed-identity-client-id>
+ configuration:
+ refresh:
+ enabled: true
+ monitoring:
+ keyValues:
+ - key: Settings:Sentinel
+ ```
+
+ > [!TIP]
+ > By default, the Kubernetes provider polls the monitoring key-values every 30 seconds for change detection. However, you can change this behavior by setting the `interval` property of the `refresh`. If you want to reduce the number of requests to your App Configuration store, you can adjust it to a higher value.
+
+1. Open the *deployment.yaml* file in the *Deployment* directory and add the following content to the `spec.containers` section. Your application will load configuration from a volume-mounted file the App Configuration Kubernetes provider generates. By setting this environment variable, your application can [ use polling to monitor changes in mounted files](/dotnet/api/microsoft.extensions.fileproviders.physicalfileprovider.usepollingfilewatcher).
+
+ ```yaml
+ env:
+ - name: DOTNET_USE_POLLING_FILE_WATCHER
+ value: "true"
+ ```
+
+1. Run the following command to deploy the change. Replace the namespace if you are using your existing AKS application.
+
+ ```console
+ kubectl apply -f ./Deployment -n appconfig-demo
+ ```
+
+1. Open a browser window, and navigate to the IP address obtained in the [previous step](./quickstart-azure-kubernetes-service.md#deploy-the-application). The web page looks like this:
+
+ ![Screenshot of the web app with old values.](./media/quickstarts/kubernetes-provider-app-launch-after.png)
++
+1. Update the following key-values in your App Configuration store, ensuring to update the sentinel key last.
+
+ | Key | Value |
+ |||
+ | Settings:Message | Hello from Azure App Configuration - now with live updates! |
+ | Settings:Sentinel | 2 |
+
+1. After refreshing the browser a few times, you will see the updated content once the ConfigMap is updated in 30 seconds.
+
+ ![Screenshot of the web app with updated values.](./media/quickstarts/kubernetes-provider-app-launch-dynamic-after.png)
+
+## Reload ConfigMap and Secret
+
+App Configuration Kubernetes provider generates ConfigMaps or Secrets that can be used as environment variables or volume-mounted files. This tutorial demonstrated how to load configuration from a JSON file using the [.NET JSON configuration provider](/dotnet/core/extensions/configuration-providers#json-configuration-provider), which automatically reloads the configuration whenever a change is detected in the mounted file. As a result, your application gets the updated configuration automatically whenever the App Configuration Kubernetes provider updates the ConfigMap.
+
+If your application is dependent on environment variables for configuration, it may require a restart to pick up any updated values. In Kubernetes, the application restart can be orchestrated using rolling updates on the corresponding pods or containers. To automate configuration updates, you may leverage third-party tools like [stakater/Reloader](https://github.com/stakater/Reloader), which can automatically trigger rolling updates upon any changes made to ConfigMaps or Secrets.
+
+## Next steps
+
+To learn more about the Azure App Configuration Kubernetes Provider, see [Azure App Configuration Kubernetes Provider reference](./reference-kubernetes-provider.md).
azure-arc Create Postgresql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/create-postgresql-server.md
There are important topics you may want read before you proceed with creation:
- [Storage configuration and Kubernetes storage concepts](storage-configuration.md) - [Kubernetes resource model](https://github.com/kubernetes/design-proposals-archive/blob/main/scheduling/resources.md#resource-quantities)
-If you prefer to try out things without provisioning a full environment yourself, get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
+If you prefer to try out things without provisioning a full environment yourself, get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
## Preliminary and temporary step for OpenShift users only
azure-arc Least Privilege https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/least-privilege.md
kubectl logs <pod name> --namespace arc
You have several additional options for creating the Azure Arc data controller: > **Just want to try things out?**
-> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on AKS, Amazon EKS, or GKE, or in an Azure VM.
+> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on AKS, Amazon EKS, or GKE, or in an Azure VM.
> - [Create a data controller in direct connectivity mode with the Azure portal](create-data-controller-direct-prerequisites.md)
azure-arc Limitations Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/limitations-managed-instance.md
__Why doesn't Microsoft provide SLAs on Azure Arc hybrid services?__ Customers a
## Next steps -- **Try it out.** Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
+- **Try it out.** Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
- **Create your own.** Follow these steps to create on your own Kubernetes cluster: 1. [Install the client tools](install-client-tools.md)
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/overview.md
To see the regions that currently support Azure Arc-enabled data services, go to
## Next steps > **Just want to try things out?**
-> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
+> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on Azure Kubernetes Service (AKS), AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
>
->In addition, deploy [Jumpstart ArcBox for DataOps](https://aka.ms/ArcBoxDataOps), an easy to deploy sandbox for all things Azure Arc-enabled SQL Managed Instance. ArcBox is designed to be completely self-contained within a single Azure subscription and resource group, which will make it easy for you to get hands-on with all available Azure Arc-enabled technology with nothing more than an available Azure subscription.
+>In addition, deploy [Jumpstart ArcBox for DataOps](https://azurearcjumpstart.com/azure_jumpstart_arcbox/DataOps), an easy to deploy sandbox for all things Azure Arc-enabled SQL Managed Instance. ArcBox is designed to be completely self-contained within a single Azure subscription and resource group, which will make it easy for you to get hands-on with all available Azure Arc-enabled technology with nothing more than an available Azure subscription.
[Install the client tools](install-client-tools.md)
azure-arc Plan Azure Arc Data Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/plan-azure-arc-data-services.md
Currently, only one Azure Arc data controller per Kubernetes cluster is supporte
You have several additional options for creating the Azure Arc data controller: > **Just want to try things out?**
-> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on AKS, Amazon EKS, or GKE, or in an Azure VM.
+> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on AKS, Amazon EKS, or GKE, or in an Azure VM.
> - [Create a data controller in direct connectivity mode with the Azure portal](create-data-controller-direct-prerequisites.md)
azure-arc Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/release-notes.md
This section describes the new features introduced or enabled for this release.
## Next steps > **Just want to try things out?**
-> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_data/) on AKS, AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
+> Get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_data) on AKS, AWS Elastic Kubernetes Service (EKS), Google Cloud Kubernetes Engine (GKE) or in an Azure VM.
- [Install the client tools](install-client-tools.md) - [Plan an Azure Arc-enabled data services deployment](plan-azure-arc-data-services.md) (requires installing the client tools first)
azure-arc Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/faq.md
The feature to enable storing customer data in a single region is currently only
* Already have an AKS cluster or an Azure Arc-enabled Kubernetes cluster? [Create GitOps configurations on your Azure Arc-enabled Kubernetes cluster](./tutorial-use-gitops-flux2.md). * Learn how to [setup a CI/CD pipeline with GitOps](./tutorial-gitops-flux2-ci-cd.md). * Learn how to [use Azure Policy to apply configurations at scale](./use-azure-policy.md).
-* Experience Azure Arc-enabled Kubernetes automated scenarios with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_k8s/).
+* Experience Azure Arc-enabled Kubernetes automated scenarios with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_k8s).
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/overview.md
Once your Kubernetes clusters are connected to Azure, at scale you can:
## Next steps * Learn about best practices and design patterns through the [Cloud Adoption Framework for hybrid and multicloud](/azure/cloud-adoption-framework/scenarios/hybrid/arc-enabled-kubernetes/eslz-arc-kubernetes-identity-access-management).
-* Try out Arc-enabled Kubernetes without provisioning a full environment by using the [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_k8s/).
+* Try out Arc-enabled Kubernetes without provisioning a full environment by using the [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_k8s).
* [Connect an existing Kubernetes cluster to Azure Arc](quickstart-connect-cluster.md).
azure-arc Quickstart Connect Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/quickstart-connect-cluster.md
ms.devlang: azurecli
Get started with Azure Arc-enabled Kubernetes by using Azure CLI or Azure PowerShell to connect an existing Kubernetes cluster to Azure Arc.
-For a conceptual look at connecting clusters to Azure Arc, see [Azure Arc-enabled Kubernetes agent overview](./conceptual-agent-overview.md). To try things out in a sample/practice experience, visit the [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_k8s/).
+For a conceptual look at connecting clusters to Azure Arc, see [Azure Arc-enabled Kubernetes agent overview](./conceptual-agent-overview.md). To try things out in a sample/practice experience, visit the [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_k8s).
## Prerequisites
Remove-AzConnectedKubernetes -ClusterName AzureArcTest1 -ResourceGroupName Azure
* Learn how to [deploy configurations using GitOps with Flux v2](tutorial-use-gitops-flux2.md). * [Troubleshoot common Azure Arc-enabled Kubernetes issues](troubleshooting.md).
-* Experience Azure Arc-enabled Kubernetes automated scenarios with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_k8s/).
+* Experience Azure Arc-enabled Kubernetes automated scenarios with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_k8s).
azure-arc Deployment Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/deployment-options.md
Be sure to review the basic [prerequisites](prerequisites.md) and [network confi
* Learn about the Azure Connected Machine agent [prerequisites](prerequisites.md) and [network requirements](network-requirements.md). * Review the [Planning and deployment guide for Azure Arc-enabled servers](plan-at-scale-deployment.md) * Learn about [reconfiguring, upgrading, and removing the Connected Machine agent](manage-agent.md).
-* Try out Arc-enabled servers by using the [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_servers/).
+* Try out Arc-enabled servers by using the [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_servers).
azure-arc Quick Enable Hybrid Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/learn/quick-enable-hybrid-vm.md
Get started with [Azure Arc-enabled servers](../overview.md) to manage and gover
In this quickstart, you'll deploy and configure the Azure Connected Machine agent on a Windows or Linux machine hosted outside of Azure, so that it can be managed through Azure Arc-enabled servers. > [!TIP]
-> If you prefer to try out things in a sample/practice experience, get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_servers/).
+> If you prefer to try out things in a sample/practice experience, get started quickly with [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_servers).
## Prerequisites
azure-arc Onboard Dsc https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/onboard-dsc.md
Using [Windows PowerShell Desired State Configuration](/powershell/dsc/getting-started/winGettingStarted) (DSC), you can automate software installation and configuration for a Windows computer. This article describes how to use DSC to install the Azure Connected Machine agent on hybrid Windows machines.
+>[!NOTE]
+> The PowerShell module described in this article is not currently supported by Microsoft. Any changes or improvements are only handled as a best-effort by the community.
+>
+ ## Requirements - Windows PowerShell version 4.0 or higher
Using [Windows PowerShell Desired State Configuration](/powershell/dsc/getting-s
## Install the ConnectedMachine DSC module
-1. To manually install the module, download the source code and unzip the contents of the project directory to the
-`$env:ProgramFiles\WindowsPowerShell\Modules folder`. Or, run the following command to install from the PowerShell gallery using PowerShellGet (in PowerShell 5.0):
+1. To manually install the module, download the source code from GitHub. Save the content to the
+`$env:ProgramFiles\WindowsPowerShell\Modules folder`.
```powershell
- Find-Module -Name AzureConnectedMachineDsc -Repository PSGallery | Install-Module
+ git clone https://github.com/azure/AzureConnectedMachineDsc
``` 2. To confirm installation, run the following command and ensure you see the Azure Connected Machine DSC resources available.
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/overview.md
An outage of Azure Arc won't affect the customer workload itself; only managemen
## Next steps * Before evaluating or enabling Azure Arc-enabled servers across multiple hybrid machines, review the [Connected Machine agent overview](agent-overview.md) to understand requirements, technical details about the agent, and deployment methods.
-* Try out Arc-enabled servers by using the [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_servers/).
+* Try out Arc-enabled servers by using the [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_servers).
* Review the [Planning and deployment guide](plan-at-scale-deployment.md) to plan for deploying Azure Arc-enabled servers at any scale and implement centralized management and monitoring. * Explore the [Azure Arc landing zone accelerator for hybrid and multicloud](/azure/cloud-adoption-framework/scenarios/hybrid/arc-enabled-servers/eslz-identity-and-access-management).
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/vmware-vsphere/overview.md
Azure Arc-enabled VMware vSphere doesn't store/process customer data outside the
- Plan your resource bridge deployment by reviewing the [support matrix for Arc-enabled VMware vSphere](support-matrix-for-arc-enabled-vmware-vsphere.md). - Once ready, [connect VMware vCenter to Azure Arc using the helper script](quick-start-connect-vcenter-to-arc-using-script.md).-- Try out Arc-enabled VMware vSphere by using the [Azure Arc Jumpstart](https://azurearcjumpstart.io/azure_arc_jumpstart/azure_arc_vsphere/).
+- Try out Arc-enabled VMware vSphere by using the [Azure Arc Jumpstart](https://azurearcjumpstart.com/azure_arc_jumpstart/azure_arc_vsphere).
azure-maps Open Source Projects https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-maps/open-source-projects.md
Find more open-source Azure Maps projects.
> [!div class="nextstepaction"] > [Code samples]
-[Azure Maps & Azure Active Directory Samples]: https://github.com/Azure-Samples/Azure-Maps-AzureAD-Samples
+[Azure Maps & Microsoft Entra ID Samples]: https://github.com/Azure-Samples/Azure-Maps-AzureAD-Samples
[Azure Maps .NET UWP IoT Remote Control]: https://github.com/Azure-Samples/azure-maps-dotnet-webgl-uwp-iot-remote-control [Azure Maps Animation module]: https://github.com/Azure-Samples/azure-maps-animations [Azure Maps Bring Data Into View Control module]: https://github.com/Azure-Samples/azure-maps-bring-data-into-view-control
azure-monitor Change Analysis Enable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/change/change-analysis-enable.md
description: Use Change Analysis in Azure Monitor to track and troubleshoot issu
-ms.contributor: cawa
Previously updated : 08/23/2022 Last updated : 11/17/2023 -+ # Enable Change Analysis
Register the `Microsoft.ChangeAnalysis` resource provider with an Azure Resource
- Enter any UI entry point, like the Web App **Diagnose and Solve Problems** tool, or - Bring up the Change Analysis standalone tab.
-In this guide, you'll learn the two ways to enable Change Analysis for Azure Functions and web app in-guest changes:
-- For one or a few Azure Functions or web apps, enable Change Analysis via the UI.-- For a large number of web apps (for example, 50+ web apps), enable Change Analysis using the provided PowerShell script.
+In this guide, you learn the two ways to enable Change Analysis for Azure Functions and web app in-guest changes:
+- For one or a few Azure Functions or web apps, [enable Change Analysis via the UI](#enable-azure-functions-and-web-app-in-guest-change-collection-via-the-change-analysis-portal).
+- For a large number of web apps (for example, 50+ web apps), [enable Change Analysis using the provided PowerShell script](#enable-change-analysis-at-scale-using-powershell).
> [!NOTE] > Slot-level enablement for Azure Functions or web app is not supported at the moment. ## Enable Azure Functions and web app in-guest change collection via the Change Analysis portal
-For web app in-guest changes, separate enablement is required for scanning code files within a web app. For more information, see [Change Analysis in the Diagnose and solve problems tool](change-analysis-visualizations.md#diagnose-and-solve-problems-tool) section.
+For web app in-guest changes, separate enablement is required for scanning code files within a web app. For more information, see [Change Analysis in the Diagnose and solve problems tool](change-analysis-visualizations.md#view-changes-using-the-diagnose-and-solve-problems-tool) section.
> [!NOTE] > You may not immediately see web app in-guest file changes and configuration changes. Prepare for downtime and restart your web app to view changes within 30 minutes. If you still can't see changes, refer to [the troubleshooting guide](./change-analysis-troubleshoot.md#cannot-see-in-guest-changes-for-newly-enabled-web-app).
For web app in-guest changes, separate enablement is required for scanning code
If your subscription includes several web apps, run the following script to enable *all web apps* in your subscription.
-### Pre-requisites
+### Prerequisites
PowerShell Az Module. Follow instructions at [Install the Azure PowerShell module](/powershell/azure/install-azure-powershell)
This section provides answers to common questions.
### How can I enable Change Analysis for a web application?
-Enable Change Analysis for web application in guest changes by using the [Diagnose and solve problems tool](./change-analysis-visualizations.md#diagnose-and-solve-problems-tool).
+Enable Change Analysis for web application in guest changes by using the [Diagnose and solve problems tool](./change-analysis-visualizations.md#view-changes-using-the-diagnose-and-solve-problems-tool).
## Next steps
azure-monitor Change Analysis Track Outages https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/change/change-analysis-track-outages.md
Title: Track a web app outage using Change Analysis
+ Title: "Tutorial: Track a web app outage using Change Analysis"
description: Describes how to identify the root cause of a web app outage using Azure Monitor Change Analysis.
-ms.contributor: cawa
- Previously updated : 01/11/2023 Last updated : 11/17/2023
-# Track a web app outage using Change Analysis
+# Tutorial: Track a web app outage using Change Analysis
-When issues happen, one of the first things to check is what changed in application, configuration and resources to triage and root cause issues. Change Analysis provides a centralized view of the changes in your subscriptions for up to the past 14 days to provide the history of changes for troubleshooting issues.
+When your application runs into an issue, you need configurations and resources to triage breaking changes and discover root-cause issues. Change Analysis provides a centralized view of the changes in your subscriptions for up to 14 days prior to provide the history of changes for troubleshooting issues.
To track an outage, we will:
To track an outage, we will:
> - Enable Change Analysis to track changes for Azure resources and for Azure Web App configurations > - Troubleshoot a Web App issue using Change Analysis
-## Pre-requisites
+## Prerequisites
- Install [.NET 7.0 or above](https://dotnet.microsoft.com/download). - Install [the Azure CLI](/cli/azure/install-azure-cli).
To track an outage, we will:
1. In your preferred terminal, log in to your Azure subscription.
-```bash
-az login
-az account set --s {azure-subscription-id}
-```
+ ```bash
+ az login
+ az account set -s {azure-subscription-id}
+ ```
1. Clone the [sample web application with storage to test Change Analysis](https://github.com/Azure-Samples/changeanalysis-webapp-storage-sample).
-```bash
-git clone https://github.com/Azure-Samples/changeanalysis-webapp-storage-sample.git
-```
+ ```bash
+ git clone https://github.com/Azure-Samples/changeanalysis-webapp-storage-sample.git
+ ```
1. Change the working directory to the project folder.
-```bash
-cd changeanalysis-webapp-storage-sample
-```
+ ```bash
+ cd changeanalysis-webapp-storage-sample
+ ```
### Run the PowerShell script
-1. Open `Publish-WebApp.ps1`.
+1. In the project folder, open `Publish-WebApp.ps1`.
1. Edit the `SUBSCRIPTION_ID` and `LOCATION` environment variables.
cd changeanalysis-webapp-storage-sample
| `SUBSCRIPTION_ID` | Your Azure subscription ID. | | `LOCATION` | The location of the resource group where you'd like to deploy the sample application. |
+1. Save your changes.
+ 1. Run the script from the `./changeanalysis-webapp-storage-sample` directory. ```bash
cd changeanalysis-webapp-storage-sample
## Enable Change Analysis
-In the Azure portal, [navigate to the Change Analysis standalone UI](./change-analysis-visualizations.md). Page loading may take a few minutes as the `Microsoft.ChangeAnalysis` resource provider is registered.
+In the Azure portal, [navigate to the Change Analysis standalone UI](./change-analysis-visualizations.md#access-change-analysis-screens). Page loading may take a few minutes as the `Microsoft.ChangeAnalysis` resource provider is registered.
:::image type="content" source="./media/change-analysis/change-analysis-blade.png" alt-text="Screenshot of Change Analysis in Azure portal.":::
Visit the web app URL to view the following error:
## Troubleshoot the outage using Change Analysis
-In the Azure portal, navigate to the Change Analysis overview page. Since you've triggered a web app outage, you'll see an entry of change for `AzureStorageConnection`:
+In the Azure portal, navigate to the Change Analysis overview page. Since you triggered a web app outage, you can see an entry of change for `AzureStorageConnection`:
:::image type="content" source="./media/change-analysis/entry-of-outage.png" alt-text="Screenshot of outage entry on the Change Analysis pane.":::
Since the connection string is a secret value, we hide it on the overview page f
The change details pane also shows important information, including who made the change.
-Now that you've discovered the web app in-guest change and understand next steps, you can proceed with troubleshooting the issue.
+Now that you discovered the web app in-guest change and understand next steps, you can proceed with troubleshooting the issue.
## Virtual network changes
azure-monitor Change Analysis Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/change/change-analysis-troubleshoot.md
description: Learn how to troubleshoot problems in Azure Monitor's Change Analys
-ms.contributor: cawa
Previously updated : 09/21/2022 Last updated : 11/17/2023 - # Troubleshoot Azure Monitor's Change Analysis
azure-monitor Change Analysis Visualizations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/change/change-analysis-visualizations.md
Title: Scenarios for using Change Analysis in Azure Monitor
+ Title: View and use Change Analysis in Azure Monitor
description: Learn the various scenarios in which you can use Azure Monitor's Change Analysis.
-ms.contributor: cawa
Previously updated : 01/12/2023 Last updated : 11/17/2023 --+
-# Scenarios for using Change Analysis in Azure Monitor
+# View and use Change Analysis in Azure Monitor
-Change Analysis provides data for various management and troubleshooting scenarios to help you understand what changes to your application might have caused the issues.
+Change Analysis provides data for various management and troubleshooting scenarios to help you understand what changes to your application caused breaking issues.
## View Change Analysis data
-You can access the Change Analysis overview portal under Azure Monitor, where you can view all changes and application dependency/resource insights. You can access Change Analysis through a couple of entry points:
+### Access Change Analysis screens
-### Monitor home page
+You can access the Change Analysis overview portal under Azure Monitor, where you can view all changes and application dependency/resource insights. You can access Change Analysis through two entry points:
+
+#### Via the Azure Monitor home page
From the Azure portal home page, select **Monitor** from the menu.
In the Monitor overview page, select the **Change Analysis** card.
:::image type="content" source="./media/change-analysis/change-analysis-monitor-overview.png" alt-text="Screenshot of selecting the Change Analysis card on the Monitor overview page.":::
-### Search
+#### Via search
In the Azure portal, search for Change Analysis to launch the experience.
The UI supports selecting multiple subscriptions to view resource changes. Use t
:::image type="content" source="./media/change-analysis/multiple-subscriptions-support.png" alt-text="Screenshot of subscription filter that supports selecting multiple subscriptions":::
-## Diagnose and solve problems tool
+### View the Activity Log change history
+
+Use the [View change history](../essentials/activity-log.md#view-change-history) feature to call the Azure Monitor Change Analysis service backend to view changes associated with an operation. Changes returned include:
+
+- Resource level changes from [Azure Resource Graph](../../governance/resource-graph/overview.md).
+- Resource properties from [Azure Resource Manager](../../azure-resource-manager/management/overview.md).
+- In-guest changes from PaaS services, such as a web app.
+
+1. From within your resource, select **Activity Log** from the side menu.
+1. Select a change from the list.
+1. Select the **Change history** tab.
+1. For the Azure Monitor Change Analysis service to scan for changes in users' subscriptions, a resource provider needs to be registered. When you select the **Change history** tab, the tool automatically registers **Microsoft.ChangeAnalysis** resource provider.
+1. Once registered, you can view changes from **Azure Resource Graph** immediately from the past 14 days.
+ - Changes from other sources will be available after ~4 hours after subscription is onboard.
+
+ :::image type="content" source="./media/change-analysis/activity-log-change-history.png" alt-text="Screenshot showing Activity Log change history integration.":::
+
+### View changes using the Diagnose and Solve Problems tool
-From your resource's overview page in Azure portal, you can view change data by selecting **Diagnose and solve problems** the left menu. As you enter the Diagnose and Solve Problems tool, the **Microsoft.ChangeAnalysis** resource provider will automatically be registered.
+From your resource's overview page in Azure portal, you can view change data by selecting **Diagnose and solve problems** the left menu. As you enter the Diagnose and Solve Problems tool, the **Microsoft.ChangeAnalysis** resource provider is automatically registered.
-### Diagnose and solve problems tool for Web App
+Learn how to use the Diagnose and Solve Problems tool for:
+- [Web App](#diagnose-and-solve-problems-tool-for-web-app)
+- [Virtual Machines](#diagnose-and-solve-problems-tool-for-virtual-machines)
+- [Azure SQL Database and other resources](#diagnose-and-solve-problems-tool-for-azure-sql-database-and-other-resources)
+
+#### Diagnose and solve problems tool for Web App
Azure Monitor's Change Analysis is: - A standalone detector in the Web App **Diagnose and solve problems** tool.
You can view change data via the **Web App Down** and **Application Crashes** de
By default, the graph displays changes from within the past 24 hours help with immediate problems.
-### Diagnose and solve problems tool for Virtual Machines
+#### Diagnose and solve problems tool for Virtual Machines
Change Analysis displays as an insight card in your virtual machine's **Diagnose and solve problems** tool. The insight card displays the number of changes or issues a resource experiences within the past 72 hours.
Change Analysis displays as an insight card in your virtual machine's **Diagnose
:::image type="content" source="./media/change-analysis/analyze-recent-changes.png" alt-text="Change analyzer in troubleshooting tools":::
-### Diagnose and solve problems tool for Azure SQL Database and other resources
+#### Diagnose and solve problems tool for Azure SQL Database and other resources
-You can view Change Analysis data for [multiple Azure resources](./change-analysis.md#supported-resource-types), but we highlight Azure SQL Database below.
+You can view Change Analysis data for [multiple Azure resources](./change-analysis.md#supported-resource-types), but we highlight Azure SQL Database in these steps.
1. Within your resource, select **Diagnose and solve problems** from the left menu. 1. Under **Common problems**, select **View change details** to view the filtered view from Change Analysis standalone UI. :::image type="content" source="./media/change-analysis/change-insight-diagnose-and-solve.png" alt-text="Screenshot of viewing common problems in Diagnose and Solve Problems tool.":::
-## Activity Log change history
+## Activities using Change Analysis
-Use the [View change history](../essentials/activity-log.md#view-change-history) feature to call the Azure Monitor Change Analysis service backend to view changes associated with an operation. Changes returned include:
+### Integrate with VM Insights
-- Resource level changes from [Azure Resource Graph](../../governance/resource-graph/overview.md).-- Resource properties from [Azure Resource Manager](../../azure-resource-manager/management/overview.md).-- In-guest changes from PaaS services, such as a web app.-
-1. From within your resource, select **Activity Log** from the side menu.
-1. Select a change from the list.
-1. Select the **Change history** tab.
-1. For the Azure Monitor Change Analysis service to scan for changes in users' subscriptions, a resource provider needs to be registered. Upon selecting the **Change history** tab, the tool will automatically register **Microsoft.ChangeAnalysis** resource provider.
-1. Once registered, you can view changes from **Azure Resource Graph** immediately from the past 14 days.
- - Changes from other sources will be available after ~4 hours after subscription is onboard.
-
- :::image type="content" source="./media/change-analysis/activity-log-change-history.png" alt-text="Activity Log change history integration":::
-
-## VM Insights integration
-
-If you've enabled [VM Insights](../vm/vminsights-overview.md), you can view changes in your virtual machines that may have caused any spikes in a metric chart, such as CPU or Memory.
+If you enabled [VM Insights](../vm/vminsights-overview.md), you can view changes in your virtual machines that caused any spikes in a metric chart, such as CPU or Memory.
1. Within your virtual machine, select **Insights** from under **Monitoring** in the left menu. 1. Select the **Performance** tab.
If you've enabled [VM Insights](../vm/vminsights-overview.md), you can view chan
:::image type="content" source="./media/change-analysis/vm-insights-2.png" alt-text="View of the property panel, selecting Investigate Changes button.":::
-## Drill to Change Analysis logs
+### Drill to Change Analysis logs
You can also drill to Change Analysis logs via a chart you've created or pinned to your resource's **Monitoring** dashboard.
You can also drill to Change Analysis logs via a chart you've created or pinned
:::image type="content" source="./media/change-analysis/view-change-analysis-2.png" alt-text="Drill into logs and select to view Change Analysis.":::
-## Browse using custom filters and search bar
+### Browse using custom filters and search bar
Browsing through a long list of changes in the entire subscription is time consuming. With Change Analysis custom filters and search capability, you can efficiently navigate to changes relevant to issues for troubleshooting. :::image type="content" source="./media/change-analysis/filters-search-bar.png" alt-text="Screenshot showing that filters and search bar are available at the top of Change Analysis homepage, right above the changes section.":::
-### Filters
+#### Filters
| Filter | Description | | | -- |
Browsing through a long list of changes in the entire subscription is time consu
| Resource | Select **Add filter** to use this filter. </br> Filter the changes to specific resources. Helpful if you already know which resources to look at for changes. [If the filter is only returning 1,000 resources, see the corresponding solution in troubleshooting guide](./change-analysis-troubleshoot.md#cant-filter-to-your-resource-to-view-changes). | | Resource type | Select **Add filter** to use this filter. </br> Filter the changes to specific resource types. |
-### Search bar
+#### Search bar
The search bar filters the changes according to the input keywords. Search bar results apply only to the changes loaded by the page already and don't pull in results from the server side.
-## Pin and share a Change Analysis query to the Azure dashboard
+### Pin and share a Change Analysis query to the Azure dashboard
Let's say you want to curate a change view on specific resources, like all Virtual Machine changes in your subscription, and include it in a report sent periodically. You can pin the view to an Azure dashboard for monitoring or sharing scenarios. If you'd like to share a specific change with your team members, you can use the share feature in the Change Details page.
-## Pin to the Azure dashboard
+### Pin to the Azure dashboard
-Once you have applied filters to the Change Analysis homepage:
+Once you applied filters to the Change Analysis homepage:
1. Select **Pin current filters** from the top menu. 1. Enter a name for the pin.
Once you have applied filters to the Change Analysis homepage:
:::image type="content" source="./media/change-analysis/click-pin-menu.png" alt-text="Screenshot of selecting Pin current filters button in Change Analysis.":::
-A side pane will open to configure the dashboard where you'll place your pin. You can select one of two dashboard types:
+A side pane opens to configure the dashboard where you place your pin. You can select one of two dashboard types:
| Dashboard type | Description | | -- | -- | | Private | Only you can access a private dashboard. Choose this option if you're creating the pin for your own easy access to the changes. | | Shared | A shared dashboard supports role-based access control for view/read access. Shared dashboards are created as a resource in your subscription with a region and resource group to host it. Choose this option if you're creating the pin to share with your team. |
-### Select an existing dashboard
+#### Select an existing dashboard
If you already have a dashboard to place the pin: 1. Select the **Existing** tab. 1. Select either **Private** or **Shared**. 1. Select the dashboard you'd like to use.
-1. If you've selected **Shared**, select the subscription in which you'd like to place the dashboard.
+1. If you selected **Shared**, select the subscription in which you'd like to place the dashboard.
1. Select **Pin**. :::image type="content" source="./media/change-analysis/existing-dashboard-small.png" alt-text="Screenshot of selecting an existing dashboard to pin your changes to. ":::
-### Create a new dashboard
+#### Create a new dashboard
You can create a new dashboard for this pin.
You can create a new dashboard for this pin.
Once the dashboard and pin are created, navigate to the Azure dashboard to view them.
-1. From the Azure portal home menu, select **Dashboard**. Use the **Manage Sharing** button in the top menu to handle access or "unshare". Click on the pin to navigate to the curated view of changes.
+1. From the Azure portal home menu, select **Dashboard**.
+1. Use the **Manage Sharing** button in the top menu to handle access or "unshare".
+1. Click on the pin to navigate to the curated view of changes.
:::image type="content" source="./media/change-analysis/azure-dashboard.png" alt-text="Screenshot of selecting the Dashboard in the Azure portal home menu."::: :::image type="content" source="./media/change-analysis/view-share-dashboard.png" alt-text="Screenshot of the pin in the dashboard.":::
-## Share a single change with your team
+### Share a single change with your team
In the Change Analysis homepage, select a line of change to view details on the change.
In the Change Analysis homepage, select a line of change to view details on the
:::image type="content" source="./media/change-analysis/share-single-change.png" alt-text="Screenshot of selecting the share button on the dashboard and copying link."::: - ## Next steps - Learn how to [troubleshoot problems in Change Analysis](change-analysis-troubleshoot.md)
azure-monitor Change Analysis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/change/change-analysis.md
description: Use Change Analysis in Azure Monitor to troubleshoot issues on live
-ms.contributor: cawa
Previously updated : 11/15/2022 Last updated : 11/17/2023 # Use Change Analysis in Azure Monitor
-While standard monitoring solutions might alert you to a live site issue, outage, or component failure, they often don't explain the cause. For example, your site worked five minutes ago, and now it's broken. What changed in the last five minutes?
+While standard monitoring solutions might alert you to a live site issue, outage, or component failure, they often don't explain the cause. Let's say your site worked five minutes ago, and now it's broken. What changed in the last five minutes?
-We've designed Change Analysis to answer that question in Azure Monitor.
+Change Analysis is designed to answer that question in Azure Monitor.
Building on the power of [Azure Resource Graph](../../governance/resource-graph/overview.md), Change Analysis: - Provides insights into your Azure application changes.
Building on the power of [Azure Resource Graph](../../governance/resource-graph/
Change Analysis detects various types of changes, from the infrastructure layer through application deployment. Change Analysis is a subscription-level Azure resource provider that: - Checks resource changes in the subscription. -- Provides data for various diagnostic tools to help users understand what changes might have caused issues.
+- Provides data for various diagnostic tools to help users understand what changes caused issues.
The following diagram illustrates the architecture of Change Analysis:
Change Analysis also tracks [resource dependency changes](#dependency-changes) t
### Azure Resource Manager resource properties changes
-Using [Azure Resource Graph](../../governance/resource-graph/overview.md), Change Analysis provides a historical record of how the Azure resources that host your application have changed over time. The following basic configuration settings are set using Azure Resource Manager and tracked by Azure Resource Graph:
+Using [Azure Resource Graph](../../governance/resource-graph/overview.md), Change Analysis provides a historical record of how the Azure resources that host your application changed over time. The following basic configuration settings are set using Azure Resource Manager and tracked by Azure Resource Graph:
- Managed identities - Platform OS upgrade - Hostnames
In addition to the settings set via Azure Resource Manager, you can set configur
- TLS settings - Extension versions
-These setting changes are not captured by Azure Resource Graph. Change Analysis fills this gap by capturing snapshots of changes in those main configuration properties, like changes to the connection string, etc. Snapshots are taken of configuration changes and change details every up to 6 hours. [See known limitations.](#limitations)
+Azure Resource Graph doesn't capture these setting changes. Change Analysis fills this gap by capturing snapshots of changes in those main configuration properties, like changes to the connection string, etc. Snapshots are taken of configuration changes and change details every up to 6 hours.
+
+[See known limitations regarding resource configuration change analysis.](#limitations)
### Changes in Azure Function and Web Apps (in-guest changes)
Every 30 minutes, Change Analysis captures the configuration state of a web appl
:::image type="content" source="./media/change-analysis/scan-changes.png" alt-text="Screenshot of the selecting the Refresh button to view latest changes.":::
-If you don't see file changes within 30 minutes or configuration changes within 6 hours, refer to [our troubleshooting guide](./change-analysis-troubleshoot.md#cannot-see-in-guest-changes-for-newly-enabled-web-app).
+Refer to [our troubleshooting guide](./change-analysis-troubleshoot.md#cannot-see-in-guest-changes-for-newly-enabled-web-app) if you don't see:
+- File changes within 30 minutes
+- Configuration changes within 6 hours
-[See known limitations.](#limitations)
+[See known limitations regarding in-guest change analysis.](#limitations)
Currently, all text-based files under site root **wwwroot** with the following extensions are supported:
Currently, all text-based files under site root **wwwroot** with the following e
Changes to resource dependencies can also cause issues in a resource. For example, if a web app calls into a Redis cache, the Redis cache SKU could affect the web app performance.
-As another example, if port 22 was closed in a virtual machine's Network Security Group, it will cause connectivity errors.
+As another example, if port 22 was closed in a virtual machine's Network Security Group, it causes connectivity errors.
#### Web App diagnose and solve problems navigator (preview)
-To detect changes in dependencies, Change Analysis checks the web app's DNS record. In this way, it identifies changes in all app components that could cause issues.
+Change Analysis checks the web app's DNS record, to detect changes in dependencies and app components that could cause issues.
-Currently the following dependencies are supported in **Web App Diagnose and solve problems | Navigator**:
+Currently, the following dependencies are supported in **Web App Diagnose and solve problems | Navigator**:
- Web Apps - Azure Storage
Currently the following dependencies are supported in **Web App Diagnose and sol
- **Web app deployment changes**: Code deployment change information might not be available immediately in the Change Analysis tool. To view the latest changes in Change Analysis, select **Refresh**. - **Function and Web App file changes**: File changes take up to 30 minutes to display. - **Function and Web App configuration changes**: Due to the snapshot approach to configuration changes, timestamps of configuration changes could take up to 6 hours to display from when the change actually happened.-- **Web app deployment and configuration changes**: Since these changes are collected by a site extension and stored on disk space owned by your application, data collection and storage is subject to your application's behavior. Check to see if a misbehaving application is affecting the results.-- **Snapshot retention for all changes**: The Change Analysis data for resources is tracked by Azure Resource Graphs (ARG). ARG keeps snapshot history of tracked resources only for 14 days.
+- **Web app deployment and configuration changes**: A site extension collects these changes and stores them on disk space owned by your application. Thus, data collection and storage is subject to your application's behavior. Check to see if a misbehaving application is affecting the results.
+- **Snapshot retention for all changes**: Azure Resource Graphs (ARG) tracks the Change Analysis data for resources. ARG only keeps snapshot history of tracked resources for _14 days_.
## Frequently asked questions
azure-monitor Container Insights Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-authentication.md
Title: Configure agent authentication for the Container Insights agent | Microsoft Docs
+ Title: Configure agent authentication for the Container Insights agent
description: This article describes how to configure authentication for the containerized agent used by Container insights. Previously updated : 07/31/2023 Last updated : 10/18/2023
-# Authentication for Container Insights
+# Legacy authentication for Container Insights
-Container Insights now defaults to managed identity authentication. This secure and simplified authentication model has a monitoring agent that uses the cluster's managed identity to send data to Azure Monitor. It replaces the existing legacy certificate-based local authentication and removes the requirement of adding a Monitoring Metrics Publisher role to the cluster.
+Container Insights defaults to managed identity authentication, which has a monitoring agent that uses the [cluster's managed identity](../../aks/use-managed-identity.md) to send data to Azure Monitor. It replaced the legacy certificate-based local authentication and removed the requirement of adding a Monitoring Metrics Publisher role to the cluster.
-> [!Note]
-> [ContainerLogV2](container-insights-logging-v2.md) will be default schema for customers who will be onboarding container insights with Managed Identity Auth using ARM, Bicep, Terraform, Policy and Portal onboarding. ContainerLogV2 can be explicitly enabled through CLI version 2.51.0 or higher using Data collection settings.
+This article describes how to migrate to managed identity authentication if you enabled Container insights using legacy authentication method and also how to enable legacy authentication if you have that requirement.
-## How to enable
+## Migrate to managed identity authentication
-Click on the relevant tab for instructions to enable Managed identity authentication on your clusters.
+If you enabled Container insights before managed identity authentication was available, you can use the following methods to migrate your clusters.
## [Azure portal](#tab/portal-azure-monitor)
-When creating a new cluster from the Azure portal: On the **Integrations** tab, first check the box for *Enable Container Logs*, then check the box for *Use managed identity*.
--
-For existing clusters, you can switch to Managed Identity authentication from the *Monitor settings* panel: Navigate to your AKS cluster, scroll through the menu on the left till you see the **Monitoring** section, there click on the **Insights** tab. In the Insights tab, click on the *Monitor Settings* option and check the box for *Use managed identity*
+You can migrate to Managed Identity authentication from the *Monitor settings* panel for your AKS cluster. From the **Monitoring** section, click on the **Insights** tab. In the Insights tab, click on the *Monitor Settings* option and check the box for *Use managed identity*
:::image type="content" source="./media/container-insights-authentication/monitor-settings.png" alt-text="Screenshot that shows the settings panel." lightbox="media/container-insights-authentication/monitor-settings.png":::
If you don't see the *Use managed identity* option, you are using an SPN cluster
## [Azure CLI](#tab/cli)
-See [Migrate to managed identity authentication](container-insights-enable-aks.md?tabs=azure-cli#migrate-to-managed-identity-authentication)
-
-## [Resource Manager template](#tab/arm)
-
-See instructions for migrating
-
-* [AKS clusters](container-insights-enable-aks.md?tabs=arm#existing-aks-cluster)
-* [Arc-enabled clusters](container-insights-enable-arc-enabled-clusters.md?tabs=create-cli%2Cverify-portal%2Cmigrate-arm)
-
-## [Bicep](#tab/bicep)
-
-**Enable Monitoring with MSI without syslog**
-
-1. Download Bicep templates and Parameter files
-
-```
-curl -L https://aka.ms/enable-monitoring-msi-bicep-template -o existingClusterOnboarding.bicep
-curl -L https://aka.ms/enable-monitoring-msi-bicep-parameters -o existingClusterParam.json
-```
-
-2. Edit the values in the parameter file
-
-
-3. Onboard with the following commands:
-
-```
-az login
-az account set --subscription "Subscription Name"
-az deployment group create --resource-group <ClusterResourceGroupName> --template-file ./existingClusterOnboarding.bicep --parameters ./existingClusterParam.json
-```
-
-**Enable Monitoring with MSI with syslog**
-
-1. Download Bicep templates and Parameter files
-
-```
- curl -L https://aka.ms/enable-monitoring-msi-syslog-bicep-template -o existingClusterOnboarding.bicep
- curl -L https://aka.ms/enable-monitoring-msi-syslog-bicep-parameters -o existingClusterParam.json
-```
-
-2. Edit the values in the parameter file
--- **aksResourceId**: Use the values on the AKS Overview page for the AKS cluster.-- **aksResourceLocation**: Use the values on the AKS Overview page for the AKS cluster.-- **workspaceResourceId**: Use the resource ID of your Log Analytics workspace.-- **workspaceRegion**: Use the location of your Log Analytics workspace.-- **resourceTagValues**: Match the existing tag values specified for the existing Container insights extension data collection rule (DCR) of the cluster and the name of the DCR. The name match `MSCI-<clusterName>-<clusterRegion>` and this resource is created in the same resource group as the AKS clusters. For first time onboarding, you can set the arbitrary tag values.--
-3. Onboarding with the following commands:
-
-```
-az login
-az account set --subscription "Subscription Name"
-az deployment group create --resource-group <ClusterResourceGroupName> --template-file ./existingClusterOnboarding.bicep --parameters ./existingClusterParam.json
-```
-
-For new AKS cluster:
-Replace and use the managed cluster resources in this [guide](../../aks/learn/quick-kubernetes-deploy-bicep.md?tabs=azure-cli)
--
-## [Terraform](#tab/terraform)
-
-**Enable Monitoring with MSI without syslog for new AKS cluster**
-
-1. Download Terraform template for enable monitoring msi with syslog enabled:
-https://aka.ms/enable-monitoring-msi-terraform
-2. Adjust the azurerm_kubernetes_cluster resource in main.tf based on what cluster settings you're going to have
-3. Update parameters in variables.tf to replace values in "<>"
-4. Run `terraform init -upgrade` to initialize the Terraform deployment.
-5. Run `terraform plan -out main.tfplan` to initialize the Terraform deployment.
-6. Run `terraform apply main.tfplan` to apply the execution plan to your cloud infrastructure.
-
-**Enable Monitoring with MSI with syslog for new AKS cluster**
-1. Download Terraform template for enable monitoring msi with syslog enabled:
-https://aka.ms/enable-monitoring-msi-syslog-terraform
-2. Adjust the azurerm_kubernetes_cluster resource in main.tf based on what cluster settings you're going to have
-3. Update parameters in variables.tf to replace values in "<>"
-4. Run `terraform init -upgrade` to initialize the Terraform deployment.
-5. Run `terraform plan -out main.tfplan` to initialize the Terraform deployment.
-6. Run `terraform apply main.tfplan` to apply the execution plan to your cloud infrastructure.
-
-**Enable Monitoring with MSI for existing AKS cluster:**
-1. Import the existing cluster resource first with this command: ` terraform import azurerm_kubernetes_cluster.k8s <aksResourceId>`
-2. Add the oms_agent add-on profile to the existing azurerm_kubernetes_cluster resource.
-```
-oms_agent {
- log_analytics_workspace_id = var.workspace_resource_id
- msi_auth_for_monitoring_enabled = true
- }
-```
-3. Copy the dcr and dcra resources from the Terraform templates
-4. Run `terraform plan -out main.tfplan` and make sure the change is adding the oms_agent property. Note: If the azurerm_kubernetes_cluster resource defined is different during terraform plan, the existing cluster will get destroyed and recreated.
-5. Run `terraform apply main.tfplan` to apply the execution plan to your cloud infrastructure.
-
-> [!TIP]
-> - Edit the `main.tf` file appropriately before running the terraform template
-> - Data will start flowing after 10 minutes since the cluster needs to be ready first
-> - WorkspaceID needs to match the format `/subscriptions/12345678-1234-9876-4563-123456789012/resourceGroups/example-resource-group/providers/Microsoft.OperationalInsights/workspaces/workspaceValue`
-> - If resource group already exists, run `terraform import azurerm_resource_group.rg /subscriptions/<Subscription_ID>/resourceGroups/<Resource_Group_Name>` before terraform plan
-
-## [Azure Policy](#tab/policy)
-
-1. Download Azure Policy templates and parameter files using the following commands:
-
-```
-curl -L https://aka.ms/enable-monitoring-msi-azure-policy-template -o azure-policy.rules.json
-curl -L https://aka.ms/enable-monitoring-msi-azure-policy-parameters -o azure-policy.parameters.json
-```
--
-2. Activate the policies:
-
-You can create the policy definition using a command:
-```
-az policy definition create --name "AKS-Monitoring-Addon-MSI" --display-name "AKS-Monitoring-Addon-MSI" --mode Indexed --metadata version=1.0.0 category=Kubernetes --rules azure-policy.rules.json --params azure-policy.parameters.json
-```
-You can create the policy assignment with the following command like:
-```
-az policy assignment create --name aks-monitoring-addon --policy "AKS-Monitoring-Addon-MSI" --assign-identity --identity-scope /subscriptions/<subscriptionId> --role Contributor --scope /subscriptions/<subscriptionId> --location <location> --role Contributor --scope /subscriptions/<subscriptionId> -p "{ \"workspaceResourceId\": { \"value\": \"/subscriptions/<subscriptionId>/resourcegroups/<resourceGroupName>/providers/microsoft.operationalinsights/workspaces/<workspaceName>\" } }"
-```
-
-> [!TIP]
-> - Make sure when performing remediation task, the policy assignment has access to workspace you specified.
-> - Download all files under AddonPolicyTemplate folder before running the policy template.
-> - For assign policy, parameters and remediation task from portal, use the following guides:
-> o After creating the policy definition through the above command, go to Azure portal -> Policy -> Definitions and select the definition you created.
-> o Click on 'Assign' and then go to the 'Parameters' tab and fill in the details. Then click 'Review + Create'.
-> o Once the policy is assigned to the subscription, whenever you create a new cluster, the policy will run and check if Container Insights is enabled. If not, it will deploy the resource. If you want to apply the policy to existing AKS cluster, create a 'Remediation task' for that resource after going to the 'Policy Assignment'.
+### AKS
+AKS clusters must first disable monitoring and then upgrade to managed identity. Only Azure public cloud, Microsoft Azure operated by 21Vianet cloud, and Azure Government cloud are currently supported for this migration. For clusters with user-assigned identity, only Azure public cloud is supported.
+
+> [!NOTE]
+> Minimum Azure CLI version 2.49.0 or higher.
+
+1. Get the configured Log Analytics workspace resource ID:
+
+ ```cli
+ az aks show -g <resource-group-name> -n <cluster-name> | grep -i "logAnalyticsWorkspaceResourceID"
+ ```
+
+2. Disable monitoring with the following command:
+
+ ```cli
+ az aks disable-addons -a monitoring -g <resource-group-name> -n <cluster-name>
+ ```
+
+3. If the cluster is using a service principal, upgrade it to system managed identity with the following command:
+
+ ```cli
+ az aks update -g <resource-group-name> -n <cluster-name> --enable-managed-identity
+ ```
+
+4. Enable the monitoring add-on with the managed identity authentication option by using the Log Analytics workspace resource ID obtained in step 1:
+
+ ```cli
+ az aks enable-addons -a monitoring -g <resource-group-name> -n <cluster-name> --workspace-resource-id <workspace-resource-id>
+ ```
++
+### Arc-enabled Kubernetes
+
+>[!NOTE]
+> Managed identity authentication is not supported for Arc-enabled Kubernetes clusters with **ARO**.
+
+1. Retrieve the Log Analytics workspace configured for Container insights extension.
+
+ ```cli
+ az k8s-extension show --name azuremonitor-containers --cluster-name \<cluster-name\> --resource-group \<resource-group\> --cluster-type connectedClusters -n azuremonitor-containers
+ ```
+
+2. Enable Container insights extension with managed identity authentication option using the workspace returned in the first step.
+
+ ```cli
+ az k8s-extension create --name azuremonitor-containers --cluster-name \<cluster-name\> --resource-group \<resource-group\> --cluster-type connectedClusters --extension-type Microsoft.AzureMonitor.Containers --configuration-settings amalogs.useAADAuth=true logAnalyticsWorkspaceResourceID=\<workspace-resource-id\>
+ ```
-## Limitations
-1. Ingestion Transformations are not supported: See [Data collection transformation](../essentials/data-collection-transformations.md) to read more.
-2. Dependency on DCR/DCRA for region availability - For new AKS region, there might be chances that DCR is still not supported in the new region. In that case, onboarding Container Insights with MSI will fail. One workaround is to onboard to Container Insights through CLI with the old way (with the use of Container Insights solution)
## Timeline Any new clusters being created or being onboarded now default to Managed Identity authentication. However, existing clusters with legacy solution-based authentication are still supported.
azure-monitor Container Insights Enable Aks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-enable-aks.md
# Enable Container insights for Azure Kubernetes Service (AKS) cluster
-This article describes how to set up Container insights to monitor a managed Kubernetes cluster hosted on an [Azure Kubernetes Service (AKS)](../../aks/index.yml) cluster.
+This article describes how to enable Container insights on a managed Kubernetes cluster hosted on an [Azure Kubernetes Service (AKS)](../../aks/index.yml) cluster.
## Prerequisites
-If you're connecting an existing AKS cluster to a Log Analytics workspace in another subscription, the *Microsoft.ContainerService* resource provider must be registered in the subscription with the Log Analytics workspace. For more information, see [Register resource provider](../../azure-resource-manager/management/resource-providers-and-types.md#register-resource-provider).
+- See [Prerequisites](./container-insights-onboard.md) for Container insights.
+- You can attach an AKS cluster to a Log Analytics workspace in a different Azure subscription in the same Microsoft Entra tenant, but you must use the Azure CLI or an Azure Resource Manager template. You can't currently perform this configuration with the Azure portal.
+- If you're connecting an existing AKS cluster to a Log Analytics workspace in another subscription, the *Microsoft.ContainerService* resource provider must be registered in the subscription with the Log Analytics workspace. For more information, see [Register resource provider](../../azure-resource-manager/management/resource-providers-and-types.md#register-resource-provider).
++
+## Enable monitoring
+
+### [Azure portal](#tab/azure-portal)
+
+There are multiple options to enable Prometheus metrics on your cluster from the Azure portal.
++
+### New cluster
+When you create a new AKS cluster in the Azure portal, you can enable Prometheus, Container insights, and Grafana from the **Integrations** tab. In the Azure Monitor section, select either **Default configuration** or **Custom configuration** if you want to specify which workspaces to use. You can perform additional configuration once the cluster is created.
++
+### From existing cluster
+
+This option enables Container insights on a cluster and gives you the option of also enabling [Managed Prometheus and Managed Grafana](./prometheus-metrics-enable.md) for the cluster.
> [!NOTE]
-> When you enable Container Insights on legacy auth clusters, a managed identity is automatically created. This identity will not be available in case the cluster migrates to MSI Auth or if the Container Insights is disabled and hence this managed identity should not be used for anything else.
+> If you want to enabled Managed Prometheus without Container insights, then [enable it from the Azure Monitor workspace](./prometheus-metrics-enable.md).
+
+1. Open the cluster's menu in the Azure portal and select **Insights**.
+ 1. If Container insights isn't enabled for the cluster, then you're presented with a screen identifying which of the features have been enabled. Click **Configure monitoring**.
+
+ :::image type="content" source="media/aks-onboard/configure-monitoring-screen.png" lightbox="media/aks-onboard/configure-monitoring-screen.png" alt-text="Screenshot that shows the configuration screen for a cluster.":::
+
+ 2. If Container insights has already been enabled on the cluster, select the **Monitoring Settings** button to modify the configuration.
+
+ :::image type="content" source="media/aks-onboard/monitor-settings-button.png" lightbox="media/aks-onboard/monitor-settings-button.png" alt-text="Screenshot that shows the monitoring settings button for a cluster.":::
+
+2. The **Container insights** will be enabled. **Select** the checkboxes for **Enable Prometheus metrics** and **Enable Grafana** if you also want to enable them for the cluster. If you have existing Azure Monitor workspace and Grafana workspace, then they're selected for you.
+
+ :::image type="content" source="media/prometheus-metrics-enable/configure-container-insights.png" lightbox="media/prometheus-metrics-enable/configure-container-insights.png" alt-text="Screenshot that shows the dialog box to configure Container insights with Prometheus and Grafana.":::
-## New AKS cluster
+3. Click **Advanced settings** to select alternate workspaces or create new ones. The **Cost presets** setting allows you to modify the default collection details to reduce your monitoring costs. See [Enable cost optimization settings in Container insights](./container-insights-cost-config.md) for details.
-You can enable monitoring for an AKS cluster when it's created by using any of the following methods:
+ :::image type="content" source="media/aks-onboard/advanced-settings.png" lightbox="media/aks-onboard/advanced-settings.png" alt-text="Screenshot that shows the advanced settings dialog box.":::
-- **Azure CLI**: Follow the steps in [Create AKS cluster](../../aks/learn/quick-kubernetes-deploy-cli.md).-- **Azure Policy**: Follow the steps in [Enable AKS monitoring add-on by using Azure Policy](container-insights-enable-aks-policy.md).-- **Terraform**: If you're [deploying a new AKS cluster by using Terraform](/azure/developer/terraform/create-k8s-cluster-with-tf-and-aks), specify the arguments required in the profile [to create a Log Analytics workspace](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/log_analytics_workspace) if you don't choose to specify an existing one. To add Container insights to the workspace, see [azurerm_log_analytics_solution](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/log_analytics_solution). Complete the profile by including **oms_agent** profile.
+4. Click **Configure** to save the configuration.
-## Existing AKS cluster
+### From Container insights
+From the Container insights menu, you can view all of your clusters, quickly identify which aren't monitored, and launch the same configuration experience as described in [From existing cluster](#from-existing-cluster).
+
+1. Open the **Monitor** menu in the Azure portal and select **Insights**.
+3. The **Unmonitored clusters** tab lists clusters that don't have Container insights enabled. Click **Enable** next to a cluster and follow the guidance in [Existing cluster](#existing-cluster).
-Use any of the following methods to enable monitoring for an existing AKS cluster.
## [CLI](#tab/azure-cli) > [!NOTE]
-> Managed identity authentication will be default in CLI version 2.49.0 or higher. If you need to use legacy/non-managed identity authentication, use CLI version < 2.49.0. For CLI version 2.54.0 or higher the logging schema will be configured to [ContainerLogV2](./container-insights-logging-v2.md) via the ConfigMap
+> Managed identity authentication will be default in CLI version 2.49.0 or higher. If you need to use legacy/non-managed identity authentication, use CLI version < 2.49.0. For CLI version 2.54.0 or higher the logging schema will be configured to [ContainerLogV2](container-insights-logging-v2.md) via the ConfigMap
### Use a default Log Analytics workspace
Use the following command to enable monitoring of your AKS cluster on a specific
az aks enable-addons -a monitoring -n <cluster-name> -g <cluster-resource-group-name> --workspace-resource-id <workspace-resource-id> ```
-The output will resemble the following example:
+**Example**
-```output
-provisioningState : Succeeded
+```azurecli
+az aks enable-addons -a monitoring -n <cluster-name> -g <cluster-resource-group-name> --workspace-resource-id "/subscriptions/my-subscription/resourceGroups/my-resource-group/providers/Microsoft.OperationalInsights/workspaces/my-workspace"
```
-## [Terraform](#tab/terraform)
-1. Add the **oms_agent** add-on profile to the existing [azurerm_kubernetes_cluster resource](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/data-sources/kubernetes_cluster) depending on the version of the [Terraform AzureRM provider version](/azure/developer/terraform/provider-version-history-azurerm).
+## [Resource Manager template](#tab/arm)
- * If the Terraform AzureRM provider version is 3.0 or higher, add the following:
+>[!NOTE]
+>The template must be deployed in the same resource group as the cluster.
- ```
- oms_agent {
- log_analytics_workspace_id = "${azurerm_log_analytics_workspace.test.id}"
- }
- ```
+1. Download the template and parameter file.
+ - Template file: [https://aka.ms/aks-enable-monitoring-msi-onboarding-template-file](https://aka.ms/aks-enable-monitoring-msi-onboarding-template-file)
+ - Parameter file: [https://aka.ms/aks-enable-monitoring-msi-onboarding-template-parameter-file](https://aka.ms/aks-enable-monitoring-msi-onboarding-template-parameter-file)
- * If the Terraform AzureRM provider is less than version 3.0, add the following:
+2. Edit the following values in the parameter file:
- ```
- addon_profile {
- oms_agent {
- enabled = true
- log_analytics_workspace_id = "${azurerm_log_analytics_workspace.test.id}"
- }
- }
- ```
+ | Parameter | Description |
+ |:|:|
+ | `aksResourceId` | Use the values on the **AKS Overview** page for the AKS cluster. |
+ | `aksResourceLocation` | Use the values on the **AKS Overview** page for the AKS cluster. |
+ | `workspaceResourceId` | Use the resource ID of your Log Analytics workspace. |
+ | `resourceTagValues` | Match the existing tag values specified for the existing Container insights extension data collection rule (DCR) of the cluster and the name of the DCR. The name will be *MSCI-\<clusterName\>-\<clusterRegion\>* and this resource created in an AKS clusters resource group. If this is the first time onboarding, you can set the arbitrary tag values. |
-2. Add the [azurerm_log_analytics_solution](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/log_analytics_solution) by following the steps in the Terraform documentation.
+3. Deploy the template with the parameter file by using any valid method for deploying Resource Manager templates. For examples of different methods, see [Deploy the sample templates](../resource-manager-samples.md#deploy-the-sample-templates).
-3. Enable collection of custom metrics by following the guidance at [Enable custom metrics](container-insights-custom-metrics.md).
+## [Bicep](#tab/bicep)
-## [Azure portal](#tab/portal-azure-monitor)
+### Existing cluster
-> [!NOTE]
-> You can initiate this same process from the **Insights** option in the AKS menu for your cluster in the Azure portal.
+1. Download Bicep templates and parameter files depending on whether you want to enable Syslog collection.
-To enable monitoring of your AKS cluster in the Azure portal from Azure Monitor:
+ **Syslog**
+ - Template file: [Template without Syslog](https://aka.ms/enable-monitoring-msi-bicep-template)
+ - Parameter file: [Parameter without Syslog](https://aka.ms/enable-monitoring-msi-bicep-parameters)
-1. In the Azure portal, select **Monitor**.
-1. Select **Containers** from the list.
-1. On the **Monitor - containers** page, select **Unmonitored clusters**.
-1. From the list of unmonitored clusters, find the cluster in the list and select **Enable**.
-1. On the **Configure Container insights** page, select **Configure**.
+ **No Syslog**
+ - Template file: [Template with Syslog](https://aka.ms/enable-monitoring-msi-syslog-bicep-template)
+ - Parameter file: [Parameter with Syslog](https://aka.ms/enable-monitoring-msi-syslog-bicep-parameters)
- :::image type="content" source="media/container-insights-enable-aks/container-insights-configure.png" lightbox="media/container-insights-enable-aks/container-insights-configure.png" alt-text="Screenshot that shows the configuration screen for an AKS cluster.":::
+2. Edit the following values in the parameter file:
+
+ | Parameter | Description |
+ |:|:|
+ | `aksResourceId` | Use the values on the AKS Overview page for the AKS cluster. |
+ | `aksResourceLocation` | Use the values on the AKS Overview page for the AKS cluster. |
+ | `workspaceResourceId` | Use the resource ID of your Log Analytics workspace. |
+ | `workspaceRegion` | Use the location of your Log Analytics workspace. |
+ | `resourceTagValues` | Match the existing tag values specified for the existing Container insights extension data collection rule (DCR) of the cluster and the name of the DCR. The name will match `MSCI-<clusterName>-<clusterRegion>` and this resource is created in the same resource group as the AKS clusters. For first time onboarding, you can set the arbitrary tag values. |
+ | `enabledContainerLogV2` | Set this parameter value to be true to use the default recommended ContainerLogV2 schema
+ | Cost optimization parameters | Refer to [Data collection parameters](container-insights-cost-config.md#data-collection-parameters) |
-1. On the **Configure Container insights** page, fill in the following information:
+3. Deploy the template with the parameter file by using any valid method for deploying Resource Manager templates. For examples of different methods, see [Deploy the sample templates](../resource-manager-samples.md#deploy-the-sample-templates).
- | Option | Description |
- |:|:|
- | Log Analytics workspace | Select a [Log Analytics workspace](../logs/log-analytics-workspace-overview.md) from the dropdown list or select **Create new** to create a default Log Analytics workspace. The Log Analytics workspace must be in the same subscription as the AKS container. |
- | Enable Prometheus metrics | Select this option to collect Prometheus metrics for the cluster in [Azure Monitor managed service for Prometheus](../essentials/prometheus-metrics-overview.md). |
- | Azure Monitor workspace | If you select **Enable Prometheus metrics**, you must select an [Azure Monitor workspace](../essentials/azure-monitor-workspace-overview.md). The Azure Monitor workspace must be in the same subscription as the AKS container and the Log Analytics workspace. |
- | Grafana workspace | To use the collected Prometheus metrics with dashboards in [Azure-managed Grafana](../../managed-grafan#enable-prometheus-metric-collection) to the Azure Monitor workspace if it isn't already. |
-
-1. Select **Use managed identity** if you want to use [managed identity authentication with Azure Monitor Agent](container-insights-onboard.md#authentication).
-After you've enabled monitoring, it might take about 15 minutes before you can view health metrics for the cluster.
+### New cluster
+Replace and use the managed cluster resources in [Deploy an Azure Kubernetes Service (AKS) cluster using Bicep](../../aks/learn/quick-kubernetes-deploy-bicep.md).
-## [Resource Manager template](#tab/arm)
->[!NOTE]
->The template must be deployed in the same resource group as the cluster.
-### Create or download templates
-
-You'll either download template and parameter files or create your own depending on the authentication mode you're using.
-
-To enable [managed identity authentication](container-insights-onboard.md#authentication):
-
-1. Download the template in the [GitHub content file](https://aka.ms/aks-enable-monitoring-msi-onboarding-template-file) and save it as **existingClusterOnboarding.json**.
-
-1. Download the parameter file in the [GitHub content file](https://aka.ms/aks-enable-monitoring-msi-onboarding-template-parameter-file) and save it as **existingClusterParam.json**.
-
-1. Edit the values in the parameter file:
-
- - `aksResourceId`: Use the values on the **AKS Overview** page for the AKS cluster.
- - `aksResourceLocation`: Use the values on the **AKS Overview** page for the AKS cluster.
- - `workspaceResourceId`: Use the resource ID of your Log Analytics workspace.
- - `resourceTagValues`: Match the existing tag values specified for the existing Container insights extension data collection rule (DCR) of the cluster and the name of the DCR. The name will be *MSCI-\<clusterRegion\>-\<clusterName\>* and this resource created in an AKS clusters resource group. If this is the first time onboarding, you can set the arbitrary tag values.
-
-To enable [managed identity authentication](container-insights-onboard.md#authentication):
-
-1. Save the following JSON as **existingClusterOnboarding.json**.
-
- ```json
- {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "aksResourceId": {
- "type": "string",
- "metadata": {
- "description": "AKS Cluster Resource ID"
- }
- },
- "aksResourceLocation": {
- "type": "string",
- "metadata": {
- "description": "Location of the AKS resource e.g. \"East US\""
- }
- },
- "aksResourceTagValues": {
- "type": "object",
- "metadata": {
- "description": "Existing all tags on AKS Cluster Resource"
- }
- },
- "workspaceResourceId": {
- "type": "string",
- "metadata": {
- "description": "Azure Monitor Log Analytics Resource ID"
- }
- }
- },
- "resources": [
- {
- "name": "[split(parameters('aksResourceId'),'/')[8]]",
- "type": "Microsoft.ContainerService/managedClusters",
- "location": "[parameters('aksResourceLocation')]",
- "tags": "[parameters('aksResourceTagValues')]",
- "apiVersion": "2018-03-31",
- "properties": {
- "mode": "Incremental",
- "id": "[parameters('aksResourceId')]",
- "addonProfiles": {
- "omsagent": {
- "enabled": true,
- "config": {
- "logAnalyticsWorkspaceResourceID": "[parameters('workspaceResourceId')]"
- }
- }
- }
- }
- }
- ]
- }
- ```
+## [Terraform](#tab/terraform)
-1. Save the following JSON as **existingClusterParam.json**.
-
- ```json
- {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "aksResourceId": {
- "value": "/subscriptions/<SubscriptionId>/resourcegroups/<ResourceGroup>/providers/Microsoft.ContainerService/managedClusters/<ResourceName>"
- },
- "aksResourceLocation": {
- "value": "<aksClusterLocation>"
- },
- "workspaceResourceId": {
- "value": "/subscriptions/<SubscriptionId>/resourceGroups/<ResourceGroup>/providers/Microsoft.OperationalInsights/workspaces/<workspaceName>"
- },
- "aksResourceTagValues": {
- "value": {
- "<existing-tag-name1>": "<existing-tag-value1>",
- "<existing-tag-name2>": "<existing-tag-value2>",
- "<existing-tag-nameN>": "<existing-tag-valueN>"
- }
- }
- }
- }
- ```
+### New AKS cluster
-1. Download the parameter file in the [GitHub content file](https://aka.ms/aks-enable-monitoring-msi-onboarding-template-parameter-file) and save as **existingClusterParam.json**.
+1. Download Terraform template file depending on whether you want to enable Syslog collection.
-1. Edit the values in the parameter file:
+ **Syslog**
+ - [https://aka.ms/enable-monitoring-msi-syslog-terraform](https://aka.ms/enable-monitoring-msi-syslog-terraform)
- - `aksResourceId`: Use the values on the **AKS Overview** page for the AKS cluster.
- - `aksResourceLocation`: Use the values on the **AKS Overview** page for the AKS cluster.
- - `workspaceResourceId`: Use the resource ID of your Log Analytics workspace.
- - `resourceTagValues`: Use the existing tag values specified for the AKS cluster.
+ **No Syslog**
+ - [https://aka.ms/enable-monitoring-msi-terraform](https://aka.ms/enable-monitoring-msi-terraform)
-### Deploy the template
+2. Adjust the `azurerm_kubernetes_cluster` resource in *main.tf* based on what cluster settings you're going to have.
+3. Update parameters in *variables.tf* to replace values in "<>"
-Deploy the template with the parameter file by using any valid method for deploying Resource Manager templates. For examples of different methods, see [Deploy the sample templates](../resource-manager-samples.md#deploy-the-sample-templates).
+ | Parameter | Description |
+ |:|:|
+ | `aks_resource_group_name` | Use the values on the AKS Overview page for the resource group. |
+ | `resource_group_location` | Use the values on the AKS Overview page for the resource group. |
+ | `cluster_name` | Define the cluster name that you would like to create. |
+ | `workspace_resource_id` | Use the resource ID of your Log Analytics workspace. |
+ | `workspace_region` | Use the location of your Log Analytics workspace. |
+ | `resource_tag_values` | Match the existing tag values specified for the existing Container insights extension data collection rule (DCR) of the cluster and the name of the DCR. The name will match `MSCI-<clusterName>-<clusterRegion>` and this resource is created in the same resource group as the AKS clusters. For first time onboarding, you can set the arbitrary tag values. |
+ | `enabledContainerLogV2` | Set this parameter value to be true to use the default recommended ContainerLogV2. |
+ | Cost optimization parameters | Refer to [Data collection parameters](container-insights-cost-config.md#data-collection-parameters) |
-#### Deploy with Azure PowerShell
-```powershell
-New-AzResourceGroupDeployment -Name OnboardCluster -ResourceGroupName <ResourceGroupName> -TemplateFile .\existingClusterOnboarding.json -TemplateParameterFile .\existingClusterParam.json
-```
+4. Run `terraform init -upgrade` to initialize the Terraform deployment.
+5. Run `terraform plan -out main.tfplan` to initialize the Terraform deployment.
+6. Run `terraform apply main.tfplan` to apply the execution plan to your cloud infrastructure.
-The configuration change can take a few minutes to complete. When it's finished, a message similar to the following example includes this result:
-```output
-provisioningState : Succeeded
-```
+### Existing AKS cluster
+1. Import the existing cluster resource first with the command: ` terraform import azurerm_kubernetes_cluster.k8s <aksResourceId>`
+2. Add the oms_agent add-on profile to the existing azurerm_kubernetes_cluster resource.
+ ```
+ oms_agent {
+ log_analytics_workspace_id = var.workspace_resource_id
+ msi_auth_for_monitoring_enabled = true
+ }
+ ```
+3. Copy the DCR and DCRA resources from the Terraform templates
+4. Run `terraform plan -out main.tfplan` and make sure the change is adding the oms_agent property. Note: If the `azurerm_kubernetes_cluster` resource defined is different during terraform plan, the existing cluster will get destroyed and recreated.
+5. Run `terraform apply main.tfplan` to apply the execution plan to your cloud infrastructure.
-#### Deploy with Azure CLI
+> [!TIP]
+> - Edit the `main.tf` file appropriately before running the terraform template
+> - Data will start flowing after 10 minutes since the cluster needs to be ready first
+> - WorkspaceID needs to match the format `/subscriptions/12345678-1234-9876-4563-123456789012/resourceGroups/example-resource-group/providers/Microsoft.OperationalInsights/workspaces/workspaceValue`
+> - If resource group already exists, run `terraform import azurerm_resource_group.rg /subscriptions/<Subscription_ID>/resourceGroups/<Resource_Group_Name>` before terraform plan
-```azurecli
-az login
-az account set --subscription "Subscription Name"
-az deployment group create --resource-group <ResourceGroupName> --template-file ./existingClusterOnboarding.json --parameters @./existingClusterParam.json
-```
+### [Azure Policy](#tab/policy)
-The configuration change can take a few minutes to complete. When it's finished, a message similar to the following example includes this result:
+1. Download Azure Policy template and parameter files depending on whether you want to enable Syslog collection.
-```output
-provisioningState : Succeeded
-```
+ - Template file: [https://aka.ms/enable-monitoring-msi-azure-policy-template](https://aka.ms/enable-monitoring-msi-azure-policy-template)
+ - Parameter file: [https://aka.ms/enable-monitoring-msi-azure-policy-parameters](https://aka.ms/enable-monitoring-msi-azure-policy-parameters)
+
+2. Create the policy definition using the following command:
+
+ ```
+ az policy definition create --name "AKS-Monitoring-Addon-MSI" --display-name "AKS-Monitoring-Addon-MSI" --mode Indexed --metadata version=1.0.0 category=Kubernetes --rules azure-policy.rules.json --params azure-policy.parameters.json
+ ```
+
+3. Create the policy assignment using the following CLI command or any [other available method](../../governance/policy/assign-policy-portal.md).
+
+ ```
+ az policy assignment create --name aks-monitoring-addon --policy "AKS-Monitoring-Addon-MSI" --assign-identity --identity-scope /subscriptions/<subscriptionId> --role Contributor --scope /subscriptions/<subscriptionId> --location <location> --role Contributor --scope /subscriptions/<subscriptionId> -p "{ \"workspaceResourceId\": { \"value\": \"/subscriptions/<subscriptionId>/resourcegroups/<resourceGroupName>/providers/microsoft.operationalinsights/workspaces/<workspaceName>\" } }"
+ ```
-After you've enabled monitoring, it might take about 15 minutes before you can view health metrics for the cluster.
+> [!TIP]
+> - Make sure when performing remediation task, the policy assignment has access to workspace you specified.
+> - Download all files under *AddonPolicyTemplate* folder before running the policy template.
## Verify agent and solution deployment-
-Run the following command to verify that the agent is deployed successfully.
+You can verify that the agent is deployed properly using the [kubectl command line tool](../../aks/learn/quick-kubernetes-deploy-cli.md#connect-to-the-cluster).
``` kubectl get ds ama-logs --namespace=kube-system
Use the `aks show` command to find out whether the solution is enabled or not, w
az aks show -g <resourceGroupofAKSCluster> -n <nameofAksCluster> ```
-After a few minutes, the command completes and returns JSON-formatted information about the solution. The results of the command should show the monitoring add-on profile and resemble the following example output:
+The command will return JSON-formatted information about the solution. The `addonProfiles` section should include information on the `omsagent` as in the following example:
```output "addonProfiles": {
After a few minutes, the command completes and returns JSON-formatted informatio
} ```
-## Migrate to managed identity authentication
-
-This section explains two methods for migrating to managed identity authentication.
-
-### Existing clusters with a service principal
-
-AKS clusters with a service principal must first disable monitoring and then upgrade to managed identity. Only Azure public cloud, Microsoft Azure operated by 21Vianet cloud, and Azure Government cloud are currently supported for this migration.
-
-> [!NOTE]
-> Minimum Azure CLI version 2.49.0 or higher.
-
-1. Get the configured Log Analytics workspace resource ID:
-
- ```cli
- az aks show -g <resource-group-name> -n <cluster-name> | grep -i "logAnalyticsWorkspaceResourceID"
- ```
-
-1. Disable monitoring with the following command:
-
- ```cli
- az aks disable-addons -a monitoring -g <resource-group-name> -n <cluster-name>
- ```
-
-1. Upgrade cluster to system managed identity with the following command:
-
- ```cli
- az aks update -g <resource-group-name> -n <cluster-name> --enable-managed-identity
- ```
-
-1. Enable the monitoring add-on with the managed identity authentication option by using the Log Analytics workspace resource ID obtained in step 1:
-
- ```cli
- az aks enable-addons -a monitoring -g <resource-group-name> -n <cluster-name> --workspace-resource-id <workspace-resource-id>
- ```
-
-### Existing clusters with system or user-assigned identity
-
-AKS clusters with system-assigned identity must first disable monitoring and then upgrade to managed identity. Only Azure public cloud, Azure operated by 21Vianet cloud, and Azure Government cloud are currently supported for clusters with system identity. For clusters with user-assigned identity, only Azure public cloud is supported.
-
-> [!NOTE]
-> Minimum Azure CLI version 2.49.0 or higher.
-
-1. Get the configured Log Analytics workspace resource ID:
-
- ```cli
- az aks show -g <resource-group-name> -n <cluster-name> | grep -i "logAnalyticsWorkspaceResourceID"
- ```
-
-1. Disable monitoring with the following command:
-
- ```cli
- az aks disable-addons -a monitoring -g <resource-group-name> -n <cluster-name>
- ```
-
-1. Enable the monitoring add-on with the managed identity authentication option by using the Log Analytics workspace resource ID obtained in step 1:
-
- ```cli
- az aks enable-addons -a monitoring -g <resource-group-name> -n <cluster-name> --workspace-resource-id <workspace-resource-id>
- ```
-
-## Private link
-Use one of the following procedures to enable network isolation by connecting your cluster to the Log Analytics workspace by using [Azure Private Link](../logs/private-link-security.md).
-
-### Managed identity authentication
-Use the following procedure if your cluster is using managed identity authentication with Azure Monitor Agent.
-
-1. Follow the steps in [Enable network isolation for the Azure Monitor agent](../agents/azure-monitor-agent-data-collection-endpoint.md) to create a data collection endpoint and add it to your Azure Monitor private link service.
-
-1. Create an association between the cluster and the data collection endpoint by using the following API call. For information on this call, see [Data collection rule associations - Create](/rest/api/monitor/data-collection-rule-associations/create). The DCR association name must beΓÇ»**configurationAccessEndpoint**, and `resourceUri` is the resource ID of the AKS cluster.
-
- ```rest
- PUT https://management.azure.com/{cluster-resource-id}/providers/Microsoft.Insights/dataCollectionRuleAssociations/configurationAccessEndpoint?api-version=2021-04-01
- {
- "properties": {
- "dataCollectionEndpointId": "{data-collection-endpoint-resource-id}"
- }
- }
- ```
-
- The following snippet is an example of this API call:
-
- ```rest
- PUT https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/myResourceGroup/providers/Microsoft.ContainerService/managedClusters/my-aks-cluster/providers/Microsoft.Insights/dataCollectionRuleAssociations/configurationAccessEndpoint?api-version=2021-04-01
-
- {
- "properties": {
- "dataCollectionEndpointId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/myResourceGroup/providers/Microsoft.Insights/dataCollectionEndpoints/myDataCollectionEndpoint"
- }
- }
- ```
-
-1. Enable monitoring with the managed identity authentication option by using the steps in [Migrate to managed identity authentication](#migrate-to-managed-identity-authentication).
-
-### Without managed identity authentication
-Use the following procedure if you're not using managed identity authentication. This requires a [private AKS cluster](../../aks/private-clusters.md).
-
-1. Create a private AKS cluster following the guidance in [Create a private Azure Kubernetes Service cluster](../../aks/private-clusters.md).
-
-2. Disable public Ingestion on your Log Analytics workspace.
-
- Use the following command to disable public ingestion on an existing workspace.
-
- ```cli
- az monitor log-analytics workspace update --resource-group <azureLogAnalyticsWorkspaceResourceGroup> --workspace-name <azureLogAnalyticsWorkspaceName> --ingestion-access Disabled
- ```
-
- Use the following command to create a new workspace with public ingestion disabled.
-
- ```cli
- az monitor log-analytics workspace create --resource-group <azureLogAnalyticsWorkspaceResourceGroup> --workspace-name <azureLogAnalyticsWorkspaceName> --ingestion-access Disabled
- ```
-
-3. Configure private link by following the instructions at [Configure your private link](../logs/private-link-configure.md). Set ingestion access to public and then set to private after the private endpoint is created but before monitoring is enabled. The private link resource region must be same as AKS cluster region.
-
-4. Enable monitoring for the AKS cluster.
-
- ```cli
- az aks enable-addons -a monitoring --resource-group <AKSClusterResourceGorup> --name <AKSClusterName> --workspace-resource-id <workspace-resource-id>
- ```
## Limitations -- When you enable managed identity authentication, a data collection rule is created with the name *MSCI-\<cluster-region\>-<\cluster-name\>*. Currently, this name can't be modified.-
+- Dependency on DCR/DCRA for region availability. For new AKS region, there might be chances that DCR is still not supported in the new region. In that case, onboarding Container Insights with MSI will fail. One workaround is to onboard to Container Insights through CLI with the old way (with the use of Container Insights solution)
- You must be on a machine on the same private network to access live logs from a private cluster. ## Next steps
Use the following procedure if you're not using managed identity authentication.
* If you experience issues while you attempt to onboard the solution, review the [Troubleshooting guide](container-insights-troubleshoot.md). * With monitoring enabled to collect health and resource utilization of your AKS cluster and workloads running on them, learn [how to use](container-insights-analyze.md) Container insights. -
azure-monitor Container Insights Enable Arc Enabled Clusters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-enable-arc-enabled-clusters.md
- Log Analytics workspace. Azure Monitor Container Insights supports a Log Analytics workspace in the regions listed under Azure [products by region page](https://azure.microsoft.com/global-infrastructure/services/?regions=all&products=monitor). You can create your own workspace using [Azure Resource Manager](../logs/resource-manager-workspace.md), [PowerShell](../logs/powershell-workspace-configuration.md), or [Azure portal](../logs/quick-create-workspace.md). - [Contributor](../../role-based-access-control/built-in-roles.md#contributor) role assignment on the Azure subscription containing the Azure Arc-enabled Kubernetes resource. If the Log Analytics workspace is in a different subscription, then [Log Analytics Contributor](../logs/manage-access.md#azure-rbac) role assignment is needed on the resource group containing the Log Analytics Workspace - To view the monitoring data, you need to have [Monitoring Reader](../roles-permissions-security.md#monitoring-reader) or [Monitoring Contributor](../roles-permissions-security.md#monitoring-contributor) role.-- The following endpoints need to be enabled for outbound access in addition to the [Azure Arc-enabled Kubernetes network requirements](../../azure-arc/kubernetes/network-requirements.md).-
- **Azure public cloud**
-
- | Endpoint | Port |
- |-||
- | `*.ods.opinsights.azure.com` | 443 |
- | `*.oms.opinsights.azure.com` | 443 |
- | `dc.services.visualstudio.com` | 443 |
- | `*.monitoring.azure.com` | 443 |
- | `login.microsoftonline.com` | 443 |
-
- The following table lists the additional firewall configuration required for managed identity authentication.
-
- |Agent resource| Purpose | Port |
- |--|||
- | `global.handler.control.monitor.azure.com` | Access control service | 443 |
- | `<cluster-region-name>.handler.control.monitor.azure.com` | Fetch data collection rules for specific AKS cluster | 443 |
-
- **Azure Government cloud**
-
- If your Azure Arc-enabled Kubernetes resource is in Azure US Government environment, following endpoints need to be enabled for outbound access:
-
- | Endpoint | Port |
- |-||
- | `*.ods.opinsights.azure.us` | 443 |
- | `*.oms.opinsights.azure.us` | 443 |
- | `dc.services.visualstudio.com` | 443 |
-
- The following table lists the additional firewall configuration required for managed identity authentication.
-
- |Agent resource| Purpose | Port |
- |--|||
- | `global.handler.control.monitor.azure.cn` | Access control service | 443 |
- | `<cluster-region-name>.handler.control.monitor.azure.cn` | Fetch data collection rules for specific AKS cluster | 443 |
---- If you are using an Arc enabled cluster on AKS, and previously installed [monitoring for AKS](./container-insights-enable-existing-clusters.md), please ensure that you have [disabled monitoring](./container-insights-optout.md) before proceeding to avoid issues during the extension install-
+- Verify the [firewall requirements for Container insights](./container-insights-onboard.md#network-firewall-requirements) in addition to the [Azure Arc-enabled Kubernetes network requirements](../../azure-arc/kubernetes/network-requirements.md).
+- If you are using an Arc enabled cluster on AKS, and previously installed [monitoring for AKS](./container-insights-enable-existing-clusters.md), please ensure that you have [disabled monitoring](./container-insights-optout.md) before proceeding to avoid issues during the extension install.
- If you had previously deployed Azure Monitor Container Insights on this cluster using script without cluster extensions, follow the instructions listed [here](container-insights-optout-hybrid.md) to delete this Helm chart. You can then continue to creating a cluster extension instance for Azure Monitor Container Insights.
If you want to tweak the default resource requests and limits, you can use the a
az k8s-extension create --name azuremonitor-containers --cluster-name <cluster-name> --resource-group <resource-group> --cluster-type connectedClusters --extension-type Microsoft.AzureMonitor.Containers --configuration-settings amalogs.resources.daemonset.limits.cpu=150m amalogs.resources.daemonset.limits.memory=600Mi amalogs.resources.deployment.limits.cpu=1 amalogs.resources.deployment.limits.memory=750Mi ```
-Checkout the [resource requests and limits section of Helm chart](https://github.com/microsoft/Docker-Provider/blob/ci_prod/charts/azuremonitor-containers/values.yaml) for the available configuration settings.
+Check out the [resource requests and limits section of Helm chart](https://github.com/microsoft/Docker-Provider/blob/ci_prod/charts/azuremonitor-containers/values.yaml) for the available configuration settings.
### Option 4 - On Azure Stack Edge
az k8s-extension show --name azuremonitor-containers --cluster-name <cluster-nam
``` --
-## Migrate to managed identity authentication
-Use the flowing guidance to migrate an existing extension instance to managed identity authentication.
-
->[!NOTE]
-> Managed identity authentication is not supported for Arc-enabled Kubernetes clusters with **ARO**.
-
-## [CLI](#tab/migrate-cli)
-First retrieve the Log Analytics workspace configured for Container insights extension.
-
-```cli
-az k8s-extension show --name azuremonitor-containers --cluster-name \<cluster-name\> --resource-group \<resource-group\> --cluster-type connectedClusters -n azuremonitor-containers
-```
-
-Enable Container insights extension with managed identity authentication option using the workspace returned in the first step.
-
-```cli
-az k8s-extension create --name azuremonitor-containers --cluster-name \<cluster-name\> --resource-group \<resource-group\> --cluster-type connectedClusters --extension-type Microsoft.AzureMonitor.Containers --configuration-settings amalogs.useAADAuth=true logAnalyticsWorkspaceResourceID=\<workspace-resource-id\>
-```
-
-## [Resource Manager](#tab/migrate-arm)
---
-1. Download the template at [https://aka.ms/arc-k8s-azmon-extension-msi-arm-template](https://aka.ms/arc-k8s-azmon-extension-msi-arm-template) and save it as **arc-k8s-azmon-extension-msi-arm-template.json**.
-
-2. Download the parameter file at [https://aka.ms/arc-k8s-azmon-extension-msi-arm-template-params](https://aka.ms/arc-k8s-azmon-extension-msi-arm-template) and save it as **arc-k8s-azmon-extension-msi-arm-template-params.json**.
-
-3. Edit the values in the parameter file.
-
- - For **workspaceDomain**, use *opinsights.azure.com* for Azure public cloud and *opinsights.azure.us* for Azure Government cloud.
- - Specify the tags in the **resourceTagValues** parameter if you want to use any Azure tags on the Azure resources that will be created as part of the Container insights extension.
-
-4. Deploy the template to create Container Insights extension.
-
-```cli
-az login
-az account set --subscription "Subscription Name"
-az deployment group create --resource-group <resource-group> --template-file ./arc-k8s-azmon-extension-msi-arm-template.json --parameters @./arc-k8s-azmon-extension-msi-arm-template-params.json
-```
-- ## Delete extension instance
azure-monitor Container Insights Logging V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-logging-v2.md
Follow the instructions to configure an existing ConfigMap or to use a new one.
## [Azure portal](#tab/configure-portal) >[!NOTE]
-> DCR based configuration is not supported for service principal based clusters. Please [migrate your clusters with service principal to managed identity](./container-insights-enable-aks.md#migrate-to-managed-identity-authentication) to use this experience.
+> DCR based configuration is not supported for service principal based clusters. Please [migrate your clusters with service principal to managed identity](./container-insights-authentication.md) to use this experience.
1. In the Insights section of your Kubernetes cluster, select the **Monitoring Settings** button from the top toolbar
azure-monitor Container Insights Onboard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-onboard.md
Title: Enable Container insights
description: This article describes how to enable and configure Container insights so that you can understand how your container is performing and what performance-related issues have been identified. Previously updated : 08/29/2022 Last updated : 10/18/2023 # Enable Container insights
-This article provides an overview of the requirements and options that are available for configuring Container insights to monitor the performance of workloads that are deployed to Kubernetes environments. You can enable Container insights for a new deployment or for one or more existing deployments of Kubernetes by using several supported methods.
+This article provides an overview of the requirements and options that are available for enabling [Container insights](../containers/container-insights-overview.md) on your Kubernetes clusters. You can enable Container insights for a new deployment or for one or more existing deployments of Kubernetes by using several supported methods.
## Supported configurations Container insights supports the following environments:- - [Azure Kubernetes Service (AKS)](../../aks/index.yml) - Following [Azure Arc-enabled Kubernetes cluster distributions](../../azure-arc/kubernetes/validation-program.md): - AKS on Azure Stack HCI
Container insights supports the following environments:
> [!NOTE] > Container insights supports ARM64 nodes on AKS. See [Cluster requirements](../../azure-arc/kubernetes/system-requirements.md#cluster-requirements) for the details of Azure Arc-enabled clusters that support ARM64 nodes.
-The versions of Kubernetes and support policy are the same as those versions [supported in AKS](../../aks/supported-kubernetes-versions.md).
-
-### Differences between Windows and Linux clusters
-
-The main differences in monitoring a Windows Server cluster compared to a Linux cluster include:
--- Windows doesn't have a Memory RSS metric. As a result, it isn't available for Windows nodes and containers. The [Working Set](/windows/win32/memory/working-set) metric is available.-- Disk storage capacity information isn't available for Windows nodes.-- Only pod environments are monitored, not Docker environments.-- With the preview release, a maximum of 30 Windows Server containers are supported. This limitation doesn't apply to Linux containers.-
->[!NOTE]
-> Container insights support for the Windows Server 2022 operating system is in preview.
-
-## Installation options
--- [AKS cluster](container-insights-enable-aks.md)-- [AKS cluster with Azure Policy](container-insights-enable-aks-policy.md)-- [Azure Arc-enabled cluster](container-insights-enable-arc-enabled-clusters.md)-- [Hybrid Kubernetes clusters](container-insights-hybrid-setup.md) ## Prerequisites
-Before you start, make sure that you've met the following requirements:
-
-### Log Analytics workspace
+- Container insights stores its data in a [Log Analytics workspace](../logs/log-analytics-workspace-overview.md). It supports workspaces in the regions that are listed in [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?regions=all&products=monitor). For a list of the supported mapping pairs to use for the default workspace, see [Region mappings supported by Container insights](container-insights-region-mapping.md). You can let the onboarding experience create a Log Analytics workspace in the default resource group of the AKS cluster subscription. If you already have a workspace, you'll probably want to use that one. For more information, see [Designing your Azure Monitor Logs deployment](../logs/design-logs-deployment.md).
+- Permissions
+ - To enable Container insights, you require must have at least [Contributor](../../role-based-access-control/built-in-roles.md#contributor) access to the AKS cluster.
+ - To view data after container monitoring is enabled, you must have [Monitoring Reader](../roles-permissions-security.md#monitoring-reader) or [Monitoring Contributor](../roles-permissions-security.md#monitoring-contributor) role.
-Container insights stores its data in a [Log Analytics workspace](../logs/log-analytics-workspace-overview.md). It supports workspaces in the regions that are listed in [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?regions=all&products=monitor). For a list of the supported mapping pairs to use for the default workspace, see [Region mappings supported by Container insights](container-insights-region-mapping.md).
-
-You can let the onboarding experience create a Log Analytics workspace in the default resource group of the AKS cluster subscription. If you already have a workspace, you'll probably want to use that one. For more information, see [Designing your Azure Monitor Logs deployment](../logs/design-logs-deployment.md).
+## Authentication
- You can attach an AKS cluster to a Log Analytics workspace in a different Azure subscription in the same Microsoft Entra tenant. Currently, you can't do it with the Azure portal, but you can use the Azure CLI or an Azure Resource Manager template.
+Container insights uses managed identity authentication. This authentication model has a monitoring agent that uses the cluster's managed identity to send data to Azure Monitor. Read more in [Authentication for Container Insights](container-insights-authentication.md) including guidance on migrating from legacy authentication models.
-### Azure Monitor workspace (preview)
+> [!Note]
+> [ContainerLogV2](container-insights-logging-v2.md) is the default schema when you onboard Container insights with using ARM, Bicep, Terraform, Policy and Portal onboarding. ContainerLogV2 can be explicitly enabled through CLI version 2.51.0 or higher using Data collection settings.
-If you're going to configure the cluster to [collect Prometheus metrics](container-insights-prometheus.md) with [Azure Monitor managed service for Prometheus](../essentials/prometheus-metrics-overview.md), you must have an Azure Monitor workspace where Prometheus metrics are stored. You can let the onboarding experience create an Azure Monitor workspace in the default resource group of the AKS cluster subscription or use an existing Azure Monitor workspace.
-### Permissions
+## Agent
-To enable Container insights, you require the following permissions:
+Container insights relies on a containerized [Azure Monitor agent](../agents/agents-overview.md) for Linux. This specialized agent collects performance and event data from all nodes in the cluster and sends it to a Log Analytics workspace. The agent is automatically deployed and registered with the specified Log Analytics workspace during deployment.
-- You must have at least [Contributor](../../role-based-access-control/built-in-roles.md#contributor) access to the AKS cluster.
+### Data collection rule
+[Data collection rules (DCR)](../essentials/data-collection-rule-overview.md) contain the definition of data that should be collected by Azure Monitor agent. When you enable Container insights on a cluster, a DCR is created with the name *MSCI-\<cluster-region\>-<\cluster-name\>*. Currently, this name can't be modified.
-To view data after container monitoring is enabled, you require the following permissions:
+Since March 1, 2023 Container insights uses a semver compliant agent version. The agent version is *mcr.microsoft.com/azuremonitor/containerinsights/ciprod:3.1.4* or later. It's represented by the format mcr.microsoft.com/azuremonitor/containerinsights/ciprod:\<semver compatible version\>. When a new version of the agent is released, it's automatically upgraded on your managed Kubernetes clusters that are hosted on AKS. To track which versions are released, see [Agent release announcements](https://github.com/microsoft/Docker-Provider/blob/ci_prod/ReleaseNotes.md).
-- You must have [Monitoring Reader](../roles-permissions-security.md#monitoring-reader) or [Monitoring Contributor](../roles-permissions-security.md#monitoring-contributor) role.
+> [!NOTE]
+> Ingestion Transformations are not currently supported with the [Container insights DCR](../essentials/data-collection-transformations.md).
-### Kubelet secure port
-The containerized Linux agent (replicaset pod) makes API calls to all the Windows nodes on Kubelet secure port (10250) within the cluster to collect node and container performance-related metrics. Kubelet secure port (:10250) should be opened in the cluster's virtual network for both inbound and outbound for Windows node and container performance-related metrics collection to work.
+### Log Analytics agent
-If you have a Kubernetes cluster with Windows nodes, review and configure the network security group and network policies to make sure the Kubelet secure port (:10250) is opened for both inbound and outbound in the cluster's virtual network.
+When Container insights doesn't use managed identity authentication, it relies on a containerized [Log Analytics agent for Linux](../agents/log-analytics-agent.md). The agent version is *microsoft/oms:ciprod04202018* or later. It's represented by a date in the following format: *mmddyyyy*. When a new version of the agent is released, it's automatically upgraded on your managed Kubernetes clusters that are hosted on AKS. To track which versions are released, see [Agent release announcements](https://github.com/microsoft/docker-provider/tree/ci_feature_prod).
-## Authentication
+With the general availability of Windows Server support for AKS, an AKS cluster with Windows Server nodes has a preview agent installed as a daemon set pod on each individual Windows Server node to collect logs and forward them to Log Analytics. For performance metrics, a Linux node that's automatically deployed in the cluster as part of the standard deployment collects and forwards the data to Azure Monitor for all Windows nodes in the cluster.
-Container insights defaults to managed identity authentication. This secure and simplified authentication model has a monitoring agent that uses the cluster's managed identity to send data to Azure Monitor. It replaces the existing legacy certificate-based local authentication and removes the requirement of adding a *Monitoring Metrics Publisher* role to the cluster. Read more in [Authentication for Container Insights](container-insights-authentication.md)
+> [!NOTE]
+> If you've already deployed an AKS cluster and enabled monitoring by using either the Azure CLI or a Resource Manager template, you can't use `kubectl` to upgrade, delete, redeploy, or deploy the agent. The template needs to be deployed in the same resource group as the cluster.
-## Agent
-This section reviews the agents used by Container insights.
+## Differences between Windows and Linux clusters
-### Azure Monitor agent
+The main differences in monitoring a Windows Server cluster compared to a Linux cluster include:
-When Container insights uses managed identity authentication (in preview), it relies on a containerized Azure Monitor agent for Linux. This specialized agent collects performance and event data from all nodes in the cluster. The agent is automatically deployed and registered with the specified Log Analytics workspace during deployment.
+- Windows doesn't have a Memory RSS metric. As a result, it isn't available for Windows nodes and containers. The [Working Set](/windows/win32/memory/working-set) metric is available.
+- Disk storage capacity information isn't available for Windows nodes.
+- Only pod environments are monitored, not Docker environments.
+- With the preview release, a maximum of 30 Windows Server containers are supported. This limitation doesn't apply to Linux containers.
-### Log Analytics agent
+>[!NOTE]
+> Container insights support for the Windows Server 2022 operating system is in preview.
-When Container insights doesn't use managed identity authentication, it relies on a containerized Log Analytics agent for Linux. This specialized agent collects performance and event data from all nodes in the cluster. The agent is automatically deployed and registered with the specified Log Analytics workspace during deployment.
-The agent version is *microsoft/oms:ciprod04202018* or later. It's represented by a date in the following format: *mmddyyyy*. When a new version of the agent is released, it's automatically upgraded on your managed Kubernetes clusters that are hosted on AKS. To track which versions are released, see [Agent release announcements](https://github.com/microsoft/docker-provider/tree/ci_feature_prod).
+The containerized Linux agent (replicaset pod) makes API calls to all the Windows nodes on Kubelet secure port (10250) within the cluster to collect node and container performance-related metrics. Kubelet secure port (:10250) should be opened in the cluster's virtual network for both inbound and outbound for Windows node and container performance-related metrics collection to work.
-With the general availability of Windows Server support for AKS, an AKS cluster with Windows Server nodes has a preview agent installed as a daemonset pod on each individual Windows Server node to collect logs and forward them to Log Analytics. For performance metrics, a Linux node that's automatically deployed in the cluster as part of the standard deployment collects and forwards the data to Azure Monitor for all Windows nodes in the cluster.
+If you have a Kubernetes cluster with Windows nodes, review and configure the network security group and network policies to make sure the Kubelet secure port (:10250) is open for both inbound and outbound in the cluster's virtual network.
-> [!NOTE]
-> If you've already deployed an AKS cluster and enabled monitoring by using either the Azure CLI or a Resource Manager template, you can't use `kubectl` to upgrade, delete, redeploy, or deploy the agent. The template needs to be deployed in the same resource group as the cluster.
## Network firewall requirements
The following table lists the proxy and firewall configuration information requi
**Azure public cloud**
-|Agent resource|Port |
+| Endpoint |Port |
|--|| | `*.ods.opinsights.azure.com` | 443 | | `*.oms.opinsights.azure.com` | 443 |
The following table lists the extra firewall configuration required for managed
The following table lists the proxy and firewall configuration information for Azure US Government.
-|Agent resource| Purpose | Port |
+| Endpoint | Purpose | Port |
|--||-| | `*.ods.opinsights.azure.us` | Data ingestion | 443 | | `*.oms.opinsights.azure.us` | OMS onboarding | 443 |
The following table lists the extra firewall configuration required for managed
| `global.handler.control.monitor.azure.us` | Access control service | 443 | | `<cluster-region-name>.handler.control.monitor.azure.us` | Fetch data collection rules for specific AKS cluster | 443 | + ## Troubleshooting
-If you have registered your cluster and/or configured HCI Insights before November, 2023, features that use the AMA agent on HCI, such as Arc for Servers Insights, VM Insights, Container Insights, Defender for Cloud or Sentinel may not be collecting logs and event data properly. See [Repair AMA agent for HCI](/azure-stack/hci/manage/monitor-hci-single?tabs=22h2-and-later) for steps to reconfigure the AMA agent and HCI Insights.
+If you registered your cluster and/or configured HCI Insights before November 2023, features that use the AMA agent on HCI, such as Arc for Servers Insights, VM Insights, Container Insights, Defender for Cloud or Sentinel might not be collecting logs and event data properly. See [Repair AMA agent for HCI](/azure-stack/hci/manage/monitor-hci-single?tabs=22h2-and-later) for steps to reconfigure the AMA agent and HCI Insights.
## Next steps
azure-monitor Container Insights Private Link https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-private-link.md
+
+ Title: Enable private link with Container insights
+description: Learn how to enable private link on an Azure Kubernetes Service (AKS) cluster.
+ Last updated : 10/18/2023++++
+# Enable private link with Container insights
+This article describes how to configure Container insights to use Azure Private Link for your AKS cluster.
++
+## Cluster using managed identity authentication
+Use the following procedures to enable network isolation by connecting your cluster to the Log Analytics workspace using [Azure Private Link](../logs/private-link-security.md) if your cluster is using managed identity authentication.
+
+1. Follow the steps in [Enable network isolation for the Azure Monitor agent](../agents/azure-monitor-agent-data-collection-endpoint.md#enable-network-isolation-for-azure-monitor-agent) to create a data collection endpoint (DCE) and add it to your Azure Monitor private link service (AMPLS).
+
+1. Create an association between the cluster and the DCE by using the following API call. For information on this call, see [Data collection rule associations - Create](/rest/api/monitor/data-collection-rule-associations/create). The DCR association name must beΓÇ»**configurationAccessEndpoint**, and `resourceUri` is the resource ID of the AKS cluster.
+
+ ```rest
+ PUT https://management.azure.com/{cluster-resource-id}/providers/Microsoft.Insights/dataCollectionRuleAssociations/configurationAccessEndpoint?api-version=2021-04-01
+ {
+ "properties": {
+ "dataCollectionEndpointId": "{data-collection-endpoint-resource-id}"
+ }
+ }
+ ```
+
+ For example, using Azure CLI:
+
+ ```rest
+ PUT https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/myResourceGroup/providers/Microsoft.ContainerService/managedClusters/my-aks-cluster/providers/Microsoft.Insights/dataCollectionRuleAssociations/configurationAccessEndpoint?api-version=2021-04-01
+
+ {
+ "properties": {
+ "dataCollectionEndpointId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/myResourceGroup/providers/Microsoft.Insights/dataCollectionEndpoints/myDataCollectionEndpoint"
+ }
+ }
+ ```
++
+### Cluster using legacy authentication
+Use the following procedures to enable network isolation by connecting your cluster to the Log Analytics workspace using [Azure Private Link](../logs/private-link-security.md) if your cluster is not using managed identity authentication. This requires a [private AKS cluster](../../aks/private-clusters.md).
+
+1. Create a private AKS cluster following the guidance in [Create a private Azure Kubernetes Service cluster](../../aks/private-clusters.md).
+
+2. Disable public Ingestion on your Log Analytics workspace.
+
+ Use the following command to disable public ingestion on an existing workspace.
+
+ ```cli
+ az monitor log-analytics workspace update --resource-group <azureLogAnalyticsWorkspaceResourceGroup> --workspace-name <azureLogAnalyticsWorkspaceName> --ingestion-access Disabled
+ ```
+
+ Use the following command to create a new workspace with public ingestion disabled.
+
+ ```cli
+ az monitor log-analytics workspace create --resource-group <azureLogAnalyticsWorkspaceResourceGroup> --workspace-name <azureLogAnalyticsWorkspaceName> --ingestion-access Disabled
+ ```
+
+3. Configure private link by following the instructions at [Configure your private link](../logs/private-link-configure.md). Set ingestion access to public and then set to private after the private endpoint is created but before monitoring is enabled. The private link resource region must be same as AKS cluster region.
+
+4. Enable monitoring for the AKS cluster.
+
+ ```cli
+ az aks enable-addons -a monitoring --resource-group <AKSClusterResourceGorup> --name <AKSClusterName> --workspace-resource-id <workspace-resource-id>
+ ```
+
+## Next steps
+
+* If you experience issues while you attempt to onboard the solution, review the [Troubleshooting guide](container-insights-troubleshoot.md).
+* With monitoring enabled to collect health and resource utilization of your AKS cluster and workloads running on them, learn [how to use](container-insights-analyze.md) Container insights.
+
azure-monitor Container Insights V2 Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-v2-migration.md
# Migrate from ContainerLog to ContainerLogV2
-With the upgraded offering of ContainerLogV2 becoming generally available, on 30th September 2026, the ContainerLog table will be retired. If you currently ingest container insights data to the ContainerLog table, please transition to using ContainerLogV2 prior to that date.
+With the upgraded offering of ContainerLogV2 becoming generally available, on 30th September 2026, the ContainerLog table will be retired. If you currently ingest container insights data to the ContainerLog table, transition to using ContainerLogV2 prior to that date.
>[!NOTE] > Support for ingesting the ContainerLog table will be **retired on 30th September 2026**.
With the upgraded offering of ContainerLogV2 becoming generally available, on 30
To transition to ContainerLogV2, we recommend the following approach. 1. Learn about the feature differences between ContainerLog and ContainerLogV2
-2. Assess the impact migrating to ContainerLogV2 may have on your existing queries, alerts, or dashboards
+2. Assess the impact migrating to ContainerLogV2 might have on your existing queries, alerts, or dashboards
3. [Enable the ContainerLogV2 schema](container-insights-logging-v2.md) through either the container insights data collection rules (DCRs) or ConfigMap
-4. Validate that you are now ingesting ContainerLogV2 to your Log Analytics workspace.
+4. Validate that you're now ingesting ContainerLogV2 to your Log Analytics workspace.
## ContainerLog vs ContainerLogV2 schema The following table highlights the key differences between using ContainerLog and ContainerLogV2 schema. >[!NOTE]
-> DCR based configuration is not supported for service principal based clusters. Please [migrate your clusters with service principal to managed identity](./container-insights-enable-aks.md#migrate-to-managed-identity-authentication) to use this experience.
+> DCR based configuration is not supported for service principal based clusters. [Migrate your clusters with service principal to managed identity](./container-insights-authentication.md) to use this experience.
| Feature differences | ContainerLog | ContainerLogV2 | | - | -- | - |
The following table highlights the key differences between using ContainerLog an
## Assess the impact on existing alerts
-If you are currently using ContainerLog in your alerts, then migrating to ContainerLogV2 requires updates to your alert queries for them to continue functioning as expected.
+If you're currently using ContainerLog in your alerts, then migrating to ContainerLogV2 requires updates to your alert queries for them to continue functioning as expected.
-To scan for alerts that may be referencing the ContainerLog table, run the following Azure Resource Graph query:
+To scan for alerts that might be referencing the ContainerLog table, run the following Azure Resource Graph query:
```Kusto resources
azure-monitor Monitor Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/monitor-kubernetes.md
See [Enable Container insights](../containers/container-insights-onboard.md) for
Once Container insights is enabled for a cluster, perform the following actions to optimize your installation. -- Container insights collects many of the same metric values as [Prometheus](#enable-scraping-of-prometheus-metrics). You can disable collection of these metrics by configuring Container insights to only collect **Logs and events** as described in [Enable cost optimization settings in Container insights](../containers/container-insights-cost-config.md?tabs=portal#enable-cost-settings). This configuration disables the Container insights experience in the Azure portal, but you can use Grafana to visualize Prometheus metrics and Log Analytics to analyze log data collected by Container insights.
+- Container insights collects many of the same metric values as [Prometheus](#enable-scraping-of-prometheus-metrics). You can disable collection of these metrics by configuring Container insights to only collect **Logs and events** as described in [Enable cost optimization settings in Container insights](../containers/container-insights-cost-config.md#enable-cost-settings). This configuration disables the Container insights experience in the Azure portal, but you can use Grafana to visualize Prometheus metrics and Log Analytics to analyze log data collected by Container insights.
- Reduce your cost for Container insights data ingestion by reducing the amount of data that's collected. - To improve your query experience with data collected by Container insights and to reduce collection costs, [enable the ContainerLogV2 schema](container-insights-logging-v2.md) for each cluster. If you only use logs for occasional troubleshooting, then consider configuring this table as [basic logs](../logs/basic-logs-configure.md).
azure-monitor Prometheus Metrics Enable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/prometheus-metrics-enable.md
This option enables Prometheus, Grafana, and Container insights on a cluster.
1. Open the clusters menu in the Azure portal and select **Insights**. 3. Select **Configure monitoring**.
-4. Container insights is already enabled. Select the checkboxes for **Enable Prometheus metrics** and **Enable Grafana**. If you have existing Azure Monitor workspace and Garafana workspace, then they're selected for you. Click **Advanced settings** to select alternate workspaces or create new ones.
+4. Container insights is already enabled. Select the checkboxes for **Enable Prometheus metrics** and **Enable Grafana**. If you have existing Azure Monitor workspace and Grafana workspace, then they're selected for you. Click **Advanced settings** to select alternate workspaces or create new ones.
:::image type="content" source="media/prometheus-metrics-enable/configure-container-insights.png" lightbox="media/prometheus-metrics-enable/configure-container-insights.png" alt-text="Screenshot that shows that show the dialog box to configure Container insights with Prometheus and Grafana.":::
The output for each command looks similar to the following example:
#### Optional parameters You can use the following optional parameters with the previous commands: -- `--ksm-metric-annotations-allow-list` is a comma-separated list of Kubernetes annotations keys used in the resource's kube_resource_annotations metric(For ex- kube_pod_annotations is the annotations metric for the pods resource). By default, the kube_resource_annotations(ex - kube_pod_annotations) metric contains only name and namespace labels. To include more annotations, provide a list of resource names in their plural form and Kubernetes annotation keys that you want to allow for them (Example: 'pods=[kubernetes.io/team,...],namespaces=[kubernetes.io/team],...)'. A single `*` can be provided per resource instead to allow any annotations, but it has severe performance implications.-- `--ksm-metric-labels-allow-list` is a comma-separated list of more Kubernetes label keys that is used in the resource's kube_resource_labels metric kube_resource_labels metric(For ex- kube_pod_labels is the labels metric for the pods resource). By default the kube_resource_labels(ex - kube_pod_labels) metric contains only name and namespace labels. To include more labels, provide a list of resource names in their plural form and Kubernetes label keys that you want to allow for them (Example: 'pods=[app],namespaces=[k8s-label-1,k8s-label-n,...],...)'. A single asterisk (`*`) can be provided per resource instead to allow any labels, but it has severe performance implications.-- `--enable-windows-recording-rules` lets you enable the recording rule groups required for proper functioning of the Windows dashboards.
+| Parameter | Description |
+|:|:|
+| `--ksm-metric-annotations-allow-list` | Comma-separated list of Kubernetes annotations keys used in the resource's kube_resource_annotations metric. For example, kube_pod_annotations is the annotations metric for the pods resource. By default, this metric contains only name and namespace labels. To include more annotations, provide a list of resource names in their plural form and Kubernetes annotation keys that you want to allow for them. A single `*` can be provided for each resource to allow any annotations, but this has severe performance implications. For example, `pods=[kubernetes.io/team,...],namespaces=[kubernetes.io/team],...`. |
+| `--ksm-metric-labels-allow-list` | Comma-separated list of more Kubernetes label keys that is used in the resource's kube_resource_labels metric kube_resource_labels metric. For example, kube_pod_labels is the labels metric for the pods resource. By default this metric contains only name and namespace labels. To include more labels, provide a list of resource names in their plural form and Kubernetes label keys that you want to allow for them A single `*` can be provided for each resource to allow any labels, but i this has severe performance implications. For example, `pods=[app],namespaces=[k8s-label-1,k8s-label-n,...],...`. |
+| `--enable-windows-recording-rules` | lets you enable the recording rule groups required for proper functioning of the Windows dashboards. |
**Use annotations and labels.**
azure-monitor Data Platform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/data-platform.md
Read more about distributed tracing at [What is distributed tracing?](app/distri
Once [Change Analysis is enabled](./change/change-analysis-enable.md), the `Microsoft.ChangeAnalysis` resource provider is registered with an Azure Resource Manager subscription to make the resource properties and configuration change data available. Change Analysis provides data for various management and troubleshooting scenarios to help users understand what changes might have caused the issues: - Troubleshoot your application via the [Diagnose & solve problems tool](./change/change-analysis-enable.md).-- Perform general management and monitoring via the [Change Analysis overview portal](./change/change-analysis-visualizations.md#view-change-analysis-data) and [the activity log](./change/change-analysis-visualizations.md#activity-log-change-history).
+- Perform general management and monitoring via the [Change Analysis overview portal](./change/change-analysis-visualizations.md#view-change-analysis-data) and [the activity log](./change/change-analysis-visualizations.md#view-the-activity-log-change-history).
- [Learn more about how to view data results for other scenarios](./change/change-analysis-visualizations.md). Read more about Change Analysis, including data sources in [Use Change Analysis in Azure Monitor](./change/change-analysis.md).
azure-monitor Snapshot Debugger App Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-app-service.md
reviewer: cweining Previously updated : 04/24/2023 Last updated : 11/17/2023
azure-monitor Snapshot Debugger Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-data.md
reviewer: cweining Previously updated : 04/14/2023 Last updated : 11/17/2023 # View Application Insights Snapshot Debugger data
azure-monitor Snapshot Debugger Function App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-function-app.md
reviewer: cweining Previously updated : 08/18/2022 Last updated : 11/17/2023
azure-monitor Snapshot Debugger Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-troubleshoot.md
reviewer: cweining Previously updated : 03/20/2023 Last updated : 11/17/2023
azure-monitor Snapshot Debugger Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-upgrade.md
reviewer: cweining Previously updated : 07/10/2023 Last updated : 11/17/2023
azure-monitor Snapshot Debugger Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger-vm.md
reviewer: cweining Previously updated : 03/21/2023 Last updated : 11/17/2023
azure-monitor Snapshot Debugger https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/snapshot-debugger/snapshot-debugger.md
reviewer: cweining Previously updated : 07/10/2023 Last updated : 11/17/2023 # Debug exceptions in .NET applications using Snapshot Debugger
-With Snapshot Debugger, you can automatically collect a debug snapshot when an exception occurs in your live .NET application. The debug snapshot shows the state of source code and variables at the moment the exception was thrown.
+With Snapshot Debugger, you can automatically collect a debug snapshot when an exception occurs in your live .NET application. Debug snapshots collected show the state of source code and variables at the moment the exception was thrown.
The Snapshot Debugger in [Application Insights](../app/app-insights-overview.md):
The Snapshot Debugger in [Application Insights](../app/app-insights-overview.md)
- Collects snapshots on your top-throwing exceptions. - Provides information you need to diagnose issues in production.
-## How Snapshot Debugger works
-
-The Snapshot Debugger is implemented as an [Application Insights telemetry processor](../app/configuration-with-applicationinsights-config.md#telemetry-processors-aspnet). When your application runs, the Snapshot Debugger telemetry processor is added to your application's system-generated logs pipeline. The Snapshot Debugger process is as follows:
-
-1. Each time your application calls [`TrackException`](../app/asp-net-exceptions.md#exceptions):
- 1. The Snapshot Debugger computes a problem ID from the type of exception being thrown and the throwing method.
- 1. A counter is incremented for the appropriate problem ID.
- 1. When the counter reaches the `ThresholdForSnapshotting` value, the problem ID is added to a collection plan.
-1. The Snapshot Debugger also monitors exceptions as they're thrown by subscribing to the [`AppDomain.CurrentDomain.FirstChanceException`](/dotnet/api/system.appdomain.firstchanceexception) event.
- 1. When this event fires, the problem ID of the exception is computed and compared against the problem IDs in the collection plan.
-1. If there's a match between problem IDs, a snapshot of the running process is created.
-1. The snapshot is assigned a unique identifier and the exception is stamped with that identifier.
-1. After the `FirstChanceException` handler returns, the thrown exception is processed as normal.
-1. Eventually, the exception reaches the `TrackException` method again. It's reported to Application Insights, along with the snapshot identifier.
-
-The main process continues to run and serve traffic to users with little interruption. Meanwhile, the snapshot is handed off to the Snapshot Uploader process. The Snapshot Uploader creates a minidump and uploads it to Application Insights along with any relevant symbol (*.pdb*) files.
-
-> [!TIP]
-> Snapshot creation tips:
-> - A process snapshot is a suspended clone of the running process.
-> - Creating the snapshot takes about 10 milliseconds to 20 milliseconds.
-> - The default value for `ThresholdForSnapshotting` is 1. This value is also the minimum. Your app has to trigger the same exception *twice* before a snapshot is created.
-> - Set `IsEnabledInDeveloperMode` to `true` if you want to generate snapshots while you debug in Visual Studio.
-> - The snapshot creation rate is limited by the `SnapshotsPerTenMinutesLimit` setting. By default, the limit is one snapshot every 10 minutes.
-> - No more than 50 snapshots per day can be uploaded.
+[Learn more about the Snapshot Debugger and Snapshot Uploader processes.](#how-snapshot-debugger-works)
## Supported applications and environments
The following environments are supported:
> [!NOTE] > Client applications (for example, WPF, Windows Forms, or UWP) aren't supported.
-If you enabled the Snapshot Debugger but you aren't seeing snapshots, see the [Troubleshooting guide](snapshot-debugger-troubleshoot.md).
-
-## Requirements
+## Prerequisites for using Snapshot Debugger
### Packages and configurations
If you enabled the Snapshot Debugger but you aren't seeing snapshots, see the [T
### Permissions
-Since access to snapshots is protected by Azure role-based access control, you must be added to the [Application Insights Snapshot Debugger](../../role-based-access-control/role-assignments-portal.md) role. Subscription owners can assign this role to individual users or groups for the target **Application Insights Snapshot**.
+- Verify you're added to the [Application Insights Snapshot Debugger](../../role-based-access-control/role-assignments-portal.md) role for the target **Application Insights Snapshot**.
-For more information, see [Assign Azure roles by using the Azure portal](../../role-based-access-control/role-assignments-portal.md).
+## How Snapshot Debugger works
+
+The Snapshot Debugger is implemented as an [Application Insights telemetry processor](../app/configuration-with-applicationinsights-config.md#telemetry-processors-aspnet). When your application runs, the Snapshot Debugger telemetry processor is added to your application's system-generated logs pipeline.
> [!IMPORTANT] > Snapshots might contain personal data or other sensitive information in variable and parameter values. Snapshot data is stored in the same region as your Application Insights resource.
+### Snapshot Debugger process
+
+The Snapshot Debugger process starts and ends with the `TrackException` method. A process snapshot is a suspended clone of the running process, so that your users experience little to no interruption.
+
+1. Your application throws the [`TrackException`](../app/asp-net-exceptions.md#exceptions).
+
+1. The Snapshot Debugger monitors exceptions as they're thrown by subscribing to the [`AppDomain.CurrentDomain.FirstChanceException`](/dotnet/api/system.appdomain.firstchanceexception) event.
+
+1. A counter is incremented for the problem ID.
+ - When the counter reaches the `ThresholdForSnapshotting` value, the problem ID is added to a collection plan.
+
+ > [!NOTE]
+ > The `ThresholdForSnapshotting` default minimum value is 1. With this value, your app has to trigger the same exception *twice* before a snapshot is created.
+
+1. The exception event's problem ID is computed and compared against the problem IDs in the collection plan.
+
+1. If there's a match between problem IDs, a **snapshot** of the running process is created.
+ - The snapshot is assigned a unique identifier and the exception is stamped with that identifier.
+
+ > [!NOTE]
+ > The snapshot creation rate is limited by the `SnapshotsPerTenMinutesLimit` setting. By default, the limit is one snapshot every 10 minutes.
+
+1. After the `FirstChanceException` handler returns, the thrown exception is processed as normal.
+
+1. The exception reaches the `TrackException` method again and is reported to Application Insights, along with the snapshot identifier.
+
+> [!NOTE]
+> Set `IsEnabledInDeveloperMode` to `true` if you want to generate snapshots while you debug in Visual Studio.
+
+### Snapshot Uploader process
+
+While the Snapshot Debugger process continues to run and serve traffic to users with little interruption, the snapshot is handed off to the Snapshot Uploader process. The Snapshot Uploader:
+
+1. Creates a minidump.
+
+1. Uploads the minidump to Application Insights, along with any relevant symbol (*.pdb*) files.
+
+> [!NOTE]
+> No more than 50 snapshots per day can be uploaded.
+
+If you enabled the Snapshot Debugger but you aren't seeing snapshots, see the [Troubleshooting guide](snapshot-debugger-troubleshoot.md).
+ ## Limitations This section discusses limitations for the Snapshot Debugger.
-### Data retention
+- **Data retention**
+
+ Debug snapshots are stored for 15 days. The default data retention policy is set on a per-application basis. If you need to increase this value, you can request an increase by opening a support case in the Azure portal. For each Application Insights instance, a maximum number of 50 snapshots are allowed per day.
+
+- **Publish symbols**
-Debug snapshots are stored for 15 days. The default data retention policy is set on a per-application basis. If you need to increase this value, you can request an increase by opening a support case in the Azure portal. For each Application Insights instance, a maximum number of 50 snapshots are allowed per day.
+ The Snapshot Debugger requires symbol files on the production server to:
+ - Decode variables
+ - Provide a debugging experience in Visual Studio
-### Publish symbols
+ By default, Visual Studio 2017 versions 15.2+ publishes symbols for release builds when it publishes to App Service.
-The Snapshot Debugger requires symbol files on the production server to:
-- Decode variables-- Provide a debugging experience in Visual Studio
+ In prior versions, you must add the following line to your publish profile `.pubxml` file so that symbols are published in release mode:
-By default, Visual Studio 2017 versions 15.2+ publishes symbols for release builds when it publishes to App Service.
+ ```xml
+ <ExcludeGeneratedDebugSymbol>False</ExcludeGeneratedDebugSymbol>
+ ```
-In prior versions, you must add the following line to your publish profile `.pubxml` file so that symbols are published in release mode:
+ For Azure Compute and other types, make sure that the symbol files are either:
+ - In the same folder of the main application `.dll` (typically, `wwwroot/bin`), or
+ - Available on the current path.
-```xml
- <ExcludeGeneratedDebugSymbol>False</ExcludeGeneratedDebugSymbol>
-```
+ For more information on the different symbol options that are available, see the [Visual Studio documentation](/visualstudio/ide/reference/advanced-build-settings-dialog-box-csharp). For best results, we recommend that you use *Full*, *Portable*, or *Embedded*.
-For Azure Compute and other types, make sure that the symbol files are either:
-- In the same folder of the main application `.dll` (typically, `wwwroot/bin`), or-- Available on the current path.
+- **Optimized builds**
-For more information on the different symbol options that are available, see the [Visual Studio documentation](/visualstudio/ide/reference/advanced-build-settings-dialog-box-csharp). For best results, we recommend that you use *Full*, *Portable*, or *Embedded*.
+ In some cases, local variables can't be viewed in release builds because of optimizations applied by the JIT compiler.
-### Optimized builds
+ However, in App Service, the Snapshot Debugger can deoptimize throwing methods that are part of its collection plan.
-In some cases, local variables can't be viewed in release builds because of optimizations applied by the JIT compiler.
+ > [!TIP]
+ > Install the Application Insights Site extension in your instance of App Service to get deoptimization support.
-However, in App Service, the Snapshot Debugger can deoptimize throwing methods that are part of its collection plan.
+## Next steps
+
+Enable the Application Insights Snapshot Debugger for your application:
-> [!TIP]
-> Install the Application Insights Site extension in your instance of App Service to get deoptimization support.
+- [Azure App Service](snapshot-debugger-app-service.md?toc=/azure/azure-monitor/toc.json)
+- [Azure Functions](snapshot-debugger-function-app.md?toc=/azure/azure-monitor/toc.json)
+- [Azure Cloud Services](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
+- [Azure Service Fabric](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
+- [Azure Virtual Machines and Virtual Machine Scale Sets](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
+- [On-premises virtual or physical machines](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
-## Release notes for Microsoft.ApplicationInsights.SnapshotCollector
+## Release notes for `Microsoft.ApplicationInsights.SnapshotCollector`
This section contains the release notes for the `Microsoft.ApplicationInsights.SnapshotCollector` NuGet package for .NET applications, which is used by the Application Insights Snapshot Debugger.
For this first version using the new pipeline, we haven't strayed far from the o
- Added host memory protection. This feature reduces the impact on the host machine's memory. - Improved the Azure portal snapshot viewing experience.
-## Next steps
-
-Enable the Application Insights Snapshot Debugger for your application:
-
-* [Azure App Service](snapshot-debugger-app-service.md?toc=/azure/azure-monitor/toc.json)
-* [Azure Functions](snapshot-debugger-function-app.md?toc=/azure/azure-monitor/toc.json)
-* [Azure Cloud Services](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
-* [Azure Service Fabric](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
-* [Azure Virtual Machines and Virtual Machine Scale Sets](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
-* [On-premises virtual or physical machines](snapshot-debugger-vm.md?toc=/azure/azure-monitor/toc.json)
-
-Beyond Application Insights Snapshot Debugger:
-* [Set snappoints in your code](/visualstudio/debugger/debug-live-azure-applications) to get snapshots without waiting for an exception.
-* [Diagnose exceptions in your web apps](../app/asp-net-exceptions.md) explains how to make more exceptions visible to Application Insights.
-* [Smart detection](../alerts/proactive-diagnostics.md) automatically discovers performance anomalies.
azure-netapp-files Manage Manual Qos Capacity Pool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/manage-manual-qos-capacity-pool.md
You can change a capacity pool that currently uses the auto QoS type to use the
> Setting the capacity type to manual QoS is a permanent change. You cannot convert a manual QoS type capacity tool to an auto QoS capacity pool. > At conversion time, throughput levels might be capped to conform to the throughput limits for volumes of the manual QoS type. See [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md#resource-limits).
-1. From the management blade for your NetApp account, click **Capacity pools** to display existing capacity pools.
+>[!NOtE]
+>An auto QoS capacity pool enabled for [standard storage with cool access](cool-access-introduction.md) cannot be converted to a capacity pool using manual QoS.
+
+1. From the management blade for your NetApp account, select **Capacity pools** to display existing capacity pools.
-2. Click the capacity pool that you want to change to using manual QoS.
+2. Select the capacity pool that you want to change to using manual QoS.
-3. Click **Change QoS type**. Then set **New QoS Type** to **Manual**. Click **OK**.
+3. Select **Change QoS type**. Then set **New QoS Type** to **Manual**. Select **OK**.
![Change QoS type](../media/azure-netapp-files/change-qos-type.png) - ## Monitor the throughput of a manual QoS capacity pool Metrics are available to help you monitor the read and write throughput of a volume. See [Metrics for Azure NetApp Files](azure-netapp-files-metrics.md).
If a volume is contained in a manual QoS capacity pool, you can modify the allot
1. From the **Volumes** page, select the volume whose throughput you want to modify.
-2. Click **Change throughput**. Specify the **Throughput (MiB/S)** that you want. Click **OK**.
+2. Select **Change throughput**. Specify the **Throughput (MiB/S)** that you want. Select **OK**.
![Change QoS throughput](../media/azure-netapp-files/change-qos-throughput.png)
azure-resource-manager Tag Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/tag-support.md
To get the same data as a file of comma-separated values, download [tag-support.
> | servers / usages | No | No | > | servers / virtualNetworkRules | No | No | > | servers / vulnerabilityAssessments | No | No |
-> | virtualClusters | No | No |
+> | virtualClusters | Yes | No |
<a id="sqlnote"></a> > [!NOTE] > The Master database doesn't support tags, but other databases, including Azure Synapse Analytics databases, support tags. Azure Synapse Analytics databases must be in Active (not Paused) state.
+> [!NOTE]
+> Only Virtual Clusters with version 2.0 support tags. Minimal required API version for configuring tags is 2022-05-01.
## Microsoft.SqlVirtualMachine
azure-vmware Configure Windows Server Failover Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-windows-server-failover-cluster.md
Title: Configure Windows Server Failover Cluster on Azure VMware Solution vSAN
description: Learn how to configure Windows Server Failover Cluster (WSFC) on Azure VMware Solution vSAN with native shared disks. Previously updated : 10/07/2022 Last updated : 11/17/2023 # Configure Windows Server Failover Cluster on Azure VMware Solution vSAN
In this article, you'll learn how to configure [Failover Clustering in Windows S
Windows Server Failover Cluster, previously known as Microsoft Service Cluster Service (MSCS), is a Windows Server Operating System (OS) feature. WSFC is a business-critical feature, and for many applications is required. For example, WSFC is required for the following configurations: -- SQL server configured as:
+- SQL Server configured as:
- Always On Failover Cluster Instance (FCI), for instance-level high availability. - Always On Availability Group (AG), for database-level high availability. - Windows File
communication-services Exception Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/router/exception-policy.md
await administrationClient.path("/routing/exceptionPolicies/{exceptionPolicyId}"
exceptionRules: [ { id: "cancelJob",
- trigger: { kind: "queue-length", threshold: 100 },
+ trigger: { kind: "queueLength", threshold: 100 },
actions: [{ kind: "cancel" }] } ]
await administrationClient.path("/routing/exceptionPolicies/{exceptionPolicyId}"
exceptionRules: [ { id: "increasePriority",
- trigger: { kind: "wait-time", thresholdSeconds: "60" },
+ trigger: { kind: "waitTime", thresholdSeconds: "60" },
actions: [{ "manual-reclassify", priority: 10 }] }, { id: "changeQueue",
- trigger: { kind: "wait-time", thresholdSeconds: "300" },
+ trigger: { kind: "waitTime", thresholdSeconds: "300" },
actions: [{ kind: "manual-reclassify", queueId: "queue2" }] }] },
communication-services Matching Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/router/matching-concepts.md
In the following example, we register a worker to:
::: zone pivot="programming-language-csharp" ```csharp
-await client.CreateWorkerAsync(new CreateWorkerOptions(workerId: "worker-1", capacity: 2)
+var worker = await client.CreateWorkerAsync(new CreateWorkerOptions(workerId: "worker-1", capacity: 2)
{ AvailableForOffers = true, Queues = { "queue1", "queue2" },
await client.CreateWorkerAsync(new CreateWorkerOptions(workerId: "worker-1", cap
::: zone pivot="programming-language-javascript" ```typescript
-await client.path("/routing/workers/{workerId}", "worker-1").patch({
+const worker = await client.path("/routing/workers/{workerId}", "worker-1").patch({
body: { availableForOffers: true, capacity: 2,
await client.path("/routing/workers/{workerId}", "worker-1").patch({
::: zone pivot="programming-language-python" ```python
-client.upsert_worker(
+worker = client.upsert_worker(
worker_id = "worker-1", available_for_offers = True, capacity = 2,
client.upsert_worker(
::: zone pivot="programming-language-java" ```java
-client.createWorker(new CreateWorkerOptions("worker-1", 2)
+RouterWorker worker = client.createWorker(new CreateWorkerOptions("worker-1", 2)
.setAvailableForOffers(true) .setQueues(List.of("queue1", "queue2")) .setChannels(List.of(
If a worker would like to stop receiving offers, it can be deregistered by setti
::: zone pivot="programming-language-csharp" ```csharp
-await client.UpdateWorkerAsync(new RouterWorker(workerId: "worker-1") { AvailableForOffers = false });
+worker.AvailableForOffers = false;
+await client.UpdateWorkerAsync(worker);
``` ::: zone-end
client.upsert_worker(worker_id = "worker-1", available_for_offers = False)
::: zone pivot="programming-language-java" ```java
-client.updateWorker(new RouterWorker("worker-1").setAvailableForOffers(false));
+client.updateWorkerWithResponse("worker-1", worker.setAvailableForOffers(false));
``` ::: zone-end
communication-services Router Rule Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/router/router-rule-concepts.md
The following rule engine types exist in Job Router to provide flexibility in ho
**Azure Function rule -** Allows the Job Router to pass the input labels as a payload to an Azure Function and respond back with an output value.
+**Webhook rule -** Allows the Job Router to pass the input labels as a payload to a Webhook and respond back with an output value.
+
+**Direct map rule -** Takes the input labels on a job and outputs a set of worker or queue selectors with the same key and values. This should only be used in the `ConditionalQueueSelectorAttachment` or `ConditionalWorkerSelectorAttachment`.
+ ### Example: Use a static rule to set the priority of a job In this example a `StaticRouterRule`, which is a subtype of `RouterRule` can be used to set the priority of all Jobs, which use this classification policy.
await administrationClient.CreateClassificationPolicyAsync(
```typescript await administrationClient.path("/routing/classificationPolicies/{classificationPolicyId}", "my-policy-id").patch({ body: {
- prioritizationRule: { kind: "static-rule", value: 5 }
+ prioritizationRule: { kind: "static", value: 5 }
}, contentType: "application/merge-patch+json" });
await administrationClient.CreateClassificationPolicyAsync(
await administrationClient.path("/routing/classificationPolicies/{classificationPolicyId}", "my-policy-id").patch({ body: { prioritizationRule: {
- kind: "expression-rule",
+ kind: "expression",
expression: "If(job.Escalated = true, 10, 5)" } },
communication-services Calling Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/voice-video-calling/calling-sdk-features.md
The following list presents the set of features that are currently available in
| | Invite another VoIP participant to join an ongoing group call | ✔️ | ✔️ | ✔️ | ✔️ | | Mid call control | Turn your video on/off | ✔️ | ✔️ | ✔️ | ✔️ | | | Mute/Unmute mic | ✔️ | ✔️ | ✔️ | ✔️ |
-| | Mute other participants |✔️<sup>1</sup> | ❌ | ❌ | ❌ |
+| | Mute other participants |✔️<sup>1</sup> | ✔️<sup>1</sup> | ❌ | ❌ |
| | Switch between cameras | ✔️ | ✔️ | ✔️ | ✔️ | | | Local hold/un-hold | ✔️ | ✔️ | ✔️ | ✔️ | | | Active speaker | ✔️ | ✔️ | ✔️ | ✔️ |
communication-services Accept Decline Offer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/router-sdk/accept-decline-offer.md
client.decline_job_offer(
client.declineJobOffer( offerIssuedEvent.getData().getWorkerId(), offerIssuedEvent.getData().getOfferId(),
- new DeclineJobOfferOptions().setRetryOfferAt(OffsetDateTime.now().plusMinutes(5)));
+ new RequestOptions().setBody(BinaryData.fromObject(
+ new DeclineJobOfferOptions().setRetryOfferAt(OffsetDateTime.now().plusMinutes(5)))));
``` ::: zone-end
await routerClient.CompleteJobAsync(new CompleteJobOptions(jobId: accept.Value.J
::: zone pivot="programming-language-javascript" ```typescript
-await client.path("/routing/jobs/{jobId}:complete", accept.body.jobId, accept.body.assignmentId).post();
+await client.path("/routing/jobs/{jobId}/assignments/{assignmentId}:complete", accept.body.jobId, accept.body.assignmentId).post();
``` ::: zone-end
router_client.complete_job(job_id = job.id, assignment_id = accept.assignment_id
::: zone pivot="programming-language-java" ```java
-routerClient.completeJob(accept.getJobId(), accept.getAssignmentId());
+routerClient.completeJobWithResponse(accept.getJobId(), accept.getAssignmentId(), null);
``` ::: zone-end
await routerClient.CloseJobAsync(new CloseJobOptions(jobId: accept.Value.JobId,
::: zone pivot="programming-language-javascript" ```typescript
-await client.path("/routing/jobs/{jobId}:close", accept.body.jobId, accept.body.assignmentId).post({
+await client.path("/routing/jobs/{jobId}/assignments/{assignmentId}:close", accept.body.jobId, accept.body.assignmentId).post({
body: { dispositionCode: "Resolved" }
router_client.close_job(job_id = job.id, assignment_id = accept.assignment_id, d
::: zone pivot="programming-language-java" ```java
-routerClient.closeJob(accept.getJobId(), accept.getAssignmentId(), new CloseJobOptions()
- .setDispositionCode("Resolved"));
+routerClient.closeJobWithResponse(accept.getJobId(), accept.getAssignmentId(),
+ new RequestOptions().setBody(BinaryData.fromObject(new CloseJobOptions().setDispositionCode("Resolved"))));
``` ::: zone-end
communication-services Escalate Job https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/router-sdk/escalate-job.md
var classificationPolicy = await client.path("/routing/classificationPolicies/{c
queueSelectorAttachments: [{ kind: "conditional", condition: {
- kind: "expression-rule",
+ kind: "expression",
expression: 'job.Escalated = true' }, queueSelectors: [{
var classificationPolicy = await client.path("/routing/classificationPolicies/{c
}] }], prioritizationRule: {
- kind: "expression-rule",
+ kind: "expression",
expression: "If(job.Escalated = true, 10, 1)" } },
await client.path("/routing/exceptionPolicies/{exceptionPolicyId}", "Escalate_XB
exceptionRules: [ { id: "Escalated_Rule",
- trigger: { kind: "wait-time", thresholdSeconds: 5 * 60 },
+ trigger: { kind: "waitTime", thresholdSeconds: 5 * 60 },
actions: [{ kind: "reclassify", classificationPolicyId: classificationPolicy.body.id, labelsToUpsert: { Escalated: true }}] }] },
communication-services Job Classification https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/router-sdk/job-classification.md
var classificationPolicy = await client.path("/routing/classificationPolicies/{c
queueSelectorAttachments: [{ kind: "conditional", condition: {
- kind: "expression-rule",
+ kind: "expression",
expression: 'job.Region = "NA"' }, queueSelectors: [{
var classificationPolicy = await client.path("/routing/classificationPolicies/{c
}], fallbackQueueId: "XBOX_DEFAULT_QUEUE", prioritizationRule: {
- kind: "expression-rule",
+ kind: "expression",
expression: "If(job.Hardware_VIP = true, 10, 1)" } },
await client.path("/routing/classificationPolicies/{classificationPolicyId}", "p
body: { workerSelectorAttachments: [{ kind: "conditional",
- condition: { kind: "expression-rule", expression: "job.Urgent = true" },
+ condition: { kind: "expression", expression: "job.Urgent = true" },
workerSelectors: [{ key: "Foo", labelOperator: "equal", value: "Bar" }] }] },
await administrationClient.CreateClassificationPolicyAsync(
```typescript await client.path("/routing/classificationPolicies/{classificationPolicyId}", "policy-1").patch({ body: {
- workerSelectorAttachments: [{ kind: "pass-through", key: "Foo", labelOperator: "equal" }]
+ workerSelectorAttachments: [{ kind: "passThrough", key: "Foo", labelOperator: "equal" }]
}, contentType: "application/merge-patch+json" });
await administrationClient.CreateClassificationPolicyAsync(new CreateClassificat
await client.path("/routing/classificationPolicies/{classificationPolicyId}", "policy-1").patch({ body: { workerSelectorAttachments: [{
- kind: "weighted-allocation-worker-selector",
+ kind: "weightedAllocation",
allocations: [ { weight: 0.3,
communication-services Manage Queue https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/router-sdk/manage-queue.md
const distributionPolicy = await client.path("/routing/distributionPolicies/{dis
body: { offerExpiresAfterSeconds: 45, mode: {
- kind: "longest-idle",
+ kind: "longestIdle",
minConcurrentOffers: 1, maxConcurrentOffers: 10 },
administration_client.upsert_queue(queue.id, queue)
```java queue.setName("XBOX Updated Queue"); queue.setLabels(Map.of("Additional-Queue-Label", new RouterValue("ChatQueue")));
-administrationClient.updateQueue(queue);
+administrationClient.updateQueue(queue.getId(), BinaryData.fromObject(queue));
``` ::: zone-end
communication-services Scheduled Jobs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/how-tos/router-sdk/scheduled-jobs.md
await client.path("/routing/jobs/{jobId}", "job1").patch({
channelId: "Voice", queueId: "Callback", matchingMode: {
- scheduleAndSuspendMode: {
- scheduleAt: new Date(Date.now() + 3 * 60000)
- }
+ kind: "scheduleAndSuspend",
+ scheduleAt: new Date(Date.now() + 3 * 60000)
} }, contentType: "application/merge-patch+json"
if (eventGridEvent.EventType == "Microsoft.Communication.RouterJobWaitingForActi
await client.path("/routing/jobs/{jobId}", eventGridEvent.data.jobId).patch({ body: {
- matchingMode: { kind: "queue-and-match" },
+ matchingMode: { kind: "queueAndMatch" },
priority: 100 }, contentType: "application/merge-patch+json"
container-apps Dapr Component Resiliency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/dapr-component-resiliency.md
az extension update --name containerapp
Create resiliency policies by targeting an individual policy. For example, to create the `Outbound Timeout` policy, run the following command. ```azurecli
-az containerapp env dapr-component resiliency create -g MyResourceGroup -n MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --out-timeout 20
+az containerapp env dapr-component resiliency create --group MyResourceGroup --name MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --out-timeout 20
``` [For a full list of parameters, see the CLI reference guide.](/cli/azure/containerapp/resiliency#az-containerapp-resiliency-create-optional-parameters)
az containerapp env dapr-component resiliency create -g MyResourceGroup -n MyDap
To apply the resiliency policies from a YAML file, run the following command: ```azurecli
-az containerapp env dapr-component resiliency create -g MyResourceGroup -n MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --yaml <MY_YAML_FILE>
+az containerapp env dapr-component resiliency create --group MyResourceGroup --name MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --yaml <MY_YAML_FILE>
``` This command passes the resiliency policy YAML file, which might look similar to the following example:
inboundPolicy:
Update your resiliency policies by targeting an individual policy. For example, to update the response timeout of the `Outbound Timeout` policy, run the following command. ```azurecli
-az containerapp env dapr-component resiliency update -g MyResourceGroup -n MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --out-timeout 20
+az containerapp env dapr-component resiliency update --group MyResourceGroup --name MyDaprResiliency --environment MyEnvironment --dapr-component-name MyDaprComponentName --out-timeout 20
``` ### Update policies with resiliency YAML
container-apps Service Discovery Resiliency https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/service-discovery-resiliency.md
az extension update --name containerapp
To create a resiliency policy with recommended settings for timeouts, retries, and circuit breakers, run the `resiliency create` command with the `--recommended` flag: ```azurecli
-az containerapp resiliency create -g MyResourceGroup -n MyResiliencyName --container-app-name MyContainerApp --recommended
+az containerapp resiliency create --group MyResourceGroup --name MyResiliencyName --container-app-name MyContainerApp --recommended
```
-This command passes the recommeded resiliency policy configurations, as shown in the following example:
+This command passes the recommended resiliency policy configurations, as shown in the following example:
```yaml httpRetryPolicy:
tcpRetryPolicy:
timeoutPolicy: connectionTimeoutInSeconds: 5 responseTimeoutInSeconds: 60
-ircuitBreakerPolicy:
+circuitBreakerPolicy:
consecutiveErrors: 5 intervalInSeconds: 10 maxEjectionPercent: 100
ircuitBreakerPolicy:
Create resiliency policies by targeting an individual policy. For example, to create the `Timeout` policy, run the following command. ```azurecli
-az containerapp resiliency update -g MyResourceGroup -n MyResiliency --container-app-name MyContainerApp --timeout 20 --timeout-connect 5
+az containerapp resiliency update --group MyResourceGroup --name MyResiliency --container-app-name MyContainerApp --timeout 20 --timeout-connect 5
``` [For a full list of parameters, see the CLI reference guide.](/cli/azure/containerapp/resiliency#az-containerapp-resiliency-create-optional-parameters)
az containerapp resiliency update -g MyResourceGroup -n MyResiliency --container
To apply the resiliency policies from a YAML file, run the following command: ```azurecli
-az containerapp resiliency create -g MyResourceGroup ΓÇôn MyResiliency --container-app-name MyContainerApp ΓÇôyaml <MY_YAML_FILE>
+az containerapp resiliency create --group MyResourceGroup ΓÇôn MyResiliency --container-app-name MyContainerApp ΓÇôyaml <MY_YAML_FILE>
``` This command passes the resiliency policy YAML file, which might look similar to the following example:
httpConnectionPool:
Update your resiliency policies by targeting an individual policy. For example, to update the response timeout of the `Timeout` policy, run the following command. ```azurecli
-az containerapp resiliency update -g MyResourceGroup -n MyResiliency --container-app-name MyContainerApp --timeout 20
+az containerapp resiliency update --group MyResourceGroup --name MyResiliency --container-app-name MyContainerApp --timeout 20
``` ### Update policies with resiliency YAML
az containerapp resiliency update -g MyResourceGroup -n MyResiliency --container
You can also update existing resiliency policies by updating the resiliency YAML you created earlier. ```azurecli
-az containerapp resiliency update --name MyResiliency -g MyResourceGroup --container-app-name MyContainerApp --yaml <MY_YAML_FILE>
+az containerapp resiliency update --name MyResiliency --group MyResourceGroup --container-app-name MyContainerApp --yaml <MY_YAML_FILE>
``` ### View policies
az containerapp resiliency update --name MyResiliency -g MyResourceGroup --conta
Use the `resiliency list` command to list all the resiliency policies attached to a container app. ```azurecli
-az containerapp resiliency list -g MyResourceGroup --container-app-name MyContainerAppΓÇï
+az containerapp resiliency list --group MyResourceGroup --container-app-name MyContainerAppΓÇï
``` Use `resiliency show` command to show a single policy by name. ```azurecli
-az containerapp resiliency show -g MyResourceGroup -n MyResiliency --container-app-name ΓÇïMyContainerApp
+az containerapp resiliency show --group MyResourceGroup --name MyResiliency --container-app-name ΓÇïMyContainerApp
``` ### Delete policies
az containerapp resiliency show -g MyResourceGroup -n MyResiliency --container-a
To delete resiliency policies, run the following command. ```azurecli
-az containerapp resiliency delete -g MyResourceGroup -n MyResiliency --container-app-name ΓÇïMyContainerApp
+az containerapp resiliency delete --group MyResourceGroup --name MyResiliency --container-app-name ΓÇïMyContainerApp
``` # [Azure portal](#tab/portal)
container-registry Container Registry Tutorial Sign Trusted Ca https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-registry/container-registry-tutorial-sign-trusted-ca.md
+
+ Title: Sign container images with Notation and Azure Key vault using a CA-issued certificate (Preview)
+description: In this tutorial learn to create a CA-issued certificate in Azure Key Vault, build and sign a container image stored in Azure Container Registry (ACR) with notation and AKV, and then verify the container image using notation.
+++++ Last updated : 6/9/2023++
+# Sign container images with Notation and Azure Key Vault using a CA-issued certificate (Preview)
+
+Signing and verifying container images with a certificate issued by a trusted Certificate Authority (CA) is a valuable security practice. This security measure will help you to responsibly identify, authorize, and validate the identity of both the publisher of the container image and the container image itself. The Trusted Certificate Authorities (CAs) such as GlobalSign, DigiCert, and others play a crucial role in the validation of a user's or organization's identity, maintaining the security of digital certificates, and revoking the certificate immediately upon any risk or misuse.
+
+Here are some essential components that help you to sign and verify container images with a certificate issued by a trusted CA:
+
+* The [Notation](https://github.com/notaryproject/notation) is an open-source supply chain tool developed by [Notary Project](https://notaryproject.dev/), which supports signing and verifying container images and other artifacts.
+* The Azure Key Vault (AKV), a cloud-based service for managing cryptographic keys, secrets, and certificates will help you ensure to securely store and manage a certificate with a signing key.
+* The [Notation AKV plugin azure-kv](https://github.com/Azure/notation-azure-kv), the extension of Notation uses the keys stored in Azure Key Vault for signing and verifying the digital signatures of container images and artifacts.
+* The Azure Container Registry (ACR) allows you to attach these signatures to the signed image and helps you to store and manage these container images.
+
+When you verify the image, the signature is used to validate the integrity of the image and the identity of the signer. This helps to ensure that the container images are not tampered with and are from a trusted source.
+
+> [!IMPORTANT]
+> This feature is currently in preview. Previews are made available to you on the condition that you agree to the [supplemental terms of use][terms-of-use]. Some aspects of this feature may change prior to general availability (GA).
+
+In this article:
+
+> [!div class="checklist"]
+> * Install the notation CLI and AKV plugin
+> * Create or import a certificate issued by a CA in AKV
+> * Build and push a container image with ACR task
+> * Sign a container image with Notation CLI and AKV plugin
+> * Verify a container image signature with Notation CLI
+
+## Prerequisites
+
+* Create or use an [Azure Container Registry](../container-registry/container-registry-get-started-azure-cli.md) for storing container images and signatures
+* Create or use an [Azure Key Vault.](../key-vault/general/quick-create-cli.md)
+* Install and configure the latest [Azure CLI](/cli/azure/install-azure-cli), or run commands in the [Azure Cloud Shell](https://portal.azure.com/#cloudshell/)
+
+> [!NOTE]
+> We recommend creating a new Azure Key Vault for storing certificates only.
+
+## Install the notation CLI and AKV plugin
+
+1. Install `Notation v1.0.0` on a Linux amd64 environment. Follow the [Notation installation guide](https://notaryproject.dev/docs/user-guides/installation/cli/) to download the package for other environments.
+
+ ```bash
+ # Download, extract and install
+ curl -Lo notation.tar.gz https://github.com/notaryproject/notation/releases/download/v1.0.0/notation_1.0.0_linux_amd64.tar.gz
+ tar xvzf notation.tar.gz
+
+ # Copy the notation cli to the desired bin directory in your PATH, for example
+ cp ./notation /usr/local/bin
+ ```
+
+2. Install the notation Azure Key Vault plugin on a Linux environment for remote signing. You can also download the package for other environments by following the [Notation AKV plugin installation guide](https://github.com/Azure/notation-azure-kv#installation-the-akv-plugin).
+
+ ```bash
+ # Create a directory for the plugin
+ mkdir -p ~/.config/notation/plugins/azure-kv
+
+ # Download the plugin
+ curl -Lo notation-azure-kv.tar.gz https://github.com/Azure/notation-azure-kv/releases/download/v1.0.1/notation-azure-kv_1.0.1_linux_amd64.tar.gz
+
+ # Extract to the plugin directory
+ tar xvzf notation-azure-kv.tar.gz -C ~/.config/notation/plugins/azure-kv
+ ```
+
+> [!NOTE]
+> The plugin directory varies depending upon the operating system in use. The directory path assumes Ubuntu. For more information, see [Notation directory structure for system configuration.](https://notaryproject.dev/docs/user-guides/how-to/directory-structure/)
+
+3. List the available plugins.
+
+ ```bash
+ notation plugin ls
+ ```
+
+## Configure environment variables
+
+> [!NOTE]
+> This guide uses environment variables for convenience when configuring the AKV and ACR. Update the values of these environment variables for your specific resources.
+
+1. Configure AKV resource names.
+
+ ```bash
+ # Name of the existing Azure Key Vault used to store the signing keys
+ AKV_NAME=myakv
+
+ # Name of the certificate created or imported in AKV
+ CERT_NAME=wabbit-networks-io
+
+ # X.509 certificate subject
+ CERT_SUBJECT="CN=wabbit-networks.io,O=Notation,L=Seattle,ST=WA,C=US"
+ ```
+
+2. Configure ACR and image resource names.
+
+ ```bash
+ # Name of the existing registry example: myregistry.azurecr.io
+ ACR_NAME=myregistry
+ # Existing full domain of the ACR
+ REGISTRY=$ACR_NAME.azurecr.io
+ # Container name inside ACR where image will be stored
+ REPO=net-monitor
+ TAG=v1
+ # Source code directory containing Dockerfile to build
+ IMAGE_SOURCE=https://github.com/wabbit-networks/net-monitor.git#main
+ ```
+
+## Sign in with Azure CLI
+
+```bash
+az login
+```
+
+To learn more about Azure CLI and how to sign in with it, see [Sign in with Azure CLI](/cli/azure/authenticate-azure-cli).
++
+## Create or import a certificate issued by a CA in AKV
+
+### Certificate requirements
+
+When creating certificates for signing and verification, the certificates must meet the [Notary Project certificate requirement](https://github.com/notaryproject/specifications/blob/v1.0.0/specs/signature-specification.md#certificate-requirements).
+
+Here are the requirements for root and intermediate certificates:
+- The `basicConstraints` extension must be present and marked as critical. The `CA` field must be set `true`.
+- The `keyUsage` extension must be present and marked `critical`. Bit positions for `keyCertSign` MUST be set.
+
+Here are the requirements for certificates issued by a CA:
+- X.509 certificate properties:
+ - Subject must contain common name (`CN`), country (`C`), state or province (`ST`), and organization (`O`). In this tutorial, `$CERT_SUBJECT` is used as the subject.
+ - X.509 key usage flag must be `DigitalSignature` only.
+ - Extended Key Usages (EKUs) must be empty or `1.3.6.1.5.5.7.3.3` (for Codesigning).
+- Key properties:
+ - The `exportable` property must be set to `false`.
+ - Select a supported key type and size from the [Notary Project specification](https://github.com/notaryproject/specifications/blob/v1.0.0/specs/signature-specification.md#algorithm-selection).
+
+> [!NOTE]
+> This guide uses version 1.0.1 of the AKV plugin. Prior versions of the plugin had a limitation that required a specific certificate order in a certificate chain. Version 1.0.1 of the plugin does not have this limitation so it is recommended that you use version 1.0.1 or later.
+
+### Create a certificate issued by a CA
+
+Create a certificate signing request (CSR) by following the instructions in [create certificate signing request](../key-vault/certificates/create-certificate-signing-request.md).
+
+> [!IMPORTANT]
+> When merging the CSR, make sure you merge the entire chain that brought back from the CA vendor.
+
+### Import the certificate in AKV
+
+To import the certificate:
+
+1. Get the certificate file from CA vendor with entire certificate chain.
+2. Import the certificate into Azure Key Vault by following the instructions in [import a certificate](../key-vault/certificates/tutorial-import-certificate.md).
+
+> [!NOTE]
+> If the certificate does not contain a certificate chain after creation or importing, you can obtain the intermediate and root certificates from your CA vendor. You can ask your vendor to provide you with a PEM file that contains the intermediate certificates (if any) and root certificate. This file can then be used at step 5 of [signing container images](#sign-a-container-image-with-notation-cli-and-akv-plugin).
+
+## Sign a container image with Notation CLI and AKV plugin
+
+1. Authenticate to your ACR by using your individual Azure identity.
+
+ ```bash
+ az acr login --name $ACR_NAME
+ ```
+
+> [!IMPORTANT]
+> If you have Docker installed on your system and used `az acr login` or `docker login` to authenticate to your ACR, your credentials are already stored and available to notation. In this case, you donΓÇÖt need to run `notation login` again to authenticate to your ACR. To learn more about authentication options for notation, see [Authenticate with OCI-compliant registries](https://notaryproject.dev/docs/user-guides/how-to/registry-authentication/).
+
+2. Build and push a new image with ACR Tasks. Always use `digest` to identify the image for signing, since tags are mutable and can be overwritten.
+
+ ```bash
+ DIGEST=$(az acr build -r $ACR_NAME -t $REGISTRY/${REPO}:$TAG $IMAGE_SOURCE --no-logs --query "outputImages[0].digest" -o tsv)
+ IMAGE=$REGISTRY/${REPO}@$DIGEST
+ ```
+
+ In this tutorial, if the image has already been built and is stored in the registry, the tag serves as an identifier for that image for convenience.
+
+ ```bash
+ IMAGE=$REGISTRY/${REPO}@$TAG
+ ```
+
+3. Assign access policy in AKV using the Azure CLI
+
+ To sign a container image with a certificate in AKV, a principal must have authorized access to AKV. The principal can be a user principal, service principal, or managed identity. In this tutorial, we assign an access policy to a signed-in user. To learn more about assigning policy to a principal, see [Assign Access Policy](/azure/key-vault/general/assign-access-policy).
+
+ To set the subscription that contains the AKV resources, run the following command:
+
+ ```bash
+ az account set --subscription <your_subscription_id>
+ ```
+
+ If the certificate contains the entire certificate chain, the principal must be granted key permission `Sign`, secret permission `Get`, and certificate permissions `Get`. To grant these permissions to the principal:
+
+ ```bash
+ USER_ID=$(az ad signed-in-user show --query id -o tsv)
+ az keyvault set-policy -n $AKV_NAME --key-permissions sign --secret-permissions get --certificate-permissions get --object-id $USER_ID
+ ```
+
+ If the certificate doesn't contain the chain, the principal must be granted key permission `Sign`, and certificate permissions `Get`. To grant these permissions to the principal:
+
+ ```bash
+ USER_ID=$(az ad signed-in-user show --query id -o tsv)
+ az keyvault set-policy -n $AKV_NAME --key-permissions sign --certificate-permissions get --object-id $USER_ID
+ ```
+
+4. Get the Key ID for a certificate. A certificate in AKV can have multiple versions, the following command gets the Key ID for the latest version of the `$CERT_NAME` certificate.
+
+ ```bash
+ KEY_ID=$(az keyvault certificate show -n $CERT_NAME --vault-name $AKV_NAME --query 'kid' -o tsv)
+ ```
+
+5. Sign the container image with the COSE signature format using the Key ID.
+
+ If the certificate contains the entire certificate chain, run the following command:
+
+ ```bash
+ notation sign --signature-format cose $IMAGE --id $KEY_ID --plugin azure-kv
+ ```
+
+ If the certificate does not contain the chain, use the `--plugin-config ca_certs=<ca_bundle_file>` parameter to pass the CA certificates in a PEM file to AKV plugin, run the following command:
+
+ ```bash
+ notation sign --signature-format cose $IMAGE --id $KEY_ID --plugin azure-kv --plugin-config ca_certs=<ca_bundle_file>
+ ```
+
+6. View the graph of signed images and associated signatures.
+
+ ```bash
+ notation ls $IMAGE
+ ```
+
+ In the following example of output, a signature of type `application/vnd.cncf.notary.signature` identified by digest `sha256:d7258166ca820f5ab7190247663464f2dcb149df4d1b6c4943dcaac59157de8e` is associated to the `$IMAGE`.
+
+ ```
+ myregistry.azurecr.io/net-monitor@sha256:17cc5dd7dfb8739e19e33e43680e43071f07497ed716814f3ac80bd4aac1b58f
+ ΓööΓöÇΓöÇ application/vnd.cncf.notary.signature
+ ΓööΓöÇΓöÇ sha256:d7258166ca820f5ab7190247663464f2dcb149df4d1b6c4943dcaac59157de8e
+ ```
+
+## Verify a container image with Notation CLI
+
+1. Add the root certificate to a named trust store for signature verification. If you do not have the root certificate, you can obtain it from your CA. The following example adds the root certificate `$ROOT_CERT` to the `$STORE_NAME` trust store.
+
+ ```bash
+ STORE_TYPE="ca"
+ STORE_NAME="wabbit-networks.io"
+ notation cert add --type $STORE_TYPE --store $STORE_NAME $ROOT_CERT
+ ```
+
+2. List the root certificate to confirm the `$ROOT_CERT` is added successfully.
+
+ ```bash
+ notation cert ls
+ ```
+
+3. Configure trust policy before verification.
+
+ Trust policies allow users to specify fine-tuned verification policies. Use the following command to configure trust policy.
+
+ ```bash
+ cat <<EOF > ./trustpolicy.json
+ {
+ "version": "1.0",
+ "trustPolicies": [
+ {
+ "name": "wabbit-networks-images",
+ "registryScopes": [ "$REGISTRY/$REPO" ],
+ "signatureVerification": {
+ "level" : "strict"
+ },
+ "trustStores": [ "$STORE_TYPE:$STORE_NAME" ],
+ "trustedIdentities": [
+ "x509.subject: $CERT_SUBJECT"
+ ]
+ }
+ ]
+ }
+ EOF
+ ```
+
+ The above `trustpolicy.json` file defines one trust policy named `wabbit-networks-images`. This trust policy applies to all the artifacts stored in the `$REGISTRY/$REPO` repositories. The named trust store `$STORE_NAME` of type `$STORE_TYPE` contains the root certificates. It also assumes that the user trusts a specific identity with the X.509 subject `$CERT_SUBJECT`. For more details, see [Trust store and trust policy specification](https://github.com/notaryproject/notaryproject/blob/v1.0.0/specs/trust-store-trust-policy.md).
+
+4. Use `notation policy` to import the trust policy configuration from `trustpolicy.json`.
+
+ ```bash
+ notation policy import ./trustpolicy.json
+ ```
+
+5. Show the trust policy configuration to confirm its successful import.
+
+ ```bash
+ notation policy show
+ ```
+
+5. Use `notation verify` to verify the integrity of the image:
+
+ ```bash
+ notation verify $IMAGE
+ ```
+
+ Upon successful verification of the image using the trust policy, the sha256 digest of the verified image is returned in a successful output message. An example of output:
+
+ `Successfully verified signature for myregistry.azurecr.io/net-monitor@sha256:17cc5dd7dfb8739e19e33e43680e43071f07497ed716814f3ac80bd4aac1b58f`
+
+## FAQ
+
+- What should I do if the certificate is expired?
+
+ If the certificate has expired, it invalidates the signature. To resolve this issue, you should renew the certificate and sign container images again. Learn more about [Renew your Azure Key Vault certificates](../key-vault/certificates/overview-renew-certificate.md).
+
+- What should I do if the root certificate is expired?
+
+ If the root certificate has expired, it invalidates the signature. To resolve this issue, you should obtain a new certificate from a trusted CA vendor and sign container images again. Replace the expired root certificate with the new one from the CA vendor.
+
+- What should I do if the certificate is revoked?
+
+ If the certificate is revoked, it invalidates the signature. The most common reason for revoking a certificate is when the certificateΓÇÖs private key has been compromised. To resolve this issue, you should obtain a new certificate from a trusted CA vendor and sign container images again.
+
+## Next steps
+
+See [Use Image Integrity to validate signed images before deploying them to your Azure Kubernetes Service (AKS) clusters (Preview)](/azure/aks/image-integrity?tabs=azure-cli) and [Ratify on Azure](https://ratify.dev/docs/1.0/quickstarts/ratify-on-azure/) to get started into verifying and auditing signed images before deploying them on AKS.
+
+[terms-of-use]: https://azure.microsoft.com/support/legal/preview-supplemental-terms/
copilot Analyze Cost Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/analyze-cost-management.md
Title: Analyze, estimate and optimize cloud costs using Microsoft Copilot for Az
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can use Microsoft Cost Management to help you manage your costs. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Author Api Management Policies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/author-api-management-policies.md
Title: Author API Management policies using Microsoft Copilot for Azure (preview
description: Learn about how Microsoft Copilot for Azure (preview) can generate Azure API Management policies based on your requirements. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Build Infrastructure Deploy Workloads https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/build-infrastructure-deploy-workloads.md
Title: Build infrastructure and deploy workloads using Microsoft Copilot for Azu
description: Learn how Microsoft Copilot for Azure (preview) can help you build custom infrastructure for your workloads and provide templates and scripts to help you deploy. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Capabilities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/capabilities.md
Title: Microsoft Copilot for Azure (preview) capabilities
description: Learn about the things you can do with Microsoft Copilot for Azure (preview). Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Deploy Vms Effectively https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/deploy-vms-effectively.md
Title: Deploy virtual machines effectively using Microsoft Copilot for Azure (pr
description: Learn how Microsoft Copilot for Azure (preview) can help you deploy cost-efficient VMs. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Generate Cli Scripts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/generate-cli-scripts.md
Title: Generate Azure CLI scripts using Microsoft Copilot for Azure (preview)
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can generate Azure CLI scripts for you to customize and use. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Generate Kubernetes Yaml https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/generate-kubernetes-yaml.md
Title: Generate Kubernetes YAML files using Microsoft Copilot for Azure (preview
description: Learn how Microsoft Copilot for Azure (preview) can generate Kubernetes YAML files for you to customize and use. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Get Information Resource Graph https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/get-information-resource-graph.md
Title: Get resource information using Microsoft Copilot for Azure (preview)
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can help with Azure Resource Graph. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Get Monitoring Information https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/get-monitoring-information.md
Title: Get information about Azure Monitor logs using Microsoft Copilot for Azur
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can provide information about Azure Monitor metrics and logs. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Improve Storage Accounts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/improve-storage-accounts.md
Title: Improve security and resiliency of storage accounts using Microsoft Copil
description: Learn how Microsoft Copilot for Azure (preview) can improve the security posture and data resiliency of storage accounts. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Limited Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/limited-access.md
Title: Limited access to Microsoft Copilot for Azure (preview)
description: This article describes the limited access policy for Microsoft Copilot for Azure (preview). Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Optimize Code Application Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/optimize-code-application-insights.md
Title: Discover performance recommendations with Code Optimizations using Micros
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can use Application Insight Code Optimizations to help optimize your apps. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/overview.md
Title: Microsoft Copilot for Azure (preview) overview
description: Learn about Microsoft Copilot for Azure (preview). Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Responsible Ai Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/responsible-ai-faq.md
Title: Responsible AI FAQ for Microsoft Copilot for Azure (preview)
description: Learn how Microsoft Copilot for Azure (preview) uses data and what to expect. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Understand Service Health https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/understand-service-health.md
Title: Understand service health events and status using Microsoft Copilot for A
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can provide information about service health events. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
copilot Work Smarter Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/copilot/work-smarter-edge.md
Title: Work smarter with your Azure Stack HCI clusters using Microsoft Copilot f
description: Learn about scenarios where Microsoft Copilot for Azure (preview) can help you work with your Azure Stack HCI clusters. Last updated 11/15/2023 -+ - ignite-2023 - ignite-2023-copilotinAzure
cosmos-db Migration Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/vcore/migration-options.md
Title: Options to migrate data from MongoDB description: Review various options to migrate your data from other MongoDB sources to Azure Cosmos DB for MongoDB vCore.--++ Previously updated : 09/12/2023 Last updated : 11/17/2023
+# CustomerIntent: As a MongoDB user, I want to understand the various options available to migrate my data to Azure Cosmos DB for MongoDB vCore, so that I can make an informed decision about which option is best for my use case.
-# Options to migrate data from MongoDB to Azure Cosmos DB for MongoDB vCore
+# What are the options to migrate data from MongoDB to Azure Cosmos DB for MongoDB vCore?
This document describes the various options to lift and shift your MongoDB workloads to Azure Cosmos DB for MongoDB vCore offering.
-## Premigration assessment
+## Azure Data Studio (Offline)
-Assessment involves finding out whether you're using the [features and syntax that are supported](./compatibility.md). The aim of this stage is to create a list of incompatibilities and warnings, if any. After you have the assessment results, you can try to address the findings during rest of the migration planning.
+The [The MongoDB migration extension for Azure Data Studio](/azure-data-studio/extensions/database-migration-for-mongo-extension) is the preferred tool in migrating your MongoDB workloads to the API for MongoDB vCore.
-The [Azure Cosmos DB Migration for MongoDB extension](/azure-data-studio/extensions/database-migration-for-mongo-extension) in Azure Data Studio helps you assess a MongoDB workload for migrating to Azure Cosmos DB for MongoDB. You can use this extension to run an end-to-end assessment on your workload and find out the actions that you may need to take to seamlessly migrate your workloads on Azure Cosmos DB. During the assessment of a MongoDB endpoint, the extension reports all the discovered resources.
+The migration process has two phases:
+
+- **Premigration assessment** - An evaluation of your current MongoDB data estate to determine if there are any incompatibilities.
+- **Migration** - The migration operation using services managed by Azure.
+
+### Premigration assessment
+
+Assessment involves finding out whether you're using the [features and syntax that are supported](./compatibility.md). The purpose of this stage is to identify any incompatibilities or warnings that exist in the current MongoDB solution. You should resolve the issues found in the assessment results before moving on with the migration process.
> [!TIP]
-> We recommend you to go through [the supported features and syntax](./compatibility.md) in detail, as well as perform a proof-of-concept prior to the actual migration.
+> We recommend you review the [supported features and syntax](./compatibility.md) in detail and perform a proof-of-concept prior to the actual migration.
+
+### Migration
+
+Use the graphical user interface to manage the entire migration process from start to finish. The migration is launched in Azure Data Studio but runs in the cloud on Azure-managed resources.
## Native MongoDB tools (Offline)
You can use the native MongoDB tools such as *mongodump/mongorestore*, *mongoexp
| Move whole database (BSON-based) | *mongodump/mongorestore* | - *mongoexport/mongoimport* is the best pair of migration tools for migrating a subset of your MongoDB database.
- - *mongoexport* exports your existing data to a human-readable JSON or CSV file. *mongoexport* takes an argument specifying the subset of your existing data to export.
- - *mongoimport* opens a JSON or CSV file and inserts the content into the target database instance (Azure Cosmos DB for MongoDB vCore in this case.).
- - JSON and CSV aren't a compact format; you may incur excess network charges as *mongoimport* sends data to Azure Cosmos DB for MongoDB vCore.
+ - *mongoexport* exports your existing data to a human-readable JSON or CSV file. *mongoexport* takes an argument specifying the subset of your existing data to export.
+ - *mongoimport* opens a JSON or CSV file and inserts the content into the target database instance (Azure Cosmos DB for MongoDB vCore in this case.).
+ - JSON and CSV aren't a compact format; you could incur excess network charges as *mongoimport* sends data to Azure Cosmos DB for MongoDB vCore.
- *mongodump/mongorestore* is the best pair of migration tools for migrating your entire MongoDB database. The compact BSON format makes more efficient use of network resources as the data is inserted into Azure Cosmos DB for MongoDB vCore. - *mongodump* exports your existing data as a BSON file. - *mongorestore* imports your BSON file dump into Azure Cosmos DB for MongoDB vCore.
You can use the native MongoDB tools such as *mongodump/mongorestore*, *mongoexp
## Data migration using Azure Databricks (Offline/Online) -- This method offers full control of the migration rate and data transformation. It can also support large datasets that are in TBs in size.
+Migrating using Azure Databricks offers full control of the migration rate and data transformation. This method can also support large datasets that are in TBs in size.
+ - [Azure Databricks](https://azure.microsoft.com/services/databricks/) is a platform as a service (PaaS) offering for [Apache Spark](https://spark.apache.org/). You can use Azure Databricks to do an offline/online migration of databases from MongoDB to Azure Cosmos DB for MongoDB. - Here's how you can [migrate data to Azure Cosmos DB for MongoDB vCore offline using Azure Databricks](../migrate-databricks.md#provision-an-azure-databricks-cluster)
-## Next steps
+## Related content
- Migrate data to Azure Cosmos DB for MongoDB vCore [using native MongoDB tools](how-to-migrate-native-tools.md). - Migrate data to Azure Cosmos DB for MongoDB vCore [using Azure Databricks](../migrate-databricks.md).
cosmos-db Priority Based Execution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/priority-based-execution.md
This feature enables users to execute critical tasks while delaying less importa
> [!NOTE] > Priority-based execution feature doesn't guarantee always throttling low priority requests in favor of high priority ones. This operates on best-effort basis and there are no SLA's linked to the performance of the feature.
+## Use-cases
+
+You can use priority-based execution when your application has different priorities for workloads running on the same container. For example,
+
+- Prioritizing read, write, or query operations.
+- Prioritizing user actions vs background operations like
+ - Stored procedures
+ - Data ingestion/migration
+ ## Getting started To get started using priority-based execution, navigate to the **Features** page in you're in Azure Cosmos DB account. Select and enable the **Priority-based execution (preview)** feature.
container.createItem(family, new PartitionKey(family.getLastName()), requestOpti
}).subscribe(); ```+ ## Monitoring Priority-based execution
cost-management-billing Automate Budget Creation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/automate-budget-creation.md
description: This article helps you create budgets with the Budget API and a budget template. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Automate budget creation
cost-management-billing Automation Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/automation-faq.md
Title: Microsoft Cost Management automation FAQ
description: This FAQ is a list of frequently asked questions and answers about Cost Management automation. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Cost Management automation FAQ
cost-management-billing Automation Ingest Usage Details Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/automation-ingest-usage-details-overview.md
description: This article explains how to use cost details records to correlate meter-based charges with the specific resources responsible for the charges so that you can properly reconcile your bill. Previously updated : 05/17/2023 Last updated : 11/17/2023 -+ # Ingest cost details data
cost-management-billing Automation Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/automation-overview.md
description: This article covers common scenarios for Cost Management automation and options available based on your situation. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Cost Management automation overview
cost-management-billing Cost Management Api Permissions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/cost-management-api-permissions.md
description: This article describes what you need to know to successfully assign permissions to an Azure service principal. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Assign permissions to Cost Management APIs
cost-management-billing Get Small Usage Datasets On Demand https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/get-small-usage-datasets-on-demand.md
description: The article explains how you can use the Cost Details API to get raw, unaggregated cost data that corresponds to your Azure bill. Previously updated : 05/10/2023 Last updated : 11/17/2023 -+ # Get small cost datasets on demand
cost-management-billing Get Usage Data Azure Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/get-usage-data-azure-cli.md
description: This article explains how you get usage data with the Azure CLI. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Get usage data with the Azure CLI
cost-management-billing Get Usage Details Legacy Customer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/get-usage-details-legacy-customer.md
description: This article explains how you get cost data if you have a MOSP pay-as-you-go subscription. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Get cost details for a pay-as-you-go subscription
cost-management-billing Migrate Consumption Marketplaces Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-consumption-marketplaces-api.md
description: This article has information to help you migrate from the Consumption Marketplaces API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from Consumption Marketplaces API
cost-management-billing Migrate Consumption Usage Details Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-consumption-usage-details-api.md
description: This article has information to help you migrate from the Consumption Usage Details API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from Consumption Usage Details API
cost-management-billing Migrate Ea Balance Summary Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-balance-summary-api.md
description: This article has information to help you migrate from the EA Balance Summary API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from EA Balance Summary API
cost-management-billing Migrate Ea Price Sheet Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-price-sheet-api.md
description: This article has information to help you migrate from the EA Price Sheet API. Previously updated : 04/05/2023 Last updated : 11/17/2023 -+ # Migrate from EA Price Sheet API
cost-management-billing Migrate Ea Reporting Arm Apis Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-reporting-arm-apis-overview.md
description: This article provides an overview about migrating from Azure Enterprise Reporting to Microsoft Cost Management APIs. Previously updated : 09/15/2023 Last updated : 11/17/2023
cost-management-billing Migrate Ea Reserved Instance Charges Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-reserved-instance-charges-api.md
description: This article has information to help you migrate from the EA Reserved Instance Charges API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from EA Reserved Instance Charges API
cost-management-billing Migrate Ea Reserved Instance Recommendations Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-reserved-instance-recommendations-api.md
description: This article has information to help you migrate from the EA Reserved Instance Recommendations API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from EA Reserved Instance Recommendations API
cost-management-billing Migrate Ea Reserved Instance Usage Details Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-reserved-instance-usage-details-api.md
description: This article has information to help you migrate from the EA Reserved Instance Usage Details API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from EA Reserved Instance Usage Details API
cost-management-billing Migrate Ea Reserved Instance Usage Summary Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-reserved-instance-usage-summary-api.md
description: This article has information to help you migrate from the EA Reserved Instance Usage Summary API. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Migrate from EA Reserved Instance Usage Summary API
cost-management-billing Migrate Ea Usage Details Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/migrate-ea-usage-details-api.md
description: This article has information to help you migrate from the EA Usage Details APIs. Previously updated : 07/18/2022 Last updated : 11/17/2023 -+ # Migrate from EA Usage Details APIs
cost-management-billing Partner Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/partner-automation.md
description: This article explains how Microsoft partners and their customers can use Cost Management APIs for common tasks. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Automation for partners
cost-management-billing Tutorial Seed Historical Cost Dataset Exports Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/tutorial-seed-historical-cost-dataset-exports-api.md
Title: Tutorial - Seed a historical cost dataset with the Exports API
description: This tutorial helps your seed a historical cost dataset to visualize cost trends over time. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Tutorial: Seed a historical cost dataset with the Exports API
cost-management-billing Understand Usage Details Fields https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/understand-usage-details-fields.md
description: This article describes the fields in the usage data files. Previously updated : 10/11/2023 Last updated : 11/17/2023
cost-management-billing Usage Details Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/automate/usage-details-best-practices.md
description: This article describes best practices recommended by Microsoft when you work with data in cost details files. Previously updated : 07/15/2022 Last updated : 11/17/2023 -+ # Choose a cost details solution
cost-management-billing Billing Understand Dedicated Hosts Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/billing-understand-dedicated-hosts-reservation-charges.md
Title: Understand Azure Dedicated Hosts Reserved Instances discount description: Learn how Azure Reserved VM Instance discount is applied to Azure Dedicated Hosts. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Buy Vm Software Reservation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/buy-vm-software-reservation.md
Title: Prepay for Virtual machine software reservations description: Learn how to prepay for Azure virtual machine software reservations to save money. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Calculate Ea Reservations Savings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/calculate-ea-reservations-savings.md
Title: Calculate EA reservations cost savings
description: Learn how Enterprise Agreement users manually calculate their reservations savings. -+ Previously updated : 03/24/2023 Last updated : 11/17/2023
cost-management-billing Charge Back Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/charge-back-usage.md
Title: Charge back Azure Reservation costs
-description: Learn how to view Azure Reservation costs for chargeback.
+description: Learn how to view Azure Reservation costs for chargeback
-+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Determine Reservation Purchase https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/determine-reservation-purchase.md
Title: Determine what Azure reservation you should purchase description: This article helps you determine which reservation you should purchase. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Discount Sql Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/discount-sql-edge.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Exchange And Refund Azure Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/exchange-and-refund-azure-reservations.md
Title: Self-service exchanges and refunds for Azure Reservations description: Learn how you can exchange or refund Azure Reservations. You must have owner access to the Reservation Order to exchange or refund reservations. -+
cost-management-billing Fabric Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/fabric-capacity.md
Previously updated : 11/02/2023 Last updated : 11/17/2023
cost-management-billing Find Reservation Purchaser From Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/find-reservation-purchaser-from-logs.md
Title: Find a reservation purchaser from Azure Monitor logs description: This article helps find a reservation purchaser with information from Azure Monitor logs. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing How To View Csp Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/how-to-view-csp-reservations.md
description: Learn how you can view Azure Reservations as a Cloud Solution Provi
-+ Last updated 06/03/2022
cost-management-billing Limited Time Central Poland https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/limited-time-central-poland.md
Previously updated : 10/24/2023 Last updated : 11/17/2023
cost-management-billing Limited Time Central Sweden https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/limited-time-central-sweden.md
Previously updated : 08/28/2023 Last updated : 11/17/2023
cost-management-billing Limited Time Us West https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/limited-time-us-west.md
Previously updated : 08/28/2023 Last updated : 11/17/2023
cost-management-billing Manage Reserved Vm Instance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/manage-reserved-vm-instance.md
description: Learn how to manage Azure Reservations. See steps to change the res
-+ Previously updated : 12/06/2022 Last updated : 11/17/2023 # Manage Reservations for Azure resources
cost-management-billing Poland Limited Time Sql Services Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/poland-limited-time-sql-services-reservations.md
Previously updated : 10/27/2023 Last updated : 11/17/2023
cost-management-billing Prepare Buy Reservation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepare-buy-reservation.md
Title: Buy an Azure reservation description: Learn about important points to help you buy an Azure reservation. -+
cost-management-billing Prepay App Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-app-service.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Prepay Databricks Reserved Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-databricks-reserved-capacity.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Prepay Hana Large Instances Reserved Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-hana-large-instances-reserved-capacity.md
Title: Save on SAP HANA Large Instances with an Azure reservation description: Understand the things you need to know before you buy a HANA Large Instance reservation and how to make the purchase. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Prepay Jboss Eap Integrated Support App Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-jboss-eap-integrated-support-app-service.md
Title: Save on JBoss EAP Integrated Support on Azure App Service with reservations description: Learn how you can save on your JBoss EAP Integrated Support fee on Azure App Service. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Prepay Sql Data Warehouse Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-sql-data-warehouse-charges.md
Previously updated : 10/23/2023 Last updated : 11/17/2023
cost-management-billing Prepay Sql Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/prepay-sql-edge.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Reservation Apis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-apis.md
Title: APIs for Azure reservation automation description: Learn about the Azure APIs that you can use to programmatically get reservation information. -+ tags: billing Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Reservation Discount App Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-discount-app-service.md
Previously updated : 06/01/2023 Last updated : 11/17/2023
cost-management-billing Reservation Discount Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-discount-application.md
Title: How an Azure reservation discount is applied description: This article helps you understand how reserved instance discounts are generally applied. -+
cost-management-billing Reservation Discount Azure Sql Dw https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-discount-azure-sql-dw.md
Title: How reservation discounts apply to Azure Synapse Analytics (data warehousing only) description: Learn how reservation discounts apply to Azure Synapse Analytics to help you save money. -+
cost-management-billing Reservation Discount Databricks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-discount-databricks.md
Title: How an Azure Databricks prepurchase discount is applied description: Learn how an Azure Databricks prepurchase discount applies to your usage. You can use these Databricks at any time during the purchase term. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Reservation Exchange Policy Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-exchange-policy-changes.md
Previously updated : 10/16/2023 Last updated : 11/17/2023 # Changes to the Azure reservation exchange policy
cost-management-billing Reservation Renew https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-renew.md
Title: Automatically renew Azure reservations description: Learn how you can automatically renew Azure reservations to continue getting reservation discounts. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Reservation Utilization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reservation-utilization.md
Title: View Azure reservation utilization description: Learn how to get reservation utilization and details. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Reserved Instance Purchase Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reserved-instance-purchase-recommendations.md
Title: Azure reservation recommendations
description: Learn about Azure reservation recommendations. -+
cost-management-billing Reserved Instance Windows Software Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/reserved-instance-windows-software-costs.md
Title: Reservations software costs for Azure description: Learn which software meters are not included in Azure Reserved VM Instance costs. -+ tags: billing Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Save Compute Costs Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/save-compute-costs-reservations.md
Previously updated : 09/12/2023 Last updated : 11/17/2023
cost-management-billing Synapse Analytics Pre Purchase Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/synapse-analytics-pre-purchase-plan.md
Title: Optimize Azure Synapse Analytics costs with a Pre-Purchase Plan description: Learn how you can save on your Azure Synapse Analytics costs when you prepurchase Azure Synapse commit units (SCU) for one year. -+
cost-management-billing Troubleshoot Download Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-download-usage.md
-+ Previously updated : 12/06/2022 Last updated : 11/17/2023 # Troubleshoot Azure reservation download usage details
cost-management-billing Troubleshoot No Eligible Subscriptions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-no-eligible-subscriptions.md
-+ Last updated 07/20/2023
cost-management-billing Troubleshoot Product Not Available https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-product-not-available.md
-+ Previously updated : 12/06/2022 Last updated : 11/17/2023 # Troubleshoot reservation type not available
cost-management-billing Troubleshoot Reservation Recommendation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-reservation-recommendation.md
-+ Last updated 01/06/2023
cost-management-billing Troubleshoot Reservation Transfers Between Tenants https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-reservation-transfers-between-tenants.md
Previously updated : 04/12/2023 Last updated : 11/17/2023 # Change an Azure reservation directory between tenants
cost-management-billing Troubleshoot Reservation Utilization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/troubleshoot-reservation-utilization.md
-+ Previously updated : 12/06/2022 Last updated : 11/17/2023 # Troubleshoot reservation utilization
cost-management-billing Understand Azure Cache For Redis Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-azure-cache-for-redis-reservation-charges.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Understand Cosmosdb Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-cosmosdb-reservation-charges.md
Previously updated : 12/06/2022 Last updated : 11/17/2023 # Understand how the reservation discount is applied to Azure Cosmos DB
cost-management-billing Understand Disk Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-disk-reservations.md
Previously updated : 12/06/2022 Last updated : 11/17/2023 # Understand how your reservation discount is applied to Azure disk storage
cost-management-billing Understand Reservation Charges Mariadb https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-reservation-charges-mariadb.md
Previously updated : 12/06/2022 Last updated : 11/17/2023 # How a reservation discount is applied to Azure Database for MariaDB
cost-management-billing Understand Reservation Charges Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-reservation-charges-mysql.md
Previously updated : 12/06/2022 Last updated : 11/17/2023 # How a reservation discount is applied to Azure Database for MySQL
cost-management-billing Understand Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-reservation-charges.md
Title: Understand reservations discount for Azure SQL Database | Microsoft Docs description: Learn how a reservation discount is applied to running Azure SQL databases. The discount is applied to these databases on an hourly basis. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Understand Reserved Instance Usage Ea https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-reserved-instance-usage-ea.md
Title: Understand Azure reservations usage for Enterprise Agreement and Microsoft Customer Agreement description: Learn how to read your usage information to understand how an Azure reservation applies to Enterprise Agreement and Microsoft Customer Agreement usage. -+ tags: billing Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Understand Reserved Instance Usage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-reserved-instance-usage.md
Title: Azure reservation usage for an individual subscription description: Learn how to read your usage to understand how the Azure reservation for your individual subscription with pay-as-you-go rates is applied. -+ tags: billing Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Understand Rhel Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-rhel-reservation-charges.md
Title: Red Hat reservation plan discounts - Azure description: Learn how Red Hat plan discounts are applied to Red Hat software on virtual machines. -+
cost-management-billing Understand Storage Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-storage-charges.md
Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing Understand Suse Reservation Charges https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-suse-reservation-charges.md
Previously updated : 10/25/2023 Last updated : 11/17/2023
cost-management-billing Understand Vm Software Reservation Discount https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/understand-vm-software-reservation-discount.md
Title: Understand how the Azure virtual machine software reservation discount is applied description: Learn how Azure virtual machine software reservation discount is applied before you buy. -+ Previously updated : 12/06/2022 Last updated : 11/17/2023
cost-management-billing View Amortized Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/view-amortized-costs.md
Previously updated : 08/07/2023 Last updated : 11/17/2023
cost-management-billing View Purchase Refunds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/view-purchase-refunds.md
Title: View Azure Reservation purchase and refund transactions description: Learn how view Azure Reservation purchase and refund transactions.
-ms.reviwer: nitinarora
+ms.reviwer: primittal
cost-management-billing View Reservations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/reservations/view-reservations.md
Title: Permissions to view and manage Azure reservations description: Learn how to view and manage Azure reservations in the Azure portal. -+
cost-management-billing Buy Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/buy-savings-plan.md
Previously updated : 09/07/2023 Last updated : 11/17/2023
cost-management-billing Calculate Ea Savings Plan Savings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/calculate-ea-savings-plan-savings.md
Previously updated : 10/25/2023 Last updated : 11/17/2023
cost-management-billing Cancel Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/cancel-savings-plan.md
Previously updated : 10/12/2022 Last updated : 11/17/2023 # Azure saving plan cancellation policies
cost-management-billing Charge Back Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/charge-back-costs.md
Previously updated : 06/14/2023 Last updated : 11/17/2023
cost-management-billing Choose Commitment Amount https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/choose-commitment-amount.md
Previously updated : 08/01/2023 Last updated : 11/17/2023
cost-management-billing Decide Between Savings Plan Reservation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/decide-between-savings-plan-reservation.md
Previously updated : 10/28/2022 Last updated : 11/17/2023 # Decide between a savings plan and a reservation
cost-management-billing Discount Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/discount-application.md
Previously updated : 10/20/2022 Last updated : 11/17/2023 # How saving plan discount is applied
cost-management-billing Download Savings Plan Price Sheet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/download-savings-plan-price-sheet.md
Previously updated : 03/15/2023 Last updated : 11/17/2023
cost-management-billing Manage Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/manage-savings-plan.md
Previously updated : 01/30/2023 Last updated : 11/17/2023
cost-management-billing Permission View Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/permission-view-manage.md
Previously updated : 06/16/2023 Last updated : 11/17/2023
cost-management-billing Purchase Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/purchase-recommendations.md
Previously updated : 04/18/2023 Last updated : 11/17/2023 # Azure savings plan recommendations
cost-management-billing Renew Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/renew-savings-plan.md
Previously updated : 10/12/2022 Last updated : 11/17/2023
cost-management-billing Reservation Trade In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/reservation-trade-in.md
Previously updated : 10/16/2023 Last updated : 11/17/2023
cost-management-billing Savings Plan Compute Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/savings-plan-compute-overview.md
Previously updated : 10/25/2023 Last updated : 11/17/2023
cost-management-billing Scope Savings Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/scope-savings-plan.md
Previously updated : 08/28/2023 Last updated : 11/17/2023 # Savings plan scopes
cost-management-billing Software Costs Not Included https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/software-costs-not-included.md
Previously updated : 10/12/2022 Last updated : 11/17/2023 # Software costs not included in saving plans
cost-management-billing Troubleshoot Savings Plan Utilization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/troubleshoot-savings-plan-utilization.md
Previously updated : 10/14/2022 Last updated : 11/17/2023
cost-management-billing Utilization Cost Reports https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/utilization-cost-reports.md
Previously updated : 06/14/2023 Last updated : 11/17/2023
cost-management-billing View Transactions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/view-transactions.md
Previously updated : 10/12/2022 Last updated : 11/17/2023
cost-management-billing View Utilization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/savings-plan/view-utilization.md
Previously updated : 11/08/2022 Last updated : 11/17/2023
devtest Quickstart Create Enterprise Devtest Subscriptions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest/offer/quickstart-create-enterprise-devtest-subscriptions.md
After you've chosen the account to create an enterprise Azure dev/test subscript
1. You must have access and permissions associated with your identity. 1. You must designate the Account as a dev/test account within the enrollment portal.
-### Next steps
+## Related content
- [Azure EA portal administration](../../cost-management-billing/manage/ea-portal-administration.md) - [Get started with the Azure Enterprise portal](../../cost-management-billing/manage/ea-portal-get-started.md)
dms Known Issues Azure Sql Migration Azure Data Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/known-issues-azure-sql-migration-azure-data-studio.md
WHERE STEP in (3,4,6);
- **Recommendation**: Check if the selected tables exist in the target Azure SQL Database. If this migration is called from a PowerShell script, check if the table list parameter includes the correct table names and is passed into the migration.
+## Error code: 2060 - SqlSchemaCopyFailed
+
+- **Message**:` The SELECT permission was denied on the object 'sql_logins', database 'master', schema 'sys'.`
+
+- **Cause**: The account customers use to connect Azure SQL Database lacks the permission to access sys.sql_logins table.
+
+- **Recommendation**: There are two ways to mitigate the issue:
+1. Add 'sysadmin' role to the account, which grant the admin permission.
+2. If customers cannot use admin account or cannot grant admin permission to the account, they can create a user in master and grant dbmanager and loginmanager permission to the user. For example,
+```sql
+-- Run the script in the master
+create user testuser from login testlogin;
+exec sp_addRoleMember 'dbmanager', 'testuser'
+exec sp_addRoleMember 'loginmanager', 'testuser'
+```
++
+- **Message**:` Failed to get service token from ADF service.`
+
+- **Cause**: The customer's SHIR fails to connect data factory.
+
+- **Recommendation**: This is sample doc how to solve it: [Integration runtime Unable to connect to Data Factory](https://learn.microsoft.com/answers/questions/139976/integration-runtime-unable-to-connect-to-data-fact)
+++
+- **Message**:` IR Nodes are offline.`
+
+- **Cause**: The cause might be that the network is interrupted during migration and thus the IR node become offline. Make sure that the machine where SHIR is installed is on.
+
+- **Recommendation**: Make sure that the machine where SHIR is installed is on.
++
+- **Message**:` Deployed failure: {0}. Object element: {1}.`
+
+- **Cause**: This is the most common error customers might encounter. It means that the object cannot be deployed to the target because it is unsupported on the target.
+
+- **Recommendation**: Customers need to check the assessment results ([Assessment rules](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql)). This is the list of assessment issues that might fail the schema migration:
+
+[BUIK INSERT](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#BulkInsert)
+
+[COMPUTE clause](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#ComputeClause)
+
+[Cryptographic provider](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#CryptographicProvider)
+
+[Cross database references](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#CrossDatabaseReferences)
+
+[Database principal alias](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#DatabasePrincipalAlias)
+
+[DISABLE_DEF_CNST_CHK option](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#DisableDefCNSTCHK)
+
+[FASTFIRSTROW hint](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#FastFirstRowHint)
+
+[FILESTREAM](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#FileStream)
+
+[MS DTC](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#MSDTCTransactSQL)
+
+[OPENROWSET (bulk)](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#OpenRowsetWithNonBlobDataSourceBulk)
+
+[OPENROWSET (provider)](https://learn.microsoft.com/azure/azure-sql/migration-guides/database/sql-server-to-sql-database-assessment-rules?view=azuresql#OpenRowsetWithSQLAndNonSQLProvider)
+
+Note: To view error detail, Open Microsoft Integration runtime configurtion manager > Diagnostics > logging > view logs.
+It will open the Event viewer > Application and Service logs > Connectors - Integration runtime and now filter for errors.
+ ## Error code: Ext_RestoreSettingsError - **Message**: Unable to read blobs in storage container, exception: The remote server returned an error: (403) Forbidden.; The remote server returned an error: (403) Forbidden
dns Dns Private Resolver Get Started Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/dns-private-resolver-get-started-powershell.md
$targetDNS2 = New-AzDnsResolverTargetDnsServerObject -IPAddress 192.168.1.3 -Por
$targetDNS3 = New-AzDnsResolverTargetDnsServerObject -IPAddress 10.0.0.4 -Port 53 $targetDNS4 = New-AzDnsResolverTargetDnsServerObject -IPAddress 10.5.5.5 -Port 53 $forwardingrule = New-AzDnsForwardingRulesetForwardingRule -ResourceGroupName myresourcegroup -DnsForwardingRulesetName myruleset -Name "Internal" -DomainName "internal.contoso.com." -ForwardingRuleState "Enabled" -TargetDnsServer @($targetDNS1,$targetDNS2)
-$forwardingrule = New-AzDnsForwardingRulesetForwardingRule -ResourceGroupName myresourcegroup -DnsForwardingRulesetName myruleset -Name "AzurePrivate" -DomainName "azure.contoso.com" -ForwardingRuleState "Enabled" -TargetDnsServer $targetDNS3
+$forwardingrule = New-AzDnsForwardingRulesetForwardingRule -ResourceGroupName myresourcegroup -DnsForwardingRulesetName myruleset -Name "AzurePrivate" -DomainName "azure.contoso.com." -ForwardingRuleState "Enabled" -TargetDnsServer $targetDNS3
$forwardingrule = New-AzDnsForwardingRulesetForwardingRule -ResourceGroupName myresourcegroup -DnsForwardingRulesetName myruleset -Name "Wildcard" -DomainName "." -ForwardingRuleState "Enabled" -TargetDnsServer $targetDNS4 ```
dns Private Resolver Endpoints Rulesets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dns/private-resolver-endpoints-rulesets.md
Outbound endpoints are also part of the private virtual network address space wh
DNS forwarding rulesets enable you to specify one or more custom DNS servers to answer queries for specific DNS namespaces. The individual [rules](#rules) in a ruleset determine how these DNS names are resolved. Rulesets can also be linked one or more virtual networks, enabling resources in the VNets to use the forwarding rules that you configure. Rulesets have the following associations: -- A single ruleset can be associated with multiple outbound endpoints.
+- A single ruleset can be associated with up to 2 outbound endpoints belonging to the same DNS Private Resolver instance. It cannot be associated with 2 outbound endpoints in two different DNS Private Resolver instances.
- A ruleset can have up to 1000 DNS forwarding rules. - A ruleset can be linked to up to 500 virtual networks in the same region
event-hubs Azure Event Hubs Kafka Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/azure-event-hubs-kafka-overview.md
As explained [above](#is-apache-kafka-the-right-solution-for-your-workload), the
The client-side [compression](https://cwiki.apache.org/confluence/display/KAFKA/Compression) feature of Apache Kafka compresses a batch of multiple messages into a single message on the producer side and decompresses the batch on the consumer side. The Apache Kafka broker treats the batch as a special message. Kafka producer application developers can enable message compression by setting the compression.type property. In the public preview, the only compression algorithm supported is gzip.
- Compression.type = none | gzip
-These changes are exposed in the header, which then allows the consumer to properly decompress the data. The feature is currently only supported for Apache Kafka traffic producer and consumer traffic and not AMQP or web service traffic.
-The payload of any Event Hubs event is a byte stream and the content can be compressed with an algorithm of your choosing though in public preview, the only option is gzip. The benefits of using Kafka compression are through smaller message size, increased payload size you can transmit, and lower message broker resource consumption.
+`Compression.type = none | gzip`
+
+The feature is currently only supported for Apache Kafka traffic producer and consumer traffic and not AMQP or web service traffic. The payload of any Event Hubs event is a byte stream and the content can be compressed with an algorithm of your choosing though in public preview, the only option is gzip. The benefits of using Kafka compression are through smaller message size, increased payload size you can transmit, and lower message broker resource consumption.
### Kafka Streams
expressroute Expressroute About Virtual Network Gateways https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-about-virtual-network-gateways.md
Before you create an ExpressRoute gateway, you must create a gateway subnet. The
>[!NOTE] >[!INCLUDE [vpn-gateway-gwudr-warning.md](../../includes/vpn-gateway-gwudr-warning.md)] >
->- Linking a private DNS resolver to the virtual network where the ExpressRoute virtual network gateway is deployed may cause management connectivity issues and is not recommended.
+>- Linking an Azure DNS private resolver to the virtual network where the ExpressRoute virtual network gateway is deployed may cause management connectivity issues and is not recommended.
When you create the gateway subnet, you specify the number of IP addresses that the subnet contains. The IP addresses in the gateway subnet are allocated to the gateway VMs and gateway services. Some configurations require more IP addresses than others.
expressroute Expressroute Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/expressroute/expressroute-introduction.md
For more information, see [About ExpressRoute Direct](./expressroute-erdirect-ab
### Bandwidth options
-You can purchase ExpressRoute circuits for a wide range of bandwidths. The supported bandwidths are listed as followed. Be sure to check with your connectivity provider to determine the bandwidths they support.
+You can purchase ExpressRoute circuits for a wide range of bandwidths. The supported bandwidths are listed as follows. Be sure to check with your connectivity provider to determine the bandwidths they support.
* 50 Mbps * 100 Mbps
iot-operations Howto Configure Data Lake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/connect-to-cloud/howto-configure-data-lake.md
Title: Send data from Azure IoT MQ to Data Lake Storage
-#
+ description: Learn how to send data from Azure IoT MQ to Data Lake Storage. + - ignite-2023 Previously updated : 11/01/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to understand how to configure Azure IoT MQ so that I can send data from Azure IoT MQ to Data Lake Storage.
iot-operations Howto Configure Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/connect-to-cloud/howto-configure-kafka.md
Title: Send and receive messages between Azure IoT MQ and Azure Event Hubs or Kafka
-#
+ description: Learn how to send and receive messages between Azure IoT MQ and Azure Event Hubs or Kafka. + - ignite-2023 Previously updated : 10/31/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to understand how to configure Azure IoT MQ to send and receive messages between Azure IoT MQ and Kafka.
iot-operations Howto Configure Mqtt Bridge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/connect-to-cloud/howto-configure-mqtt-bridge.md
Title: Connect MQTT bridge cloud connector to other MQTT brokers
-#
+ description: Bridge Azure IoT MQ to another MQTT broker. + - ignite-2023 Previously updated : 11/02/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to bridge Azure IoT MQ to another MQTT broker so that I can integrate Azure IoT MQ with other messaging systems.
iot-operations Concept About Distributed Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/develop/concept-about-distributed-apps.md
Title: Develop highly available distributed applications
-#
+ description: Learn how to develop highly available distributed applications that work with Azure IoT MQ. + - ignite-2023 Previously updated : 10/26/2023 Last updated : 11/15/2023 #CustomerIntent: As an developer, I want understand how to develop highly available distributed applications for my IoT Operations solution.
iot-operations Concept About State Store Protocol https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/develop/concept-about-state-store-protocol.md
Title: About Azure IoT MQ state store protocol
-#
+ description: Learn about the fundamentals of the Azure IoT MQ state store protocol
iot-operations Howto Develop Dapr Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/develop/howto-develop-dapr-apps.md
Title: Use Dapr to develop distributed application workloads
-#
+ Title: Use Dapr to develop distributed applications
+ description: Develop distributed applications that talk with Azure IoT MQ using Dapr.
iot-operations Howto Develop Mqttnet Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/develop/howto-develop-mqttnet-apps.md
Title: Use MQTTnet to develop distributed application workloads
-#
+ description: Develop distributed applications that talk with Azure IoT MQ using MQTTnet. + - ignite-2023 Previously updated : 10/29/2023 Last updated : 11/15/2023 #CustomerIntent: As an developer, I want to understand how to use MQTTnet to develop distributed apps that talk with Azure IoT MQ.
iot-operations Howto Autodetect Opcua Assets Using Akri https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-devices-assets/howto-autodetect-opcua-assets-using-akri.md
Last updated 11/14/2023
-# CustomerIntent: As an industrial edge IT or operations user, I want to autodetect and create OPC UA data sources in my
+# CustomerIntent: As an industrial edge IT or operations user, I want to discover and create OPC UA data sources in my
# industrial edge environment so that I can reduce manual configuration overhead.
iot-operations Howto Configure Opc Plc Simulator https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-devices-assets/howto-configure-opc-plc-simulator.md
Title: Configure an OPC PLC simulator+ description: How to configure an OPC PLC simulator
iot-operations Howto Configure Opcua Authentication Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-devices-assets/howto-configure-opcua-authentication-options.md
Title: Configure OPC UA authentication options+ description: How to configure OPC UA authentication options to use with Azure IoT OPC UA Broker
iot-operations Overview Akri https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-devices-assets/overview-akri.md
With Azure IoT Akri, you can dynamically provision devices like the following ex
Azure IoT Akri employs standard Kubernetes primitives. The use of Kubernetes primitives lets users apply their expertise creating applications or managing infrastructure. Small devices connected in an Akri-configured site can appear as Kubernetes resources, just like memory or CPUs. The Azure IoT Akri controller enables the cluster operator to start brokers, jobs or other workloads for individual connected devices or groups of devices. These Azure IoT Akri device configurations and properties remain in the cluster so that if there's node failure, other nodes can pick up any lost work. ## Using Azure IoT Akri to discover OPC UA assets
-Azure IoT Akri is a turnkey solution that enables you to autodetect and create assets connected to an OPC UA server at the edge. Azure IoT Akri discovers devices at the edge and maps them to assets. The assets send telemetry to upstream connectors. By using Azure IoT Akri, you eliminate the painstaking process of manually configuring from the cloud and onboarding the assets to your cluster.
+Azure IoT Akri is a turnkey solution that enables you to discover and create assets connected to an OPC UA server at the edge. Azure IoT Akri discovers devices at the edge and maps them to assets. The assets send telemetry to upstream connectors. By using Azure IoT Akri, you eliminate the painstaking process of manually configuring from the cloud and onboarding the assets to your cluster.
The Azure IoT Operations Preview documentation provides guidance for detecting assets at the edge, by using the Azure IoT Operations OPC UA discovery handler and broker. You can use these components to process your OPC UA data and telemetry.
To learn more about the CNCF Akri, see the following open source resources.
In this article, you learned how Azure IoT Akri works and how it enables you to detect devices and add assets at the edge. Here's the suggested next step: > [!div class="nextstepaction"]
-> [Autodetect assets using Azure IoT Akri](howto-autodetect-opcua-assets-using-akri.md)
+> [Discover assets using Azure IoT Akri](howto-autodetect-opcua-assets-using-akri.md)
iot-operations Overview Manage Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-devices-assets/overview-manage-assets.md
Title: Manage assets overview+ description: Understand the options to manage the assets that are part of your Azure IoT Operations Preview solution.
The following diagram shows the high-level architecture of Azure IoT Operations.
- **Azure IoT Operations Experience (preview)**. The Operations Experience portal is a web app that lets you create and manage assets, and configure data processing pipelines. The portal simplifies the task of managing assets. Operations Experience is the recommended service to manage assets. - **Azure Device Registry (preview)**. The Device Registry is a service that projects industrial assets as Azure resources. It works together with the Operations Experience to streamline the process of managing assets. Device Registry lets you manage all your assets in the cloud, as true Azure resources contained in a single unified registry. -- **Azure IoT Akri (preview)**. Azure IoT Akri is a service that automatically discovers assets at the edge. The service can detect and create assets in the address space of an OPC UA Server.
+- **Azure IoT Akri (preview)**. Azure IoT Akri is a service that discovers assets at the edge. The service can detect and create assets in the address space of an OPC UA Server.
- **Azure IoT OPC UA Broker (preview)**. OPC UA Broker is a data exchange service that enables assets to exchange data with Azure IoT Operations, based on the widely used OPC UA standard. Azure IoT Operations uses OPC UA Broker to exchange data between OPC UA servers and the Azure IoT MQ service. Each of these services is explained in greater detail in the following sections that discuss use cases for managing assets.
The following features are supported in Azure Device Registry:
|Asset as Azure resource (supports ARG, resource groups, tags, etc.)| Supported | ``✅`` |
-## Discover edge assets automatically
-A common task in complex edge solutions is to discover assets and add them to your Kubernetes cluster. Azure IoT Akri provides this capability. It enables you to automatically detect and add OPC UA assets to your cluster. For administrators who attach devices to or remove them from the cluster, using Azure IoT Akri reduces the level of coordination and manual configuration.
+## Discover edge assets
+A common task in complex edge solutions is to discover assets and add them to your Kubernetes cluster. Azure IoT Akri provides this capability. It enables you to detect and add OPC UA assets to your cluster. For administrators who attach devices to or remove them from the cluster, using Azure IoT Akri reduces the level of coordination and manual configuration.
An Azure IoT Akri deployment can include fixed-network discovery handlers. Discovery handlers enable assets from known network endpoints to find leaf devices as they appear on device interfaces or local subnets. Examples of network endpoints include OPC UA servers at a fixed IP (without network scanning), and network scanning discovery handlers.
-When you install Azure IoT Operations, Azure IoT Akri is installed and configured along with a simulated OPC UA PLC server. Azure IoT Akri should discover the simulated server and expose it as a resource on your cluster, so that you can start to work with automated asset discovery.
+When you install Azure IoT Operations, Azure IoT Akri is installed and configured along with a simulated OPC UA PLC server. Azure IoT Akri should discover the simulated server and expose it as a resource on your cluster, so that you can start to work with asset discovery.
## Use a common data exchange standard for your edge solution A critical need in industrial environments is to have a common standard or protocol for machine-to-machine and machine-to-cloud data exchange. By using a widely supported data exchange protocol, you can simplify the process to enable diverse industrial assets to exchange data with each other, with workloads running in your Kubernetes cluster, and with the cloud. [OPC UA](https://opcfoundation.org/about/opc-technologies/opc-ua/) is a specification for a platform independent service-oriented architecture that enables data exchange in industrial environments.
iot-operations Howto Configure Aks Edge Essentials Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/howto-configure-aks-edge-essentials-layered-network.md
Title: Configure Layered Network Management service to enable Azure IoT Operations in an isolated network
-#
+ description: Configure Layered Network Management service to enable Azure IoT Operations in an isolated network. + - ignite-2023 Previously updated : 10/30/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to Azure Arc enable AKS Edge Essentials clusters using Layered Network Management so that I have secure isolate devices.
iot-operations Howto Configure L3 Cluster Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/howto-configure-l3-cluster-layered-network.md
Title: Configure level 3 cluster in an Azure IoT Layered Network Management isolated network
-#
+ description: Prepare a level 3 cluster and connect it to the IoT Layered Network Management service + - ignite-2023 Previously updated : 11/07/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure Layered Network Management so that I have secure isolate devices.
iot-operations Howto Configure L4 Cluster Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/howto-configure-l4-cluster-layered-network.md
Title: Configure Azure IoT Layered Network Management on level 4 cluster
-#
+ description: Deploy and configure Azure IoT Layered Network Management on a level 4 cluster. + - ignite-2023 Previously updated : 11/07/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure Layered Network Management so that I have secure isolate devices.
iot-operations Howto Configure Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/howto-configure-layered-network.md
Title: Create sample network environment for Azure IoT Layered Network Management
-#
+ description: Set up a test or sample network environment for Azure IoT Layered Network Management. + - ignite-2023 Previously updated : 11/07/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure Layered Network Management so that I have secure isolate devices.
iot-operations Howto Deploy Aks Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/howto-deploy-aks-layered-network.md
Title: Deploy Azure IoT Layered Network Management to an AKS cluster
-#
+ description: Configure Azure IoT Layered Network Management to an AKS cluster. + - ignite-2023 Previously updated : 11/07/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure Layered Network Management so that I have secure isolate devices.
iot-operations Overview Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-layered-network/overview-layered-network.md
Title: What is Azure IoT Layered Network Management?
-#
+ description: Learn about Azure IoT Layered Network Management. + - ignite-2023 Previously updated : 10/24/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want understand how to use Azure IoT Layered Network Management to secure my devices.
iot-operations Howto Configure Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-authentication.md
Title: Configure Azure IoT MQ authentication
-#
+ description: Configure Azure IoT MQ authentication. + - ignite-2023 Previously updated : 11/07/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure authentication so that I have secure MQTT broker communications.
iot-operations Howto Configure Authorization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-authorization.md
Title: Configure Azure IoT MQ authorization
-#
+ description: Configure Azure IoT MQ authorization using BrokerAuthorization. + - ignite-2023 Previously updated : 10/28/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure authorization so that I have secure MQTT broker communications.
iot-operations Howto Configure Availability Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-availability-scale.md
Title: Configure core MQTT broker settings
-#
+ description: Configure core MQTT broker settings for high availability, scale, memory usage, and disk-backed message buffer behavior. + - ignite-2023 Previously updated : 10/27/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to understand the settings for the MQTT broker so that I can configure it for high availability and scale.
iot-operations Howto Configure Brokerlistener https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-brokerlistener.md
Title: Secure Azure IoT MQ communication using BrokerListener
-#
+ description: Understand how to use the BrokerListener resource to secure Azure IoT MQ communications including authorization, authentication, and TLS. -++ - ignite-2023 Previously updated : 11/05/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want understand options to secure MQTT communications for my IoT Operations solution.
iot-operations Howto Configure Tls Auto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-tls-auto.md
Title: Configure TLS with automatic certificate management to secure MQTT communication
-#
+ description: Configure TLS with automatic certificate management to secure MQTT communication between the MQTT broker and client. + - ignite-2023 Previously updated : 10/29/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure IoT MQ to use TLS so that I have secure communication between the MQTT broker and client.
iot-operations Howto Configure Tls Manual https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-configure-tls-manual.md
Title: Configure TLS with manual certificate management to secure MQTT communication
-#
+ description: Configure TLS with manual certificate management to secure MQTT communication between the MQTT broker and client. + - ignite-2023 Previously updated : 10/29/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure IoT MQ to use TLS so that I have secure communication between the MQTT broker and client.
iot-operations Howto Manage Secrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-manage-secrets.md
Title: Manage secrets using Azure Key Vault or Kubernetes secrets
-#
+ description: Learn how to manage secrets using Azure Key Vault or Kubernetes secrets. + - ignite-2023 Previously updated : 10/30/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure IoT MQ to use Azure Key Vault or Kubernetes secrets so that I can securely manage secrets.
iot-operations Howto Test Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/howto-test-connection.md
Title: Test connectivity to IoT MQ with MQTT clients
-#
+ description: Learn how to use common and standard MQTT tools to test connectivity to Azure IoT MQ. + - ignite-2023 Previously updated : 11/01/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator or developer, I want to test MQTT connectivity with tools that I'm already familiar with to know that I set up my Azure IoT MQ broker correctly.
iot-operations Overview Iot Mq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/manage-mqtt-connectivity/overview-iot-mq.md
Title: Publish and subscribe MQTT messages using Azure IoT MQ
-#
+ description: Use Azure IoT MQ to publish and subscribe to messages. Destinations include other MQTT brokers, Azure IoT Data Processor, and Azure cloud services. + - ignite-2023 Previously updated : 10/30/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to understand how to I can use Azure IoT MQ to publish and subscribe MQTT topics.
iot-operations Howto Configure Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/monitor/howto-configure-diagnostics.md
Title: Configure Azure IoT MQ diagnostics service
-#
+ Title: Configure MQ diagnostics service
+ description: How to configure Azure IoT MQ diagnostics service.
iot-operations Howto Configure Observability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/monitor/howto-configure-observability.md
Title: Configure observability
-#
+ description: Configure observability features in Azure IoT Operations to monitor the health of your solution.
iot-operations Observability Metrics Akri https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/reference/observability-metrics-akri.md
Title: Metrics for Azure IoT Akri
-#
+ description: Available observability metrics for Azure IoT Akri to monitor the health and performance of your solution.
iot-operations Observability Metrics Layered Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/reference/observability-metrics-layered-network.md
Title: Metrics for Azure IoT Layered Network Management
-#
+ description: Available observability metrics for Azure IoT Layered Network Management to monitor the health and performance of your solution.
iot-operations Observability Metrics Mq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/reference/observability-metrics-mq.md
Title: Metrics for Azure IoT MQ
-#
+ description: Available observability metrics for Azure IoT MQ to monitor the health and performance of your solution.
iot-operations Observability Metrics Opcua Broker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/reference/observability-metrics-opcua-broker.md
Title: Metrics for Azure IoT OPC UA Broker
-#
+ description: Available observability metrics for Azure IoT OPC UA Broker to monitor the health and performance of your solution.
iot-operations Tutorial Connect Event Grid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/send-view-analyze-data/tutorial-connect-event-grid.md
Title: Configure MQTT bridge between IoT MQ and Azure Event Grid
-#
+ description: Learn how to configure IoT MQ for bi-directional MQTT bridge with Azure Event Grid MQTT broker PaaS. + Previously updated : 11/13/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to configure IoT MQ to bridge to Azure Event Grid MQTT broker PaaS so that I can process my IoT data at the edge and in the cloud.
iot-operations Tutorial Event Driven With Dapr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/send-view-analyze-data/tutorial-event-driven-with-dapr.md
Title: Build event-driven apps with Dapr
-#
+ description: Learn how to create a Dapr application that aggregates data and publishing on another topic
To verify the MQTT bridge is working, deploy an MQTT client to the cluster.
1. Verify the application is outputting a sliding windows calculation for the various sensors: ```json
- {"timestamp": "2023-11-14T05:21:49.807684+00:00", "window_size": 30, "temperature": {"min": 551.805, "max": 599.746, "mean": 579.929, "median": 581.917, "75_per": 591.678, "count": 29}, "pressure": {"min": 290.361, "max": 299.949, "mean": 295.98575862068964, "median": 296.383, "75_per": 298.336, "count": 29}, "vibration": {"min": 0.00114438, "max": 0.00497965, "mean": 0.0033943155172413792, "median": 0.00355337, "75_per": 0.00433423, "count": 29}}
+ {
+ "timestamp": "2023-11-16T21:59:53.939690+00:00",
+ "window_size": 30,
+ "temperature": {
+ "min": 553.024,
+ "max": 598.907,
+ "mean": 576.4647857142858,
+ "median": 577.4905,
+ "75_per": 585.96125,
+ "count": 28
+ },
+ "pressure": {
+ "min": 290.605,
+ "max": 299.781,
+ "mean": 295.521,
+ "median": 295.648,
+ "75_per": 297.64050000000003,
+ "count": 28
+ },
+ "vibration": {
+ "min": 0.00124192,
+ "max": 0.00491257,
+ "mean": 0.0031171810714285715,
+ "median": 0.003199235,
+ "75_per": 0.0038769150000000003,
+ "count": 28
+ }
+ }
``` ## Optional - Create the Dapr application
iot-operations Tutorial Real Time Dashboard Fabric https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/send-view-analyze-data/tutorial-real-time-dashboard-fabric.md
Title: Build a real-time dashboard in Microsoft Fabric with MQTT data
-#
+ description: Learn how to build a real-time dashboard in Microsoft Fabric using MQTT data from IoT MQ + Previously updated : 11/13/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to learn how to build a real-time dashboard in Microsoft Fabric using MQTT data from IoT MQ.
iot-operations Tutorial Upload Mqtt Lakehouse https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-operations/send-view-analyze-data/tutorial-upload-mqtt-lakehouse.md
Title: Upload MQTT data to Microsoft Fabric lakehouse
-#
+ description: Learn how to upload MQTT data from the edge to a Fabric lakehouse + Previously updated : 11/13/2023 Last updated : 11/15/2023 #CustomerIntent: As an operator, I want to learn how to send MQTT data from the edge to a lakehouse in the cloud.
logic-apps Workflow Assistant Standard https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/workflow-assistant-standard.md
+
+ Title: Get AI-assisted help for Standard workflows
+description: Try the workflow assistant for AI-powered help with Standard workflows in Azure Logic Apps.
+
+ms.suite: integration
++ Last updated : 11/17/2023++
+# Get AI-powered help for Standard workflows in Azure Logic Apps (preview)
++
+> [!IMPORTANT]
+> This capability is in preview and is subject to the
+> [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+
+If you're new to Standard workflows in Azure Logic Apps or updating a workflow built by other developers, you might have many questions about workflows, connectors, their operations, and other tasks that you need to complete in Azure Logic Apps. For example, Azure Logic Apps provides 1,000+ connectors for you to use. How do you choose the best ones for your workflow?
+
+In the Azure portal, within the Standard workflow designer, the workflow assistant offers a chat box so that you can ask questions about the currently open workflow or about Azure Logic Apps in general. The assistant generates answers and provides access to Azure Logic Apps documentation and best practices. When you use the assistant, you don't have to switch context to search or browse for documentation online.
++
+The workflow assistant delivers curated information based on reputable knowledge sources, such as the Azure Logic Apps documentation on Microsoft Learn, connector schemas, and tech community blogs. The assistant can also build responses using the currently open workflow in the designer. That way, you can learn how to complete tasks specific to your workflow's context. For example, you can ask how to configure a specific action in the workflow, get recommendations about the action's inputs or outputs, how to test that data, and so on.
+
+> [!IMPORTANT]
+>
+> The workflow assistant doesn't collect, save, store, or share any personal or customer data in your
+> Standard logic app workflows nor any information in your chat history. You can open the assistant only
+> from the designer for Standard workflows in Azure portal, not in Visual Studio Code. You can use the
+> assistant in all Azure regions where Standard workflows and single-tenant Azure Logic Apps are available.
+> However, the assistant currently supports only English for questions and responses.
+>
+> The workflow assistant follows responsible practices in accordance with the
+> [Azure Privacy policy](https://azure.microsoft.com/explore/trusted-cloud/privacy)
+> and follows responsible and ethical AI practices in accordance with the
+> [Microsoft responsible AI principles and approach](https://www.microsoft.com/ai/principles-and-approach).
+> For more information, see [Azure customer data protection](../security/fundamentals/protection-customer-data.md)
+> and [Microsoft data protection and privacy](https://www.microsoft.com/trust-center/privacy).
+
+## Prerequisites
+
+- An Azure account and subscription. If you don't have a subscription, [sign up for a free Azure account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+
+- A [new or existing Standard logic app workflow](create-single-tenant-workflows-azure-portal.md).
+
+## Open the workflow assistant
+
+1. In the [Azure portal](https://portal.azure.com), open your Standard logic app resource and workflow in the designer.
+
+1. On the workflow toolbar, select **Assistant**.
+
+ :::image type="content" source="media/workflow-assistant-standard/open-workflow-assistant.png" alt-text="Screenshot shows Azure portal, Standard logic app with workflow designer opened, and workflow toolbar with Assistant selected." lightbox="media/workflow-assistant-standard/open-workflow-assistant.png":::
+
+ The **Workflow assistant** pane opens on the designer's left side:
+
+ :::image type="content" source="media/workflow-assistant-standard/chat-open-first-time.png" alt-text="Screenshot shows Azure portal, Standard workflow designer, and open workflow assistant pane." lightbox="media/workflow-assistant-standard/chat-open-first-time.png":::
+
+## Example ways to use the assistant
+
+The following table includes only some example use cases, so please share your feedback with the Azure Logic Apps team about how you use the workflow assistant to improve your productivity.
+
+| Use case | Example question | Description |
+|-||-|
+| Describe the currently open workflow. | **"What does this workflow do?"** | Useful when you use or update a workflow built by other developers or when collaborating with other developers on shared workflows. |
+| Get help with connectors. | - **"Which connectors can send email?"** <br><br>- **"What does the Request trigger do?"** | Useful when you're not sure which connector to use, what connectors are available, or need specific information about a connector. <br><br>The workflow assistant can provide recommendations on connectors or operations, provide best practices about how to use a connector, provide comparisons between connectors, and so on. |
+| Suggest guidance based on your specific scenario. | **"How do I create a workflow that checks an RSS feed and sends me the feed items?"** | Recommend step-by-step information about how to build a workflow based on your scenario, including which connectors to use, how to configure them, and how to process the data. |
+| Recommend patterns. | **"What's a best practice for error handling in my workflow?"** | Provide guidance and best practices for error handling, testing, and other optimizations. |
+
+## Ask your question
+
+1. In the chat box, enter your question about the current workflow or about Azure Logic Apps.
+
+ The following example asks the question, **"What is a trigger?"**
+
+ :::image type="content" source="media/workflow-assistant-standard/ask-question.png" alt-text="Screenshot shows Azure portal, Standard workflow designer, open workflow assistant pane, and chat box with a question entered." lightbox="media/workflow-assistant-standard/ask-question.png":::
+
+ The workflow assistant researches your question and generates an answer, for example:
+
+ :::image type="content" source="media/workflow-assistant-standard/question-response.png" alt-text="Screenshot shows open workflow assistant pane, and chat box with a generated answer to the previously entered question." lightbox="media/workflow-assistant-standard/question-response.png":::
+
+ > [!NOTE]
+ >
+ > If you close the workflow assistant pane, your chat history isn't saved or preserved.
+ > When you reopen the assistant, you start with new chat box.
+
+1. [Provide optional feedback about your experience with the workflow assistant](#provide-feedback).
+
+## Limitations
+
+- Inaccurate responses
+
+ The workflow assistant can generate valid responses that might not be semantically correct or capture the intent behind your prompt. As the language model trains with more data over time, the responses will improve. Always make sure to carefully review the assistant's recommendations before you apply them to your workflows.
+
+- No support for conversation threads
+
+ The workflow assistant currently responds only to the immediate question, and not earlier questions in the same chat session.
+
+- Workflow size
+
+ You might experience different performance levels in the workflow assistant, based on factors such as the number of workflow operations or complexity. The assistant is trained on workflows with different complexity levels but still has limited scope and might not be able to handle very large workflows. These limitations are primarily related to token constraints in the queries sent to Azure Open AI Service. The Azure Logic Apps team is committed to continuous improvement and enhancing these limitations through iterative updates.
+
+<a name="provide-feedback"></a>
+
+## Provide feedback
+
+The Azure Logic Apps team values your feedback and encourages you to share your experiences, especially if you encounter unexpected responses or have any concerns about the workflow assistant.
+
+In the chat pane, under the workflow assistant's response, choose an option:
+
+- Share constructive feedback about the workflow assistant or its responses.
+
+ 1. Select the thumbs-down icon:
+
+ ![Screenshot shows workflow assistant pane feedback options with down-vote icon selected.](media/workflow-assistant-standard/thumbs-down.png)
+
+ 1. Provide the following information:
+
+ | Item | Description |
+ ||-|
+ | Difficulty | Rate the difficulty level for using the assistant. |
+ | Value | Rate the value that the assistant provided in helping you with your workflow or Azure Logic Apps. |
+ | Comments | Include the following information: <br><br>- The question you asked <br>- Relevant information about your workflow <br>- The assistant's response |
+ | **It's OK to contact me about my feedback** | Select whether you want Microsoft or the Azure Logic Apps team to contact you. |
+
+ 1. When you're done, select **Submit**.
+
+- Report problems with the workflow assistant.
+
+ 1. Select **Report a bug**:
+
+ ![Screenshot shows workflow assistant pane feedback options with selected option for Report a bug.](media/workflow-assistant-standard/report-bug.png)
+
+ The link opens a GitHub page for the Azure Logic Apps customer feedback bug report template.
+
+ 1. Follow the template's prompts to provide the required information and other details about the problem.
+
+ 1. When you're done, select **Submit new issue**.
+
+## Frequently asked questions (FAQ)
+
+**Q**: Can the workflow assistant answer questions about any topic?
+
+**A**: The workflow assistant is trained to answer only questions about Azure Logic Apps. To make sure that responses are grounded and relevant to Azure Logic Apps, the assistant was evaluated using valid and harmful prompts from various sources. The assistant is trained to not answer any harmful questions. If you ask questions about Azure that are unrelated to Azure Logic Apps, the assistant gracefully hands off processing to Azure Copilot.
+
+**Q**: How does the workflow assistant use my query to generate responses?
+
+**A**: The workflow is powered by [Azure Open AI Service](../ai-services/openai/overview.md) and [ChatGPT](https://openai.com/blog/chatgpt), which use Azure Logic Apps documentation from reputable sources along with internet data that's used to train GPT 3.5-Turbo. This content is processed into a vectorized format, which is then accessible through a backend system built on Azure App Service. Queries are triggered based on interactions with the workflow designer.
+
+When you enter your question in the assistant's chat box, the Azure Logic Apps backend performs preprocessing and forwards the results to a large language model in Azure Open AI Service. This model generates responses based on the current context in the form of the workflow definition's JSON code and your prompt.
+
+**Q**: What data does the workflow assistant collect?
+
+**A**: To provide contextual responses, the workflow assistant relies on your workflow's sanitized JSON definition, which is used only to scope the responses and isn't stored anywhere. The workflow definition is sanitized to make sure that no customer data or secrets are passed as context. For troubleshooting purposes, the assistant collects some telemetry about UI interactions, but omits any customer or personal data.
+
+**Q**: What happens to any personal or customer data entered in the workflow assistant?
+
+**A**: The workflow assistant doesn't collect, save, store, or share any personal or customer data, including any information in workflow assistant's chat history.
+
+**Q**: Where can I learn about privacy and data protection for Azure?
+
+**A**: The workflow assistant follows responsible practices in accordance with the [Azure Privacy policy](https://azure.microsoft.com/explore/trusted-cloud/privacy). For more information, see [Azure customer data protection](../security/fundamentals/protection-customer-data.md) and [Microsoft data protection and privacy](https://www.microsoft.com/trust-center/privacy).
+
+**Q**: Where can I learn about responsible and ethical AI practices at Microsoft?
+
+**A**: The workflow assistant follows responsible and ethical AI practices in accordance with the [Microsoft responsible AI principles and approach](https://www.microsoft.com/ai/principles-and-approach).
+
+**Q**: Does Azure Logic Apps own the workflows suggested by the workflow assistant?
+
+**A**: The workflow assistant doesn't own the suggestions that the assistant provides to you nor the workflows that you build based on these suggestions.
+
+**Q**: What's the difference between Azure OpenAI Service and ChatGPT?
+
+**A**: [Azure Open AI Service](../ai-services/openai/overview.md) is an enterprise-ready AI technology that's powered and optimized for your business processes and your business data to meet security and privacy requirements.
+
+[ChatGPT](https://openai.com/blog/chatgpt) is built by [Open AI](https://openai.com) and is a general-purpose large language model (LLM) trained by OpenAI on a massive dataset of text, designed to engage in human-like conversations and answer a wide range of questions on several topics.
+
+## Next steps
+
+[Create an example Standard workflow in single-tenant Azure Logic Apps](create-single-tenant-workflows-azure-portal.md)
machine-learning How To Use Batch Model Openai Embeddings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-batch-model-openai-embeddings.md
+
+ Title: 'Run OpenAI models in batch endpoints'
+
+description: In this article, learn how to use batch endpoints with OpenAI models.
+++++++ Last updated : 11/04/2023+++
+# Run OpenAI models in batch endpoints to compute embeddings
++
+Batch Endpoints can deploy models to run inference over large amounts of data, including OpenAI models. In this example, you learn how to create a batch endpoint to deploy ADA-002 model from OpenAI to compute embeddings at scale but you can use the same approach for completions and chat completions models. It uses Microsoft Entra authentication to grant access to the Azure OpenAI resource.
+
+## About this example
+
+In this example, we're going to compute embeddings over a dataset using ADA-002 model from OpenAI. We will register the particular model in MLflow format using the OpenAI flavor which has support to orchestrate all the calls to the OpenAI service at scale.
++
+The files for this example are in:
+
+```azurecli
+cd endpoints/batch/deploy-models/openai-embeddings
+```
+
+### Follow along in Jupyter Notebooks
+
+You can follow along this sample in the following notebooks. In the cloned repository, open the notebook: [deploy-and-test.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb).
+
+## Prerequisites
+++
+### Ensure you have an OpenAI deployment
+
+The example shows how to run OpenAI models hosted in Azure OpenAI service. To successfully do it, you need an OpenAI resource correctly deployed in Azure and a deployment for the model you want to use.
++
+Take note of the OpenAI resource being used. We use the name to construct the URL of the resource. Save the URL for later use on the tutorial.
+
+# [Azure CLI](#tab/cli)
+
+```azurecli
+OPENAI_API_BASE="https://<your-azure-openai-resource-name>.openai.azure.com"
+```
+
+# [Python](#tab/python)
+
+```python
+openai_api_base="https://<your-azure-openai-resource-name>.openai.azure.com"
+```
+++
+### Ensure you have a compute cluster where to deploy the endpoint
+
+Batch endpoints use compute cluster to run the models. In this example, we use a compute cluster called **batch-cluster**. We create the compute cluster here but you can skip this step if you already have one:
+
+# [Azure CLI](#tab/cli)
+
+```azurecli
+COMPUTE_NAME="batch-cluster"
+az ml compute create -n batch-cluster --type amlcompute --min-instances 0 --max-instances 5
+```
+
+# [Python](#tab/python)
+
+[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=create_compute)]
+++
+### Decide in the authentication mode
+
+You can access the Azure OpenAI resource in two ways:
+
+* Using Microsoft Entra authentication (recommended).
+* Using an access key.
+
+Using Microsoft Entra is recommended because it helps you avoid managing secrets in the deployments.
+
+# [Microsoft Entra authentication](#tab/ad)
+
+You can configure the identity of the compute to have access to the Azure OpenAI deployment to get predictions. In this way, you don't need to manage permissions for each of the users using the endpoint. To configure the identity of the compute cluster get access to the Azure OpenAI resource, follow these steps:
+
+1. Ensure or assign an identity to the compute cluster your deployment uses. In this example, we use a compute cluster called **batch-cluster** and we assign a system assigned managed identity, but you can use other alternatives.
+
+ ```azurecli
+ COMPUTE_NAME="batch-cluster"
+ az ml compute update --name $COMPUTE_NAME --identity-type system_assigned
+ ```
+
+1. Get the managed identity principal ID assigned to the compute cluster you plan to use.
+
+ ```azurecli
+ PRINCIPAL_ID=$(az ml compute show -n $COMPUTE_NAME --query identity.principal_id)
+ ```
+
+1. Get the unique ID of the resource group where the Azure OpenAI resource is deployed:
+
+ ```azurecli
+ RG="<openai-resource-group-name>"
+ RESOURCE_ID=$(az group show -g $RG --query "id" -o tsv)
+ ```
+
+1. Grant the role **Cognitive Services User** to the managed identity:
+
+ ```azurecli
+ az role assignment create --role "Cognitive Services User" --assignee $PRINCIPAL_ID --scope $RESOURCE_ID
+ ```
+
+# [Access keys](#tab/keys)
+
+You can get an access key and configure the batch deployment to use the access key to get predictions. Grab the access key from your account and keep it for future reference in this tutorial.
++++
+### Register the OpenAI model
+
+Model deployments in batch endpoints can only deploy registered models. You can use MLflow models with the flavor OpenAI to create a model in your workspace referencing a deployment in Azure OpenAI.
+
+1. Create an MLflow model in the workspace's models registry pointing to your OpenAI deployment with the model you want to use. Use MLflow SDK to create the model:
+
+ > [!TIP]
+ > In the cloned repository in the folder **model** you already have an MLflow model to generate embeddings based on ADA-002 model in case you want to skip this step.
+
+ ```python
+ import mlflow
+ import openai
+
+ engine = openai.Model.retrieve("text-embedding-ada-002")
+
+ model_info = mlflow.openai.save_model(
+ path="model",
+ model="text-embedding-ada-002",
+ engine=engine.id,
+ task=openai.Embedding,
+ )
+ ```
+
+1. Register the model in the workspace:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="register_model" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=register_model)]
++
+## Create a deployment for an OpenAI model
+
+1. First, let's create the endpoint that hosts the model. Decide on the name of the endpoint:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="name_endpoint" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=name_endpoint)]
++
+1. Configure the endpoint:
+
+ # [Azure CLI](#tab/cli)
+
+ The following YAML file defines a batch endpoint:
+
+ __endpoint.yml__
+
+ :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/endpoint.yml":::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=configure_endpoint)]
+
+1. Create the endpoint resource:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="create_endpoint" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=create_endpoint)]
+
+1. Our scoring script uses some specific libraries that are not part of the standard OpenAI SDK so we need to create an environment that have them. Here, we configure an environment with a base image a conda YAML.
+
+ # [Azure CLI](#tab/cli)
+
+ __environment/environment.yml__
+
+ :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/environment/environment.yml":::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=configure_environment)]
+
+
+
+ The conda YAML looks as follows:
+
+ __conda.yaml__
+
+ :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/environment/conda.yaml":::
+
+1. Let's create a scoring script that performs the execution. In Batch Endpoints, MLflow models don't require a scoring script. However, in this case we want to extend a bit the capabilities of batch endpoints by:
+
+ > [!div class="checklist"]
+ > * Allow the endpoint to read multiple data types, including `csv`, `tsv`, `parquet`, `json`, `jsonl`, `arrow`, and `txt`.
+ > * Add some validations to ensure the MLflow model used has an OpenAI flavor on it.
+ > * Format the output in `jsonl` format.
+ > * Add an environment variable `AZUREML_BI_TEXT_COLUMN` to control (optionally) which input field you want to generate embeddings for.
+
+ > [!TIP]
+ > By default, MLflow will use the first text column available in the input data to generate embeddings from. Use the environment variable `AZUREML_BI_TEXT_COLUMN` with the name of an existing column in the input dataset to change the column if needed. Leave it blank if the defaut behavior works for you.
+
+ The scoring script looks as follows:
+
+ __code/batch_driver.py__
+
+ :::code language="python" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/code/batch_driver.py" :::
+
+1. One the scoring script is created, it's time to create a batch deployment for it. We use environment variables to configure the OpenAI deployment. Particularly we use the following keys:
+
+ * `OPENAI_API_BASE` is the URL of the Azure OpenAI resource to use.
+ * `OPENAI_API_VERSION` is the version of the API you plan to use.
+ * `OPENAI_API_TYPE` is the type of API and authentication you want to use.
+
+ # [Microsoft Entra authentication](#tab/ad)
+
+ The environment variable `OPENAI_API_TYPE="azure_ad"` instructs OpenAI to use Active Directory authentication and hence no key is required to invoke the OpenAI deployment. The identity of the cluster is used instead.
+
+ # [Access keys](#tab/keys)
+
+ To use access keys instead of Microsoft Entra authentication, we need the following environment variables:
+
+ * Use `OPENAI_API_TYPE="azure"`
+ * Use `OPENAI_API_KEY="<YOUR_AZURE_OPENAI_KEY>"`
+
+1. Once we decided on the authentication and the environment variables, we can use them in the deployment. The following example shows how to use Microsoft Entra authentication particularly:
+
+ # [Azure CLI](#tab/cli)
+
+ __deployment.yml__
+
+ :::code language="yaml" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deployment.yml" highlight="26-28":::
+
+ > [!TIP]
+ > Notice the `environment_variables` section where we indicate the configuration for the OpenAI deployment. The value for `OPENAI_API_BASE` will be set later in the creation command so you don't have to edit the YAML configuration file.
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=configure_deployment)]
+
+ > [!TIP]
+ > Notice the `environment_variables` section where we indicate the configuration for the OpenAI deployment.
+
+1. Now, let's create the deployment.
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="create_deployment" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=create_deployment)]
+
+ Finally, set the new deployment as the default one:
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=set_default_deployment)]
+
+1. At this point, our batch endpoint is ready to be used.
+
+## Test the deployment
+
+For testing our endpoint, we are going to use a sample of the dataset [BillSum: A Corpus for Automatic Summarization of US Legislation](https://arxiv.org/abs/1910.00523). This sample is included in the repository in the folder data.
+
+1. Create a data input for this model:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/imagenet-classifier/deploy-and-run.sh" ID="show_job_in_studio" :::
+
+ # [Python](#tab/python)
+
+ ```python
+ ml_client.jobs.get(job.name)
+ ```
+
+1. Invoke the endpoint:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="start_batch_scoring_job" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=start_batch_scoring_job)]
+
+1. Track the progress:
+
+ # [Azure CLI](#tab/cli)
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="show_job_in_studio" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=get_job)]
+
+1. Once the deployment is finished, we can download the predictions:
+
+ # [Azure CLI](#tab/cli)
+
+ To download the predictions, use the following command:
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/endpoints/batch/deploy-models/openai-embeddings/deploy-and-run.sh" ID="download_outputs" :::
+
+ # [Python](#tab/python)
+
+ [!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-models/openai-embeddings/deploy-and-test.ipynb?name=download_outputs)]
+
+1. The output predictions look like the following.
+
+ ```python
+ import pandas as pd
+
+ embeddings = pd.read_json("named-outputs/score/embeddings.jsonl", lines=True)
+ embeddings
+ ```
+
+ __embeddings.jsonl__
+
+ ```json
+ {
+ "file": "billsum-0.csv",
+ "row": 0,
+ "embeddings": [
+ [0, 0, 0 ,0 , 0, 0, 0 ]
+ ]
+ },
+ {
+ "file": "billsum-0.csv",
+ "row": 1,
+ "embeddings": [
+ [0, 0, 0 ,0 , 0, 0, 0 ]
+ ]
+ },
+ ```
+
+## Next steps
+
+* [Create jobs and input data for batch endpoints](how-to-access-data-batch-endpoints-jobs.md)
machine-learning How To Use Batch Scoring Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-batch-scoring-pipeline.md
To deploy the pipeline component, we have to create a batch deployment. A deploy
# [Python](#tab/python)
- Our pipeline is defined in a function. To transform it to a component, you'll use the `build()` method. Pipeline components are reusable compute graphs that can be included in batch deployments or used to compose more complex pipelines.
+ Our pipeline is defined in a function. To transform it to a component, you'll use the `component` property from it. Pipeline components are reusable compute graphs that can be included in batch deployments or used to compose more complex pipelines.
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-pipelines/batch-scoring-with-preprocessing/sdk-deploy-and-test.ipynb?name=build_pipeline)]
machine-learning How To Use Batch Training Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-use-batch-training-pipeline.md
To deploy the pipeline component, we have to create a batch deployment. A deploy
# [Python](#tab/python)
- Our pipeline is defined in a function. To transform it to a component, you'll use the `build()` method. Pipeline components are reusable compute graphs that can be included in batch deployments or used to compose more complex pipelines.
+ Our pipeline is defined in a function. To transform it to a component, you'll use the `component` property from it. Pipeline components are reusable compute graphs that can be included in batch deployments or used to compose more complex pipelines.
[!notebook-python[] (~/azureml-examples-main/sdk/python/endpoints/batch/deploy-pipelines/training-with-components/sdk-deploy-and-test.ipynb?name=build_pipeline_component)]
managed-grafana Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/managed-grafana/overview.md
Previously updated : 10/27/2023 Last updated : 11/17/2023 # What is Azure Managed Grafana?
Azure Managed Grafana is available in the two service tiers presented below.
| Essential (preview) | Provides the core Grafana functionalities in use with Azure data sources. Since it doesn't provide an SLA guarantee, this tier should be used only for non-production environments. | | Standard | The default tier, offering better performance, more features and an SLA. It's recommended for most situations. |
-The following table lists the main features supported in each tier:
+The [Azure Managed Grafana pricing page](https://azure.microsoft.com/pricing/details/managed-grafana/) gives more information on these tiers and the following table lists the main features supported in each tier:
| Feature | Essential (preview) | Standard | ||-|--|
mysql Concepts High Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/concepts-high-availability.md
The overall failover time depends on the current workload and the last checkpoin
> [!NOTE] >Azure Resource Health event is generated in the event of planned failover, representing the failover time during which server was unavailable. The triggered events can be seen when clicked on "Resource Health" in the left pane. User initiated/ Manual failover is represented by status as **"Unavailable"** and tagged as **"Planned"**. Example - "A failover operation was triggered by an authorized user (Planned)". If your resource remains in this state for an extended period of time, please open a [support ticket](https://azure.microsoft.com/support/create-ticket/) and we will assist you.
-
+ ### Unplanned: Automatic failover Unplanned service downtime can be caused by software bugs or infrastructure faults like compute, network, or storage failures, or power outages that affect the availability of the database. If the database becomes unavailable, replication to the standby replica is severed and the standby replica is activated as the primary database. DNS is updated, and clients reconnect to the database server and resume their operations.
Here are some considerations to keep in mind when you use high availability:
- Zone-redundant high availability can be set only when the flexible server is created. - High availability isn't supported in the burstable compute tier. - Restarting the primary database server to pick up static parameter changes also restarts the standby replica.-- Data-in Replication isn't supported for HA servers. - GTID mode will be turned on as the HA solution uses GTID. Check whether your workload has [restrictions or limitations on replication with GTIDs](https://dev.mysql.com/doc/refman/5.7/en/replication-gtids-restrictions.html). + >[!Note] >If you are enabling same-zone HA post the server create, you need to make sure the server parameters enforce_gtid_consistencyΓÇ¥ and [ΓÇ£gtid_modeΓÇ¥](./concepts-read-replicas.md#global-transaction-identifier-gtid) is set to ON before enabling HA.
If there's a database crash or node failure, the Flexible Server VM is restarted
For zone-redundant HA, while there is no major performance impact for read workloads across availability zones, there might be up to 40 percent drop in write-query latency. The increase in write-latency is due to synchronous replication across Availability zone. The write latency impact is generally twice in zone redundant HA compared to the same zone HA. For same-zone HA, because the primary and the standby replica is in the same zone, the replication latency and consequently the synchronous write latency is lower. In summary, if write-latency is more critical for you compared to availability, you may want to choose same-zone HA but if availability and resiliency of your data is more critical for you at the expense of write-latency drop, you must choose zone-redundant HA. To measure the accurate impact of the latency drop in HA setup, we recommend you to perform performance testing for your workload to take an informed decision.</br> - **How does maintenance of my HA server happen?**</br>
-Planned events like scaling of compute and minor version upgrades happen on the primary and the standby at the same time. You can set the [scheduled maintenance window](./concepts-maintenance.md) for HA servers as you do for flexible servers. The amount of downtime will be the same as the downtime for the Azure Database for MySQL - Flexible Server when HA is disabled. </br>
+Planned events like scaling of compute and minor version upgrades happen on the original standby instance first, and followed by triggering a planned failover operation, and then operate on the original primary instance. You can set the [scheduled maintenance window](./concepts-maintenance.md) for HA servers as you do for flexible servers. The amount of downtime will be the same as the downtime for the Azure Database for MySQL - Flexible Server when HA is disabled. </br>
- **Can I do a point-in-time restore (PITR) of my HA server?**</br> You can do a [PITR](./concepts-backup-restore.md#point-in-time-restore) for an HA-enabled Azure Database for MySQL - Flexible Server to a new Azure Database for MySQL - Flexible Server that has HA disabled. If the source server was created with zone-redundant HA, you can enable zone-redundant HA or same-zone HA on the restored server later. If the source server was created with same-zone HA, you can enable only same-zone HA on the restored server.</br>
You need to be able to mitigate downtime for your application even when you're n
Yes, read replicas are supported for HA servers.</br> - **Can I use Data-in Replication for HA servers?**</br>
-Data-in Replication isn't supported for HA servers. But Data-in Replication for HA servers is on our roadmap and will be available soon. For now, if you want to use Data-in Replication for migration, you can follow these steps:
- 1. Create the server with zone-redundant HA enabled.
- 1. Disable HA.
- 1. Complete the steps to [set up Data-in Replication](./concepts-data-in-replication.md). (Be sure `gtid_mode` has the same setting on the source and target servers.)
- 1. Post cutovers remove the Data-in Replication configuration.
- 1. Enable HA.
-
+Yes, support for data-in replication for high availability (HA) enabled server is available only through GTID-based replication.
- **To reduce downtime, can I fail over to the standby server during server restarts or while scaling up or down?** </br> Currently, Azure MySQL Flexible Server has utlized Planned Failover to optmize the HA operations including scaling up/down, and planned maintenance to help reduce the downtime. When such operations started, it would operate on the original standby instance first, followed by triggering a planned failover operation, and then operate on the original primary instance. </br>
If you create the server with Zone-redundant HA mode enabled then you can change
- Learn about [business continuity](./concepts-business-continuity.md). - Learn about [zone-redundant high availability](./concepts-high-availability.md). - Learn about [backup and recovery](./concepts-backup-restore.md).+
operator-nexus Concepts Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/operator-nexus/concepts-storage.md
Title: Azure Operator Nexus storage appliance description: Get an overview of storage appliance resources for Azure Operator Nexus.-+
items:
### Examples #### Read Write Once (RWO) with nexus-volume storage class The below manifest creates a StatefulSet with PersistentVolumeClaimTemplate using nexus-volume storage class in ReadWriteOnce mode.
-```dotnetcli
+```
apiVersion: apps/v1 kind: StatefulSet metadata:
spec:
storageClassName: nexus-volume ``` Each pod of the StatefulSet will have one PersistentVolumeClaim created.
-```dotnetcli
+```
# kubectl get pvc NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE test-volume-rwo-test-sts-rwo-0 Bound pvc-e41fec47-cc43-4cd5-8547-5a4457cbdced 10Gi RWO nexus-volume 8m17s test-volume-rwo-test-sts-rwo-1 Bound pvc-1589dc79-59d2-4a1d-8043-b6a883b7881d 10Gi RWO nexus-volume 7m58s test-volume-rwo-test-sts-rwo-2 Bound pvc-82e3beac-fe67-4676-9c61-e982022d443f 10Gi RWO nexus-volume 12s ```
-```dotnetcli
+```
# kubectl get pods -o wide -w NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES test-sts-rwo-0 1/1 Running 0 8m31s 10.245.231.74 nexus-cluster-6a8c4018-agentpool2-md-vhhv6 <none> <none> test-sts-rwo-1 1/1 Running 0 8m12s 10.245.126.73 nexus-cluster-6a8c4018-agentpool1-md-27nw4 <none> <none> test-sts-rwo-2 1/1 Running 0 26s 10.245.183.9 nexus-cluster-6a8c4018-agentpool1-md-4jprt <none> <none> ```
-```dotnetcli
+```
# kubectl exec test-sts-rwo-0 -- cat /mnt/hostname.txt Thu Nov 9 21:57:25 UTC 2023 -- test-sts-rwo-0 Thu Nov 9 21:57:26 UTC 2023 -- test-sts-rwo-0
Thu Nov 9 21:58:34 UTC 2023 -- test-sts-rwo-2
``` #### Read Write Many (RWX) with nexus-shared storage class The below manifest creates a Deployment with a PersistentVolumeClaim (PVC) using nexus-shared storage class in ReadWriteMany mode. The PVC created is shared by all the pods of the deployment and can be used to read and write by all of them simultaneously.
-```dotnetcli
+```
apiVersion: v1 kind: PersistentVolumeClaim
test-deploy-rwx-fdb8f49c-9zsjf 1/1 Running 0 18s 10.245.126
test-deploy-rwx-fdb8f49c-wdgw7 1/1 Running 0 18s 10.245.231.75 nexus-cluster-6a8c4018-agentpool2-md-vhhv6 <none> <none> ``` It can observed from the below output that all pods are writing into the same PVC.
-```dotnetcli
+```
# kubectl exec test-deploy-rwx-fdb8f49c-86pv4 -- cat /mnt/hostname.txt Thu Nov 9 21:51:41 UTC 2023 -- test-deploy-rwx-fdb8f49c-86pv4 Thu Nov 9 21:51:41 UTC 2023 -- test-deploy-rwx-fdb8f49c-9zsjf
postgresql Concepts Networking Ssl Tls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/flexible-server/concepts-networking-ssl-tls.md
SELECT datname as "Database name", usename as "User name", ssl, client_addr, app
ON pg_stat_ssl.pid = pg_stat_activity.pid ORDER BY ssl; ```+
+> [!NOTE]
+> To enforce **latest, most secure TLS version** for connectivity protection from client to Azure Database for PostgreSQL - Flexible Server set **ssl_min_protocol_version** to **1.3**. That would **require** clients connecting to your Azure Postgres server to use **this version of the protocol only** to securely communicate. However, older clients, since they don't support this version, may not be able to communicate with the server.
+ ## Cipher Suites A **cipher suite** is a set of cryptographic algorithms. TLS/SSL protocols use algorithms from a cipher suite to create keys and encrypt information.
postgresql Concepts Storage Extension https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/flexible-server/concepts-storage-extension.md
Azure Blob Storage can provide following benefits:
To load data from Azure Blob Storage, you need [allowlist](../../postgresql/flexible-server/concepts-extensions.md#how-to-use-postgresql-extensions) **azure_storage** extension and install the **azure_storage** PostgreSQL extension in this database using create extension command: ```sql
-SELECT * FROM create_extension('azure_storage');
+ CREATE EXTENSION azure_storage;
``` When you create a storage account, Azure generates two 512-bit storage **account access keys** for that account. These keys can be used to authorize access to data in your storage account via Shared Key authorization, or via SAS tokens that are signed with the shared key.Therefore, before you can import the data, you need to map storage account using **account_add** method, providing **account access key** defined when account was created. Code snippet shows mapping storage account *'mystorageaccount'* where access key parameter is shown as string *'SECRET_ACCESS_KEY'*.
The **COPY** command and **blob_get** function support following file extension
To export data from PostgreSQL Flexible Server to Azure Blob Storage, you need to [allowlist](../../postgresql/flexible-server/concepts-extensions.md#how-to-use-postgresql-extensions) **azure_storage** extension and install the **azure_storage** PostgreSQL extension in database using create extension command: ```sql
-SELECT * FROM create_extension('azure_storage');
+CREATE EXTENSION azure_storage;
``` When you create a storage account, Azure generates two 512-bit storage **account access keys** for that account. These keys can be used to authorize access to data in your storage account via Shared Key authorization, or via SAS tokens that are signed with the shared key.Therefore, before you can import the data, you need to map storage account using account_add method, providing **account access key** defined when account was created. Code snippet shows mapping storage account *'mystorageaccount'* where access key parameter is shown as string *'SECRET_ACCESS_KEY'*
The **COPY** command and **blob_put** function support following file extension
To list objects in Azure Blob Storage, you need to [allowlist](../../postgresql/flexible-server/concepts-extensions.md#how-to-use-postgresql-extensions) **azure_storage** extension and install the **azure_storage** PostgreSQL extension in database using create extension command: ```sql
-SELECT * FROM create_extension('azure_storage');
+CREATE EXTENSION azure_storage;
``` When you create a storage account, Azure generates two 512-bit storage **account access keys** for that account. These keys can be used to authorize access to data in your storage account via Shared Key authorization, or via SAS tokens that are signed with the shared key.Therefore, before you can import the data, you need to map storage account using account_add method, providing **account access key** defined when account was created. Code snippet shows mapping storage account *'mystorageaccount'* where access key parameter is shown as string *'SECRET_ACCESS_KEY'*
postgresql Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/flexible-server/overview.md
One advantage of running your workload in Azure is global reach. The flexible se
| Norway West | :heavy_check_mark: (v3/v4 only) | :x: | :heavy_check_mark: | :x: | | Qatar Central | :heavy_check_mark: (v3/v4 only) | :heavy_check_mark: | :heavy_check_mark: | :x: | | South Africa North | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
-| South Central US | :heavy_check_mark: (v3/v4 only) | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
+| South Central US | :heavy_check_mark: (v3/v4 only) | :x: $ | :heavy_check_mark: | :heavy_check_mark: |
| South India | :heavy_check_mark: | :x: | :heavy_check_mark: | :heavy_check_mark: | | Southeast Asia | :heavy_check_mark:(v3/v4 only) | :x: $ | :heavy_check_mark: | :heavy_check_mark: | | Sweden Central | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: |
postgresql Concepts Version Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/postgresql/single-server/concepts-version-policy.md
-+ Last updated 09/14/2022
Azure Database for PostgreSQL supports the following database versions.
| PostgreSQL 13 | | X | | PostgreSQL 12 | | X | | PostgreSQL 11 | X | X |
-| PostgreSQL 10 | X | |
+| *PostgreSQL 10 (retired)* | See [policy](#retired-postgresql-engine-versions-not-supported-in-azure-database-for-postgresql) | |
| *PostgreSQL 9.6 (retired)* | See [policy](#retired-postgresql-engine-versions-not-supported-in-azure-database-for-postgresql) | | | *PostgreSQL 9.5 (retired)* | See [policy](#retired-postgresql-engine-versions-not-supported-in-azure-database-for-postgresql) | |
The table below provides the retirement details for PostgreSQL major versions. T
| - | - | | - | | [PostgreSQL 9.5 (retired)](https://www.postgresql.org/about/news/postgresql-132-126-1111-1016-9621-and-9525-released-2165/)| [Features](https://www.postgresql.org/docs/9.5/release-9-5.html) | April 18, 2018 | February 11, 2021 | [PostgreSQL 9.6 (retired)](https://www.postgresql.org/about/news/postgresql-96-released-1703/) | [Features](https://wiki.postgresql.org/wiki/NewIn96) | April 18, 2018 | November 11, 2021
-| [PostgreSQL 10](https://www.postgresql.org/about/news/postgresql-10-released-1786/) | [Features](https://wiki.postgresql.org/wiki/New_in_postgres_10) | June 4, 2018 | November 10, 2022
+| [PostgreSQL 10 (retired)](https://www.postgresql.org/about/news/postgresql-10-released-1786/) | [Features](https://wiki.postgresql.org/wiki/New_in_postgres_10) | June 4, 2018 | November 10, 2022
| [PostgreSQL 11](https://www.postgresql.org/about/news/postgresql-11-released-1894/) | [Features](https://www.postgresql.org/docs/11/release-11.html) | July 24, 2019 | November 9, 2024 [Single Server, Flexible Server] | | [PostgreSQL 12](https://www.postgresql.org/about/news/postgresql-12-released-1976/) | [Features](https://www.postgresql.org/docs/12/release-12.html) | Sept 22, 2020 | November 14, 2024 | [PostgreSQL 13](https://www.postgresql.org/about/news/postgresql-13-released-2077/) | [Features](https://www.postgresql.org/docs/13/release-13.html) | May 25, 2021 | November 13, 2025
route-server Hub Routing Preference Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/route-server/hub-routing-preference-cli.md
Title: Configure routing preference - Azure CLI
-description: Learn how to configure routing preference in Azure Route Server using the Azure CLI to influence its route selection.
+description: Learn how to configure routing preference (Preview) in Azure Route Server using the Azure CLI to influence its route selection.
Last updated 11/15/2023-
- - devx-track-azurecli
- - ignite-2023
+
-#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server by using the Azure CLI.
+#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server using the Azure CLI.
# Configure routing preference to influence route selection using the Azure CLI
-Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference](hub-routing-preference.md).
+Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference (Preview)](hub-routing-preference.md).
+
+> [!IMPORTANT]
+> Routing preference is currently in PREVIEW. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Prerequisites
route-server Hub Routing Preference Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/route-server/hub-routing-preference-portal.md
Title: Configure routing preference - Azure portal
-description: Learn how to configure routing preference in Azure Route Server using the Azure portal to influence its route selection.
+description: Learn how to configure routing preference (Preview) in Azure Route Server using the Azure portal to influence its route selection.
-
- - ignite-2023
Last updated 11/15/2023
-#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server.
+#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server using the Azure portal.
# Configure routing preference to influence route selection using the Azure portal
-Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference](hub-routing-preference.md).
+Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference (Preview)](hub-routing-preference.md).
+
+> [!IMPORTANT]
+> Routing preference is currently in PREVIEW. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Prerequisites
route-server Hub Routing Preference Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/route-server/hub-routing-preference-powershell.md
Title: Configure routing preference - PowerShell
-description: Learn how to configure routing preference in Azure Route Server using Azure PowerShell to influence its route selection.
+description: Learn how to configure routing preference (Preview) in Azure Route Server using Azure PowerShell to influence its route selection.
Last updated 11/15/2023-
- - devx-track-azurepowershell
- - ignite-2023
+
-#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server by using Azure PowerShell.
+#CustomerIntent: As an Azure administrator, I want learn how to use routing preference setting so that I can influence route selection in Azure Route Server using Azure PowerShell.
# Configure routing preference to influence route selection using PowerShell
-Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference](hub-routing-preference.md).
+Learn how to use routing preference setting in Azure Route Server to influence its route learning and selection. For more information, see [Routing preference (Preview)](hub-routing-preference.md).
+
+> [!IMPORTANT]
+> Routing preference is currently in PREVIEW. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
## Prerequisites
route-server Hub Routing Preference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/route-server/hub-routing-preference.md
Title: Routing preference
+ Title: Routing preference (Preview)
-description: Learn about Azure Route Server routing preference feature to change how it can learn routes.
+description: Learn about Azure Route Server routing preference feature to change how the Route Server can learn routes.
-
- - ignite-2023
Last updated 11/15/2023 #CustomerIntent: As an Azure administrator, I want learn about routing preference feature so that I know how to influence route selection in Azure Route Server.
-# Routing preference
+# Routing preference (Preview)
Azure Route Server enables dynamic routing between network virtual appliances (NVAs) and virtual networks (VNets). In addition to supporting third-party NVAs, Route Server also seamlessly integrates with ExpressRoute and VPN gateways. Route Server uses built-in route selection algorithms to make routing decisions to set connection preferences. You can configure routing preference to influence how Route Server selects routes that it learned across site-to-site (S2S) VPN, ExpressRoute and SD-WAN NVAs for the same on-premises destination route prefix.
+> [!IMPORTANT]
+> Routing preference is currently in PREVIEW. See the [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ ## Routing preference configuration When Route Server has multiple routes to an on-premises destination prefix, Route Server selects the best route(s) in order of preference, as follows:
search Hybrid Search Ranking https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/hybrid-search-ranking.md
For more information, see [How to work with search results](search-pagination-pa
The following diagram illustrates a hybrid query that invokes keyword and vector search, with boosting through scoring profiles, and semantic ranking. A query that generates the previous workflow might look like this:
search Search Limits Quotas Capacity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-limits-quotas-capacity.md
Previously updated : 08/09/2023 Last updated : 11/16/2023 - references_regions - ignite-2023
When estimating document size, remember to consider only those fields that can b
## Vector index size limits
-When you index documents with vector fields, we construct internal vector indexes using the algorithm parameters you provide. The size of these vector indexes is restricted by the memory reserved for vector search for your service's tier (or SKU).
+When you index documents with vector fields, Azure AI Search constructs internal vector indexes using the algorithm parameters you provide. The size of these vector indexes is restricted by the memory reserved for vector search for your service's tier (or SKU).
The service enforces a vector index size quota **for every partition** in your search service. Each extra partition increases the available vector index size quota. This quota is a hard limit to ensure your service remains healthy, which means that further indexing attempts once the limit is exceeded results in failure. You may resume indexing once you free up available quota by either deleting some vector documents or by scaling up in partitions.
-The table describes the vector index size quota per partition across the service tiers (or SKU). For context, it includes the [storage limits](#storage-limits) for each tier. Use the [Get Service Statistics API (GET /servicestats)](/rest/api/searchservice/get-service-statistics) to retrieve your vector index size quota.
+The table describes the vector index size quota per partition across the service tiers (or SKU). For context, it includes:
-See our [documentation on vector index size](./vector-search-index-size.md) for more details.
++ [Storage limits](#storage-limits) for each tier, repeated here for context.++ Amount of each partition (in GB) available for vector indexes (created when you add vector fields to an index).++ Approximate number of embeddings (floating point values) per partition.+
+Use the [Get Service Statistics API (GET /servicestats)](/rest/api/searchservice/get-service-statistics) to retrieve your vector index size quota. See our [documentation on vector index size](vector-search-index-size.md) for more details.
### Services created prior to July 1, 2023
-| Tier | Storage quota (GB) | Vector index size quota per partition (GB) | Approx. floats per partition (assuming 15% overhead) |
+| Tier | Storage quota (GB) | Vector quota per partition (GB) | Approx. floats per partition (assuming 15% overhead) |
| -- | | | - | | Basic | 2 | 0.5 | 115 million | | S1 | 25 | 1 | 235 million |
The following regions **do not** support increased limits:
- Jio India West - Qatar Central
-| Tier | Storage quota (GB) | Vector index size quota per partition (GB) | Approx. floats per partition (assuming 15% overhead) |
+| Tier | Storage quota (GB) | Vector quota per partition (GB) | Approx. floats per partition (assuming 15% overhead) |
| -- | | | - | | Basic | 2 | 1 | 235 million | | S1 | 25 | 3 | 700 million |
search Tutorial Csharp Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-csharp-overview.md
ms.devlang: csharp
# 1 - Overview of adding search to a website with .NET
-This tutorial builds a website to search through a catalog of books then deploys the website to an Azure Static Web App.
+This tutorial builds a website to search through a catalog of books and then deploys the website to an Azure Static Web App.
-The application is available:
+## What does the sample do?
-* [Sample](https://github.com/azure-samples/azure-search-dotnet-samples/tree/main/search-website-functions-v4)
-* [Demo website - aka.ms/azs-good-books](https://aka.ms/azs-good-books)
-
-## What does the sample do?
[!INCLUDE [tutorial-overview](includes/tutorial-add-search-website-what-sample-does.md)] ## How is the sample organized?
-The [sample](https://github.com/Azure-Samples/azure-search-dotnet-samples/tree/main/search-website-functions-v4) includes the following:
+The [sample code](https://github.com/Azure-Samples/azure-search-dotnet-samples/tree/main/search-website-functions-v4) includes the following:
|App|Purpose|GitHub<br>Repository<br>Location| |--|--|--|
search Tutorial Javascript Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-javascript-overview.md
In this Azure AI Search tutorial, create a web app that searches through a catal
This tutorial is for JavaScript developers who want to create a frontend client app that includes search interactions like faceted navigation, typeahead, and pagination. It also demonstrates the `@azure/search-documents` library in the Azure SDK for JavaScript for calls to Azure AI Search for indexing and query workflows on the backend.
-Source code is available in the [azure-search-javascript-samples](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/main/search-website-functions-v4) GitHub repository.
- ## What does the sample do? [!INCLUDE [tutorial-overview](includes/tutorial-add-search-website-what-sample-does.md)] ## How is the sample organized?
-The [sample](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/main/search-website-functions-v4) includes the following components:
+The [sample code](https://github.com/Azure-Samples/azure-search-javascript-samples/tree/main/search-website-functions-v4) includes the following components:
|App|Purpose|GitHub<br>Repository<br>Location| |--|--|--|
search Tutorial Python Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/tutorial-python-overview.md
ms.devlang: python
This tutorial builds a website to search through a catalog of books then deploys the website to an Azure Static Web App.
-The application is available:
-* [Sample](https://github.com/Azure-Samples/azure-search-python-samples/tree/main/search-website-functions-v4)
-* [Demo website - aka.ms/azs-good-books](https://aka.ms/azs-good-books)
- ## What does the sample do? [!INCLUDE [tutorial-overview](includes/tutorial-add-search-website-what-sample-does.md)] ## How is the sample organized?
-The [sample](https://github.com/Azure-Samples/azure-search-python-samples/tree/main/search-website-functions-v4) includes the following:
+The [sample code](https://github.com/Azure-Samples/azure-search-python-samples/tree/main/search-website-functions-v4) includes the following:
|App|Purpose|GitHub<br>Repository<br>Location| |--|--|--|
search Vector Search How To Query https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/vector-search-how-to-query.md
Here's a modified example so that you can see the basic structure of a response
**Key points:**
-+ `k` usually determines how many matches are returned. You can assume a `k` of three for this response.
++ `k` determines how many nearest neighbor results are returned. In the example above, a `k` value of three was used. Vector queries always return `k` results, assuming at least `k` documents exist, even if there are documents with poor similarity, because the algorithm is only identifying the `k` nearest neighbors to the query vector. As a result, note that both count and facet aggregations (facet counts) operate on this `k` recall set. + + The **`@search.score`** is determined by the [vector search algorithm](vector-search-ranking.md) (HNSW algorithm and a `cosine` similarity metric in this example). + + Fields include text and vector values. The content vector field consists of 1536 dimensions for each match, so it's truncated for brevity (normally, you might exclude vector fields from results). The text fields used in the response (`"select": "title, category"`) aren't used during query execution. The match is made on vector data alone. However, a response can include any "retrievable" field in an index. As such, the inclusion of text fields is helpful because its values are easily recognized by users. ## Vector query with filter
search Vector Search Index Size https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/vector-search-index-size.md
- ignite-2023 Previously updated : 11/07/2023 Last updated : 11/16/2023 # Vector index size limit
The size of vector indexes is measured in bytes. The size constraints are based
The service enforces a vector index size quota **based on the number of partitions** in your search service, where the quota per partition varies by tier and also by service creation date (see [Vector index size limits](search-limits-quotas-capacity.md#vector-index-size-limits) in service limits).
-Each extra partition that you add to your service increases the available vector index size quota. This quota is a hard limit to ensure your service remains healthy. It also means that if vector size exceeds this limit, any further indexing requests will result in failure. You can resume indexing once you free up available quota by either deleting some vector documents or by scaling up in partitions.
+Each extra partition that you add to your service increases the available vector index size quota. This quota is a hard limit to ensure your service remains healthy. It also means that if vector size exceeds this limit, any further indexing requests result in failure. You can resume indexing once you free up available quota by either deleting some vector documents or by scaling up in partitions.
-The following limits are for newer search services created *after July 1, 2023*. For more information, including limits for older search services, see [Search service limits](search-limits-quotas-capacity.md).
+The following table shows vector quotas by partition, and by service if all partitions are in use. This table is for newer search services created *after July 1, 2023*. For more information, including limits for older search services and also limits on the approximate number of embeddings per partition, see [Search service limits](search-limits-quotas-capacity.md).
-| Tier | Storage (GB) |Partitions | Vector quota per partition (GB) | Vector quota per service (GB) |
-| -- | - | -|-- | - |
-| Basic | 2 | 1 | 1 | 1 |
-| S1 | 25 | 12 | 3 | 36 |
-| S2 | 100 | 12 |12 | 144 |
-| S3 | 200 | 12 |36 | 432 |
-| L1 | 1,000 | 12 |12 | 144 |
-| L2 | 2,000 | 12 |36 | 432 |
+| Tier | Partitions | Storage (GB) | Vector quota per partition (GB) | Vector quota per service (GB) |
+| -- | - | --|-- | -- |
+| Basic | 1 | 2 | 1 | 1 |
+| S1 | 12 | 25 | 3 | 36 |
+| S2 | 12 | 100 | 12 | 144 |
+| S3 | 12 | 200 | 36 | 432 |
+| L1 | 12 | 1,000 | 12 | 144 |
+| L2 | 12 | 2,000 | 36 | 432 |
**Key points**:
-+ Storage quota is the physical storage available to the search service for all search data. Basic has one partition sized at 2 GB that must accommodate all of the data on the service. S1 can have 12 partitions sized at 25 GB each, for a maximum limit of 300 GB for all search data.
++ Storage quota is the physical storage available to the search service for all search data. Basic has one partition sized at 2 GB that must accommodate all of the data on the service. S1 can have up to 12 partitions, sized at 25 GB each, for a maximum limit of 300 GB for all search data.
-+ Vector quotas for are the vector indexes created for each vector field, and they're enforced at the partition level. On Basic, the sum total of all vector fields can't be more than 1 GB because Basic only has one partition. On S1, which can have up to 12 partitions, the quota for vector data is 3 GB if you've only allocated one partition, or up to 36 GB if you've allocated 12 partitions. For more information about partitions and replicas, see [Estimate and manage capacity](search-capacity-planning.md).
++ Vector quotas for are the vector indexes created for each vector field, and they're enforced at the partition level. On Basic, the sum total of all vector fields can't be more than 1 GB because Basic only has one partition. On S1, which can have up to 12 partitions, the quota for vector data is 3 GB if you allocate just one partition, or up to 36 GB if you allocate all 12 partitions. For more information about partitions and replicas, see [Estimate and manage capacity](search-capacity-planning.md). ## How to get vector index size
For `Edm.Single`, the size of the data type is 4 bytes.
### Memory Overhead from the Selected Algorithm
-Every approximate nearest neighbor (ANN) algorithm generates additional data structures in memory to enable efficient searching. These structures consume extra space within memory.
+Every approximate nearest neighbor (ANN) algorithm generates extra data structures in memory to enable efficient searching. These structures consume extra space within memory.
**For the HNSW algorithm, the memory overhead ranges between 1% and 20%.**
search Vector Search Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/vector-search-overview.md
You can index vector data as fields in documents alongside alphanumeric content.
Vector search is available as part of all Azure AI Search tiers in all regions at no extra charge.
+Newer services created after July 1, 2023 support [higher quotas for vector indexes](vector-search-index-size.md).
+ > [!NOTE] > Some older search services created before January 1, 2019 are deployed on infrastructure that doesn't support vector workloads. If you try to add a vector field to a schema and get an error, it's a result of outdated services. In this situation, you must create a new search service to try out the vector feature.
service-bus-messaging Configure Customer Managed Key https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/configure-customer-managed-key.md
After you enable customer-managed keys, you need to associate the customer manag
> * If you are looking to enable Geo-DR on a Service Bus namespace where customer managed key is already set up, then - > * [Set up the access policy](../key-vault/general/assign-access-policy-portal.md) for the managed identity for the secondary namespace to the key vault. > * Pair the primary and secondary namespaces.
+ >
+ > * Once paired, the secondary namespace will use the key vault configured for the primary namespace. If the key vault for both namespaces is different before Geo-DR pairing, the user must delegate an access policy or RBAC role for the managed identity of the secondary namespace in the key vault associated with primary namespace.
## Managed identities There are two types of managed identities that you can assign to a Service Bus namespace.
service-bus-messaging Enable Partitions Premium https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/enable-partitions-premium.md
Service Bus partitions enable queues and topics, or messaging entities, to be pa
> - When creating a partitioned namespace in a region [that supports Availability Zones](service-bus-outages-disasters.md#availability-zones), this will automatically enabled on the namespace. > - Multiple partitions with lower messaging units (MU) give you a better performance over a single partition with higher MUs. > - When using the Service Bus [Geo-disaster recovery](service-bus-geo-dr.md) feature, ensure not to pair a partitioned namespace with a non-partitioned namespace.
+> - It is not possible to [migrate](service-bus-migrate-standard-premium.md) a standard SKU namespace to a Premium SKU partitioned namespace.
+> - JMS is currently not supported on partitioned namespaces.
> - The feature is currently available in the regions noted below. New regions will be added regularly, we will keep this article updated with the latest regions as they become available. > > | | | | | |
-> |--|-|-||--|
-> | Australia Central | Central US | Germany West Central | South Central US | West Central US |
-> | Australia Southeast | East Asia | Japan West | South India | West Europe |
-> | Canada Central | East US | North Central US | UAE North | West US |
-> | Canada East | East US 2 EUAP | North Europe | UK South | West US 3 |
-> | Central India | France Central | Norway East | UK West | |
+> |--|-||-|--|
+> | Australia Central | Central US | Italy North | Poland Central | UK South |
+> | Australia East | East Asia | Japan West | South Central US | UK West |
+> | Australia Southeast | East US | Malaysia South | South India | West Central US |
+> | Brazil Southeast | East US 2 EUAP | Mexico Central | Spain Central | West Europe |
+> | Canada Central | France Central | North Central US | Switzerland North | West US |
+> | Canada East | Germany West Central | North Europe | Switzerland West | West US 3 |
+> | Central India | Israel Central | Norway East | UAE North | |
## Use Azure portal When creating a **namespace** in the Azure portal, set the **Partitioning** to **Enabled** and choose the number of partitions, as shown in the following image.
service-bus-messaging Service Bus Geo Dr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-geo-dr.md
The Geo-Disaster recovery feature ensures that the entire configuration of a nam
- Identities and encryption settings (customer-managed key encryption or bring your own key (BYOK) encryption) - Enable auto scale - Disable local authentication-
+- Pairing a [partitioned namespace](enable-partitions-premium.md) with a non-partitioned namespace is not supported.
> [!TIP] > For replicating the contents of queues and topic subscriptions and operating corresponding namespaces in active/active configurations to cope with outages and disasters, don't lean on this Geo-disaster recovery feature set, but follow the [replication guidance](service-bus-federation-overview.md).
service-bus-messaging Service Bus Migrate Standard Premium https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-migrate-standard-premium.md
This article describes how to migrate existing standard tier namespaces to the p
Some of the points to note: - This migration is meant to happen in place, meaning that existing sender and receiver applications **don't require any changes to code or configuration**. The existing connection string will automatically point to the new premium namespace.-- If you're using an existing premium name, the **premium** namespace should have **no entities** in it for the migration to succeed.
+- If you're using an existing premium name, the **premium** namespace should have **no entities** in it for the migration to succeed, and should not have [partitioning enabled](enable-partitions-premium.md).
- All **entities** in the standard namespace are **copied** to the premium namespace during the migration process. - Migration supports **1,000 entities per messaging unit** on the premium tier. To identify how many messaging units you need, start with the number of entities that you have on your current standard namespace. - You can't directly migrate from **basic tier** to **premium tier**, but you can do so indirectly by migrating from basic to standard first and then from the standard to premium in the next step.
service-connector Concept Service Connector Internals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/concept-service-connector-internals.md
Last updated 01/17/2023 - # Service Connector internals Service Connector is an Azure extension resource provider designed to provide a simple way to create and manage connections between Azure services.
Service Connector offers the following features:
The concept of *service connection* is a key concept in the resource model of Service Connector. A service connection represents an abstraction of the link between two services. Service connections have the following properties: | Property | Description |
-||-|
+| - | -- |
| Connection Name | The unique name of the service connection. |
-| Source Service Type | Source services are services you can connect to target services. They are usually Azure compute services and they include Azure App Service, Azure Container Apps and Azure Spring Apps. |
+| Source Service Type | Source services are services you can connect to target services. They are usually Azure compute services and they include Azure App Service, Azure Functions, Azure Container Apps and Azure Spring Apps.
| Target Service Type | Target services are backing services or dependency services that your compute services connect to. Service Connector supports various target service types including major databases, storage, real-time services, state, and secret stores. |
-| Client Type | Client type refers to your compute runtime stack, development framework, or specific type of client library that accepts the specific format of the connection environment variables or properties. |
+| Client Type | Client type refers to your compute runtime stack, development framework, or specific type of client library that accepts the specific format of the connection environment variables or properties. |
| Authentication Type | The authentication type used for the service connection. It could be a secret/connection string, a managed identity, or a service principal. | Source services and target services support multiple simultaneous service connections, which means that you can connect each resource to multiple resources.
az containerapp connection list-configuration --resource-group <source-service-r
Service Connector sets the connection configuration when creating a connection. The environment variable key-value pairs are determined based on your client type and authentication type. For example, using the Azure SDK with a managed identity requires a client ID, client secret, etc. Using a JDBC driver requires a database connection string. Follow these conventions to name the configurations: - Spring Boot client: the Spring Boot library for each target service has its own naming convention. For example, MySQL connection settings would be `spring.datasource.url`, `spring.datasource.username`, `spring.datasource.password`. Kafka connection settings would be `spring.kafka.properties.bootstrap.servers`.- - Other clients:+ - The key name of the first connection configuration uses the format `<Cloud>_<Type>_<Name>`. For example, `AZURE_STORAGEBLOB_RESOURCEENDPOINT`, `CONFLUENTCLOUD_KAFKA_BOOTSTRAPSERVER`. - For the same type of target resource, the key name of the second connection configuration uses the format `<Cloud>_<Type>_<Connection Name>_<Name>`. For example, `AZURE_STORAGEBLOB_CONN2_RESOURCEENDPOINT`, `CONFLUENTCLOUD_KAFKA_CONN2_BOOTSTRAPSERVER`.
Service Connector offers three network solutions for users to choose from when c
- The compute resource must have virtual network integration enabled. For Azure App Service, it can be configured in its networking settings; for Azure Spring Apps, users must set VNet injection during the resource creation stage. - The target service must support private endpoints. For a list of supported services, refer to [Private-link resource](/azure/private-link/private-endpoint-overview#private-link-resource).
- When selecting this option, Service Connector doesn't perform any more configurations in the compute or target resources. Instead, it verifies the existence of a valid private endpoint and fails the connection if not found. For convenience, users can select the "New Private Endpoint" checkbox in the Azure Portal when creating a connection. With it, Service Connector automatically creates all related resources for the private endpoint in the proper sequence, simplifying the connection creation process.
+ When selecting this option, Service Connector doesn't perform any more configurations in the compute or target resources. Instead, it verifies the existence of a valid private endpoint and fails the connection if not found. For convenience, users can select the "New Private Endpoint" checkbox in the Azure portal when creating a connection. With it, Service Connector automatically creates all related resources for the private endpoint in the proper sequence, simplifying the connection creation process.
service-connector How To Integrate App Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-app-configuration.md
Last updated 10/26/2023 - # Integrate Azure App Configuration with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure App Configuration to other cloud services using Service Connector. You might still be able to connect to App Configuration using other methods. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
+ | Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal | |-|::|::|::|::|
Use the connection details below to connect compute services to Azure App Config
### System-assigned managed identity
-| Default environment variable name | Description | Sample value |
-|--|||
+| Default environment variable name | Description | Sample value |
+| | - | |
| AZURE_APPCONFIGURATION_ENDPOINT | App Configuration endpoint | `https://<App-Configuration-name>.azconfig.io` | #### Sample code
Refer to the steps and code below to connect to Azure App Configuration using a
### User-assigned managed identity
-| Default environment variable name | Description | Sample value |
-|--|-|--|
+| Default environment variable name | Description | Sample value |
+| | -- | -- |
| AZURE_APPCONFIGURATION_ENDPOINT | App Configuration Endpoint | `https://App-Configuration-name>.azconfig.io` | | AZURE_APPCONFIGURATION_CLIENTID | Your client ID | `<client-ID>` |
Refer to the steps and code below to connect to Azure App Configuration using a
### Service principal
-| Default environment variable name | Description | Sample value |
-|-|-|-|
+| Default environment variable name | Description | Sample value |
+| -- | -- | - |
| AZURE_APPCONFIGURATION_ENDPOINT | App Configuration Endpoint | `https://<AppConfigurationName>.azconfig.io` | | AZURE_APPCONFIGURATION_CLIENTID | Your client ID | `<client-ID>` | | AZURE_APPCONFIGURATION_CLIENTSECRET | Your client secret | `<client-secret>` |
service-connector How To Integrate Confluent Kafka https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-confluent-kafka.md
Last updated 11/07/2023 - # Integrate Apache Kafka on Confluent Cloud with Service Connector This page shows supported authentication methods and clients to connect Apache Kafka on Confluent Cloud to other cloud services using Service Connector. You might still be able to connect to Apache Kafka on Confluent Cloud in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients to connect Apache K
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported Authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|-|--|--|-|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | -- | | - | -- |
| .NET | | | ![yes icon](./media/green-check.png) | | | Java | | | ![yes icon](./media/green-check.png) | | | Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
Use the connection details below to connect compute services to Kafka. For each
#### SpringBoot client type
-| Default environment variable name | Description | Example value |
-|--||--|
+| Default environment variable name | Description | Example value |
+| | - | -- |
| spring.kafka.properties.bootstrap.servers | Your Kafka bootstrap server | `pkc-<server-name>.eastus.azure.confluent.cloud:9092` | | spring.kafka.properties.sasl.jaas.config | Your Kafka SASL configuration | `org.apache.kafka.common.security.plain.PlainLoginModule required username='<Bootstrap-server-key>' password='<Bootstrap-server-secret>';` | | spring.kafka.properties.schema.registry.url | Your Confluent registry URL | `https://psrc-<server-name>.westus2.azure.confluent.cloud` |
service-connector How To Integrate Cosmos Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-cassandra.md
Previously updated : 10/20/2023 Last updated : 10/25/2023 - # Integrate Azure Cosmos DB for Cassandra with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Cosmos DB for Apache Cassandra to other cloud services using Service Connector. You might still be able to connect to the Azure Cosmos DB for Cassandra in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|--|--|--|--|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
Reference the connection details and sample code in the following tables, accord
### System-assigned Managed Identity
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
-| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
-| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
-| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
+| Default environment variable name | Description | Example value |
+| | -- | |
+| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
+| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
+| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
+| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
+| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
#### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for Cassandra us
### User-assigned Managed Identity
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
-| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
-| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
-| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
-| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
+| Default environment variable name | Description | Example value |
+| | -- | |
+| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
+| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
+| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
+| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
+| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
+| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
#### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for Cassandra us
#### SpringBoot client type
-| Default environment variable name | Description | Example value |
-|-|--|--|
+| Default environment variable name | Description | Example value |
+| -- | -- | -- |
| spring.data.cassandra.contact-points | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
-| spring.data.cassandra.port | Cassandra connection port | 10350 |
-| spring.data.cassandra.keyspace-name | Cassandra keyspace | `<keyspace>` |
-| spring.data.cassandra.username | Cassandra username | `<username>` |
-| spring.data.cassandra.password | Cassandra password | `<password>` |
-| spring.data.cassandra.local-datacenter | Azure Region | `<Azure-region>` |
-| spring.data.cassandra.ssl | SSL status | true |
+| spring.data.cassandra.port | Cassandra connection port | 10350 |
+| spring.data.cassandra.keyspace-name | Cassandra keyspace | `<keyspace>` |
+| spring.data.cassandra.username | Cassandra username | `<username>` |
+| spring.data.cassandra.password | Cassandra password | `<password>` |
+| spring.data.cassandra.local-datacenter | Azure Region | `<Azure-region>` |
+| spring.data.cassandra.ssl | SSL status | true |
#### Other client types
-| Default environment variable name | Description | Example value |
-|--|--||
+| Default environment variable name | Description | Example value |
+| | -- | -- |
| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
-| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
-| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
-| AZURE_COSMOS_PASSWORD | Cassandra password | `<password>` |
+| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
+| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
+| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
+| AZURE_COSMOS_PASSWORD | Cassandra password | `<password>` |
#### Sample code Refer to the steps and code below to connect to Azure Cosmos DB for Cassandra using a connection string. [!INCLUDE [code sample for blob](./includes/code-cosmoscassandra-secret.md)] - #### Service principal
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
-| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
-| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
-| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
-| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
-| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
-| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| | -- | |
+| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` |
+| AZURE_COSMOS_CONTACTPOINT | Azure Cosmos DB for Apache Cassandra contact point | `<Azure-Cosmos-DB-account>.cassandra.cosmos.azure.com` |
+| AZURE_COSMOS_PORT | Cassandra connection port | 10350 |
+| AZURE_COSMOS_KEYSPACE | Cassandra keyspace | `<keyspace>` |
+| AZURE_COSMOS_USERNAME | Cassandra username | `<username>` |
+| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
+| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
#### Sample code
service-connector How To Integrate Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-db.md
Last updated 10/31/2023 - # Integrate Azure Cosmos DB for MongoDB with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect the Azure Cosmos DB for MongoDB to other cloud services using Service Connector. You might still be able to connect to Azure Cosmos DB for MongoDB in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps, and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps, and Azure Spring Apps:
+
+### [Azure App Service](#tab/app-service)
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|--|--|--|--|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | --
-## Default environment variable names or application properties and sample code
+### [Azure Functions](#tab/azure-functions)
-Use the connection details below to connect compute services to Azure Cosmos DB. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection, as well as sample code. For each example below, replace the placeholder texts `<mongo-db-admin-user>`, `<password>`, `<Azure-Cosmos-DB-API-for-MongoDB-account>`, `<subscription-ID>`, `<resource-group-name>`, `<client-secret>`, and `<tenant-id>` with your own information. For more information about naming conventions, check the [Service Connector internals](concept-service-connector-internals.md#configuration-naming-convention) article.
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+### [Azure Container Apps](#tab/container-apps)
-### System-assigned managed identity
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-API-for-MongoDB-account>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-API-for-MongoDB-account>.documents.azure.com:443/` |
+### [Azure Spring Apps](#tab/spring-apps)
-#### Sample code
-Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a system-assigned managed identity.
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | | - | - |
+| .NET | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Node.js | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-### User-assigned managed identity
+
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-API-for-MongoDB-account>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
-| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-API-for-MongoDB-account>.documents.azure.com:443/` |
+## Default environment variable names or application properties and sample code
-#### Sample code
-Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a user-assigned managed identity.
+Use the connection details below to connect compute services to Azure Cosmos DB. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection, as well as sample code. For each example below, replace the placeholder texts `<mongo-db-admin-user>`, `<password>`, `<Azure-Cosmos-DB-API-for-MongoDB-account>`, `<subscription-ID>`, `<resource-group-name>`, `<client-secret>`, and `<tenant-id>` with your own information. For more information about naming conventions, check the [Service Connector internals](concept-service-connector-internals.md#configuration-naming-convention) article.
-### Connection string
+### Secret / Connection string
#### SpringBoot client type
Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB usin
|--|-|-| | AZURE_COSMOS_CONNECTIONSTRING | MongoDB API connection string | `mongodb://<mongo-db-admin-user>:<password>@<mongo-db-server>.mongo.cosmos.azure.com:10255/?ssl=true&replicaSet=globaldb&retrywrites=false&maxIdleTimeMS=120000&appName=@<mongo-db-server>@` |
-#### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a connection string. [!INCLUDE [code sample for mongo](./includes/code-cosmosmongo-secret.md)]
+### System-assigned managed identity
+
+| Default environment variable name | Description | Example value |
+| | | -- |
+| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-API-for-MongoDB-account>/listConnectionStrings?api-version=2021-04-15` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-API-for-MongoDB-account>.documents.azure.com:443/` |
+
+Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a system-assigned managed identity.
+
+### User-assigned managed identity
+
+| Default environment variable name | Description | Example value |
+| | | -- |
+| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-API-for-MongoDB-account>/listConnectionStrings?api-version=2021-04-15` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-API-for-MongoDB-account>.documents.azure.com:443/` |
+
+Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a user-assigned managed identity.
+ ### Service principal
-| Default environment variable name | Description | Example value |
-|--|--|--|
+| Default environment variable name | Description | Example value |
+| | | -- |
| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-API-for-MongoDB-account>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
-| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
-| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
+| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-API-for-MongoDB-account>.documents.azure.com:443/` |
-#### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for MongoDB using a service principal. [!INCLUDE [code sample for mongo](./includes/code-cosmosmongo-me-id.md)]
service-connector How To Integrate Cosmos Gremlin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-gremlin.md
Last updated 10/31/2023 - # Integrate the Azure Cosmos DB for Gremlin with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect the Azure Cosmos DB for Apache Gremlin to other cloud services using Service Connector. You might still be able to connect to the Azure Cosmos DB for Gremlin in other programming languages without using Service Connector. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | |-|--|--|--|--|
Use the connection details below to connect your compute services to Azure Cosmo
### System-assigned managed identity
-| Default environment variable name | Description | Example value |
-|--|--|-|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` | | AZURE_COSMOS_HOSTNAME | Your Gremlin Unique Resource Identifier (UFI) | `<Azure-Cosmos-DB-account>.gremlin.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Connection port | 443 |
-| AZURE_COSMOS_USERNAME | Your username | `/dbs/<database>/colls/<collection or graphs>` |
+| AZURE_COSMOS_PORT | Connection port | 443 |
+| AZURE_COSMOS_USERNAME | Your username | `/dbs/<database>/colls/<collection or graphs>` |
#### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for Gremlin usin
### User-assigned managed identity
-| Default environment variable name | Description | Example value |
-|--|--|-|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` | | AZURE_COSMOS_HOSTNAME | Your Gremlin Unique Resource Identifier (UFI) | `<Azure-Cosmos-DB-account>.gremlin.cosmos.azure.com` | | AZURE_COSMOS_PORT | Connection port | 443 |
Refer to the steps and code below to connect to Azure Cosmos DB for Gremlin usin
### Service principal
-| Default environment variable name | Description | Example value |
-|--|--|-|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTKEYURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<Azure-Cosmos-DB-account>/listKeys?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<Azure-Cosmos-DB-account>.documents.azure.com:443/` | | AZURE_COSMOS_HOSTNAME | Your Gremlin Unique Resource Identifier (UFI) | `<Azure-Cosmos-DB-account>.gremlin.cosmos.azure.com` |
-| AZURE_COSMOS_PORT | Gremlin connection port | 10350 |
-| AZURE_COSMOS_USERNAME | Your username | `</dbs/<database>/colls/<collection or graphs>` |
-| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
-| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
-| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
+| AZURE_COSMOS_PORT | Gremlin connection port | 10350 |
+| AZURE_COSMOS_USERNAME | Your username | `</dbs/<database>/colls/<collection or graphs>` |
+| AZURE_COSMOS_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
+| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
#### Sample code
service-connector How To Integrate Cosmos Sql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-sql.md
Last updated 10/24/2023 - # Integrate the Azure Cosmos DB for NoSQL with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Cosmos DB for NoSQL to other cloud services using Service Connector. You might still be able to connect to Azure Cosmos DB for NoSQL in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | |--|--|--|--|--|
Supported authentication and clients for App Service, Container Apps and Azure S
| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | - ## Default environment variable names or application properties and Sample code
Using a system-assigned managed identity as the authentication type is only avai
#### Other client types
-| Default environment variable name | Description | Example value |
-|--|--|--|
+| Default environment variable name | Description | Example value |
+| | | -- |
| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<database-server>/listConnectionStrings?api-version=2021-04-15` | | AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` | | AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<database-server>.documents.azure.com:443/` |
service-connector How To Integrate Cosmos Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-cosmos-table.md
Last updated 11/01/2023 - # Integrate the Azure Cosmos DB for Table with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect the Azure Cosmos DB for Table to other cloud services using Service Connector. You might still be able to connect to the Azure Cosmos DB for Table in other programming languages without using Service Connector. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | |--|--|--|--|--|
Supported authentication and clients for App Service, Container Apps and Azure S
Use the connection details below to connect your compute services to Azure Cosmos DB for Table. For each example below, replace the placeholder texts `<account-name>`, `<table-name>`, `<account-key>`, `<resource-group-name>`, `<subscription-ID>`, `<client-ID>`, `<client-secret>`, `<tenant-id>` with your own information. For more information about naming conventions, check the [Service Connector internals](concept-service-connector-internals.md#configuration-naming-convention) article. - #### System-assigned managed identity
-| Default environment variable name | Description | Example value |
-|--|--|--|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<table-name>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<table-name>.documents.azure.com:443/` | #### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for Table using
#### User-assigned managed identity
-| Default environment variable name | Description | Example value |
-|--|--|--|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<table-name>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_CLIENTID | Your client secret ID | `<client-ID>` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_CLIENTID | Your client secret ID | `<client-ID>` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<table-name>.documents.azure.com:443/` | #### Sample code
Refer to the steps and code below to connect to Azure Cosmos DB for Table using
#### Service principal
-| Default environment variable name | Description | Example value |
-|--|--|--|
+| Default environment variable name | Description | Example value |
+| | | |
| AZURE_COSMOS_LISTCONNECTIONSTRINGURL | The URL to get the connection string | `https://management.azure.com/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.DocumentDB/databaseAccounts/<table-name>/listConnectionStrings?api-version=2021-04-15` |
-| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
-| AZURE_COSMOS_CLIENTID | Your client secret ID | `<client-ID>` |
-| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
-| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
+| AZURE_COSMOS_SCOPE | Your managed identity scope | `https://management.azure.com/.default` |
+| AZURE_COSMOS_CLIENTID | Your client secret ID | `<client-ID>` |
+| AZURE_COSMOS_CLIENTSECRET | Your client secret | `<client-secret>` |
+| AZURE_COSMOS_TENANTID | Your tenant ID | `<tenant-ID>` |
| AZURE_COSMOS_RESOURCEENDPOINT | Your resource endpoint | `https://<table-name>.documents.azure.com:443/` | #### Sample code
service-connector How To Integrate Event Hubs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-event-hubs.md
Last updated 11/03/2023 - # Integrate Azure Event Hubs with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Event Hubs to other cloud services using Service Connector. You might still be able to connect to Event Hubs in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create service connections.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | ||::|::|::|::|
Use the connection details below to connect compute services to Event Hubs. For
#### Other client types
-| Default environment variable name | Description | Sample value |
-|-|-||
+| Default environment variable name | Description | Sample value |
+| -- | -- | - |
| AZURE_EVENTHUB_FULLYQUALIFIEDNAMESPACE | Event Hubs namespace | `<Event-Hubs-namespace>.servicebus.windows.net` | #### Sample code
Refer to the steps and code below to connect to Azure Event Hubs using a system-
#### Other client types
-| Default environment variable name | Description | Sample value |
-|-|-||
+| Default environment variable name | Description | Sample value |
+| -- | -- | - |
| AZURE_EVENTHUB_FULLYQUALIFIEDNAMESPACE | Event Hubs namespace | `<Event-Hubs-namespace>.servicebus.windows.net` |
-| AZURE_EVENTHUB_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_EVENTHUB_CLIENTID | Your client ID | `<client-ID>` |
#### Sample code
Refer to the steps and code below to connect to Azure Event Hubs using a connect
#### Other client types
-| Default environment variable name | Description | Sample value |
-|-|-||
+| Default environment variable name | Description | Sample value |
+| -- | -- | - |
| AZURE_EVENTHUB_FULLYQUALIFIEDNAMESPACE | Event Hubs namespace | `<Event-Hubs-namespace>.servicebus.windows.net` |
-| AZURE_EVENTHUB_CLIENTID | Your client ID | `<client-ID>` |
-| AZURE_EVENTHUB_CLIENTSECRET | Your client secret | `<client-secret>` |
-| AZURE_EVENTHUB_TENANTID | Your tenant ID | `<tenant-id>` |
+| AZURE_EVENTHUB_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_EVENTHUB_CLIENTSECRET | Your client secret | `<client-secret>` |
+| AZURE_EVENTHUB_TENANTID | Your tenant ID | `<tenant-id>` |
#### Sample code Refer to the steps and code below to connect to Azure Event Hubs using a service principal.
service-connector How To Integrate Key Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-key-vault.md
Last updated 11/02/2023 - # Integrate Azure Key Vault with Service Connector > [!NOTE]
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | |--|--|--|-|--|
Use the connection details below to connect compute services to Azure Key Vault.
#### Other client types
-| Default environment variable name | Description | Example value |
-|--|-|--|
+| Default environment variable name | Description | Example value |
+| | -- | -- |
| AZURE_KEYVAULT_SCOPE | Your Azure RBAC scope | `https://management.azure.com/.default` | | AZURE_KEYVAULT_RESOURCEENDPOINT | Your Key Vault endpoint | `https://<vault-name>.vault.azure.net/` |
service-connector How To Integrate Mysql https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-mysql.md
Previously updated : 10/20/2023 Last updated : 10/25/2023 - # Integrate Azure Database for MySQL with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Database for MySQL - Flexible Server to other cloud services using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service. You can get the configurations from Azure App Service configurations.
+- Azure Functions. You can get the configurations from Azure Functions configurations.
- Azure Container Apps. You can get the configurations from Azure Container Apps environment variables. - Azure Spring Apps. You can get the configurations from Azure Spring Apps runtime. ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps, and Azure Spring Apps:
-
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
-||::|:--:|::|::|
-| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Go (go-sql-driver for mysql) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Node.js (mysql) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Python (mysql-connector-python) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Python-Django | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| PHP (MySQLi) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Ruby (mysql2) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+Supported authentication and clients for App Service, Azure Functions, Container Apps, and Azure Spring Apps:
+
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
+| - | :--: | :--: | :--: | :--: |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go (go-sql-driver for mysql) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Node.js (mysql) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python (mysql-connector-python) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python-Django | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| PHP (MySQLi) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Ruby (mysql2) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
> [!NOTE]
-> System-assigned managed identity, User-assigned managed identity and Service principal are only supported on Azure CLI.
+> System-assigned managed identity, User-assigned managed identity and Service principal are only supported on Azure CLI.
## Default environment variable names or application properties and sample code
Reference the connection details and sample code in following tables, according
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--||-|
-| `AZURE_MYSQL_CONNECTIONSTRING ` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required;` |
--
+| Default environment variable name | Description | Example value |
+| | - | |
+| `AZURE_MYSQL_CONNECTIONSTRING ` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required;` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--|||
-| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
--
+| Default environment variable name | Description | Example value |
+| | - | |
+| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
#### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-|||--|
-| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
-| `spring.datasource.url` | Spring Boot JDBC database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
-| `spring.datasource.username` | Database username | `<MySQL-DB-username>` |
-
+| Application properties | Description | Example value |
+| | - | -- |
+| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
+| `spring.datasource.url` | Spring Boot JDBC database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
+| `spring.datasource.username` | Database username | `<MySQL-DB-username>` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST ` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST ` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
#### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
#### [Go](#tab/go)
-| Default environment variable name | Description | Example value |
-|--||--|
-| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
-
+| Default environment variable name | Description | Example value |
+| | - | -- |
+| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
-| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_SSL` | SSL option | `true` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
+| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_SSL` | SSL option | `true` |
#### [PHP](#tab/php)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>` |
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>` |
#### [Ruby](#tab/ruby)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
Refer to the steps and code below to connect to Azure Database for MySQL using a
[!INCLUDE [code sample for mysql system mi](./includes/code-mysql-me-id.md)] ### User-assigned Managed Identity
-#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--||-|
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required;` |
+#### [.NET](#tab/dotnet)
+| Default environment variable name | Description | Example value |
+| | - | |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required;` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--|||
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
--
+| Default environment variable name | Description | Example value |
+| | - | |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
#### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-|--|--||
-| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication| `true` |
-| `spring.cloud.azure.credential.client-id` | Your client ID | `<identity-client-ID>` |
-| `spring.cloud.azure.credential.client-managed-identity-enabled` | Enable client managed identity | `true` |
-| `spring.datasource.url` | Database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
-| `spring.datasource.username` | Database username | `username` |
-
+| Application properties | Description | Example value |
+| -- | - | -- |
+| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
+| `spring.cloud.azure.credential.client-id` | Your client ID | `<identity-client-ID>` |
+| `spring.cloud.azure.credential.client-managed-identity-enabled` | Enable client managed identity | `true` |
+| `spring.datasource.url` | Database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
+| `spring.datasource.username` | Database username | `username` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `identity-client-ID` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `identity-client-ID` |
#### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER ` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER ` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
#### [Go](#tab/go)
-| Default environment variable name | Description | Example value |
-|--||--|
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
--
+| Default environment variable name | Description | Example value |
+| | - | -- |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
-| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_SSL` | SSL option | `true` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
+| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_SSL` | SSL option | `true` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
#### [PHP](#tab/php)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
#### [Ruby](#tab/ruby)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
Refer to the steps and code below to connect to Azure Database for MySQL using a
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--||-|
-| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;Password=<MySQL-DB-password>;SSL Mode=Required` |
+| Default environment variable name | Description | Example value |
+| | - | - |
+| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;Password=<MySQL-DB-password>;SSL Mode=Required` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--||-|
-| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>&password=<Uri.EscapeDataString(<MySQL-DB-password>)` |
-
+| Default environment variable name | Description | Example value |
+| | - | - |
+| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>&password=<Uri.EscapeDataString(<MySQL-DB-password>)` |
#### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-||-|--|
+| Application properties | Description | Example value |
+| | -- | -- |
| `spring.datasource.url` | Spring Boot JDBC database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
-| `spring.datasource.username` | Database username | `<MySQL-DB-username>` |
-| `spring.datasource.password` | Database password | `MySQL-DB-password` |
+| `spring.datasource.username` | Database username | `<MySQL-DB-username>` |
+| `spring.datasource.password` | Database password | `MySQL-DB-password` |
After created a `springboot` client type connection, Service Connector service will automatically add properties `spring.datasource.url`, `spring.datasource.username`, `spring.datasource.password`. So Spring boot application could add beans automatically. -- #### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
#### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
#### [Go](#tab/go)
-| Default environment variable name | Description | Example value |
-|--||--|
-| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>:<MySQL-DB-password>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
-
+| Default environment variable name | Description | Example value |
+| | - | - |
+| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>:<MySQL-DB-password>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
-| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
-| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_SSL` | SSL option | `true` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
+| `AZURE_MYSQL_PASSWORD` | Database password | `MySQL-DB-password` |
+| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_SSL` | SSL option | `true` |
#### [PHP](#tab/php)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>` |
-| `AZURE_MYSQL_PASSWORD` | Database password | `<MySQL-DB-password>` |
--
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>` |
+| `AZURE_MYSQL_PASSWORD` | Database password | `<MySQL-DB-password>` |
#### [Ruby](#tab/ruby)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_PASSWORD` | Database password | `<MySQL-DB-password>` |
-| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_PASSWORD` | Database password | `<MySQL-DB-password>` |
+| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
Refer to the steps and code below to connect to Azure Database for MySQL using a
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--||-|
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required` |
-
+| Default environment variable name | Description | Example value |
+| | - | -- |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | ADO.NET MySQL connection string | `Server=<MySQL-DB-name>.mysql.database.azure.com;Database=<MySQL-DB-name>;Port=3306;User Id=<MySQL-DBusername>;SSL Mode=Required` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|-|||
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
-
+| Default environment variable name | Description | Example value |
+| | - | |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | JDBC MySQL connection string | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required&user=<MySQL-DB-username>` |
#### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-|--|--||
-| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication| `true` |
-| `spring.cloud.azure.credential.client-id` | Your client ID | `<client-ID>` |
-| `spring.cloud.azure.credential.client-secret` | Your client secret | `<client-secret>` |
-| `spring.cloud.azure.credential.tenant-id` | Your tenant ID | `<tenant-ID>` |
-| `spring.datasource.url` | Database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
-| `spring.datasource.username` | Database username | `username` |
-
+| Application properties | Description | Example value |
+| | - | -- |
+| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
+| `spring.cloud.azure.credential.client-id` | Your client ID | `<client-ID>` |
+| `spring.cloud.azure.credential.client-secret` | Your client secret | `<client-secret>` |
+| `spring.cloud.azure.credential.tenant-id` | Your tenant ID | `<tenant-ID>` |
+| `spring.datasource.url` | Database URL | `jdbc:mysql://<MySQL-DB-name>.mysql.database.azure.com:3306/<MySQL-DB-name>?sslmode=required` |
+| `spring.datasource.username` | Database username | `username` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER ` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_NAME` | Database name | `MySQL-DB-name` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER ` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [Go](#tab/go)
-| Default environment variable name | Description | Example value |
-|--||--|
-| `AZURE_MYSQL_CLIENTID` | Your client ID |`<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>`
-| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
-
+| Default environment variable name | Description | Example value |
+| | - | -- |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| `AZURE_MYSQL_CONNECTIONSTRING` | Go-sql-driver connection string | `<MySQL-DB-username>@tcp(<server-host>:<port>)/<MySQL-DB-name>?tls=true` |
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_HOST ` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
-| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_MYSQL_PORT ` | Port number | `3306` |
-| `AZURE_MYSQL_SSL` | SSL option | `true` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_HOST ` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USER` | Database username | `MySQL-DB-username` |
+| `AZURE_MYSQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_MYSQL_PORT ` | Port number | `3306` |
+| `AZURE_MYSQL_SSL` | SSL option | `true` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [PHP](#tab/php)
-| Default environment variable name | Description | Example value |
-|-|--|--|
-| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_PORT` | Port number | `3306` |
-| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_DBNAME` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_PORT` | Port number | `3306` |
+| `AZURE_MYSQL_FLAG` | SSL or other flags | `MySQL_CLIENT_SSL` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [Ruby](#tab/ruby)
-| Default environment variable name | Description | Example value |
-|-|-|--|
-| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
-| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
-| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
-| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
-| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_MYSQL_CLIENTSECRET` | Your client secret| `<client-secret>` |
-| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_MYSQL_DATABASE` | Database name | `<MySQL-DB-name>` |
+| `AZURE_MYSQL_HOST` | Database host URL | `<MySQL-DB-name>.mysql.database.azure.com` |
+| `AZURE_MYSQL_USERNAME` | Database username | `<MySQL-DB-username>@<MySQL-DB-name>` |
+| `AZURE_MYSQL_SSLMODE` | SSL option | `required` |
+| `AZURE_MYSQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_MYSQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_MYSQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
Refer to the steps and code below to connect to Azure Database for MySQL using a
Follow the documentations to learn more about Service Connector. > [!div class="nextstepaction"]
-> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
+> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
service-connector How To Integrate Postgres https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-postgres.md
Previously updated : 10/20/2023 Last updated : 10/25/2023 - # Integrate Azure Database for PostgreSQL with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Database for PostgreSQL to other cloud services using Service Connector. You might still be able to connect to Azure Database for PostgreSQL in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure App Configuration - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps, and Azure Spring Apps:
-
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
-||::|:--:|::|::|
-| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Go (pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Node.js (pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| PHP (native) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Python (psycopg2) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Python-Django | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Ruby (ruby-pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+Supported authentication and clients for App Service, Azure Functions, Container Apps, and Azure Spring Apps:
+
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
+| - | :--: | :--: | :--: | :--: |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go (pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot (JDBC) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Node.js (pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| PHP (native) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python (psycopg2) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python-Django | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Ruby (ruby-pg) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
> [!NOTE]
-> System-assigned managed identity, User-assigned managed identity and Service principal are only supported on Azure CLI.
+> System-assigned managed identity, User-assigned managed identity and Service principal are only supported on Azure CLI.
## Default environment variable names or application properties and sample code
Reference the connection details and sample code in the following tables, accord
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-||--||
-| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
-
+| Default environment variable name | Description | Example value |
+| - | | |
+| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|-|--|--|
+| Default environment variable name | Description | Example value |
+| - | | - |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | JDBC PostgreSQL connection string | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require&user=<username>` | #### [SpringBoot](#tab/springBoot)
Reference the connection details and sample code in the following tables, accord
| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` | | `spring.datasource.username` | Database username | `username` | - #### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|--|-||
+| Default environment variable name | Description | Example value |
+| - | -- | |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | psycopg2 connection string | `dbname=<database-name> host=<PostgreSQL-server-name>.postgres.database.azure.com port=5432 sslmode=require user=<username>` | #### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
#### [Go](#tab/go)
-| Default environment variable name | Description | Example value |
-|-|||
-| `AZURE_POSTGRESQL_CONNECTIONSTRING` | Go postgres connection string | `host=<PostgreSQL-server-name>.postgres.database.azure.com dbname=<database-name> sslmode=require user=<username>`|
+| Default environment variable name | Description | Example value |
+| - | -- | -- |
+| `AZURE_POSTGRESQL_CONNECTIONSTRING` | Go postgres connection string | `host=<PostgreSQL-server-name>.postgres.database.azure.com dbname=<database-name> sslmode=require user=<username>` |
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
-| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
+| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
#### [PHP](#tab/php)
-| Default environment variable name | Description | Example value |
-|--|||
+| Default environment variable name | Description | Example value |
+| - | - | |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | PHP native postgres connection string | `host=<PostgreSQL-server-name>.postgres.database.azure.com port=5432 dbname=<database-name> sslmode=require user=<username>` | #### [Ruby](#tab/ruby)
-| Default environment variable name | Description | Example value |
-|--||-|
+| Default environment variable name | Description | Example value |
+| - | - | |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | Ruby postgres connection string | `host=<your-postgres-server-name>.postgres.database.azure.com port=5432 dbname=<database-name> sslmode=require user=<username>` |
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-||--||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
+| Default environment variable name | Description | Example value |
+| - | | |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--|--||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
-| `AZURE_POSTGRESQL_CONNECTIONSTRING` | JDBC PostgreSQL connection string | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require&user=<username>` |
+| Default environment variable name | Description | Example value |
+| - | | - |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| `AZURE_POSTGRESQL_CONNECTIONSTRING` | JDBC PostgreSQL connection string | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require&user=<username>` |
#### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-||-||
-| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
-| `spring.cloud.azure.credential.client-id` | Your client ID | `<identity-client-ID>` |
-| `spring.cloud.azure.credential.client-managed-identity-enabled`| Enable client managed identity | `true` |
-| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
-| `spring.datasource.username` | Database username | `username` |
+| Application properties | Description | Example value |
+| -- | - | |
+| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
+| `spring.cloud.azure.credential.client-id` | Your client ID | `<identity-client-ID>` |
+| `spring.cloud.azure.credential.client-managed-identity-enabled` | Enable client managed identity | `true` |
+| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
+| `spring.datasource.username` | Database username | `username` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|--|-||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| Default environment variable name | Description | Example value |
+| - | -- | |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | psycopg2 connection string | `dbname=<database-name> host=<PostgreSQL-server-name>.postgres.database.azure.com port=5432 sslmode=require user=<username>` | #### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<<identity-client-ID>>` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<<identity-client-ID>>` |
#### [Go](#tab/go)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
-| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
+| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<identity-client-ID>` |
#### [PHP](#tab/php)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--|--||
-| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
+| Default environment variable name | Description | Example value |
+| - | | |
+| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` |
#### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--|--|-|
+| Default environment variable name | Description | Example value |
+| - | | |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | JDBC PostgreSQL connection string | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require&user=<username>&password=<password>` | #### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-||-||
-| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
-| `spring.datasource.username` | Database username | `<username>` |
-| `spring.datasource.password` | Database password | `<password>` |
+| Application properties | Description | Example value |
+| | -- | |
+| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
+| `spring.datasource.username` | Database username | `<username>` |
+| `spring.datasource.password` | Database password | `<password>` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|--|-||
+| Default environment variable name | Description | Example value |
+| - | -- | -- |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | psycopg2 connection string | `dbname=<database-name> host=<PostgreSQL-server-name>.postgres.database.azure.com port=5432 sslmode=require user=<username> password=<password>` | #### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_PASSWORD` | Database password | `<database-password>` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_PASSWORD` | Database password | `<database-password>` |
#### [Go](#tab/go)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_PASSWORD` | Database password | `<password>` |
-| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
-| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_PASSWORD` | Database password | `<password>` |
+| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
+| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
#### [PHP](#tab/php)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|-|--||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| - | | |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | .NET PostgreSQL connection string | `Server=<PostgreSQL-server-name>.postgres.database.azure.com;Database=<database-name>;Port=5432;Ssl Mode=Require;User Id=<username>;` | - #### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|-|--||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| - | | - |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | JDBC PostgreSQL connection string | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require&user=<username>` | #### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-||-||
-| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
-| `spring.cloud.azure.credential.client-id` | Your client ID | `<client-ID>` |
-| `spring.cloud.azure.credential.client-secret` | Your client secret | `<client-secret>` |
-| `spring.cloud.azure.credential.tenant-id` | Your tenant ID | `<tenant-ID>` |
-| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
-| `spring.datasource.username` | Database username | `username` |
+| Application properties | Description | Example value |
+| | - | |
+| `spring.datasource.azure.passwordless-enabled` | Enable passwordless authentication | `true` |
+| `spring.cloud.azure.credential.client-id` | Your client ID | `<client-ID>` |
+| `spring.cloud.azure.credential.client-secret` | Your client secret | `<client-secret>` |
+| `spring.cloud.azure.credential.tenant-id` | Your tenant ID | `<tenant-ID>` |
+| `spring.datasource.url` | Database URL | `jdbc:postgresql://<PostgreSQL-server-name>.postgres.database.azure.com:5432/<database-name>?sslmode=require` |
+| `spring.datasource.username` | Database username | `username` |
#### [Python](#tab/python)
-| Default environment variable name | Description | Example value |
-|--|-||
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client SECRET | `<client-secret>` |
-| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
+| Default environment variable name | Description | Example value |
+| - | -- | |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client SECRET | `<client-secret>` |
+| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
| `AZURE_POSTGRESQL_CONNECTIONSTRING` | psycopg2 connection string | `dbname=<database-name> host=<PostgreSQL-server-name>.postgres.database.azure.com port=5432 sslmode=require user=<username>` | #### [Django](#tab/django)
-| Default environment variable name | Description | Example value |
-|--|-|--|
-| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client SECRET| `<client-secret>` |
-| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_POSTGRESQL_NAME` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client SECRET | `<client-secret>` |
+| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [Go](#tab/go)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|--|--|--|
-| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
-| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
-| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
-| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
-| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
-| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
-| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
-| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
-
+| Default environment variable name | Description | Example value |
+| | | -- |
+| `AZURE_POSTGRESQL_HOST` | Database host URL | `<PostgreSQL-server-name>.postgres.database.azure.com` |
+| `AZURE_POSTGRESQL_USER` | Database username | `<username>` |
+| `AZURE_POSTGRESQL_DATABASE` | Database name | `<database-name>` |
+| `AZURE_POSTGRESQL_PORT` | Port number | `5432` |
+| `AZURE_POSTGRESQL_SSL` | SSL option | `true` |
+| `AZURE_POSTGRESQL_CLIENTID` | Your client ID | `<client-ID>` |
+| `AZURE_POSTGRESQL_CLIENTSECRET` | Your client secret | `<client-secret>` |
+| `AZURE_POSTGRESQL_TENANTID` | Your tenant ID | `<tenant-ID>` |
#### [PHP](#tab/php)
Refer to the steps and code below to connect to Azure Database for PostgreSQL us
Refer to the steps and code below to connect to Azure Database for PostgreSQL using a service principal. [!INCLUDE [code sample for postgresql service principal](./includes/code-postgres-me-id.md)] - ## Next steps Follow the tutorials listed below to learn more about Service Connector. > [!div class="nextstepaction"]
-> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
+> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
service-connector How To Integrate Redis Cache https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-redis-cache.md
Last updated 10/31/2023 - # Integrate Azure Cache for Redis with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Cache for Redis to other cloud services using Service Connector. You might still be able to connect to Azure Cache for Redis in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported Authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|-|--|--|-|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | -- | | - | -- |
| .NET | | | ![yes icon](./media/green-check.png) | | | Go | | | ![yes icon](./media/green-check.png) | | | Java | | | ![yes icon](./media/green-check.png) | |
Use the environment variable names and application properties listed below to co
#### [.NET](#tab/dotnet)
-| Default environment variable name | Description | Example value |
-|--|-|-|
+| Default environment variable name | Description | Example value |
+| | -- | - |
| AZURE_REDIS_CONNECTIONSTRING | StackExchange. Redis connection string | `<redis-server-name>.redis.cache.windows.net:6380,password=<redis-key>,ssl=True,defaultDatabase=0` | #### [Java](#tab/java)
-| Default environment variable name | Description | Example value |
-|--|-|-|
+| Default environment variable name | Description | Example value |
+| | -- | - |
| AZURE_REDIS_CONNECTIONSTRING | Jedis connection string | `rediss://:<redis-key>@<redis-server-name>.redis.cache.windows.net:6380/0` | #### [SpringBoot](#tab/springBoot)
-| Application properties | Description | Example value |
-||-|--|
+| Application properties | Description | Example value |
+| - | -- | -- |
| spring.redis.host | Redis host | `<redis-server-name>.redis.cache.windows.net` | | spring.redis.port | Redis port | `6380` | | spring.redis.database | Redis database | `0` |
Use the environment variable names and application properties listed below to co
#### [NodeJS](#tab/nodejs)
-| Default environment variable name | Description | Example value |
-|--||-|
+| Default environment variable name | Description | Example value |
+| | - | - |
| AZURE_REDIS_CONNECTIONSTRING | node-redis connection string | `rediss://:<redis-key>@<redis-server-name>.redis.cache.windows.net:6380/0` |
service-connector How To Integrate Service Bus https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-service-bus.md
Previously updated : 08/11/2022 Last updated : 10/25/2023 - # Integrate Service Bus with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Service Bus to other cloud services using Service Connector. You might still be able to connect to Service Bus in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create service connections.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal | |--|::|::|::|::|
Use the connection details below to connect compute services to Service Bus. For
#### Other client types
-| Default environment variable name | Description | Sample value |
-| -- | -- | -- |
+| Default environment variable name | Description | Sample value |
+| - | | -- |
| AZURE_SERVICEBUS_FULLYQUALIFIEDNAMESPACE | Service Bus namespace | `<Service-Bus-namespace>.servicebus.windows.net` | #### Sample code
Refer to the steps and code below to connect to Service Bus using a system-assig
#### Other client types
-| Default environment variable name | Description | Sample value |
-| - | -| - |
+| Default environment variable name | Description | Sample value |
+| - | | -- |
| AZURE_SERVICEBUS_FULLYQUALIFIEDNAMESPACE | Service Bus namespace | `<Service-Bus-namespace>.servicebus.windows.net` |
-| AZURE_SERVICEBUS_CLIENTID | Your client ID | `<client-ID>` |
+| AZURE_SERVICEBUS_CLIENTID | Your client ID | `<client-ID>` |
#### Sample code Refer to the steps and code below to connect to Service Bus using a user-assigned managed identity.
Refer to the steps and code below to connect to Service Bus using a user-assigne
#### SpringBoot client type > [!div class="mx-tdBreakAll"]
-> | Default environment variable name | Description | Sample value |
-> | -- | -- | |
+>
+> | Default environment variable name | Description | Sample value |
+> | -- | -- | |
> | spring.cloud.azure.servicebus.connection-string | Service Bus connection string | `Endpoint=sb://<Service-Bus-namespace>.servicebus.windows.net/;SharedAccessKeyName=<access-key-name>;SharedAccessKey=<access-key-value>` | #### Other client types
Refer to the steps and code below to connect to Service Bus using a user-assigne
Refer to the steps and code below to connect to Service Bus using a connection string. [!INCLUDE [code sample for service bus](./includes/code-servicebus-secret.md)] + ### Service principal #### SpringBoot client type
-| Default environment variable name | Description | Sample value |
-|--|--|--|
+| Default environment variable name | Description | Sample value |
+| | | -- |
| spring.cloud.azure.servicebus.namespace | Service Bus namespace | `<Service-Bus-namespace>.servicebus.windows.net` | | spring.cloud.azure.client-id | Your client ID | `<client-ID>` | | spring.cloud.azure.tenant-id | Your client secret | `<client-secret>` |
service-connector How To Integrate Signalr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-signalr.md
- kr2b-contr-experiment - event-tier1-build-2022 - # Integrate Azure SignalR Service with Service Connector This article supported authentication methods and clients, and shows sample code you can use to connect Azure SignalR Service to other cloud services using Service Connector. This article also shows default environment variable name and value (or Spring Boot configuration) that you get when you create the service connection.
This article supported authentication methods and clients, and shows sample code
## Supported compute service - Azure App Service
+- Azure Functions
- Azure Container Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service and Container Apps:
+Supported authentication and clients for App Service, Azure Functions and Container Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal | |-|--|--|--|--|
Refer to the steps and code below to connect to Azure SignalR Service using a co
### Service Principal
- | Default environment variable name | Description | Example value |
- | | | |
- | AZURE_SIGNALR_CONNECTIONSTRING | SignalR Service connection string with Service Principal | `Endpoint=https://<SignalR-name>.service.signalr.net;AuthType=aad;ClientId=<client-ID>;ClientSecret=<client-secret>;TenantId=<tenant-ID>;Version=1.0;` |
+| Default environment variable name | Description | Example value |
+| | -- | -- |
+| AZURE_SIGNALR_CONNECTIONSTRING | SignalR Service connection string with Service Principal | `Endpoint=https://<SignalR-name>.service.signalr.net;AuthType=aad;ClientId=<client-ID>;ClientSecret=<client-secret>;TenantId=<tenant-ID>;Version=1.0;` |
#### Sample code Refer to the steps and code below to connect to Azure SignalR Service using a service principal.
service-connector How To Integrate Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-sql-database.md
Last updated 10/26/2023 - # Integrate Azure SQL Database with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect compute services to Azure SQL Database using Service Connector. You might still be able to connect to Azure SQL Database using other methods. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and clients
-Supported authentication and clients for App Service, Container Apps, and Azure Spring Apps:
-
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
-|--|::|:--:|::|::|
-| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Go | | | ![yes icon](./media/green-check.png) | |
-| Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| PHP | | | ![yes icon](./media/green-check.png) | |
-| Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Python - Django | | | ![yes icon](./media/green-check.png) | |
-| Ruby | | | ![yes icon](./media/green-check.png) | |
-| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+Supported authentication and clients for App Service, Azure Functions, Container Apps, and Azure Spring Apps:
+
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal |
+| | :--: | :--: | :--: | :--: |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go | | | ![yes icon](./media/green-check.png) | |
+| Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| PHP | | | ![yes icon](./media/green-check.png) | |
+| Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python - Django | | | ![yes icon](./media/green-check.png) | |
+| Ruby | | | ![yes icon](./media/green-check.png) | |
+| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
> [!NOTE]
-> System-assigned managed identity,User-assigned managed identity and Service principal are only supported on Azure CLI.
+> System-assigned managed identity,User-assigned managed identity and Service principal are only supported on Azure CLI.
## Default environment variable names or application properties and sample code
Refer to the steps and code below to connect to Azure SQL Database using a user-
#### [SpringBoot](#tab/sql-secret-spring) > [!div class="mx-tdBreakAll"]
-> | Default environment variable name | Description | Sample value |
-> |--|-|-|
-> | `spring.datasource.url` | Azure SQL Database datasource URL | `jdbc:sqlserver://<sql-server>.database.windows.net:1433;databaseName=<sql-db>;` |
-> | `spring.datasource.username` | Azure SQL Database datasource username | `<sql-user>` |
-> | `spring.datasource.password` | Azure SQL Database datasource password | `<sql-password>` |
+>
+> | Default environment variable name | Description | Sample value |
+> | | -- | - |
+> | `spring.datasource.url` | Azure SQL Database datasource URL | `jdbc:sqlserver://<sql-server>.database.windows.net:1433;databaseName=<sql-db>;` |
+> | `spring.datasource.username` | Azure SQL Database datasource username | `<sql-user>` |
+> | `spring.datasource.password` | Azure SQL Database datasource password | `<sql-password>` |
#### [Python](#tab/sql-secret-python)
Refer to the steps and code below to connect to Azure SQL Database using a conne
Refer to the steps and code below to connect to Azure SQL Database using a service principal. [!INCLUDE [code sample for sql](./includes/code-sql-me-id.md)] + ## Next steps Follow the tutorial listed below to learn more about Service Connector.
service-connector How To Integrate Storage Blob https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-blob.md
Previously updated : 10/20/2023 Last updated : 10/25/2023 - # Integrate Azure Blob Storage with Service Connector This page shows the supported authentication types, client types and sample code of Azure Blob Storage using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows the supported authentication types, client types and sample code
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
-
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|--|--|--|--|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Java - Spring Boot | | | ![yes icon](./media/green-check.png) |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Go | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
| None | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | - ## Default environment variable names or application properties and sample code
Supported authentication and clients for App Service, Container Apps and Azure S
Reference the connection details and sample code in the following tables, according to your connection's authentication type and client type, to connect compute services to Azure Blob Storage. You can learn more about [Service Connector environment variable naming convention](concept-service-connector-internals.md). ### System-assigned managed identity+ For default environment variables and sample code of other authentication type, please choose from beginning of the documentation.
-| Default environment variable name | Description | Example value |
-||--||
+| Default environment variable name | Description | Example value |
+| - | | |
| AZURE_STORAGEBLOB_RESOURCEENDPOINT | Blob Storage endpoint | `https://<storage-account-name>.blob.core.windows.net/` | - #### Sample code Refer to the steps and code below to connect to Azure Blob Storage using a system-assigned managed identity.
Refer to the steps and code below to connect to Azure Blob Storage using a syste
For default environment variables and sample code of other authentication type, please choose from beginning of the documentation.
-| Default environment variable name | Description | Example value |
-||--||
+| Default environment variable name | Description | Example value |
+| - | | |
| AZURE_STORAGEBLOB_RESOURCEENDPOINT | Blob Storage endpoint | `https://<storage-account-name>.blob.core.windows.net/` | | AZURE_STORAGEBLOB_CLIENTID | Your client ID | `<client-ID>` |
For default environment variables and sample code of other authentication type,
#### SpringBoot client type
-| Application properties | Description | Example value |
-|--|--||
+| Application properties | Description | Example value |
+| | | |
| azure.storage.account-name | Your Blob storage-account-name | `<storage-account-name>` |
-| azure.storage.account-key | Your Blob Storage account key | `<account-key>` |
+| azure.storage.account-key | Your Blob Storage account key | `<account-key>` |
| azure.storage.blob-endpoint | Your Blob Storage endpoint | `https://<storage-account-name>.blob.core.windows.net/` | - #### Other client types | Default environment variable name | Description | Example value | ||--|| | AZURE_STORAGEBLOB_CONNECTIONSTRING | Blob Storage connection string | `DefaultEndpointsProtocol=https;AccountName=<account name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net` |-
+| Default environment variable name | Description | Example value |
+| - | | - |
+| AZURE_STORAGEBLOB_CONNECTIONSTRING | Blob Storage connection string | `DefaultEndpointsProtocol=https;AccountName=<account name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net` |
#### Sample code
Refer to the steps and code below to connect to Azure Blob Storage using a conne
For default environment variables and sample code of other authentication type, please choose from beginning of the documentation.
-| Default environment variable name | Description | Example value |
-||--||
+| Default environment variable name | Description | Example value |
+| - | | |
| AZURE_STORAGEBLOB_RESOURCEENDPOINT | Blob Storage endpoint | `https://<storage-account-name>.blob.core.windows.net/` | | AZURE_STORAGEBLOB_CLIENTID | Your client ID | `<client-ID>` | | AZURE_STORAGEBLOB_CLIENTSECRET | Your client secret | `<client-secret>` |
service-connector How To Integrate Storage File https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-file.md
Last updated 11/02/2023 - # Integrate Azure Files with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure File Storage to other cloud services using Service Connector. You might still be able to connect to Azure File Storage in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client Type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|-|--|--|-|
+| Client Type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | -- | | - | -- |
| .NET | | | ![yes icon](./media/green-check.png) | | | Java | | | ![yes icon](./media/green-check.png) | | | Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
Use the connection details below to connect compute services to Azure File Stora
#### SpringBoot client type
-| Application properties | Description | Example value |
-|--|||
+| Application properties | Description | Example value |
+| | - | |
| azure.storage.account-name | File storage account name | `<storage-account-name>` | | azure.storage.account-key | File storage account key | `<storage-account-key>` | | azure.storage.file-endpoint | File storage endpoint | `https://<storage-account-name>.file.core.windows.net/` |
service-connector How To Integrate Storage Queue https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-queue.md
Previously updated : 08/11/2022 Last updated : 10/25/2023 - # Integrate Azure Queue Storage with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Queue Storage to other cloud services using Service Connector. You might still be able to connect to Azure Queue Storage in other programming languages without using Service Connector. This page also shows default environment variable names and values (or Spring Boot configuration) you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps ## Supported authentication types and client types
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|--|--|--|--|--|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| | - | - | - | - |
| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-|Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
+| Java - Spring Boot | | | ![yes icon](./media/green-check.png) | |
| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
Use the connection details below to connect compute services to Queue Storage. F
### System-assigned managed identity
-| Default environment variable name | Description | Example value |
-|-||-|
+| Default environment variable name | Description | Example value |
+| -- | - | - |
| AZURE_STORAGEQUEUE_RESOURCEENDPOINT | Queue storage endpoint | `https://<storage-account-name>.queue.core.windows.net/` | #### Sample code
Refer to the steps and code below to connect to Azure Queue Storage using a syst
### User-assigned managed identity
-| Default environment variable name | Description | Example value |
-|-||-|
+| Default environment variable name | Description | Example value |
+| -- | - | - |
| AZURE_STORAGEQUEUE_RESOURCEENDPOINT | Queue storage endpoint | `https://<storage-account-name>.queue.core.windows.net/` | | AZURE_STORAGEQUEUE_CLIENTID | Your client ID | `<client-ID>` |
Refer to the steps and code below to connect to Azure Queue Storage using a conn
### Service principal
-| Default environment variable name | Description | Example value |
-|-||-|
+| Default environment variable name | Description | Example value |
+| -- | - | - |
| AZURE_STORAGEQUEUE_RESOURCEENDPOINT | Queue storage endpoint | `https://<storage-account-name>.queue.core.windows.net/` | | AZURE_STORAGEQUEUE_CLIENTID | Your client ID | `<client-ID>` | | AZURE_STORAGEQUEUE_CLIENTSECRET | Your client secret | `<client-secret>` |
service-connector How To Integrate Storage Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-storage-table.md
Last updated 10/24/2023 - # Integrate Azure Table Storage with Service Connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Table Storage to other cloud services using Service Connector. You might still be able to connect to Azure Table Storage in other programming languages without using Service Connector. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
-| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
-|-|-|--|--|-|
-| .NET |![yes icon](./media/green-check.png)|![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) |![yes icon](./media/green-check.png)|
-| Java |![yes icon](./media/green-check.png)|![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) |![yes icon](./media/green-check.png)|
-| Node.js |![yes icon](./media/green-check.png)|![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) |![yes icon](./media/green-check.png)|
-| Python |![yes icon](./media/green-check.png)|![yes icon](./media/green-check.png)| ![yes icon](./media/green-check.png) |![yes icon](./media/green-check.png)|
+| Client type | System-assigned managed identity | User-assigned managed identity | Secret / connection string | Service principal |
+| -- | - | - | - | - |
+| .NET | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Java | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Node.js | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Python | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
## Default environment variable names or application properties and sample code
service-connector How To Integrate Web Pubsub https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-integrate-web-pubsub.md
Last updated 10/26/2023 - # Integrate Azure Web PubSub with service connector This page shows supported authentication methods and clients, and shows sample code you can use to connect Azure Web PubSub to other cloud services using Service Connector. You might still be able to connect to App Configuration using other methods. This page also shows default environment variable names and values you get when you create the service connection.
This page shows supported authentication methods and clients, and shows sample c
## Supported compute services - Azure App Service
+- Azure Functions
- Azure Container Apps - Azure Spring Apps
-Supported authentication and clients for App Service, Container Apps and Azure Spring Apps:
+Supported authentication and clients for App Service, Azure Functions, Container Apps and Azure Spring Apps:
| Client type | System-assigned managed identity | User-assigned managed identity | Secret/connection string | Service principal | |-|::|::|::|::|
Use the environment variable names and application properties listed below, acco
### System-assigned managed identity
-| Default environment variable name | Description | Sample value |
-| | | - |
+| Default environment variable name | Description | Sample value |
+| | | |
| AZURE_WEBPUBSUB_HOST | Azure Web PubSub host | `<name>.webpubsub.azure.com` | #### Sample code
Refer to the steps and code below to connect to Azure Web PubSub using a system-
### User-assigned managed identity
-| Default environment variable name | Description | Sample value |
-| | | |
-| AZURE_WEBPUBSUB_HOST | Azure Web PubSub host | `<name>.webpubsub.azure.com` |
-| AZURE_WEBPUBSUB_CLIENTID | Azure Web PubSub client ID | `<client-id>` |
+| Default environment variable name | Description | Sample value |
+| | -- | |
+| AZURE_WEBPUBSUB_HOST | Azure Web PubSub host | `<name>.webpubsub.azure.com` |
+| AZURE_WEBPUBSUB_CLIENTID | Azure Web PubSub client ID | `<client-id>` |
#### Sample code
Refer to the steps and code below to connect to Azure Web PubSub using a user-as
### Connection string > [!div class="mx-tdBreakAll"]
-> | Default environment variable name | Description | Sample value |
-> | | --| -|
-> | AZURE_WEBPUBSUB_CONNECTIONSTRING | Azure Web PubSub connection string | `Endpoint=https://<name>.webpubsub.azure.com;AccessKey=<access-key>;Version=1.0;` |
+>
+> | Default environment variable name | Description | Sample value |
+> | | - | -- |
+> | AZURE_WEBPUBSUB_CONNECTIONSTRING | Azure Web PubSub connection string | `Endpoint=https://<name>.webpubsub.azure.com;AccessKey=<access-key>;Version=1.0;` |
#### Sample code
Refer to the steps and code below to connect to Azure Web PubSub using a connect
### Service principal
-| Default environment variable name | Description | Sample value |
-| | -| --|
-| AZURE_WEBPUBSUB_HOST | Azure Web PubSub host | `<name>.webpubsub.azure.com` |
-| AZURE_WEBPUBSUB_CLIENTID | Azure Web PubSub client ID | `<client-id>` |
-| AZURE_WEBPUBSUB_CLIENTSECRET | Azure Web PubSub client secret | `<client-secret>` |
-| AZURE_WEBPUBSUB_TENANTID | Azure Web PubSub tenant ID | `<tenant-id>` |
+| Default environment variable name | Description | Sample value |
+| | | |
+| AZURE_WEBPUBSUB_HOST | Azure Web PubSub host | `<name>.webpubsub.azure.com` |
+| AZURE_WEBPUBSUB_CLIENTID | Azure Web PubSub client ID | `<client-id>` |
+| AZURE_WEBPUBSUB_CLIENTSECRET | Azure Web PubSub client secret | `<client-secret>` |
+| AZURE_WEBPUBSUB_TENANTID | Azure Web PubSub tenant ID | `<tenant-id>` |
#### Sample code
service-connector How To Manage Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-manage-authentication.md
description: Learn how to select and manage authentication parameters in Service
Previously updated : 03/07/2023 Last updated : 10/25/2023 - # Manage authentication within Service Connector In this guide, learn about the different authentication options available in Service Connector, and how to customize environment variables.
In this guide, learn about the different authentication options available in Ser
## Start creating a new connection 1. Within your App Service, Container Apps or Azure Spring Apps instance, open Service Connector and fill out the form in the **Basics** tab with the required information about your compute and target services.
-1. Select **Next : Authentication**.
+2. Select **Next : Authentication**.
## Select an authentication option
Select one of the four different authentication options offered by Service Conne
Service Connector offers the following authentication options:
-| Target resource | System assigned managed identity | User assigned managed identity | Connection string | Service principal |
-|-|--|--|--|--|
+| Target resource | System assigned managed identity | User assigned managed identity | Connection string | Service principal |
+| -- | - | - | - | - |
| App Configuration | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Azure SQL | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
-| Azure Cache for Redis | | | ![yes icon](./media/green-check.png) | |
-| Azure Cache for Redis Enterprise | | | ![yes icon](./media/green-check.png) | |
-| Azure Cosmos DB - Cassandra | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
+| Azure SQL | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
+| Azure Cache for Redis | | | ![yes icon](./media/green-check.png) | |
+| Azure Cache for Redis Enterprise | | | ![yes icon](./media/green-check.png) | |
+| Azure Cosmos DB - Cassandra | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
| Azure Cosmos - Gremlin | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Azure Cosmos DB for MongoDB | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Azure Cosmos Table | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Azure Cosmos - SQL | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | Blob Storage | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Confluent Cloud | | | ![yes icon](./media/green-check.png) | |
+| Confluent Cloud | | | ![yes icon](./media/green-check.png) | |
| Event Hubs | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Keyvault | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) |
-| MySQL single server | ![yes icon](./media/green-check.png) | | | |
-| MySQL flexible server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
-| Postgres single server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
-| Postgres, flexible server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
+| Keyvault | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) |
+| MySQL single server | ![yes icon](./media/green-check.png) | | | |
+| MySQL flexible server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
+| Postgres single server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
+| Postgres, flexible server | ![yes icon](./media/green-check.png) | | ![yes icon](./media/green-check.png) | |
| Storage Queue | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
-| Storage File | | | ![yes icon](./media/green-check.png) | |
-| Storage Table | | | ![yes icon](./media/green-check.png) | |
+| Storage File | | | ![yes icon](./media/green-check.png) | |
+| Storage Table | | | ![yes icon](./media/green-check.png) | |
| Service Bus | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | SignalR | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | | WebPub Sub | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) | ![yes icon](./media/green-check.png) |
Service Connector offers the following authentication options:
When using a system-assigned managed identity, optionally review or update its authentication configuration by following these steps: 1. Select **Advanced** to display more options.
-1. Under **Role**, review the default role selected for your source service or choose another one from the list.
-1. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties. It varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
-1. Select **Done** to confirm.
+2. Under **Role**, review the default role selected for your source service or choose another one from the list.
+3. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties. It varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
+4. Select **Done** to confirm.
- :::image type="content" source="./media/manage-authentication/managed-identity-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration for a system-assigned managed identity.":::
+ :::image type="content" source="./media/manage-authentication/managed-identity-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration for a system-assigned managed identity.":::
## [User assigned managed identity](#tab/user-assigned-identity) When using a user-assigned managed identity, review or edit its authentication settings by following these steps:
-1. Under **Subscription**, select the Azure subscription that contains your user-assigned managed identity.
-1. Under **User assigned managed identity**, select the managed identity you want to use.
+1. Under **Subscription**, select the Azure subscription that contains your user-assigned managed identity.
+2. Under **User assigned managed identity**, select the managed identity you want to use.
- :::image type="content" source="./media/manage-authentication/user-assigned-identity-basic.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration for a user-assigned managed identity.":::
+ :::image type="content" source="./media/manage-authentication/user-assigned-identity-basic.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration for a user-assigned managed identity.":::
+3. Optionally select **Advanced** to display more options.
-1. Optionally select **Advanced** to display more options.
1. Under **Role**, review the default role selected for your source service or choose another one from the list.
- 1. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties and varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
- 1. Select **Done** to confirm.
+ 2. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties and varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
+ 3. Select **Done** to confirm.
- :::image type="content" source="./media/manage-authentication/user-assigned-identity-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration for a user-assigned managed identity.":::
+ :::image type="content" source="./media/manage-authentication/user-assigned-identity-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration for a user-assigned managed identity.":::
## [Connection string](#tab/connection-string)
When using a connection string, review or edit its authentication settings by fo
1. Optionally select **Store Secret in Key Vault** to save your connection credentials in Azure Key Vault. This option lets you select an existing Key Vault connection from a drop-down list or create a new connection to a new or an existing Key Vault.
- :::image type="content" source="./media/manage-authentication/connection-string-basic-with-key-vault.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration to authenticate with a connection-string.":::
+ :::image type="content" source="./media/manage-authentication/connection-string-basic-with-key-vault.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration to authenticate with a connection-string.":::
+2. Optionally select **Advanced** to display more options.
-1. Optionally select **Advanced** to display more options.
1. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties and varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
- 1. Select **Done** to confirm.
+ 2. Select **Done** to confirm.
- :::image type="content" source="./media/manage-authentication/connection-string-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration to authenticate with a connection-string.":::
+ :::image type="content" source="./media/manage-authentication/connection-string-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration to authenticate with a connection-string.":::
## [Service principal](#tab/service-principal) When connecting Azure services using a service principal, review or edit authentication settings by following these steps: 1. Choose a service principal by entering an object ID or name and selecting your service principal.
-1. Under **Secret**, enter the secret of the service principal.
-1. Optionally select **Store Secret in Key Vault** to save your connection credentials in Azure Key Vault. This option lets you select an existing Key Vault connection from a drop-down list or create a new connection to a new or an existing Key Vault.
+2. Under **Secret**, enter the secret of the service principal.
+3. Optionally select **Store Secret in Key Vault** to save your connection credentials in Azure Key Vault. This option lets you select an existing Key Vault connection from a drop-down list or create a new connection to a new or an existing Key Vault.
- :::image type="content" source="./media/manage-authentication/service-principal-basic-with-key-vault.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration to authenticate with a service principal.":::
+ :::image type="content" source="./media/manage-authentication/service-principal-basic-with-key-vault.png" alt-text="Screenshot of the Azure portal, showing basic authentication configuration to authenticate with a service principal.":::
+4. Optionally select **Advanced** to display more options.
-1. Optionally select **Advanced** to display more options.
1. Under **Configuration information**, Service Connector lists a series of configuration settings that will be generated when you create the connection. This list consists of environment variables or application properties and varies depending on the target resource and authentication method selected. Optionally select the edit button in front of each configuration setting to edit its key.
- 1. Select **Done** to confirm.
+ 2. Select **Done** to confirm.
- :::image type="content" source="./media/manage-authentication/service-principal-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration to authenticate with a service principal.":::
-
-1. Select **Review + Create** and then **Create** to finalize the creation of the connection.
+ :::image type="content" source="./media/manage-authentication/service-principal-advanced.png" alt-text="Screenshot of the Azure portal, showing advanced authentication configuration to authenticate with a service principal.":::
+5. Select **Review + Create** and then **Create** to finalize the creation of the connection.
You can review authentication configuration on the following pages in the Azure
- When creating the connection, select the **Review + Create** tab and check the information listed under **Authentication**.
- :::image type="content" source="./media/manage-authentication/review-authentication.png" alt-text="Screenshot of the Azure portal, showing a summary of connection authentication configuration.":::
-
+ :::image type="content" source="./media/manage-authentication/review-authentication.png" alt-text="Screenshot of the Azure portal, showing a summary of connection authentication configuration.":::
- After you've created the connection, in the **Service connector** page, configuration keys are listed.
- :::image type="content" source="./media/manage-authentication/review-keys-after-creation.png" alt-text="Screenshot of the Azure portal, showing a summary of authentication configuration keys.":::
-
+ :::image type="content" source="./media/manage-authentication/review-keys-after-creation.png" alt-text="Screenshot of the Azure portal, showing a summary of authentication configuration keys.":::
## Next steps
service-connector How To Use Service Connector In Function https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/how-to-use-service-connector-in-function.md
+
+ Title: How Service Connector helps Azure Functions connect to services
+description: Learn how to use Service Connector to connect to services in Azure Functions.
+++ Last updated : 10/25/2023++
+# How Service Connector helps Azure Functions connect to services
+
+Azure Functions is one of the compute services supported by Service Connector. We recommend using bindings to connect Azure Functions with other services, although you can also use client SDKs. This article aims to help you understand:
+
+* The relationship between Service Connector and Functions bindings.
+* The process used by Service Connector to connect Functions to other Azure services using bindings or the SDK.
+* The responsibilities carried by Service Connector and the users respectively in each scenario.
+
+## Prerequisites
+
+* This guide assumes that you already know the [basic concepts of Service Connector](concept-service-connector-internals.md).
+* This guide assumes you know the concepts presented in the [Azure Functions developer guide](../azure-functions/functions-reference.md) and [how to connect a function to Azure services](../azure-functions/add-bindings-existing-function.md).
+
+## Service Connector and Azure Functions bindings
+
+### Bindings in Azure Functions
+
+A binding is a concept used by Azure Functions, aiming to provide a simple way of connecting functions to services without having to work with client SDKs in function codes.
+
+Binding can support inputs, outputs, and triggers. Bindings let you configure the connection to services so that the Functions host can handle the data access for you. For more information, see [Azure Functions triggers and bindings concepts](../azure-functions/functions-triggers-bindings.md).
+
+Function binding supports both secret/connection string and identity based authentication types.
+
+### Service Connector
+
+Service Connector is an Azure service that helps developers easily connect compute services to target backing services. Azure Functions is one of the [compute services supported by Service Connector](./overview.md#what-services-are-supported-by-service-connector).
+
+Compared to a function binding, which is more like a logically abstracted concept, Service Connector is an Azure service that you can directly operate on. It provides APIs for the whole lifecycle of a connection, like `create`, `delete`, `validate` health and `list configurations`.
+
+Service Connector also supports both secret/connection string and identity based authentication types.
+
+### Connection in an Azure Functions binding
+
+In Functions bindings, `connection` is a property defined in a binding file (usually the `function.json` file) in your function folder. It defines the app settings name or prefix that will be used by the binding runtime to authenticate to target services.
+
+### Connection in Service Connector
+
+A `connection` in Service Connector refers to a specific Azure resource that belongs to Service Connector.
+
+The `connection` used by Azure Functions bindings corresponds to the `configuration name` used by Service Connector. The configuration name refers to the app setting key names that Service Connect saves into the compute services' configurations.
+
+## Connecting Azure Functions to other cloud services using Service Connector
+
+Service Connector reduces the amount of effort needed to connect Azure Functions to cloud services using bindings or SDKs. It takes over cloud resource configurations like App Settings, network, identity and permission assignment, so that users can focus on function business logics. The following sections describe how Service Connector helps simplify function connections with different connection mechanisms and authentication methods.
+
+### Binding
+
+* Secret/connection string
+
+| Scenario | Operation | Description | Without Service Connector | With Service Connector |
+| -- | - | | - | - |
+| Local project | Add binding | Add a binding in a function according to the target service type and binding type (in/out/trigger). | User | User |
+| | Consume binding | Set a connection string for authentication in `local.settings.json`, and change the function code to consume the variable defined in the binding. | User | User |
+| Cloud resource | Configure app settings | Configure connection string as an app setting in function resource's configurations. | User | Service Connector |
+| | Configure network | Make sure the target service's network configuration allow access from function resource. | User | Service Connector |
+
+* Identity based authentication
+
+| Scenario | Operation | Description | Without Service Connector | With Service Connector |
+| -- | - | -- | - | - |
+| Local project | Add binding | Add a binding in a function according to the target service type and binding type (in/out/trigger). | User | User |
+| | Consume binding | Set a connection string for authentication in `local.settings.json`, and change the function code to consume the variable defined in the binding. | User | User |
+| Cloud resource | Configure app settings | Configure the Azure Function's identity settings, such as service endpoints. | User | Service Connector |
+| | Configure network | Make sure the target service's network configuration allows access from the function resource. | User | Service Connector |
+| | Configure identity | Make sure system identity is enabled when using system identity to authenticate. | User | Service Connector |
+| | Permission assignment | Assign the identity necessary roles so that it can access the target service. | User | Service Connector |
+
+When using Service Connector with function bindings, pay special attention to the function's key name configured by Service Connector. Make sure it's the same key name as the one defined in `connection` property in the binding file. If it's different, change the name in the binding file or use Service Connector's `customize keys` feature to customize [Service Connector&#39;s default configuration names](./how-to-integrate-storage-blob.md).
+
+### SDK
+
+* Secret/connection string
+
+| Scenario | Operation | Description | Without Service Connector | With Service Connector |
+| -- | - | - | - | - |
+| Local project | Add dependency | Add dependency package according to the target service and your runtime. | User | User |
+| | Initiate SDK client | Set connection string for authentication in `local.settings.json`. Initiate the target service SDK using a connection string. | User | User |
+| Cloud resource | Configure app settings | Configure a connection string as an app setting in the function's configuration. | User | Service Connector |
+| | Configure network | Make sure the target service's network configuration allow access from function resource. | User | Service Connector |
+
+* Identity based authentication
+
+| Scenario | Operation | Description | Without Service Connector | With Service Connector |
+| -- | - | - | - | - |
+| Local project | Add dependency | Add dependency package according to the target service and your runtime. | User | User |
+| | Initiate SDK client | Set connection string for authentication in `local.settings.json`. Initiate the target service SDK using a connection string. | User | User |
+| Cloud resource | Configure app settings | Configure a connection string as an app setting in the function's configuration. | User | Service Connector |
+| | Configure network | Make sure the target service's network configuration allows access from the function resource. | User | Service Connector |
+| | Configure identity | Make sure system identity is enabled when using system identity to authenticate. | User | Service Connector |
+| | Permission assignment | Assign the identity necessary roles so that it can access the target service. | User | Service Connector |
+
+## Next steps
+
+Learn how to integrate different target services and read about their configuration settings and authentication methods.
+
+> [!div class="nextstepaction"]
+> [Learn about how to integrate storage blob](./how-to-integrate-storage-blob.md)
service-connector Quickstart Cli App Service Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-cli-app-service-connection.md
Last updated 04/13/2023
ms.devlang: azurecli - # Quickstart: Create a service connection in App Service with the Azure CLI
-The [Azure CLI](/cli/azure) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation. This quickstart shows you the options to create Azure Web PubSub instance with the Azure CLI.
+The [Azure CLI](/cli/azure) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation. This quickstart shows you the options to create a service connection with the Azure CLI.
[!INCLUDE [quickstarts-free-trial-note](../../includes/quickstarts-free-trial-note.md)] [!INCLUDE [azure-cli-prepare-your-environment.md](~/articles/reusable-content/azure-cli/azure-cli-prepare-your-environment.md)] - This quickstart requires version 2.30.0 or higher of the Azure CLI. If using Azure Cloud Shell, the latest version is already installed.- - This quickstart assumes that you already have at least an App Service running on Azure. If you don't have an App Service, [create one](../app-service/quickstart-dotnetcore.md). ## Initial set-up 1. If you're using Service Connector for the first time, start by running the command [az provider register](/cli/azure/provider#az-provider-register) to register the Service Connector resource provider.
- ```azurecli
- az provider register -n Microsoft.ServiceLinker
- ```
-
- > [!TIP]
- > You can check if the resource provider has already been registered by running the command `az provider show -n "Microsoft.ServiceLinker" --query registrationState`. If the output is `Registered`, then Service Connector has already been registered.
+ ```azurecli
+ az provider register -n Microsoft.ServiceLinker
+ ```
+ > [!TIP]
+ > You can check if the resource provider has already been registered by running the command `az provider show -n "Microsoft.ServiceLinker" --query registrationState`. If the output is `Registered`, then Service Connector has already been registered.
+ >
+2. Optionally, use the Azure CLI [az webapp connection list-support-types](/cli/azure/webapp/connection#az-webapp-connection-list-support-types) command to get a list of supported target services for App Service.
-1. Optionally, use the Azure CLI [az webapp connection list-support-types](/cli/azure/webapp/connection#az-webapp-connection-list-support-types) command to get a list of supported target services for App Service.
+ ```azurecli
+ az webapp connection list-support-types --output table
+ ```
- ```azurecli
- az webapp connection list-support-types --output table
- ```
-
## Create a service connection #### [Using an access key](#tab/Using-access-key)
service-connector Quickstart Cli Functions Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-cli-functions-connection.md
+
+ Title: Quickstart - Create a service connection in Azure Functions with the Azure CLI
+description: Quickstart showing how to create a service connection in Azure Functions with the Azure CLI
++++ Last updated : 10/25/2023
+ms.devlang: azurecli
++
+# Quickstart: Create a service connection in Azure Functions with the Azure CLI
+
+This quickstart shows you how to connect Azure Functions to other Cloud resources using Azure CLI and Service Connector. Service Connector lets you quickly connect compute services to cloud services, while managing your connection's authentication and networking settings.
+++
+- This quickstart requires version 2.30.0 or higher of the Azure CLI. If using Azure Cloud Shell, the latest version is already installed.
+- This quickstart assumes that you already have an Azure Function. If you don't have one yet, [create an Azure Function](../azure-functions/create-first-function-cli-python.md).
+- This quickstart assumes that you already have an Azure Storage account. If you don't have one yet, [create a Azure Storage account](../storage/common/storage-account-create.md).
+
+## Initial set-up
+
+1. If you're using Service Connector for the first time, start by running the command [az provider register](/cli/azure/provider#az-provider-register) to register the Service Connector resource provider.
+
+ ```azurecli
+ az provider register -n Microsoft.ServiceLinker
+ ```
+
+ > [!TIP]
+ > You can check if the resource provider has already been registered by running the command `az provider show -n "Microsoft.ServiceLinker" --query registrationState`. If the output is `Registered`, then Service Connector has already been registered.
+ >
+2. Optionally, use the Azure CLI [az functionapp connection list-support-types](/cli/azure/functionapp/connection#az-webapp-connection-list-support-types) command to get a list of supported target services for Function App.
+
+ ```azurecli
+ az functionapp connection list-support-types --output table
+ ```
+
+## Create a service connection
+
+#### [Using an access key](#tab/Using-access-key)
+
+Use the Azure CLI [az functionapp connection create](/cli/azure/functionapp/connection/create) command to create a service connection to an Azure Blob Storage with an access key, providing the following information:
+
+- **Source compute service resource group name:** the resource group name of the Function App.
+- **Function App name:** the name of your Function App that connects to the target service.
+- **Target service resource group name:** the resource group name of the Blob Storage.
+- **Storage account name:** the account name of your Blob Storage.
+
+```azurecli
+az functionapp connection create storage-blob --secret
+```
+
+> [!NOTE]
+> If you don't have a Blob Storage, you can run `az functionapp connection create storage-blob --new --secret` to provision a new one and directly get connected to your function app.
+
+#### [Using a managed identity](#tab/Using-Managed-Identity)
+
+> [!IMPORTANT]
+> Using Managed Identity requires you have the permission to [Azure AD role assignment](../active-directory/managed-identities-azure-resources/howto-assign-access-portal.md). If you don't have the permission, your connection creation will fail. You can ask your subscription owner for the permission or use an access key to create the connection.
+
+Use the Azure CLI [az functionapp connection](/cli/azure/functionapp/connection) command to create a service connection to a Blob Storage with a system-assigned managed identity, providing the following information:
+
+- **Source compute service resource group name:** the resource group name of the Function App.
+- **Function App name:** the name of your FunctioApp that connects to the target service.
+- **Target service resource group name:** the resource group name of the Blob Storage.
+- **Storage account name:** the account name of your Blob Storage.
+
+```azurecli
+az functionapp connection create storage-blob --system-identity
+```
+
+> [!NOTE]
+> If you don't have a Blob Storage, you can run `az functionapp connection create storage-blob --new --system-identity` to provision a new one and directly get connected to your function app.
+++
+## View connections
+
+Use the Azure CLI [az functionapp connection list](/cli/azure/functionapp/connection#az-functionapp-connection-list) command to list connections to your Function App, providing the following information:
+
+- **Source compute service resource group name:** the resource group name of the Function App.
+- **Function App name:** the name of your Function App that connects to the target service.
+
+```azurecli
+az functionapp connection list -g "<your-function-app-resource-group>" -n "<your-function-app-name>" --output table
+```
+
+## Next steps
+
+Follow the tutorials below to start building your own function application with Service Connector.
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Queue Storage as trigger](./tutorial-python-functions-storage-queue-as-trigger.md)
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Blob Storage as input](./tutorial-python-functions-storage-blob-as-input.md)
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Table Storage as output](./tutorial-python-functions-storage-table-as-output.md)
service-connector Quickstart Portal Functions Connection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/quickstart-portal-functions-connection.md
+
+ Title: Quickstart - Create a service connection in a function app from the Azure portal
+description: Quickstart showing how to create a service connection in a function app from the Azure portal
++++ Last updated : 10/25/2023+
+# Quickstart: Create a service connection in a function app from the Azure portal
+
+Get started with Service Connector by using the Azure portal to create a new service connection for Azure Functions in a function app.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free).
+- A Function App in a [region supported by Service Connector](./concept-region-support.md). If you don't have one yet, [create one](../azure-functions/create-first-function-cli-python.md).
+
+## Sign in to Azure
+
+Sign in to the Azure portal at [https://portal.azure.com/](https://portal.azure.com/) with your Azure account.
+
+## Create a new service connection in Function App
+
+1. To create a new service connection in Function App, select the **Search resources, services and docs (G +/)** search bar at the top of the Azure portal, type ***Function App***, and select **Function App**.
+
+ :::image type="content" source="./media/function-app-quickstart/select-function-app.png" alt-text="Screenshot of the Azure portal, selecting Function App.":::
+2. Select the Function App resource you want to connect to a target resource.
+3. Select **Service Connector** from the left table of contents. Then select **Create**.
+
+ :::image type="content" source="./media/function-app-quickstart/select-service-connector.png" alt-text="Screenshot of the Azure portal, selecting Service Connector and creating new connection.":::
+4. Select or enter the following settings.
+
+ | Setting | Example | Description |
+ | - | | |
+ | **Service type** | Storage - Blob | The target service type. If you don't have a Microsoft Blob Storage, you can [create one](../storage/blobs/storage-quickstart-blobs-portal.md) or use another service type. |
+ | **Subscription** | My subscription | The subscription for your target service (the service you want to connect to). The default value is the subscription for this Function App resource. |
+ | **Connection name** | *my_connection* | The connection name that identifies the connection between your Function App and target service. Use the connection name provided by Service Connector or choose your own connection name. |
+ | **Storage account** | *my_storage_account* | The target storage account you want to connect to. Target service instances to choose from vary according to the selected service type. |
+ | **Client type** | The same app stack on this Function App | The default value comes from the Function App runtime stack. Select the app stack that's on this Function App instance. |
+5. Select **Next: Authentication** to choose an authentication method.
+
+ ### [System-assigned managed identity](#tab/SMI)
+
+ System-assigned managed identity is the recommended authentication option. Select **System-assigned managed identity** to connect through an identity that's generated in Azure Active Directory and tied to the lifecycle of the service instance.
+
+ ### [User-assigned managed identity](#tab/UMI)
+
+ Select **User-assigned managed identity** to authenticate through a standalone identity assigned to one or more instances of an Azure service.
+
+ ### [Connection string](#tab/CS)
+
+ Select **Connection string** to generate or configure one or multiple key-value pairs with pure secrets or tokens.
+
+ ### [Service principal](#tab/SP)
+
+ Select **Service principal** to use a service principal that defines the access policy and permissions for the user/application in Azure Active Directory.
+6. Select **Next: Networking** to configure the network access to your target service and select **Configure firewall rules to enable access to your target service**.
+7. Select **Next: Review + Create** to review the provided information. Then select **Create** to create the service connection. This operation may take a minute to complete.
+
+## View service connections in Function App
+
+1. The **Service Connector** tab displays existing function app connections.
+2. Select **Validate** to check your connection. You can see the connection validation details in the panel on the right.
+
+ :::image type="content" source="./media/function-app-quickstart/list-and-validate.png" alt-text="Screenshot of the Azure portal, listing and validating the connection.":::
+
+## Next steps
+
+Follow the tutorials to start building your own function application with Service Connector.
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Queue Storage as trigger](./tutorial-python-functions-storage-queue-as-trigger.md)
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Blob Storage as input](./tutorial-python-functions-storage-blob-as-input.md)
+
+> [!div class="nextstepaction"]
+> [Tutorial: Python function with Azure Table Storage as output](./tutorial-python-functions-storage-table-as-output.md)
service-connector Tutorial Python Functions Storage Blob As Input https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-python-functions-storage-blob-as-input.md
+
+ Title: 'Tutorial: Python function with Azure Blob Storage as input'
+description: Learn how you can connect a Python function to a storage blob as input using Service Connector
++++ Last updated : 10/25/2023+
+# Tutorial: Python function with Azure Blob Storage as input
+
+In this tutorial, you learn how to configure a Python function with Storage Blob as input by completing the following tasks:
+
+> [!div class="checklist"]
+> * Use Visual Studio Code to create a Python function project.
+> * Change codes to add storage blob input function binding.
+> * Use Visual Studio Code to run the function locally.
+> * Use the Azure CLI to create a connection between Azure Function and Storage Blob with Service Connector.
+> * Use Visual Studio to deploy your function.
+
+An overview of the function project components in this tutorial:
+
+| Project Component | Selection / Solution |
+| | |
+| Source Service | Azure Function |
+| Target Service | Azure Storage Blob |
+| Function Binding | HTTP trigger, Storage Blob as Input |
+| Local Project Auth Type | Connection String |
+| Cloud Function Auth Type | System-Assigned Managed Identity |
+
+## Prerequisites
+
+- Install [Visual Studio Code](https://code.visualstudio.com) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
+- Azure CLI. You can use it in [Azure Cloud Shell](https://shell.azure.com/) or [install it locally](/cli/azure/install-azure-cli).
+- An Azure Storage Account and a Storage blob. If you don't have an Azure Storage account, [create one](../storage/common/storage-account-create.md).
+- This guide assumes you know the concepts presented in the [Functions developer guide](../azure-functions/functions-reference.md) and [how to connect to services in Functions](../azure-functions/add-bindings-existing-function.md).
+
+## Create a Python function project
+
+Follow the [tutorial to create a local Azure Functions project](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#create-an-azure-functions-project), and provide the following information at the prompts:
+
+| Prompt | Selection |
+| | - |
+| **Select a language** | Choose `Python`. (v1 programming language model) |
+| **Select a Python interpreter to create a virtual environment** | Choose your preferred Python interpreter. If an option isn't shown, type in the full path to your Python binary. |
+| **Select a template for your project's first function** | Choose `HTTP trigger`. |
+| **Provide a function name** | Enter `BlobStorageInputFunc`. |
+| **Authorization level** | Choose `Anonymous`, which lets anyone call your function endpoint.  |
+
+You have created a Python function project with an HTTP trigger.
+
+## Add a Blob Storage input binding
+
+Binding attributes are defined in the *function.json* file for a given function. To create a binding, right-click (Ctrl+click on macOS) the `function.json` file in your function folder and choose **Add binding...** . Follow the prompts to define the following binding properties for the new binding:
+
+| Prompt | Value | Description |
+| - | | |
+| **Select binding direction** | `in` | The binding is an input binding. |
+| **Select binding with direction...** | `Azure Blob Storage` | The binding is an Azure Storage blob binding. |
+| **The name used to identify this binding in your code** | `inputBlob` | Name that identifies the binding parameter referenced in your code. |
+| **The path within your storage account from which the blob will be read** | `testcontainer/test.txt` | The blob path your function read as input. Prepare a file named `test.txt`, with a `Hello, World!` as the file content. Create a container named `testcontainer `, and upload the file to the container. |
+| **Select setting from "local.setting.json"** | `Create new local app settings` | Select the Storage Account your function reads as input. Visual Studio Code retrieves its connection string for local project connection. |
+
+To check the binding was added successfully,
+
+1. Open the `BlobStorageInputFunc/function.json` file, check that a new binding with `type: blob` and `direction: in` was added into this file.
+1. Open the `local.settings.json` file, check that a new key-value pair `<your-storage-account-name>_STORAGE: <your-storage-account-connection-string>` that contains your storage account connection string was added into this file.
+
+After the binding is added, update your function codes to consume the binding by replacing `BlobStorageInputFunc/__init__.py` with the Python file here.
+
+```python
+import logging
+import azure.functions as func
+
+def main(req: func.HttpRequest, inputBlob: bytes) -> func.HttpResponse:
+ logging.info('Python HTTP trigger function processed a request.')
+ return func.HttpResponse('The triggered function executed successfully. And read blob content: {}'.format(inputBlob))
+```
+
+## Run the function locally
+
+Follow the [tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#run-the-function-locally) to run the function locally and verify the blob input.
+
+1. Select the storage account you used when creating the Azure Function resource if you're prompted to connect to Storage. It is for Azure Function runtime's internal use, and isn't necessarily the same with the one you use for input.
+1. To start the function locally, press `<kbd>`F5 `</kbd>` or select the **Run and Debug** icon in the left-hand side Activity bar.
+1. To verify the function can read the blob, right click `Exucute Function Now...` on the function in the Visual Studio Code **WORKSPACE** and check the function response. The response message should contain the content in your blob file.
+
+## Create a connection using Service Connector
+
+You just ran the project and verified the function locally, and your local project connects to your storage blob using a connection string.
+
+Now you'll learn how to configure the connection between the Azure Function and Azure Blob Storage, so that your function can read the blob after being deployed to the cloud. In the cloud environment, we demonstrate how to authenticate using a system-assigned managed identity.
+
+1. Open the `function.json` file in your local project, change the value of the `connection` property in `bindings` to be `MyBlobInputConnection`.
+1. Run the following Azure CLI command to create a connection between your Azure Function and your Azure Storage.
+
+```azurecli
+az functionapp connection create storage-blob --source-id "<your-function-resource-id>" --target-id "<your-storage-blob-resource-id>" --system-identity --customized-keys AZURE_STORAGEBLOB_RESOURCEENDPOINT=MyBlobInputConnection__serviceUri
+```
+
+* `--source-id` format: `/subscriptions/{subscription}/resourceG roups/{source_resource_group}/providers/Microsoft.Web/sites/{site}`
+* `--target-id` format: `/subscriptions/{subscription}/resourceGroups/{target_resource_group}/providers/Microsoft.Storage/storageAccounts/{account}/blobServices/default`
+
+You have created a connection between Azure Function and Azure Blob Storage using Service Connector, with a system-assigned managed identity.
+
+Service Connector configured a `MyBlobInputConnection__serviceUri` variable in the function's app settings used by the function binding runtime to connect to the storage, so that the function can read data from the blob storage. You can learn more about [how Service Connector helps Azure Functions connect to services](./how-to-use-service-connector-in-function.md).
+
+## Deploy your function to Azure
+
+Now you can deploy your function to Azure and verify the storage blob input binding works.
+
+1. Follow the [tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#deploy-the-project-to-azure) to deploy your function to Azure.
+1. To verify the function can read the blob, right click `Exucute Function Now...` on the function in the Visual Studio Code **RESOURCES** view and check the function response. The response message should contain the content in your blob file.
+
+## Troubleshoot
+
+If there are any errors related with storage host, such as `No such host is known (<acount-name>.blob.core.windows.net:443)`, you need to check whether the connection string you use to connect to Azure Storage contains the blob endpoint or not. If it doesn't, go to Azure Storage in the Azure portal, copy the connection string from the `Access keys` blade, and replace the values.
+
+If the error happens when you start the project locally, check the `local.settings.json` file.
+
+If the error happens when you deploy your function to cloud (in this case, Function deployment usually fails on `Syncing triggers` ), check your function's App Settings.
+
+## Clean up resources
+
+If you're not going to continue to use this project, delete the Function App resource you created earlier.
+
+### [Portal](#tab/azure-portal)
+
+1. In the Azure portal, open the Function App resource and select **Delete**.
+1. Enter the app name and select **Delete** to confirm.
+
+### [Azure CLI](#tab/azure-cli)
+
+Run the following command in the Azure CLI and replace all placeholders with your own information.
+
+```azurecli
+az functionapp delete --name <function-name> --resource-group <resource-group>
+```
+++
+## Next steps
+
+Read the articles below to learn more about Service Connector concepts and how it helps Azure Functions connect to other cloud services.
+
+> [!div class="nextstepaction"]
+> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
+
+> [!div class="nextstepaction"]
+> [Use Service Connector to connect Azure Functions to other services](./how-to-use-service-connector-in-function.md)
service-connector Tutorial Python Functions Storage Queue As Trigger https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-python-functions-storage-queue-as-trigger.md
+
+ Title: 'Tutorial: Python function with Azure Queue Storage as trigger'
+description: Learn how you can connect a Python function to a storage queue as trigger using Service Connector
++++ Last updated : 10/25/2023+
+# Tutorial: Python function with Azure Queue Storage as trigger
+
+In this tutorial, you learn how to configure a Python function with Storage Queue as trigger by completing the following tasks.
+
+> [!div class="checklist"]
+> * Use Visual Studio Code to create a Python function project.
+> * Use Visual Studio Code to run the function locally.
+> * Use the Azure CLI to create a connection between Azure Function and Storage Queue with Service Connector.
+> * Use Visual Studio to deploy your function.
+
+An overview of the function project components in this tutorial:
+
+| Project Component | Selection / Solution |
+| | |
+| Source Service | Azure Function |
+| Target Service | Azure Storage Queue |
+| Function Binding | Storage Queue as Trigger |
+| Local Project Auth Type | Connection String |
+| Cloud Function Auth Type | Connection String |
+
+## Prerequisites
+
+- Install [Visual Studio Code](https://code.visualstudio.com) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
+- The Azure CLI. You can use it in [Azure Cloud Shell](https://shell.azure.com/) or [install it locally](/cli/azure/install-azure-cli).
+- An Azure Storage Account and a storage queue. If you don't have an Azure Storage, [create one](../storage/common/storage-account-create.md).
+- This guide assumes you know the basic concepts presented in the [Azure Functions developer guide](../azure-functions/functions-reference.md) and [how to connect to services in Functions](../azure-functions/add-bindings-existing-function.md).
+
+## Create a Python function project
+
+Follow the [tutorial to create a local Azure Functions project](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#create-an-azure-functions-project), and provide the following information at the prompts:
+
+| Prompt | Selection |
+| | - |
+| **Select a language** | Choose `Python`. (v1 programming language model) |
+| **Select a Python interpreter to create a virtual environment** | Choose your preferred Python interpreter. If an option isn't shown, type in the full path to your Python binary. |
+| **Select a template for your project's first function** | Choose `Azure Queue Storage trigger`. |
+| **Provide a function name** | Enter `QueueStorageTriggerFunc`. |
+| **Select setting from "local.settings.json"** | Choose `Create new local app settings`, which lets you select your Storage Account and provide your queue name that works as the trigger. |
+
+You have created a Python function project with Azure Storage Queue as trigger. The local project connects to Azure Storage using the connection string saved into the `local.settings.json` file. Finally, the `main` function in `__init__.py` file of the function can consume the connection string with the help of the Function Binding defined in the `function.json` file.
+
+## Run the function locally
+
+Follow the [tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#run-the-function-locally) to run the function locally and verify the trigger.
+
+1. Select the storage account as you chose when creating the Azure Function resource if you're prompted to connect to storage. This value is used for Azure Function's runtime, and it isn't necessarily the same as the storage account you use for the trigger.
+1. To start the function locally, press `<kbd>`F5 `</kbd>` or select the **Run and Debug** icon in the left-hand side Activity bar.
+1. To verify the trigger works properly, keep the function running locally and open the Storage Queue blade in Azure portal, select **Add message** and provide a test message. You should see the function is triggered and processed as a queue item in your Visual Studio Code terminal.
+
+## Create a connection using Service Connector
+
+In last step, you verified the function project locally. Now you'll learn how to configure the connection between the Azure Function and Azure Storage Queue in the cloud, so that your function can be triggered by the storage queue after being deployed to the cloud.
+
+1. Open the `function.json` file in your local project, change the value of the `connection` property in `bindings` to be `AZURE_STORAGEQUEUE_CONNECTIONSTRING`.
+1. Run the following Azure CLI command to create a connection between your Azure Function and your Azure storage account.
+
+```azurecli
+az functionapp connection create storage-queue --source-id "<your-function-resource-id>" --target-id "<your-storage-queue-resource-id>" --secret
+```
+
+* `--source-id` format: `/subscriptions/{subscription}/resourceG roups/{source_resource_group}/providers/Microsoft.Web/sites/{site}`
+* `--target-id` format: `/subscriptions/{subscription}/resourceGroups/{target_resource_group}/providers/Microsoft.Storage/storageAccounts/{account}/queueServices/default`
+
+This step creates a Service Connector resource that configures an `AZURE_STORAGEQUEUE_CONNECTIONSTRING` variable in the function's App Settings. The function binding runtime uses it to connect to the storage, so that the function can accept triggers from the storage queue. For more information, go to [how Service Connector helps Azure Functions connect to services](./how-to-use-service-connector-in-function.md).
+
+## Deploy your function to Azure
+
+Now you can deploy your function to Azure and verify the storage queue trigger works.
+
+1. Follow this [Azure Functions tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#deploy-the-project-to-azure) to deploy your function to Azure.
+1. Open the Storage Queue blade in the Azure portal, select **Add message** and provide a test message. You should see the function is triggered and processed as a queue item in your function logs.
+
+## Troubleshoot
+
+If there are any errors related with the storage host, such as `No such host is known (<acount-name>.queue.core.windows.net:443)`, check whether the connection string you use to connect to Azure Storage contains the queue endpoint or not. If it doesn't, go to Azure Storage in the Azure portal, copy the connection string from the `Access keys` blade, and replace the values.
+
+If this error happens when you start the project locally, check the `local.settings.json` file.
+
+If this error happens when you deploy your function to cloud (in this case, Function deployment usually fails on `Syncing triggers` ), check your Function's App Settings.
+
+## Clean up resources
+
+If you're not going to continue to use this project, delete the Function App resource you created earlier.
+
+### [Portal](#tab/azure-portal)
+
+1. In the Azure portal, open the Function App resource and select **Delete**.
+1. Enter the app name and select **Delete** to confirm.
+
+### [Azure CLI](#tab/azure-cli)
+
+Run the following command in the Azure CLI and replace all placeholders with your own information.
+
+```azurecli
+az functionapp delete --name <function-name> --resource-group <resource-group>
+```
+++
+## Next steps
+
+Read the articles below to learn more about Service Connector concepts and how it helps Azure Functions connect to services.
+
+> [!div class="nextstepaction"]
+> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
+
+> [!div class="nextstepaction"]
+> [Use Service Connector to connect Azure Functions to other cloud services](./how-to-use-service-connector-in-function.md)
service-connector Tutorial Python Functions Storage Table As Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-connector/tutorial-python-functions-storage-table-as-output.md
+
+ Title: 'Tutorial: Python function with Azure Table Storage as output'
+description: Learn how you can connect a Python function to a storage table as output using Service Connector
++++ Last updated : 11/14/2023+
+# Tutorial: Python function with Azure Table Storage as output
+
+In this tutorial, you learn how to configure a Python function with Storage Table as output by completing the following tasks.
+
+> [!div class="checklist"]
+> * Use Visual Studio Code to create a Python function project.
+> * Add a Storage Table output function binding.
+> * Use Visual Studio Code to run the function locally.
+> * Use the Azure CLI to create a connection between Azure Function and Storage Table with Service Connector.
+> * Use Visual Studio to deploy your function.
+
+An overview of the function project components in this tutorial:
+
+| Project Component | Selection / Solution |
+| | -- |
+| Source Service | Azure Function |
+| Target Service | Azure Storage Table |
+| Function Binding | HTTP trigger, Storage Table as Output |
+| Local Project Auth Type | Connection String |
+| Cloud Function Auth Type | Connection String |
+
+## Prerequisites
+
+- Install [Visual Studio Code](https://code.visualstudio.com) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms).
+- The Azure CLI. You can use it in [Azure Cloud Shell](https://shell.azure.com/) or [install it locally](/cli/azure/install-azure-cli).
+- An Azure Storage Account and a Storage Table. If you don't have a storage account, [create one](../storage/common/storage-account-create.md).
+- The guide assumes you know the concepts presented in the [Functions developer guide](../azure-functions/functions-reference.md) and [how to connect to services in Azure Functions](../azure-functions/add-bindings-existing-function.md).
+
+## Create a Python function project
+
+Follow the [tutorial to create a local Azure Functions project](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#create-an-azure-functions-project), and provide the following information at the prompts:
+
+| Prompt | Selection |
+| | - |
+| **Select a language** | Choose `Python`. (v1 programming language model) |
+| **Select a Python interpreter to create a virtual environment** | Choose your preferred Python interpreter. If an option isn't shown, type in the full path to your Python binary. |
+| **Select a template for your project's first function** | Choose `HTTP trigger`. |
+| **Provide a function name** | Enter `TableStorageOutputFunc`. |
+| **Authorization level** | Choose `Anonymous`, which lets anyone call your function endpoint.  |
+
+You have created a Python function project with an HTTP trigger.
+
+## Add a storage table output binding
+
+Binding attributes are defined in the *function.json* file for a given function. To create a binding, right-click (Ctrl+click on macOS) the `function.json` file in your function folder and choose **Add binding...** . Follow the prompts to define the following binding properties for the new binding:
+
+| Prompt | Value | Description |
+| | | -- |
+| **Select binding direction** | `out` | The binding is an output binding. |
+| **Select binding with direction...** | `Azure Table Storage` | The binding is an Azure Storage table binding. |
+| **The name used to identify this binding in your code** | `outMessage` | Name that identifies the binding parameter referenced in your code. |
+| **Table name in storage account where data will be written** | `testTable` | The table name your function writes as output. Create a table named `testTable` in your storage account if it doesn't exist. |
+| **Select setting from "local.setting.json"** | `Create new local app settings` | Select the Storage Account your function writes as output. Visual Studio Code retrieves its connection string for local project connection. |
+
+To check the binding was added successfully:
+
+1. Open the `TableStorageOutputFunc/function.json` file, check that a new binding with `type: table` and `direction: out` was added into this file.
+1. Open the `local.settings.json` file, check that a new key-value pair `<your-storage-account-name>_STORAGE: <your-storage-account-connection-string>` that contains your storage account connection string was added into this file.
+
+After the binding is added, update your function codes to consume the binding by replacing `TableStorageOutputFunc/__init__.py` with the Python file here.
+
+```python
+import logging
+import uuid
+import json
+import azure.functions as func
+
+def main(req: func.HttpRequest, outMessage: func.Out[str]) -> func.HttpResponse:
+
+ rowKey = str(uuid.uuid4())
+ data = {
+ "Name": "Output binding message",
+ "PartitionKey": "message",
+ "RowKey": rowKey
+ }
+
+ outMessage.set(json.dumps(data))
+ return func.HttpResponse(f"Message created with the rowKey: {rowKey}")
+```
+
+## Run the function locally
+
+Follow the [tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#run-the-function-locally) to run the function locally and verify the table output.
+
+1. Select the Storage Account you chose when creating the Azure Function resource if you're prompted to connect to a storage account. This value is used for Azure Function runtime's. It isn't necessarily the same storage account you use for the output.
+1. To start the function locally, press `<kbd>`F5 `</kbd>` or select the **Run and Debug** icon in the left-hand side Activity bar.
+1. To verify the function can write to your table, right click `Execute Function Now...` on the function in the Visual Studio Code **WORKSPACE** view and check the function response. The response message should contain the `rowKey` that was written to the table.
+
+## Create a connection using Service Connector
+
+In last step, you verified the function project locally. Now you'll learn how to configure the connection between the Azure Function and Azure Storage Table in the cloud, so that your function can write to your storage blob after being deployed to the cloud.
+
+1. Open the `function.json` file in your local project, change the value of the `connection` property in `bindings` to be `AZURE_STORAGETABLE_CONNECTIONSTRING`.
+1. Run the following Azure CLI command to create a connection between your Azure Function and your Azure Storage.
+
+```azurecli
+az functionapp connection create storage-table --source-id "<your-function-resource-id>" --target-id "<your-storage-table-resource-id>" --secret
+```
+
+* `--source-id` format: `/subscriptions/{subscription}/resourceG roups/{source_resource_group}/providers/Microsoft.Web/sites/{site}`
+* `--target-id` format: `/subscriptions/{subscription}/resourceGroups/{target_resource_group}/providers/Microsoft.Storage/storageAccounts/{account}/tableServices/default`
+
+You've created a Service Connector resource that configures an `AZURE_STORAGETABLE_CONNECTIONSTRING` variable in the function's App Settings. This app setting will then be consumed by the function binding to connect to the storage, so that the function can write to the storage table. You can learn more about [how Service Connector helps Azure Functions connect to services](./how-to-use-service-connector-in-function.md).
+
+## Deploy your function to Azure
+
+Now you can deploy your function to Azure and verify the storage table output binding works.
+
+1. Follow this [Azure Functions tutorial](../azure-functions/create-first-function-vs-code-python.md?pivots=python-mode-configuration#deploy-the-project-to-azure) to deploy your function to Azure.
+1. To verify the function can write to the table, right click `Execute Function Now...` on the function in the Visual Studio Code **RESOURCES** view, and check the function response. The response message should contain the `rowKey` the function just wrote to your table.
+
+## Troubleshoot
+
+If there are any errors related with storage host, such as `No such host is known (<acount-name>.table.core.windows.net:443)`, you need check whether the connection string you use to connect to Azure Storage contains the table endpoint or not. If it doesn't, go to Azure Storage portal, copy the connection string from the `Access keys` blade, and replace the values.
+
+If this error happens when you start the project locally, check the `local.settings.json` file.
+
+If it happens when you deploy your function to the cloud (in this case, Function deployment usually fails on `Syncing triggers` ), check your Function's App Settings.
+
+## Clean up resources
+
+If you're not going to continue to use this project, delete the Function App resource you created earlier.
+
+### [Portal](#tab/azure-portal)
+
+1. In the Azure portal, open the Function App resource and select **Delete**.
+1. Enter the app name and select **Delete** to confirm.
+
+### [Azure CLI](#tab/azure-cli)
+
+Run the following command in the Azure CLI and replace all placeholders with your own information.
+
+```azurecli
+az functionapp delete --name <function-name> --resource-group <resource-group>
+```
+++
+## Next steps
+
+Read the articles below to learn more about Service Connector concepts and how it helps Azure Functions connect to other cloud services.
+
+> [!div class="nextstepaction"]
+> [Learn about Service Connector concepts](./concept-service-connector-internals.md)
+
+> [!div class="nextstepaction"]
+> [Use Service Connector to connect Azure Functions to other cloud services](./how-to-use-service-connector-in-function.md)
spring-apps How To Configure Enterprise Spring Cloud Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-apps/how-to-configure-enterprise-spring-cloud-gateway.md
You can now view the state of the Spring Cloud Gateway on the **Spring Cloud Gat
Use the following Azure CLI commands to enable or disable VMware Spring Cloud Gateway: ```azurecli
-az spring spring-cloud-gateway create \
+az spring gateway create \
--resource-group <resource-group-name> \ --service <Azure-Spring-Apps-service-instance-name> ``` ```azurecli
-az spring spring-cloud-gateway delete \
+az spring gateway delete \
--resource-group <resource-group-name> \ --service <Azure-Spring-Apps-instance-name> ```
Use the following steps to restart VMware Spring Cloud Gateway by using the Azur
Use the following Azure CLI command to restart the gateway: ```azurecli
-az spring spring-cloud-gateway restart \
+az spring gateway restart \
--resource-group <resource-group-name> \ --service <Azure-Spring-Apps-service-instance-name> ```
VMware Spring Cloud Gateway supports authentication and authorization through si
| Property | Required? | Description | |-|--|-|
-| `issuerUri` | Yes | The URI that's asserted as its issuer identifier. For example, if `issuer-uri` is `https://example.com`, an OpenID Provider Configuration Request is made to `https://example.com/.well-known/openid-configuration`. The result is expected to be an OpenID Provider Configuration Response. |
+| `issuerUri` | Yes | The URI that is asserted as its issuer identifier. For example, if `issuerUri` is `https://example.com`, an OpenID Provider Configuration Request is made to `https://example.com/.well-known/openid-configuration`. The result is expected to be an OpenID Provider Configuration Response. |
| `clientId` | Yes | The OpenID Connect client ID from your identity provider. | | `clientSecret` | Yes | The OpenID Connect client secret from your identity provider. | | `scope` | Yes | A list of scopes to include in JWT identity tokens. This list should be based on the scopes that your identity provider allows. |
storage Data Lake Storage Acl Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/data-lake-storage-acl-powershell.md
This section shows you how to:
### Set an ACL
-Use the `set-AzDataLakeGen2ItemAclObject` cmdlet to create an ACL for the owning user, owning group, or other users. Then, use the `Update-AzDataLakeGen2Item` cmdlet to commit the ACL.
+Use the `Set-AzDataLakeGen2ItemAclObject` cmdlet to create an ACL for the owning user, owning group, or other users. Then, use the `Update-AzDataLakeGen2Item` cmdlet to commit the ACL.
This example sets the ACL on the root directory of a **container** for the owning user, owning group, or other users, and then prints the ACL to the console. ```powershell $filesystemName = "my-file-system"
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission -wx -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission -wx -InputObject $acl
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Acl $acl $filesystem = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName $filesystem.ACL
This example sets the ACL on a **directory** for the owning user, owning group,
```powershell $filesystemName = "my-file-system" $dirname = "my-directory/"
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission -wx -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission -wx -InputObject $acl
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Acl $acl $dir = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname $dir.ACL ``` > [!NOTE]
-> If you want to set a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rwx -DefaultScope`.
+> If you want to set a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rwx -DefaultScope`.
This example sets the ACL on a **file** for the owning user, owning group, or other users, and then prints the ACL to the console. ```powershell $filesystemName = "my-file-system" $filePath = "my-directory/upload.txt"
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission "-wx" -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rw-
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType group -Permission rw- -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType other -Permission "-wx" -InputObject $acl
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $filePath -Acl $acl $file = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $filePath $file.ACL
Set-AzDataLakeGen2AclRecursive -Context $ctx -FileSystem $filesystemName -Path $
``` > [!NOTE]
-> If you want to set a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rwx -DefaultScope`.
+> If you want to set a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -Permission rwx -DefaultScope`.
To see an example that sets ACLs recursively in batches by specifying a batch size, see the [Set-AzDataLakeGen2AclRecursive](/powershell/module/az.storage/set-azdatalakegen2aclrecursive) reference article.
This section shows you how to:
### Update an ACL
-First, get the ACL. Then, use the `set-AzDataLakeGen2ItemAclObject` cmdlet to add or update an ACL entry. Use the `Update-AzDataLakeGen2Item` cmdlet to commit the ACL.
+First, get the ACL. Then, use the `Set-AzDataLakeGen2ItemAclObject` cmdlet to add or update an ACL entry. Use the `Update-AzDataLakeGen2Item` cmdlet to commit the ACL.
This example creates or updates the ACL on a **directory** for a user.
This example creates or updates the ACL on a **directory** for a user.
$filesystemName = "my-file-system" $dirname = "my-directory/" $acl = (Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname).ACL
-$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityID xxxxxxxx-xxxx-xxxxxxxxxxx -Permission r-x -InputObject $acl
+$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityID xxxxxxxx-xxxx-xxxxxxxxxxx -Permission r-x -InputObject $acl
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Acl $acl ``` > [!NOTE]
-> If you want to update a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityID xxxxxxxx-xxxx-xxxxxxxxxxx -Permission r-x -DefaultScope`.
+> If you want to update a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityID xxxxxxxx-xxxx-xxxxxxxxxxx -Permission r-x -DefaultScope`.
### Update ACLs recursively
Remove-AzDataLakeGen2AclRecursive -Context $ctx -FileSystem $filesystemName -Ac
``` > [!NOTE]
-> If you want to remove a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityId $userID -Permission "" -DefaultScope`.
+> If you want to remove a **default** ACL entry, use the **-DefaultScope** parameter when you run the **Set-AzDataLakeGen2ItemAclObject** command. For example: `$acl = Set-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityId $userID -Permission "" -DefaultScope`.
To see an example that removes ACLs recursively in batches by specifying a batch size, see the [Remove-AzDataLakeGen2AclRecursive](/powershell/module/az.storage/remove-azdatalakegen2aclrecursive) reference article.
storage Storage Blob Index How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-index-how-to.md
Within the Azure portal, the blob index tags filter automatically applies the `@
3. To find all blobs that match a specific blob tag, use the `az storage blob filter` command. ```azurecli
- az storage blob filter --account-name mystorageaccount --tag-filter ""tag1"='value1' and "tag2"='value2'" --auth-mode login
+ az storage blob filter --account-name mystorageaccount --tag-filter """tag1""='value1' and ""tag2""='value2'" --auth-mode login
``` 4. To find blobs only in a specific container, include the container name in the `--tag-filter` parameter. ```azurecli
- az storage blob filter --account-name mystorageaccount --tag-filter ""@container"='myContainer' and "tag1"='value1' and "tag2"='value2'" --auth-mode login
+ az storage blob filter --account-name mystorageaccount --tag-filter """@container""='myContainer' and ""tag1""='value1' and ""tag2""='value2'" --auth-mode login
``` ### [AzCopy](#tab/azcopy)
storage Storage Quickstart Static Website Terraform https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-quickstart-static-website-terraform.md
+
+ Title: 'Quickstart: Deploy a static website on Azure Storage using Terraform'
+description: Learn how to deploy an Azure Storage account with static website hosting enabled.
+++ Last updated : 11/17/2021++
+content_well_notification:
+ - AI-contribution
++
+# Quickstart: Deploy a static website on Azure Storage using Terraform
+
+In this quickstart, you learn how to deploy an [Azure Storage account](https://www.terraform.io/docs/providers/azurerm/r/storage_account.html) with static website hosting enabled.
++
+In this article, you learn how to:
+
+> [!div class="checklist"]
+> * Create a random value (to be used in the resource group name) using [random_pet](https://registry.terraform.io/providers/hashicorp/random/latest/docs/resources/pet)
+> * Create an Azure resource group using [azurerm_resource_group](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/resource_group)
+> * Create a random value (to be used in the storage acccount name) using [random_string](https://registry.terraform.io/providers/hashicorp/random/latest/docs/resources/string)
+> * Create a storage account with a static website using [azurerm_storage_account](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/storage_account)
+> * Create a storage account blob in the using [azurerm_storage_blob](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/storage_blob)
+
+## Prerequisites
+
+- [Install and configure Terraform](/azure/developer/terraform/quickstart-configure)
+
+## Implement the Terraform code
+
+> [!NOTE]
+> The sample code for this article is located in the [Azure Terraform GitHub repo](https://github.com/Azure/terraform/tree/master/quickstart/101-storage-account-with-static-website). You can view the log file containing the [test results from current and previous versions of Terraform](https://github.com/Azure/terraform/tree/master/quickstart/101-storage-account-with-static-website/TestRecord.md).
+>
+> See more [articles and sample code showing how to use Terraform to manage Azure resources](/azure/terraform)
+
+1. Create a directory in which to test the sample Terraform code and make it the current directory.
+
+1. Create a file named `providers.tf` and insert the following code:
+
+ :::code language="Terraform" source="~/terraform_samples/quickstart/101-storage-account-with-static-website/providers.tf":::
+
+1. Create a file named `main.tf` and insert the following code:
+
+ :::code language="Terraform" source="~/terraform_samples/quickstart/101-storage-account-with-static-website/main.tf":::
+
+1. Create a file named `variables.tf` and insert the following code:
+
+ :::code language="Terraform" source="~/terraform_samples/quickstart/101-storage-account-with-static-website/variables.tf":::
+
+1. Create a file named `outputs.tf` and insert the following code:
+
+ :::code language="Terraform" source="~/terraform_samples/quickstart/101-storage-account-with-static-website/outputs.tf":::
+
+1. Create a file named `https://docsupdatetracker.net/index.html` and insert the following code:
+
+ :::code language="html" source="~/terraform_samples/quickstart/101-storage-account-with-static-website/https://docsupdatetracker.net/index.html":::
+
+## Initialize Terraform
++
+## Create a Terraform execution plan
++
+## Apply a Terraform execution plan
++
+## Verify the results
+
+#### [Azure CLI](#tab/azure-cli)
+
+1. Get the URL to the static web site.
+
+ ```console
+ primary_web_host=$(terraform output -raw primary_web_host)
+ ```
+
+1. Open a browser and enter the URL in your browser's address bar.
+
+ :::image type="content" source="./media/storage-quickstart-static-website-terraform/static-website-running-in-storage-account.png" alt-text="Screenshot of the static web site stored in an Azure storage account.":::
+
+#### [Azure PowerShell](#tab/azure-powershell)
+
+1. Get the URL to the static web site.
+
+ ```console
+ $primary_web_host=$(terraform output -raw primary_web_host)
+ ```
+
+1. Open a browser and enter the URL in your browser's address bar.
+
+ :::image type="content" source="./media/storage-quickstart-static-website-terraform/static-website-running-in-storage-account.png" alt-text="Screenshot of the static web site stored in an Azure storage account.":::
+++
+## Clean up resources
++
+## Troubleshoot Terraform on Azure
+
+[Troubleshoot common problems when using Terraform on Azure](/azure/developer/terraform/troubleshoot)
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Introduction to Azure Blob Storage](./storage-blobs-introduction.md)
storage Storage Use Azcopy Blobs Copy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-use-azcopy-blobs-copy.md
You can tweak your copy operation by using optional flags. Here's a few examples
|Scenario|Flag| |||
-|Copy blobs as Block, Page, or Append Blobs.|**--blob-type**=\[BlockBlob\|PageBlob\|AppendBlob\]|
-|Copy to a specific access tier (such as the archive tier).|**--block-blob-tier**=\[None\|Hot\|Cool\|Archive\]|
-|Automatically decompress files.|**--decompress**=\[gzip\|deflate\]|
+|Copy blobs as Block, Page, or Append Blobs.|**--blob-type**=[BlockBlob\|PageBlob\|AppendBlob]|
+|Copy to a specific access tier (such as the archive tier).|**--block-blob-tier**=[None\|Hot\|Cool\|Archive]|
+|Automatically decompress files.|**--decompress**=[gzip\|deflate]|
For a complete list, see [options](storage-ref-azcopy-copy.md#options).
See these articles to configure settings, optimize performance, and troubleshoot
- [Optimize the performance of AzCopy](storage-use-azcopy-optimize.md) - [Find errors and resume jobs by using log and plan files in AzCopy](storage-use-azcopy-configure.md) - [Troubleshoot problems with AzCopy v10](storage-use-azcopy-troubleshoot.md)
+- [Use AzCopy to copy blobs between Azure storage accounts with network restrictions](/troubleshoot/azure/azure-storage/copy-blobs-between-storage-accounts-network-restriction?toc=/azure/storage/blobs/toc.json&bc=/azure/storage/blobs/breadcrumb/toc.json)
+
storage Use Container Storage With Local Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/container-storage/use-container-storage-with-local-disk.md
description: Configure Azure Container Storage Preview for use with Ephemeral Di
Previously updated : 11/06/2023 Last updated : 11/16/2023
If you want to delete a storage pool, run the following command. Replace `<stora
kubectl delete sp -n acstor <storage-pool-name> ```
+## Use volume replication for storage pools (optional)
+
+Applications that require the extremely low latency or high performance of local NVMe can leverage storage replication for improved resiliency. [Sign up here](https://aka.ms/NVMeReplication).
+ ## See also - [What is Azure Container Storage?](container-storage-introduction.md)
storage Storage Queues Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/queues/storage-queues-introduction.md
# What is Azure Queue Storage?
-Azure Queue Storage is a service for storing large numbers of messages. You access messages from anywhere in the world via authenticated calls using HTTP or HTTPS. A queue message can be up to 64 KB in size. A queue may contain millions of messages, up to the total capacity limit of a storage account. Queues are commonly used to create a backlog of work to process asynchronously.
+Azure Queue Storage is a service for storing large numbers of messages. You access messages from anywhere in the world via authenticated calls using HTTP or HTTPS. A queue message can be up to 64 KB in size. A queue may contain millions of messages, up to the total capacity limit of a storage account. Queues are commonly used to create a backlog of work to process asynchronously, like in the [Web-Queue-Worker architectural style](/azure/architecture/guide/architecture-styles/web-queue-worker).
## Queue Storage concepts
stream-analytics Machine Learning Udf https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/machine-learning-udf.md
You can achieve low latency by ensuring that your Azure Kubernetes Service (AKS)
* Not Found (404) * Unauthorized (401)
+## Limitations
+
+If you're using an Azure ML Managed Endpoint service, Stream Analytics can currently only access endpoints that have public network access enabled. Read more about it on the page about [Azure ML private endpoints](/azure/machine-learning/concept-secure-online-endpoint?view=azureml-api-2&tabs=cli#secure-inbound-scoring-requests).
+ ## Next steps * [Tutorial: Azure Stream Analytics JavaScript user-defined functions](stream-analytics-javascript-user-defined-functions.md)
stream-analytics Stream Analytics Parsing Protobuf https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/stream-analytics-parsing-protobuf.md
Title: Parsing Protobuf
-description: This article describes how to use Azure Stream Analytics with protobuf as a data input
+ Title: Parse Protobuf
+description: This article describes how to use Azure Stream Analytics with Protobuf as data input.
# Parse Protobuf in Azure Stream Analytics
-Azure Stream Analytics supports processing events in protocol buffer data formats. You can use the built-in protobuf deserializer when configuring your inputs. To use the built-in deserializer, specify the protobuf definition file, message type, and prefix style.
+Azure Stream Analytics supports processing events in Protocol Buffer (Protobuf) data formats. You can use the built-in Protobuf deserializer when configuring your inputs. To use the built-in deserializer, specify the Protobuf definition file, message type, and prefix style.
-To configure your stream analytics job to deserialize events in protobuf, use the following guidance:
+## Steps to configure a Stream Analytics job
-1. After creating your stream analytics job, click on **Inputs**
-1. Click on **Add input** and select what input you want to configure to open the input configuration blade
-1. Select **Event serialization format** to show a dropdown and select **Protobuf**
+To configure your Stream Analytics job to deserialize events in Protobuf:
+1. After you create your Stream Analytics job, select **Inputs**.
+1. Select **Add input**, and then select what input you want to configure to open the pane for input configuration.
+1. Select **Event serialization format** to show a dropdown list, and then select **Protobuf (preview)**.
-Complete the configuration using the following guidance:
+ :::image type="content" source="./media/protobuf/protobuf-input-config.png" alt-text=" Screenshot that shows selections for configuring Protobuf for an Azure Stream Analytics job." lightbox="./media/protobuf/protobuf-input-config.png" :::
-| Property name | Description |
-||-|
-| Protobuf definition file | A file that specifies the structure and datatypes of your protobuf events |
-| Message type | The message type that you want to deserialize |
-| Prefix style | It is used to determine how long a message is to deserialize protobuf events correctly |
+1. Complete the configuration by using the following guidance:
+ | Property name | Description |
+ ||-|
+ | **Protobuf definition file** | A file that specifies the structure and data types of your Protobuf events |
+ | **Message type** | The message type that you want to deserialize |
+ | **Prefix style** | The setting that determines how long a message is to deserialize Protobuf events correctly |
-> [!NOTE]
-> To learn more about Protobuf datatypes, visit the [Official Protocol Buffers Documentation](https://protobuf.dev/reference/protobuf/google.protobuf/) .
->
+ :::image type="content" source="./media/protobuf/protobuf.png" alt-text=" Screenshot that shows the input boxes on the configuration pane for an Azure Stream Analytics job, after you select Protobuf as the event serialization format." lightbox="./media/protobuf/protobuf.png" :::
+
+To learn more about Protobuf data types, see the [official Protocol Buffers documentation](https://protobuf.dev/reference/protobuf/google.protobuf/).
+
+## Limitations
+
+- The Protobuf deserializer takes only one Protobuf definition file at a time. Imports to custom-made Protobuf definition files aren't supported. For example:
-### Limitations
+ :::image type="content" source="./media/protobuf/one-proto-example.png" alt-text=" Screenshot that shows an example of a custom-made Protobuf definition file." lightbox="./media/protobuf/one-proto-example.png" :::
-1. Protobuf Deserializer takes only one (1) protobuf definition file at a time. Imports to custom-made protobuf definition files aren't supported.
- For example:
- :::image type="content" source="./media/protobuf/one-proto-example.png" alt-text=" Screenshot showing how an example of a custom-made protobuf definition file." lightbox="./media/protobuf/one-proto-example.png" :::
+ This Protobuf definition file refers to another Protobuf definition file in its imports. Because the Protobuf deserializer would have only the current Protobuf definition file and not know what *carseat.proto* is, it would be unable to deserialize correctly.
- This protobuf definition file refers to another protobuf definition file in its imports. Because the protobuf deserializer would have only the current protobuf definition file and not know what carseat.proto is, it would be unable to deserialize correctly.
+- Enumerations aren't supported. If the Protobuf definition file contains enumerations, the `enum` field is empty when the Protobuf events deserialize. This condition leads to data loss.
-2. Enums aren't supported. If the protobuf definition file contains enums, then protobuf events deserialize, but the enum field is empty, leading to data loss.
+- Maps in Protobuf aren't supported. Maps in Protobuf result in an error about missing a string key.
-3. Maps in protobuf are currently not supported. Maps in protobuf result in an error about missing a string key.
+- When a Protobuf definition file contains a namespace or package, the message type must include it. For example:
-4. When a protobuf definition file contains a namespace or package, the message type must include it.
- For example:
- :::image type="content" source="./media/protobuf/proto-namespace-example.png" alt-text=" Screenshot showing an example of a protobuf definition file with a namespace." lightbox="./media/protobuf/proto-namespace-example.png" :::
+ :::image type="content" source="./media/protobuf/proto-namespace-example.png" alt-text=" Screenshot that shows an example of a Protobuf definition file with a namespace." lightbox="./media/protobuf/proto-namespace-example.png" :::
- In the Protobuf Deserializer in portal, the message type must be **namespacetest.Volunteer** instead of the usual **Volunteer**.
+ In the Protobuf deserializer in the portal, the message type must be `namespacetest.Volunteer` instead of the usual `Volunteer`.
-5. When sending messages that were serialized using Google.Protobuf, the prefix type should be set to base128 since that is the most cross-compatible type.
+- When you're sending messages that were serialized via `google.protobuf`, the prefix type should be set to `base128` because that's the most cross-compatible type.
-6. Service Messages aren't supported in the protobuf deserializers. Your job throws an exception if you attempt to use a service message.
- For example:
- :::image type="content" source="./media/protobuf/service-message-proto.png" alt-text=" Screenshot showing an example of a service message." lightbox="./media/protobuf/service-message-proto.png" :::
-
-8. Current datatypes not supported:
- * Any
- * One of (related to enums)
- * Durations
- * Struct
- * Field Mask (Not supported by protobuf-net)
- * List Value
- * Value
- * Null Value
- * Empty
+- Service messages aren't supported in the Protobuf deserializers. Your job throws an exception if you try to use a service message. For example:
+
+ :::image type="content" source="./media/protobuf/service-message-proto.png" alt-text=" Screenshot that shows an example of a service message." lightbox="./media/protobuf/service-message-proto.png" :::
+
+- These data types aren't supported:
+
+ - `Any`
+ - `One of` (related to enumerations)
+ - `Durations`
+ - `Struct`
+ - `Field Mask` (not supported by protobuf-net)
+ - `List Value`
+ - `Value`
+ - `Null Value`
+ - `Empty`
> [!NOTE]
-> For direct help with using the protobuf deserializer, please reach out to [askasa@microsoft.com](mailto:askasa@microsoft.com).
->
+> For direct help with using the Protobuf deserializer, send email to [askasa@microsoft.com](mailto:askasa@microsoft.com).
+
+## See also
-## See Also
-[Data Types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
+[Data types in Azure Stream Analytics](/stream-analytics-query/data-types-azure-stream-analytics)
update-manager Manage Updates Customized Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/update-manager/manage-updates-customized-images.md
With marketplace images, support is validated even before Update Manager operati
For instance, an assessment call attempts to fetch the latest patch that's available from the image's OS family to check support. It stores this support-related data in an Azure Resource Graph table, which you can query to see the support status for your Azure Compute Gallery image.
+## Check the preview
+
+Start the asynchronous support check by using either one of the following APIs:
+
+- API Action Invocation:
+ 1. [Assess patches](/rest/api/compute/virtual-machines/assess-patches?tabs=HTTP).
+ 1. [Install patches](/rest/api/compute/virtual-machines/install-patches?tabs=HTTP).
+
+- Portal operations. Try the preview:
+ 1. [On-demand check for updates](view-updates.md)
+ 1. [One-time update](deploy-updates.md)
+
+Validate the VM support state for Azure Resource Graph:
+
+- Table:
+
+ `patchassessmentresources`
+- Resource:
+
+ `Microsoft.compute/virtualmachines/patchassessmentresults/configurationStatus.vmGuestPatchReadiness.detectedVMGuestPatchSupportState. [Possible values: Unknown, Supported, Unsupported, UnableToDetermine]`
+
+ :::image type="content" source="./media/manage-updates-customized-images/resource-graph-view.png" alt-text="Screenshot that shows the resource in Azure Resource Graph Explorer.":::
+
+We recommend that you run the Assess Patches API after the VM is provisioned and the prerequisites are set for public preview. This action validates the support state of the VM. If the VM is supported, you can run the Install Patches API to begin the patching.
## Limitations
virtual-machine-scale-sets Virtual Machine Scale Sets Health Extension https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machine-scale-sets/virtual-machine-scale-sets-health-extension.md
The following JSON shows the schema for the Application Health extension. The ex
| protocol | `http` or `https` or `tcp` | string | | port | Optional when protocol is `http` or `https`, mandatory when protocol is `tcp` | int | | requestPath | Mandatory when protocol is `http` or `https`, not allowed when protocol is `tcp` | string |
+| intervalInSeconds | Optional, default is 5 seconds. This is the interval between each health probe. For example, if intervalInSeconds == 5, a probe will be sent to the local application endpoint once every 5 seconds. | int |
+| numberOfProbes | Optional, default is 1. This is the number of consecutive probes required for the health status to change. For example, if numberOfProbles == 3, you will need 3 consecutive "Healthy" signals to change the health status from "Unhealthy" into "Healthy" state. The same requirement applies to change health status into "Unhealthy" state. | int |
## Extension schema for Rich Health States
The following JSON shows the schema for the Rich Health States extension. The ex
| protocol | `http` or `https` or `tcp` | string | | port | Optional when protocol is `http` or `https`, mandatory when protocol is `tcp` | int | | requestPath | Mandatory when protocol is `http` or `https`, not allowed when protocol is `tcp` | string |
-| intervalInSeconds | Optional, default is 5 seconds | int |
-| numberOfProbes | Optional, default is 1 | int |
+| intervalInSeconds | Optional, default is 5 seconds. This is the interval between each health probe. For example, if intervalInSeconds == 5, a probe will be sent to the local application endpoint once every 5 seconds. | int |
+| numberOfProbes | Optional, default is 1. This is the number of consecutive probes required for the health status to change. For example, if numberOfProbles == 3, you will need 3 consecutive "Healthy" signals to change the health status from "Unhealthy"/"Unknown" into "Healthy" state. The same requirement applies to change health status into "Unhealthy" or "Unknown" state. | int |
| gracePeriod | Optional, default = `intervalInSeconds` * `numberOfProbes`; maximum grace period is 7200 seconds | int |
virtual-machines Disks Incremental Snapshots https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disks-incremental-snapshots.md
Title: Create an incremental snapshot
-description: Learn about incremental snapshots for managed disks, including how to create them using the Azure portal, Azure PowerShell module, and Azure Resource Manager.
+description: Learn about incremental snapshots for managed disks, including how to create them and the performance impact when restoring snapshots.
Previously updated : 10/24/2023 Last updated : 11/17/2023 ms.devlang: azurecli
virtual-machines Msv3 Mdsv3 Medium Series https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/msv3-mdsv3-medium-series.md
Last updated 08/10/2023
-# Msv3 and Mdsv3 Medium Memory Series (Preview)
+# Msv3 and Mdsv3 Medium Memory Series
-> [!IMPORTANT]
-> The Msv3 and Mdsv3 Medium Memory Series is currently in preview. Previews are made available to you on the condition that you agree to the [supplemental terms of use](https://azure.microsoft.com/support/legal/preview-supplemental-terms/). Some aspects of this feature may change prior to general availability (GA).
->
-> Customers can [sign up for Msv3 and Mdsv3 Medium Memory Series preview today](https://forms.office.com/r/s0fKkC420i). Msv3 and Mdsv3 Medium Memory Series VMs preview is available in the West Europe, North Europe, East US 2 and East US Azure regions.
-
-The Msv3 and Mdsv3 Medium Memory(MM) series, powered by 4<sup>th</sup> generation Intel® Xeon® Scalable processors, are the next generation of memory-optimized VM sizes delivering faster performance, lower total cost of ownership and improved resilience to failures compared to previous generation Mv2 VMs. The Mv3 MM offers VM sizes of up to 3TB of memory and 4,000 MBps throughout to remote storage and provides up to 25% networking performance improvements over previous generations.
+The Msv3 and Mdsv3 Medium Memory(MM) series, powered by 4<sup>th</sup> generation Intel® Xeon® Scalable processors, are the next generation of memory-optimized VM sizes delivering faster performance, lower total cost of ownership and improved resilience to failures compared to previous generation Mv2 VMs. The Mv3 MM offers VM sizes of up to 4TB of memory and 4,000 MBps throughout to remote storage and provides up to 25% networking performance improvements over previous generations.
## Msv3 Medium Memory series
The Msv3 and Mdsv3 Medium Memory(MM) series, powered by 4<sup>th</sup> generatio
|Standard_M96s_1_v3|96|974|64|65,000/ 1,560|65,000/ 1,560|8|16,000| |Standard_M96s_2_v3|96|1,946|64|130,000/ 3,120|130,000/ 3,120|8|30,000| |Standard_M176s_3_v3|176|2794|64|130,000/ 4,000|130,000/ 4,000|8|40,000|
+|Standard_M176s_4_v3|176|3892|64|130,000/ 4,000|130,000/ 4,000|8|40,000|
## Mdsv3 Medium Memory series
These virtual machines feature local SSD storage (up to 400 GiB).
|Standard_M96ds_1_v3|96|974|400|64|40,000/400|65,000/ 1,560|65,000/ 1,560|8|16,000| |Standard_M96ds_2_v3|96|1,946|400|64|160,000/1600|130,000/ 3,120|130,000/ 3,120|8|30,000| |Standard_M176ds_3_v3|176|2794|400|64|160,000/1600|130,000/ 4,000|130,000/ 4,000|8|40,000|
+|Standard_M176ds_4_v3|176|3892|400|64|160,000/1600|130,000/ 4,000|130,000/ 4,000|8|40,000|
<sup>*</sup> Read iops is optimized for sequential reads<br>
virtual-machines Create Managed Disk From Snapshot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/scripts/create-managed-disk-from-snapshot.md
Title: Create managed disk from snapshot (Linux) - CLI sample
-description: Azure CLI Script Sample - Create a managed disk from a snapshot
+description: Azure CLI Script Sample - restore a disk from a snapshot and learn about the performance impact of restoring managed disk snapshots
documentationcenter: storage
ms.devlang: azurecli
vm-linux Previously updated : 10/24/2023 Last updated : 11/17/2023
virtual-machines Virtual Machines Powershell Sample Create Managed Disk From Snapshot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/scripts/virtual-machines-powershell-sample-create-managed-disk-from-snapshot.md
Title: Create managed disk from snapshot - PowerShell sample
-description: Azure PowerShell Script Sample - Create a managed disk from a snapshot
+description: Azure PowerShell Sample - restore a disk from a snapshot and learn about the performance impact of restoring managed disk snapshots
documentationcenter: storage
vm-windows Previously updated : 10/24/2023 Last updated : 11/17/2023
virtual-network Default Outbound Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/default-outbound-access.md
If you deploy a virtual machine in Azure and it doesn't have explicit outbound c
* Loss of IP address
- * Customers don't own the default outbound access IP. This IP might change, and any dependency on it could cause issues in the future.
+ * Customers don't own the default outbound access IP. This IP might changit ge, and any dependency on it could cause issues in the future.
## How can I transition to an explicit method of public connectivity (and disable default outbound access)?-
-There are multiple ways to turn off default outbound access:
-
+
+There are multiple ways to turn off default outbound access. The following sections describe the options available to you.
+
>[!Important] > Private Subnet is currently in public preview. It is provided without a service-level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).-
-* Utilize the Private Subnet parameter
- * Creating a subnet to be Private prevents any virtual machines on the subnet from utilizing default outbound access to connect to public endpoints.
- * The parameter to create a Private Subnet can only be modified during the creation of a subnet.
- * VMs on a Private Subnet can still access the Internet using explicit outbound connectivity.
-
+
+### Utilize the Private Subnet parameter
+
+* Creating a subnet to be Private prevents any virtual machines on the subnet from utilizing default outbound access to connect to public endpoints.
+
+* The parameter to create a Private subnet can only be set during the creation of a subnet.
+
+* VMs on a Private subnet can still access the Internet using explicit outbound connectivity.
+
> [!NOTE] > Certain services will not function on a virtual machine in a Private Subnet without an explicit method of egress (examples are Windows Activation and Windows Updates).-
-* Add an explicit outbound connectivity method.
-
- * Associate a NAT gateway to the subnet of your virtual machine.
-
- * Associate a standard load balancer configured with outbound rules.
-
- * Associate a Standard public IP to any of the virtual machine's network interfaces (if there are multiple network interfaces, having a single NIC with a standard public IP prevents default outbound access for the virtual machine).
-
-* Use Flexible orchestration mode for Virtual Machine Scale Sets.
-
- * Flexible scale sets are secure by default. Any instances created via Flexible scale sets don't have the default outbound access IP associated with them, so an explicit outbound method is required. For more information, see [Flexible orchestration mode for Virtual Machine Scale Sets](../../virtual-machines/flexible-virtual-machine-scale-sets.md)
-
+
+#### Add the Private subnet feature
+
+* From the Azure portal, ensure the option to enable Private subnet is selected when creating a subnet as part of the Virtual Network create experience as shown below:
+
+
+* Using CLI, when creating a subnet with [az network vnet subnet create](https://learn.microsoft.com/cli/azure/network/vnet/subnet?view=azure-cli-latest#az-network-vnet-subnet-create), use the **`--default-outbound**` option and choose "false"
+
+* Using an Azure Resource Manager template, set the value of **`defaultOutboundAccess**` parameter to be "false"
+
+### Add an explicit outbound connectivity method
+
+* Associate a NAT gateway to the subnet of your virtual machine.
+
+* Associate a standard load balancer configured with outbound rules.
+
+* Associate a Standard public IP to any of the virtual machine's network interfaces (if there are multiple network interfaces, having a single NIC with a standard public IP prevents default outbound access for the virtual machine).
+
+### Use Flexible orchestration mode for Virtual Machine Scale Sets
+
+* Flexible scale sets are secure by default. Any instances created via Flexible scale sets don't have the default outbound access IP associated with them, so an explicit outbound method is required. For more information, see [Flexible orchestration mode for Virtual Machine Scale Sets](../../virtual-machines/flexible-virtual-machine-scale-sets.md)
+
>[!Important] > When a load balancer backend pool is configured by IP address, it will use default outbound access due to an ongoing known issue. For secure by default configuration and applications with demanding outbound needs, associate a NAT gateway to the VMs in your load balancer's backend pool to secure traffic. See more on existing [known issues](../../load-balancer/whats-new.md#known-issues).