Updates from: 05/06/2022 01:21:38
Category Microsoft Docs article Related commit history on GitHub Change details
compliance Compliance Manager Alert Policies https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/compliance-manager-alert-policies.md
Compliance Manger can alert you to changes as soon as they happen so that you ca
To create alerts, you first set up an alert policy to outline the conditions that trigger an alert and the frequency of notifications. When we detect a match to your policy conditions, you'll receive an email notification with details so you can determine whether to investigate or take further action. -
-All alerts are listed on the **Alerts** tab in Compliance Manger, and all alert policies are listed on the **Alert Policies tab**.
+All alerts are listed on the **Alerts** tab in Compliance Manger, and all alert policies are listed on the **Alert Policies tab**. All organizations have a [default score change policy](#default-score-change-policy) already set up for them.
## Understanding the Alerts and Alert policies pages
You can create policies to alert you when certain changes or events related to i
- **Test status change**: a user has changed the testing status of an improvement action. - **Evidence change**: a user has uploaded or deleted an evidence document in the **Documents** tab of the improvement action.
+#### Default score change policy
+
+Compliance Manager sets up a default alert policy to monitor for score changes in improvement actions. The default policy will generate an alert when an improvement action's score changes. Most settings for the default policy can't be edited, but you can add additional recipients for notifications.
+
+Here are the settings for the default policy:
+
+- All matches that are detected within a span of 60 minutes will be grouped into one single alert to reduce excessive notifications. For example, if five improvement actions experience a score change within one hour, one alert will be generated.
+
+- The severity level for these alerts is **medium**.
+
+- The Global Admin for your organization is the default recipient of alert notifications.
+
+- You can add more alert recipients by following these steps:
+ - On the **Alert policies** page, find the **Compliance Manager default alert policy**.
+ - Check the box to thee left of its name and select the **Edit** button near the top, above the filters.
+ - Select the **Next** button until you come to the **Alert recipients** page.
+ - Select **+Select recipients** and check the boxes next to each user name on the flyout pane whom you want to receive the email notification. When done, select **Add recipient**, then select **Next**.
+ - On the **Review and finish** page, select **Update** to save your changes.
+
+- The default policy can't be deleted, but you can disable it by [following the steps outlined below](#activate-or-inactivate-a-policy).
++ ### Policy creation steps To create a policy to generate alerts based on one or more events, follow the steps below:
compliance Dlp Learn About Dlp https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/dlp-learn-about-dlp.md
After the policy's synced to the right locations, it starts to evaluate content
## Viewing policy application results
-DLP reports a vast amount of information into Microsoft Purview from monitoring, policy matches and actions, and user activities. You'll need to consume and act on that information to tune your policies and triage actions taken on sensitive items. The telemetry goes into the [Microsoft Purview compliance portal Audit Logs](search-the-audit-log-in-security-and-compliance.md#search-the-audit-log-in-the-compliance-center) first, is processed, and makes its way to different reporting tools. Each reporting tool has a different purpose.
+DLP reports a vast amount of information into Microsoft Purview from monitoring, policy matches and actions, and user activities. You'll need to consume and act on that information to tune your policies and triage actions taken on sensitive items. The telemetry goes into the [Microsoft Purview compliance portal Audit Logs](search-the-audit-log-in-security-and-compliance.md#search-the-audit-log-in-the-compliance-portal) first, is processed, and makes its way to different reporting tools. Each reporting tool has a different purpose.
### DLP Alerts Dashboard
compliance Microsoft 365 Compliance Center Redirection https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/microsoft-365-compliance-center-redirection.md
Title: "Redirection of users from the Office 365 Security and Compliance center to the Microsoft Purview compliance portal"
+ Title: "Redirection of users from the Office 365 Security and Compliance Center to the Microsoft Purview compliance portal"
f1.keywords: - NOCSH
audience: ITPro ms.localizationpriority: medium
-description: Learn about automatic redirection of users from the Office 365 Security and Compliance center users to the Microsoft Purview compliance portal.
+description: Learn about automatic redirection of users from the Office 365 Security and Compliance Center users to the Microsoft Purview compliance portal.
-# Redirection of users from the Office 365 Security and Compliance center to the Microsoft Purview compliance portal
+# Redirection of users from the Office 365 Security and Compliance Center to the Microsoft Purview compliance portal
[!include[Purview banner](../includes/purview-rebrand-banner.md)]
Automatic redirection is enabled by default for all users accessing compliance-r
- [Content search](search-for-content.md) - [eDiscovery (Standard)](get-started-core-ediscovery.md) - [Data classification](data-classification-overview.md)-- [Microsoft Purview Data Loss Prevention (DLP)](dlp-learn-about-dlp.md)
+- [Data loss prevention (DLP)](dlp-learn-about-dlp.md)
- [Data subject requests](/compliance/regulatory/gdpr-manage-gdpr-data-subject-requests-with-the-dsr-case-tool)-- [Information governance](manage-data-governance.md)
+- [Data lifecycle management](manage-data-governance.md) (formerly **Information governance**)
- [Records management](records-management.md) Users are automatically routed to the same compliance solutions in the <a href="https://go.microsoft.com/fwlink/p/?linkid=2077149" target="_blank">compliance portal</a>.
-This feature and associated controls does not enable the automatic redirection of Security features for Microsoft Defender for Office 365. To enable the redirection for security features, see [Redirecting accounts from Microsoft Defender for Office 365 to the Microsoft 365 Defender portal](/microsoft-365/security/defender/microsoft-365-security-mdo-redirection) for details.
+This feature and associated controls does not enable the automatic redirection of Security features for Microsoft Defender for Office 365. For more information, see [Microsoft Defender for Office 365 in Microsoft 365 Defender](/microsoft-365/security/defender/microsoft-365-security-center-mdo).
## Related information
compliance Ome Message Access Logs https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/ome-message-access-logs.md
audience: Admin
ms.localizationpriority: medium Previously updated : 04/21/2022 Last updated : 05/04/2022 - Strat_O365_IP - M365-security-compliance
The access log contains entries for messages sent through the encrypted message
- Attachment download - mail replies and forward
+For more information on the message access log schema, see [Search the audit log in the compliance portal](search-the-audit-log-in-security-and-compliance.md#encrypted-message-portal-activities).
+ ## Search for events in the message access logs To view the events captured in the message access logs:
compliance Retention Settings https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/retention-settings.md
For overview information about policies for retention and how retention works in
## Scopes - adaptive and static
-If you are unfamiliar with adaptive and static scopes, and to help you choose which one to use when you configure a policy for retention, see [Adaptive or static policy scopes for retention](retention.md#adaptive-or-static-policy-scopes-for-retention).
+If you're unfamiliar with adaptive and static scopes, and to help you choose which one to use when you configure a policy for retention, see [Adaptive or static policy scopes for retention](retention.md#adaptive-or-static-policy-scopes-for-retention).
When you've decided whether to use an adaptive or static scope, use the following information to help you configure it: - [Configuration information for adaptive scopes](#configuration-information-for-adaptive-scopes)
When you've decided whether to use an adaptive or static scope, use the followin
### Configuration information for adaptive scopes
-When you choose to use adaptive scopes, you are prompted to select what type of adaptive scope you want. There are three different types of adaptive scopes and each one supports different attributes or properties:
+When you choose to use adaptive scopes, you're prompted to select what type of adaptive scope you want. There are three different types of adaptive scopes and each one supports different attributes or properties:
| Adaptive scope type | Attributes or properties supported include | |:--|:--|
The attribute names for users and groups are based on [filterable recipient prop
The attributes and properties listed in the table can be easily specified when you configure an adaptive scope by using the simple query builder. Additional attributes and properties are supported with the advanced query builder, as described in the following section. > [!TIP]
-> For additional information about using the advanced query builder, see the following webinars:
+> For more information about using the advanced query builder, see the following webinars:
> - [Building Advanced Queries for Users and Groups with Adaptive Policy Scopes](https://mipc.eventbuilder.com/event/52683/occurrence/49452/recording?rauth=853.3181650.1f2b6e8b4a05b4441f19b890dfeadcec24c4325e90ac492b7a58eb3045c546ea) > - [Building Advanced Queries for SharePoint Sites with Adaptive Policy Scopes](https://aka.ms/AdaptivePolicyScopes-AdvancedSharePoint)
Specifically for SharePoint sites, there might be additional SharePoint configur
1. In the [Microsoft Purview compliance portal](https://compliance.microsoft.com/), navigate to one of the following locations:
- - If you are using the records management solution:
+ - If you're using the records management solution:
- **Solutions** > **Records management** > **Adaptive scopes** tab > + **Create scope**
- - If you are using the data lifecycle management solution:
+ - If you're using the data lifecycle management solution:
- **Solutions** > **Data lifecycle management** > **Adaptive scopes** tab > + **Create scope** Don't immediately see your solution in the navigation pane? First select **Show all**.
Specifically for SharePoint sites, there might be additional SharePoint configur
- For **SharePoint sites** scopes, use Keyword Query Language (KQL). You might already be familiar with using KQL to search SharePoint by using indexed site properties. To help you specify these KQL queries, see [Keyword Query Language (KQL) syntax reference](/sharepoint/dev/general-development/keyword-query-language-kql-syntax-reference). For example, because SharePoint sites scopes automatically include all SharePoint site types, which include Microsoft 365 group-connected and OneDrive sites, you can use the indexed site property **SiteTemplate** to include or exclude specific site types. The templates you can specify:
- - SITEPAGEPUBLISHING for modern communication sites
- - GROUP for Microsoft 365 group-connected sites
- - TEAMCHANNEL for Microsoft Teams private channel sites
- - STS for a classic SharePoint team site
- - SPSPERS for OneDrive sites
+ - `SITEPAGEPUBLISHING` for modern communication sites
+ - `GROUP` for Microsoft 365 group-connected sites
+ - `TEAMCHANNEL` for Microsoft Teams private channel sites
+ - `STS` for a classic SharePoint team site
+ - `SPSPERS` for OneDrive sites
So to create an adaptive scope that includes only modern communication sites and excludes Microsoft 365 goup-connected and OneDrive sites, specify the following KQL query: ````console
To run a query using PowerShell:
1. [Connect to Exchange Online PowerShell](/powershell/exchange/connect-to-exchange-online-powershell) using an account with [appropriate Exchange Online Administrator permissions](/powershell/exchange/find-exchange-cmdlet-permissions#use-powershell-to-find-the-permissions-required-to-run-a-cmdlet).
-2. Use either [Get-Recipient](/powershell/module/exchange/get-recipient) or [Get-Mailbox](/powershell/module/exchange/get-mailbox) with the *-Filter* parameter and your [OPATH query](/powershell/exchange/filter-properties) for the adaptive scope enclosed in curly brackets (`{`,`}`). If your attribute values are strings, enclose these values in double or single quotes.
+2. Use either [Get-Recipient](/powershell/module/exchange/get-recipient), [Get-Mailbox](/powershell/module/exchange/get-mailbox), or [Get-User](/powershell/module/exchange/get-user) with the *-Filter* parameter and your [OPATH query](/powershell/exchange/filter-properties) for the adaptive scope enclosed in curly brackets (`{`,`}`). If your attribute values are strings, enclose these values in double or single quotes.
- You can determine whether to use `Get-Mailbox` or `Get-Recipient` for validation by identifying which cmdlet is supported by the [OPATH property](/powershell/exchange/filter-properties) that you choose for your query.
+ You can determine whether to use Get-Mailbox, Get-Recipient, or Get-User for validation by identifying which cmdlet is supported by the [OPATH property](/powershell/exchange/filter-properties) that you choose for your query.
> [!IMPORTANT]
- > `Get-Mailbox` does not support the *MailUser* recipient type, so `Get-Recipient` must be used to validate queries that include on-premises mailboxes in a hybrid environment.
+ > Get-Mailbox does not support the *MailUser* recipient type, so Get-Recipient or Get-User must be used to validate queries that include on-premises mailboxes in a hybrid environment.
- To validate a **User** scope, use either:
- - `Get-Mailbox` with `-RecipientTypeDetails UserMailbox` or
- - `Get-Recipient` with `-RecipientTypeDetails UserMailbox,MailUser`
+ To validate a **User** scope, use the appropriate command:
+ - `Get-Mailbox` with *-RecipientTypeDetails UserMailbox,SharedMailbox,RoomMailbox,EquipmentMailbox*
+ - `Get-Recipient` with *-RecipientTypeDetails UserMailbox,MailUser,SharedMailbox,RoomMailbox,EquipmentMailbox*
To validate a **Microsoft 365 Group** scope, use:
- - `Get-Mailbox` or `Get-Recipient` with `-RecipientTypeDetails GroupMailbox`
+ - `Get-Mailbox` with *-GroupMailbox* or `Get-Recipient` with *-RecipientTypeDetails GroupMailbox*
For example, to validate a **User** scope, you could use:
To run a query using PowerShell:
```PowerShell Get-Mailbox -RecipientTypeDetails GroupMailbox -Filter {CustomAttribute15 -eq "Marketing"} -ResultSize Unlimited ```
+
+ > [!TIP]
+ > When you use these commands to validate a user scope, if the number of recipients returned is higher than expected, it might be because it includes users who don't have a valid license for adaptive scopes. These users won't have the retention settings applied to them.
+ >
+ > For example, in a hybrid environment, you might have unlicensed synchronized user accounts without an Exchange mailbox on-premises or in Exchange Online. You can identify these users by running the following command: `Get-User -RecipientTypeDetails User`
3. Verify that the output matches the expected users or groups for your adaptive scope. If it doesn't, check your query and the values with the relevant administrator for Azure AD or Exchange.
When you choose to use static scopes, you must then decide whether to apply the
#### A policy that applies to entire locations
-With the exception of Skype for Business, the default is that all instances for the selected locations are automatically included in the policy without you having to specify them as included.
+Except Skype for Business, the default is that all instances for the selected locations are automatically included in the policy without you having to specify them as included.
For example, **All recipients** for the **Exchange email** location. With this default setting, all existing user mailboxes will be included in the policy, and any new mailboxes created after the policy is applied will automatically inherit the policy.
Locations in policies for retention identify specific Microsoft 365 services tha
Both the **Exchange email** location and the **Exchange public folders** location require mailboxes to have at least 10 MB of data before retention settings will apply to them.
-The **Exchange email** location supports retention for users' email, calendar, and other mailbox items, by applying retention settings at the level of a mailbox. Shared mailboxes are also supported.
+The **Exchange email** location supports retention for users' email, calendar, and other mailbox items, by applying retention settings at the level of a mailbox. Shared mailboxes and resource mailboxes for equipment and rooms are also supported.
-Resource mailboxes, contacts, and Microsoft 365 group mailboxes aren't supported for Exchange email. For Microsoft 365 group mailboxes, select the **Microsoft 365 Groups** location instead. Although the Exchange location initially allows a group mailbox to be selected for a static scope, when you try to save the retention policy, you receive an error that "RemoteGroupMailbox" is not a valid selection for this location.
+Email contacts and Microsoft 365 group mailboxes aren't supported for Exchange email. For Microsoft 365 group mailboxes, select the **Microsoft 365 Groups** location instead. Although the Exchange location initially allows a group mailbox to be selected for a static scope, when you try to save the retention policy, you receive an error that "RemoteGroupMailbox" isn't a valid selection for this location.
Depending on your policy configuration, [inactive mailboxes](inactive-mailboxes-in-office-365.md) might be included or not:
When you configure an auto-apply policy that uses sensitive information types an
### Configuration information for SharePoint sites and OneDrive accounts
-When you choose the **SharePoint sites** location, the policy for retention can retain and delete documents in SharePoint communication sites, team sites that aren't connected by Microsoft 365 groups, and classic sites. Unless you are using [adaptive policy scopes](#exceptions-for-adaptive-policy-scopes), Team sites connected by Microsoft 365 groups aren't supported with this option and instead, use the **Microsoft 365 Groups** location that applies to content in the group's mailbox, site, and files.
+When you choose the **SharePoint sites** location, the policy for retention can retain and delete documents in SharePoint communication sites, team sites that aren't connected by Microsoft 365 groups, and classic sites. Unless you're using [adaptive policy scopes](#exceptions-for-adaptive-policy-scopes), Team sites connected by Microsoft 365 groups aren't supported with this option and instead, use the **Microsoft 365 Groups** location that applies to content in the group's mailbox, site, and files.
For detailed information about what's included and excluded when you configure retention settings for SharePoint and OneDrive, see [What's included for retention and deletion](retention-policies-sharepoint.md#whats-included-for-retention-and-deletion).
Mailboxes that you target with this policy location require at least 10 MB of da
> [!NOTE] > Even though a Microsoft 365 group has an Exchange mailbox, a retention policy for the **Exchange email** location won't include content in Microsoft 365 group mailboxes.
-If you use static scopes: Although the **Exchange email** location for a static scope initially allows you to specify a group mailbox to be included or excluded, when you try to save the retention policy, you'll see an error that "RemoteGroupMailbox" is not a valid selection for the Exchange location.
+If you use static scopes: Although the **Exchange email** location for a static scope initially allows you to specify a group mailbox to be included or excluded, when you try to save the retention policy, you'll see an error that "RemoteGroupMailbox" isn't a valid selection for the Exchange location.
By default, a retention policy applied to a Microsoft 365 group includes the group mailbox and SharePoint teams site. Files stored in the SharePoint teams site are covered with this location, but not Teams chats or Teams channel messages that have their own retention policy locations.
-To change the default because you want the retention policy to apply to either just the Microsoft 365 mailboxes, or just the connected SharePoint teams sites, use the [Set-RetentionCompliancePolicy](/powershell/module/exchange/set-retentioncompliancepolicy) PowerShell cmdlet with the *Applications* parameter with one of the following values:
+To change the default because you want the retention policy to apply to either just the Microsoft 365 mailboxes, or just the connected SharePoint teams sites, use the [Set-RetentionCompliancePolicy](/powershell/module/exchange/set-retentioncompliancepolicy) PowerShell cmdlet and the *Applications* parameter with one of the following values:
- `Group:Exchange` for just Microsoft 365 mailboxes that are connected to the group. - `Group:SharePoint` for just SharePoint sites that are connected to the group.
When a policy for retention (static policy scope or adaptive) is applied to a Mi
- The group-connected SharePoint site is preserved and continues to be managed by the retention policy with the **Microsoft 365 Groups** location. The site is still accessible to the people who had access to it before the group was deleted, and any new permissions must now be managed via SharePoint.
- At this point, you can't exclude the site from the Microsoft 365 Groups location, because you can't specify the deleted group. If you need to release the retention policy from this site, contact Microsoft Support. For example, open a [service request in the Microsoft 365 Admin Center](https://admin.microsoft.com/Adminportal/Home#/support).
+ At this point, you can't exclude the site from the Microsoft 365 Groups location, because you can't specify the deleted group. If you need to release the retention policy from this site, contact Microsoft Support. For example, [open a support request in the Microsoft 365 Admin Center](/microsoft-365/admin/get-help-support#online-support).
- The mailbox for the deleted group becomes inactive and like the SharePoint site, remains subject to retention settings. For more information, see [Inactive mailboxes in Exchange Online](inactive-mailboxes-in-office-365.md).
By choosing the settings for retaining and deleting content, your policy for ret
### Retaining content for a specific period of time
-When you configure a retention label or policy to retain content, you choose to retain items for a specific number of days, months (assumes 30 days for a month), or years. Or alternatively, retain the items forever. The retention period is not calculated from the time the policy was assigned, but according to the start of the retention period specified.
+When you configure a retention label or policy to retain content, you choose to retain items for a specific number of days, months (assumes 30 days for a month), or years. Or alternatively, retain the items forever. The retention period isn't calculated from the time the policy was assigned, but according to the start of the retention period specified.
For the start of the retention period, you can choose when the content was created or, supported only for files and the SharePoint, OneDrive, and Microsoft 365 Groups, when the content was last modified. For retention labels, you can start the retention period from the content was labeled, and when an event occurs.
Examples:
- Exchange: If you want to retain items in a mailbox for seven years, and a message was sent six years ago, the message will be retained for only one year. For Exchange items, the age is based on the date received for incoming email, or the date sent for outgoing email. Retaining items based on when it was last modified applies only to site content in OneDrive and SharePoint.
-At the end of the retention period, you choose whether you want the content to be permanently deleted. For example, for retention polices:
+At the end of the retention period, you choose whether you want the content to be permanently deleted. For example, for retention policies:
![Retention settings page.](../media/b05f84e5-fc71-4717-8f7b-d06a29dc4f29.png)
Before you configure retention, first familiarize yourself with capacity and sto
Retention settings can retain and then delete items, or delete old items without retaining them.
-In both cases, if your retention settings delete items, it's important to understand that the time period you specify is not calculated from the time the policy was assigned, but according to the start of the retention period specified. For example, from the time when the item was created or modified, or labeled.
+In both cases, if your retention settings delete items, it's important to understand that the time period you specify isn't calculated from the time the policy was assigned, but according to the start of the retention period specified. For example, from the time when the item was created or modified, or labeled.
For this reason, first consider the age of the existing content and how the settings might impact that content. Consider communicating your chosen settings to your users and help desk before the settings are applied to content, which gives them time to assess the possible impact. ### A policy that applies to entire locations
-When you choose locations, with the exception of Skype for Business, the default setting is **All** when the status of the location is **On**.
+When you choose locations, except for Skype for Business, the default setting is **All** when the status of the location is **On**.
When a retention policy applies to any combination of entire locations, there is no limit to the number of recipients, sites, accounts, groups, etc., that the policy can include.
compliance Search The Audit Log In Security And Compliance https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/search-the-audit-log-in-security-and-compliance.md
- admindeeplinkMAC
-# Search the audit log in the compliance center
+# Search the audit log in the compliance portal
[!include[Purview banner](../includes/purview-rebrand-banner.md)]
Be sure to read the following items before you start searching the audit log.
- You have to be assigned the View-Only Audit Logs or Audit Logs role in Exchange Online to search the audit log. By default, these roles are assigned to the Compliance Management and Organization Management role groups on the **Permissions** page in the Exchange admin center. Global administrators in Office 365 and Microsoft 365 are automatically added as members of the Organization Management role group in Exchange Online. To give a user the ability to search the audit log with the minimum level of privileges, you can create a custom role group in Exchange Online, add the View-Only Audit Logs or Audit Logs role, and then add the user as a member of the new role group. For more information, see [Manage role groups in Exchange Online](/Exchange/permissions-exo/role-groups).
- > [!IMPORTANT]
> If you assign a user the View-Only Audit Logs or Audit Logs role on the **Permissions** page in the compliance portal, they won't be able to search the audit log. You have to assign the permissions in Exchange Online. This is because the underlying cmdlet used to search the audit log is an Exchange Online cmdlet. - When an audited activity is performed by a user or admin, an audit record is generated and stored in the audit log for your organization. The length of time that an audit record is retained (and searchable in the audit log) depends on your Office 365 or Microsoft 365 Enterprise subscription, and specifically the type of the license that is assigned to specific users.
Be sure to read the following items before you start searching the audit log.
> [!NOTE] > If your organization participated in the private preview program for the one-year retention of audit records, the retention duration for audit records that were generated before the general availability rollout date will not be reset.
- - For users assigned any other (non-E5) Office 365 or Microsoft 365 license, audit records are retained for 90 days. For a list of Office 365 and Microsoft 365 subscriptions that support unified audit logging, see [the security and compliance center service description](/office365/servicedescriptions/office-365-platform-service-description/office-365-securitycompliance-center).
+ - For users assigned any other (non-E5) Office 365 or Microsoft 365 license, audit records are retained for 90 days. For a list of Office 365 and Microsoft 365 subscriptions that support unified audit logging, see [the security and compliance portal service description](/office365/servicedescriptions/office-365-platform-service-description/office-365-securitycompliance-center).
> [!NOTE] > Even when mailbox auditing on by default is turned on, you might notice that mailbox audit events for some users aren't found in audit log searches in the compliance portal or via the Office 365 Management Activity API. For more information, see [More information about mailbox audit logging](enable-mailbox-auditing.md#more-information).
You can export the results of an audit log search to a comma-separated value (CS
#### More information about exporting and viewing audit log search results -- When you download all search results, the CSV file contains the columns **CreationDate**, **UserIds**, **Operations**, and **AuditData**. The **AuditData** column contains additional information about each event (similar to the detailed information displayed on the flyout page when you view the search results in the compliance center). The data in this column consists of a JSON object that contains multiple properties from the audit log record. Each *property:value* pair in the JSON object is separated by a comma. You can use the JSON transform tool in the Power Query Editor in Excel to split **AuditData** column into multiple columns so that each property in the JSON object has its own column. This lets you sort and filter on one or more of these properties. For step-by-step instructions using the Power Query Editor to transform the JSON object, see [Export, configure, and view audit log records](export-view-audit-log-records.md).
+- When you download all search results, the CSV file contains the columns **CreationDate**, **UserIds**, **Operations**, and **AuditData**. The **AuditData** column contains additional information about each event (similar to the detailed information displayed on the flyout page when you view the search results in the compliance portal). The data in this column consists of a JSON object that contains multiple properties from the audit log record. Each *property:value* pair in the JSON object is separated by a comma. You can use the JSON transform tool in the Power Query Editor in Excel to split **AuditData** column into multiple columns so that each property in the JSON object has its own column. This lets you sort and filter on one or more of these properties. For step-by-step instructions using the Power Query Editor to transform the JSON object, see [Export, configure, and view audit log records](export-view-audit-log-records.md).
After you split the **AuditData** column, you can filter on the **Operations** column to display the detailed properties for a specific type of activity.
You can export the results of an audit log search to a comma-separated value (CS
## Audited activities
-The tables in this section describe the activities that are audited in Microsoft 365. You can search for these events by searching the audit log in the security and compliance center.
+The tables in this section describe the activities that are audited in Microsoft 365. You can search for these events by searching the audit log in the security and compliance portal.
These tables group related activities or the activities from a specific service. The tables include the friendly name that's displayed in the **Activities** drop-down list and the name of the corresponding operation that appears in the detailed information of an audit record and in the CSV file when you export the search results. For descriptions of the detailed information, see [Detailed properties in the audit log](detailed-properties-in-the-office-365-audit-log.md).
Click one of the following links to go to a specific table.
:::column-end::: :::row-end:::
+ :::column:::
+ [Encrypted message portal activities](#encrypted-message-portal-activities)
+ :::column-end:::
+ :::column:::
+
+ :::column-end:::
+ :::column:::
+
+ :::column-end:::
+ ### File and page activities The following table describes the file and page activities in SharePoint Online and OneDrive for Business.
The following table lists Azure AD directory and domain-related activities that
### eDiscovery activities
-Content Search and eDiscovery-related activities that are performed in the security and compliance center or by running the corresponding PowerShell cmdlets are logged in the audit log. This includes the following activities:
+Content Search and eDiscovery-related activities that are performed in the security and compliance portal or by running the corresponding PowerShell cmdlets are logged in the audit log. This includes the following activities:
- Creating and managing eDiscovery cases
As previously explained, audit records for activities performed by users assigne
Yes. The Office 365 Management Activity API is used to fetch the audit logs programmatically. To get started, see [Get started with Office 365 Management APIs](/office/office-365-management-api/get-started-with-office-365-management-apis).
-**Are there other ways to get auditing logs other than using the security and compliance center or the Office 365 Management Activity API?**
+**Are there other ways to get auditing logs other than using the security and compliance portal or the Office 365 Management Activity API?**
No. These are the only two ways to get data from the auditing service.
compliance Use Network Upload To Import Pst Files https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/compliance/use-network-upload-to-import-pst-files.md
Now you're ready to use the AzCopy tool to upload PST files to Microsoft 365. Th
| Field | Description | |:--|:--| | Source |The first field specifies the source directory in your organization that contains the PST files that will be uploaded to Microsoft 365. Alternatively, you can specify an Azure Storage location as the source location of the PST files to upload. <br/> Be sure to surround the value of this field with double-quotation marks (" "). <br/> <br/>**Examples**: <br/>`"\\FILESERVER01\PSTs"` <br/> Or <br/>`"https://storageaccountid.blob.core.windows.net/PSTs?sp=racwdl&st=2021-09-21T07:25:53Z&se=2021-09-21T15:25:53Z&sv=2020-08-04&sr=c&sig=xxxxxx"`|
- | Destination |Specifies the SAS URL that you obtained in Step 1. <br/> Be sure to surround the value of this parameter with double-quotation marks (" ").<br/><br/>**Note:** If you use the SAS URL in a script or batch file, watch out for certain characters that need to be escaped. For example, you have to change `%` to `%%` and change `&` to `^&`.<br/><br/>**Tip:** (Optional) You can specify a subfolder in the Azure Storage location to upload the PST files to. You do this by adding a subfolder location (after "ingestiondata") in the SAS URL. The first example doesn't specify a subfolder. That means the PSTs are uploaded to the root (named *ingestiondata*) of the Azure Storage location. The second example uploads the PST files to a subfolder (named *PSTFiles*) in the root of the Azure Storage location. <br/><br/>**Examples**: <br/> `"https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata?sv=2012-02-12&amp;se=9999-12-31T23%3A59%3A59Z&amp;sr=c&amp;si=IngestionSasForAzCopy201601121920498117&amp;sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D"` <br/> Or <br/> `"https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata/PSTFiles?sv=2012-02-12&amp;se=9999-12-31T23%3A59%3A59Z&amp;sr=c&amp;si=IngestionSasForAzCopy201601121920498117&amp;sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D"` <br/> |
- | `--recursive` |This optional flag specifies the recursive mode so that the AzCopy tool copies PSTs files that are located in subfolders in the source directory that is specified by the source field. The default value for this flag is `true`. <br/>**Note:** If you include this flag, PST files in subfolders will have a different file pathname in the Azure Storage location after they're uploaded. You'll have to specify the exact file pathname in the CSV file that you create in Step 4.|
+ | Destination |Specifies the SAS URL that you obtained in Step 1. <br/> Be sure to surround the value of this parameter with double-quotation marks (" ").<br/><br/>**Note:** If you use the SAS URL in a script or batch file, watch out for certain characters that need to be escaped. For example, you have to change `%` to `%%` and change `&` to `^&`.<br/><br/>**Tip:** (Optional) You can specify a subfolder in the Azure Storage location to upload the PST files to. You do this by adding a subfolder location (after "ingestiondata") in the SAS URL. The first example doesn't specify a subfolder. That means the PST files are uploaded to the root (named *ingestiondata*) of the Azure Storage location. The second example uploads the PST files to a subfolder (named *PSTFiles*) in the root of the Azure Storage location. <br/><br/>**Examples**: <br/> `"https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata?sv=2012-02-12&amp;se=9999-12-31T23%3A59%3A59Z&amp;sr=c&amp;si=IngestionSasForAzCopy201601121920498117&amp;sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D"` <br/> Or <br/> `"https://3c3e5952a2764023ad14984.blob.core.windows.net/ingestiondata/PSTFiles?sv=2012-02-12&amp;se=9999-12-31T23%3A59%3A59Z&amp;sr=c&amp;si=IngestionSasForAzCopy201601121920498117&amp;sig=Vt5S4hVzlzMcBkuH8bH711atBffdrOS72TlV1mNdORg%3D"` <br/> |
+ | `--recursive` |This optional flag specifies the recursive mode so that the AzCopy tool copies PST files that are located in subfolders in the source directory that is specified by the source field. The default value for this flag is `true`. <br/>**Note:** If you include this flag, PST files in subfolders will have a different file pathname in the Azure Storage location after they're uploaded. You'll have to specify the exact file pathname in the CSV file that you create in Step 4.|
| `--s2s-preserve-access-tier` | This optional flag is only required when the source location is a general-purpose v2 Azure Storage location that supports access tiers. For the PST Import scenario, there is no need to preserve the access tier when you copy PST files from your Azure Storage account to the Microsoft-provided Azure Storage location. In this case, you can include this flag and use a value of `false`. You don't need to use this flag when copy PST files from a classic Azure Storage account, which doesn't support access tiers.| |||
After the PST files have been uploaded to the Azure Storage location for your or
| `Name` <br/> |Specifies the name of the PST file that will be imported to the user mailbox. The value for this parameter is case-sensitive. The file name of each PST file in the mapping file for an import job must be unique. <br/> <br/>**Important:** The case for the PST file name in the CSV file must be the same as the PST file that was uploaded to the Azure Storage location in Step 2. For example, if you use `annb.pst` in the `Name` parameter in the CSV file, but the name of the actual PST file is `AnnB.pst`, the import for that PST file will fail. Be sure that the name of the PST in the CSV file uses the same case as the actual PST file. <br/> | `annb.pst` <br/> | | `Mailbox` <br/> |Specifies the email address of the mailbox that the PST file will be imported to. You can't specify a public folder because the PST Import Service doesn't support importing PST files to public folders. <br/> To import a PST file to an inactive mailbox, you have to specify the mailbox GUID for this parameter. To obtain this GUID, run the following PowerShell command in Exchange Online: `Get-Mailbox <identity of inactive mailbox> -InactiveMailboxOnly | FL Guid` <br/> <br/>**Note:** Sometimes you might have multiple mailboxes with the same email address, where one mailbox is an active mailbox and the other mailbox is in a soft-deleted (or inactive) state. In these situations, you have to specify the mailbox GUID to uniquely identify the mailbox to import the PST file to. To obtain this GUID for active mailboxes, run the following PowerShell command: `Get-Mailbox <identity of active mailbox> | FL Guid`. To obtain the GUID for soft-deleted (or inactive) mailboxes, run this command `Get-Mailbox <identity of soft-deleted or inactive mailbox> -SoftDeletedMailbox | FL Guid`. <br/> | `annb@contoso.onmicrosoft.com` <br/> Or <br/> `2d7a87fe-d6a2-40cc-8aff-1ebea80d4ae7` <br/> | | `IsArchive` <br/> | Specifies whether to import the PST file to the user's archive mailbox. There are two options: <br/><br/>**FALSE:** Imports the PST file to the user's primary mailbox. <br/> **TRUE:** Imports the PST file to the user's archive mailbox. This assumes that the [user's archive mailbox is enabled](enable-archive-mailboxes.md). <br/><br/>If you set this parameter to `TRUE` and the user's archive mailbox isn't enabled, the import for that user will fail. If an import fails for one user (because their archive isn't enabled and this property is set to `TRUE`), the other users in the import job won't be affected. <br/> If you leave this parameter blank, the PST file is imported to the user's primary mailbox. <br/> <br/>**Note:** To import a PST file to a cloud-based archive mailbox for a user whose primary mailbox is on-premises, just specify `TRUE` for this parameter and specify the email address for the user's on-premises mailbox for the `Mailbox` parameter. <br/> | `FALSE` <br/> Or <br/> `TRUE` <br/> |
- | `TargetRootFolder` <br/> | Specifies the mailbox folder that the PST file is imported to. <br/> <br/> If you leave this parameter blank, the PST file will be imported to a new folder named **Imported** at the root level of the mailbox (the same level as the Inbox folder and the other default mailbox folders). <br/> <br/> If you specify `/`, the folders and items in the PST file are imported to the top of the folder structure in the target mailbox or archive. If a folder exists in the target mailbox (for example, default folders such as Inbox, Sent Items, and Deleted Items), the items in that folder in the PST are merged into the existing folder in the target mailbox. For example, if the PST file contains an Inbox folder, items in that folder are imported to the Inbox folder in the target mailbox. New folders are created if they don't exist in the folder structure for the target mailbox. <br/><br/> If you specify `/<foldername>`, items and folders in the PST file are imported to a folder named *\<foldername\>* . For example, if you use `/ImportedPst`, items would be imported to a folder named **ImportedPst**. This folder will be located in the user's mailbox at the same level as the Inbox folder. <br/><br/> **Tip:** Consider running a few test batches to experiment with this parameter so you can determine the best folder location to import PSTs files to. <br/> |(leave blank) <br/> Or <br/> `/` <br/> Or <br/> `/ImportedPst` <br/> |
+ | `TargetRootFolder` <br/> | Specifies the mailbox folder that the PST file is imported to. <br/> <br/> If you leave this parameter blank, the PST file will be imported to a new folder named **Imported** at the root level of the mailbox (the same level as the Inbox folder and the other default mailbox folders). <br/> <br/> If you specify `/`, the folders and items in the PST file are imported to the top of the folder structure in the target mailbox or archive. If a folder exists in the target mailbox (for example, default folders such as Inbox, Sent Items, and Deleted Items), the items in that folder in the PST are merged into the existing folder in the target mailbox. For example, if the PST file contains an Inbox folder, items in that folder are imported to the Inbox folder in the target mailbox. New folders are created if they don't exist in the folder structure for the target mailbox. <br/><br/> If you specify `/<foldername>`, items and folders in the PST file are imported to a folder named *\<foldername\>* . For example, if you use `/ImportedPst`, items would be imported to a folder named **ImportedPst**. This folder will be located in the user's mailbox at the same level as the Inbox folder. <br/><br/> **Tip:** Consider running a few test batches to experiment with this parameter so you can determine the best folder location to import PST files. <br/> |(leave blank) <br/> Or <br/> `/` <br/> Or <br/> `/ImportedPst` <br/> |
| `ContentCodePage` <br/> |This optional parameter specifies a numeric value for the code page to use for importing PST files in the ANSI file format. This parameter is used for importing PST files from Chinese, Japanese, and Korean (CJK) organizations because these languages typically use a double byte character set (DBCS) for character encoding. If this parameter isn't used to import PST files for languages that use DBCS for mailbox folder names, the folder names are often garbled after they're imported. <br/><br/> For a list of supported values to use for this parameter, see [Code Page Identifiers](/windows/win32/intl/code-page-identifiers). <br/> <br/>**Note:** As previously stated, this is an optional parameter and you don't have to include it in the CSV file. Or you can include it and leave the value blank for one or more rows. <br/> |(leave blank) <br/> Or <br/> `932` (which is the code page identifier for ANSI/OEM Japanese) <br/> | | `SPFileContainer` <br/> |For PST Import, leave this parameter blank. <br/> |Not applicable <br/> | | `SPManifestContainer` <br/> |For PST Import, leave this parameter blank. <br/> |Not applicable <br/> |
enterprise Cross Tenant Mailbox Migration https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/enterprise/cross-tenant-mailbox-migration.md
ms.prod: microsoft-365-enterprise
f1.keywords: - NOCSH Previously updated : 09/21/2020 Last updated : 05/05/2022 - it-pro
Get-MoveRequest -Flags "CrossTenant"
```powershell # Now sync the changes from On-Premises to Azure and Exchange Online in the Target tenant # This action should create the target mail enabled users (MEUs) in the Target tenant
- Start-ADSyncCycle
+ Start-ADSyncSyncCycle
``` **How do we access Outlook on Day 1 after the use mailbox is moved?**
Exchange mailbox moves using MRS craft the targetAddress on the original source
Mailbox permissions include Send on Behalf of and Mailbox Access: -- Send On Behalf Of (AD:publicDelegates) stores the DN of recipients with access to a userΓÇÖs mailbox as a delegate. This value is stored in Active Directory and currently does not move as part of the mailbox transition. If the source mailbox has publicDelegates set, you will need to restamp the publicDelegates on the target Mailbox once the MEU to Mailbox conversion completes in the target environment by running `Set-Mailbox <principle> -GrantSendOnBehalfTo <delegate>`.
+- Send On Behalf Of (AD:publicDelegates) stores the DN of recipients with access to a user's mailbox as a delegate. This value is stored in Active Directory and currently does not move as part of the mailbox transition. If the source mailbox has publicDelegates set, you will need to restamp the publicDelegates on the target Mailbox once the MEU to Mailbox conversion completes in the target environment by running `Set-Mailbox <principle> -GrantSendOnBehalfTo <delegate>`.
- Mailbox Permissions that are stored in the mailbox will move with the mailbox when both the principal and the delegate are moved to the target system. For example, the user TestUser_7 is granted FullAccess to the mailbox TestUser_8 in the tenant SourceCompany.onmicrosoft.com. After the mailbox move completes to TargetCompany.onmicrosoft.com, the same permissions are set up in the target directory. Examples using *Get-MailboxPermission* for TestUser_7 in both source and target tenants are shown below. Exchange cmdlets are prefixed with source and target accordingly.
includes Office 365 U.S. Government Dod Endpoints https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/includes/office-365-u.s.-government-dod-endpoints.md
<!--THIS FILE IS AUTOMATICALLY GENERATED. MANUAL CHANGES WILL BE OVERWRITTEN.--> <!--Please contact the Office 365 Endpoints team with any questions.-->
-<!--USGovDoD endpoints version 2022042800-->
-<!--File generated 2022-04-29 08:00:04.2241-->
+<!--USGovDoD endpoints version 2022050400-->
+<!--File generated 2022-05-04 17:00:03.2586-->
## Exchange Online
ID | Category | ER | Addresses | Ports
26 | Allow<BR>Required | Yes | `*.compliance.apps.mil, *.security.apps.mil, compliance.apps.mil, security.apps.mil`<BR>`23.103.191.0/24, 23.103.199.0/25, 23.103.204.0/22, 52.181.167.52/32, 52.181.167.91/32, 52.182.95.219/32, 2001:489a:2202::/62, 2001:489a:2202:8::/62, 2001:489a:2202:2000::/63` | **TCP:** 443, 80 28 | Default<BR>Required | No | `activity.windows.com, dod.activity.windows.us` | **TCP:** 443 29 | Default<BR>Required | No | `dod-mtis.cortana.ai` | **TCP:** 443
+30 | Default<BR>Required | No | `*.aadrm.us, *.informationprotection.azure.us` |
includes Office 365 U.S. Government Gcc High Endpoints https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/includes/office-365-u.s.-government-gcc-high-endpoints.md
<!--THIS FILE IS AUTOMATICALLY GENERATED. MANUAL CHANGES WILL BE OVERWRITTEN.--> <!--Please contact the Office 365 Endpoints team with any questions.-->
-<!--USGovGCCHigh endpoints version 2022042800-->
-<!--File generated 2022-04-29 08:00:05.6638-->
+<!--USGovGCCHigh endpoints version 2022050400-->
+<!--File generated 2022-05-04 17:00:04.9643-->
## Exchange Online
ID | Category | ER | Addresses | Ports
26 | Allow<BR>Required | Yes | `*.compliance.microsoft.us, *.security.microsoft.us, compliance.microsoft.us, security.microsoft.us`<BR>`13.72.179.197/32, 13.72.183.70/32, 23.103.191.0/24, 23.103.199.128/25, 23.103.208.0/22, 52.227.170.14/32, 52.227.170.120/32, 52.227.178.94/32, 52.227.180.138/32, 52.227.182.149/32, 52.238.74.212/32, 52.244.65.13/32, 2001:489a:2202:4::/62, 2001:489a:2202:c::/62, 2001:489a:2202:2000::/63` | **TCP:** 443, 80 28 | Default<BR>Required | No | `activity.windows.com, gcc-high.activity.windows.us` | **TCP:** 443 29 | Default<BR>Required | No | `gcch-mtis.cortana.ai` | **TCP:** 443
+30 | Default<BR>Required | No | `*.aadrm.us, *.informationprotection.azure.us` |
security Exposed Apis Create App Webapp https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/security/defender-endpoint/exposed-apis-create-app-webapp.md
$authBody = [Ordered] @{
} $authResponse = Invoke-RestMethod -Method Post -Uri $oAuthUri -Body $authBody -ErrorAction Stop $token = $authResponse.access_token
+$token
``` ### Use C#:
The following code was tested with NuGet Microsoft.IdentityModel.Clients.ActiveD
ClientCredential clientCredential = new ClientCredential(appId, appSecret); AuthenticationResult authenticationResult = auth.AcquireTokenAsync(wdatpResourceId, clientCredential).GetAwaiter().GetResult(); string token = authenticationResult.AccessToken;
+ console.write(token)
```
security Gov https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/security/defender-endpoint/gov.md
Windows 10, version 1709|![No.](images/svg/check-no.svg) <br /> Note: Won't be s
Windows 10, version 1703 and earlier|![No.](images/svg/check-no.svg) <br /> Note: Won't be supported|![No](images/svg/check-no.svg) <br /> Note: Won't be supported|![No](images/svg/check-no.svg) <br /> Note: Won't be supported Windows Server 2022|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg) Windows Server 2019 (with [KB4586839](https://support.microsoft.com/help/4586839) <sup>1</sup>)|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)
-Windows Server 2016 (Modern) <sup>2</sup>|![Yes.](images/svg/check-yes.svg) <br /> Public preview|![Yes](images/svg/check-yes.svg) <br /> Public preview|![Yes](images/svg/check-yes.svg) <br /> Public preview
-Windows Server 2012 R2 (Modern) <sup>2</sup>|![Yes.](images/svg/check-yes.svg) <br /> Public preview|![Yes](images/svg/check-yes.svg) <br /> Public preview|![Yes](images/svg/check-yes.svg) <br /> Public preview
+Windows Server 2016 (Modern) <sup>2</sup>|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)
+Windows Server 2012 R2 (Modern) <sup>2</sup>|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)
Windows Server 2016 (Legacy) <sup>3</sup>|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg) Windows Server 2012 R2 (Legacy) <sup>3</sup>|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg) Windows Server 2008 R2 SP1 (Legacy) <sup>3</sup>|![Yes.](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)|![Yes](images/svg/check-yes.svg)
security Microsoft Defender Endpoint Android https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/security/defender-endpoint/microsoft-defender-endpoint-android.md
This topic describes how to install, configure, update, and use Defender for End
### Prerequisites - **For end users**:
- - Microsoft Defender for Endpoint license assigned to the end user(s) of the app. See [Microsoft Defender for Endpoint licensing requirements](/microsoft-365/security/defender-endpoint/minimum-requirements#licensing-requirements)
+ - Microsoft Defender for Endpoint license assigned to the end user(s) of the app. See [Microsoft Defender for Endpoint licensing requirements](/microsoft-365/security/defender-endpoint/minimum-requirements#licensing-requirements).
+ - Intune license is needed before onboarding Android devices.
- Intune Company Portal app can be downloaded from [Google Play](https://play.google.com/store/apps/details?id=com.microsoft.windowsintune.companyportal) and is available on the Android device. - Additionally, device(s) can be [enrolled](/mem/intune/user-help/enroll-device-android-company-portal) via the Intune Company Portal app to enforce Intune device compliance policies. This requires the end user to be assigned a Microsoft Intune license. - For more information on how to assign licenses, see [Assign licenses to users](/azure/active-directory/users-groups-roles/licensing-groups-assign).
security Run Advanced Query Sample Powershell https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/security/defender-endpoint/run-advanced-query-sample-powershell.md
where
Run the following query: ```powershell
-$query = 'RegistryEvents | limit 10' # Paste your own query here
+$query = 'DeviceRegistryEvents | limit 10' # Paste your own query here
$url = "https://api.securitycenter.microsoft.com/api/advancedqueries/run" $headers = @{
security Configure Event Hub https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/security/defender/configure-event-hub.md
Learn how to configure your Event Hub so that it can ingest events from Microsof
:::image type="content" source="../../media/759498162a4e93cbf17c4130d704d164.png" alt-text="The event hubs properties section in the Microsoft Azure portal" lightbox="../../media/759498162a4e93cbf17c4130d704d164.png":::
-1. Once the Event Hub Namespace is created, you will need to add the App Registration Service Principal as Reader, Azure Event Hub Data Receiver, and the user who will be logging into Microsoft 365 Defender as Contributor (you can also do this at Resource Group or Subscription level).
+1. Once the Event Hub Namespace is created, you will need to add the App Registration Service Principal as Reader, Azure Event Hubs Data Receiver, and the user who will be logging into Microsoft 365 Defender as Contributor (you can also do this at Resource Group or Subscription level).
You do this step at **Event Hub Namespace** \> **Access Control (IAM)** \> **Add** and verify under **Role assignments**:
For these Event Hub (not namespace) you will need to configure a Shared Access P
**Forward events to event hub**: Select this checkbox.
- **Event-Hub Resource ID**: This value is the Event Hub Namespace Resource ID you recorded when you setup the Event Hub.
+ **Event-Hub Resource ID**: This value is the Event Hub Namespace Resource ID you recorded when you set up the Event Hub.
**Event-Hub name**: If you created an Event Hub inside your Event Hub Namespace, paste the Event Hub name you recorded above.
test-base Overview https://github.com/MicrosoftDocs/microsoft-365-docs/commits/public/microsoft-365/test-base/overview.md
f1.keywords: NOCSH
# What is Test Base for Microsoft 365?
-Test Base for Microsoft 365 (Test Base) is MicrosoftΓÇÖs validation service based in the secure Azure environment.
-With Test Base, Software Vendors (SVs) and System Integrators (SIs) can accelerate the validation of their applications against pre-released Windows security and feature builds. This is a highly engaged collaboration between SV partners and Microsoft enabling joint testing, validation and remediation.
+Test Base is an Azure service that enables data-driven application testing while providing user access to intelligent testing from anywhere in the world.
-Test Base provides a great opportunity to build and maintain a secure validation service on Azure, where customers and partners can stage and test their application's workloads against our pre-released security updates.
+The following entities are encouraged to onboard their applications, binaries, and test scripts onto the Test Base for Microsoft 365 service: Independent Software Vendors (ISVs), System Integrators (SIs) to validate their applications and IT Professionals who want to validate their line-of-business (LOB) applications through integration with Microsoft Intune.
-With Test Base, SVs are provided with more visibility into potential issues that could hinder their application(s) from performing at its best on the new OS release before Microsoft releases the update to the market.
+## Why test your application with Test Base?
-This new service will help SVs make testing efforts simpler and more efficient. Enterprise customers will benefit from SV and Microsoft testing together in a collaborative environment and gain more confidence that their applications will work as expected.
+The Test Base for Microsoft 365 service can accommodate the expansion of your testing matrix as necessary so you will have confidence in the integrity, compatibility, and usability of your applications.
-**Advantages Test Base offers Enterprises and their SV partners include**:
+Test Base enables your application to continue working as expected even as platform dependencies vary, and new updates are applied by the Windows update service. With Test Base, you can avoid the aggravation, protracted time commitments, and the expense of setting up and maintaining a complex lab environment for testing your applications.
-- Faster rollout of security updates to secure your devices;-- Lowered update validation costs by hosting the OS changes and application in the same environment;-- World-class intelligence report from Microsoft about your apps (code coverage, API impact analysis, and so on);-- Microsoft's expertise in shifting test content and harnesses to Azure.
+In addition, you can automatically test compatibility against security and feature updates for Windows by using secure virtual machines (VMs) while also obtaining access to world-class intelligence for testing your applications. You can also get your apps tested for compatibility against pre-release windows security updates by submitting a request to get the access.
-## Guide to navigating the Test Base portal
+## How does Test Base work?
-This guide is divided into four (4) parts to ensure a hitch free experience while using our service:
+To sign up for the Test Base service, see [Create a new Test Base account](createAccount.md).
-1. The **Overview** which provides detailed, step-by-step guidelines on how to upload your application via our self-serve onboarding portal.
+After a customer has enrolled in the Test Base service, it is a simple matter to begin uploading application packages for testing.
-2. The **Quickstarts** section, which provides information on the format for the zipped folder structure and what you need to know when preparing your test scripts.
+Following a successful upload, packages are tested against Windows pre-release updates.
-3. The **How-to guide** which provides detailed outline on how to use Test Base to infer test results.
+After initial tests are successfully completed, the customer can do a deep dive with insights on performance and regression analysis to detect whether pre-release content updates have degraded application performance in any way.
-4. The **Reference** section that provides answers to the typical questions we receive from our customers.
+However, if the package failed any test, then the customer can also leverage Insights from memory or CPU regressions to remediate the failure and then update the package as necessary.
-## Test Base has reached general availability
+With Test Base, the customer can use a single location to manage all packages being tested, which can also facilitate uploading and updating packages to generate new application versions as needed.
-Test Base has officially been declared General Availability during the Microsoft Ignite conference in November 2021.
+> [!NOTE]
+> **So that customers can take advantage of pre-release update content, they must specifically request access to it. Once your request for access to pre-release updates is approved, your uploaded packages will automatically get scheduled to be tested against the pre-release Windows updates for the OS versions selected during onboarding**.
-This means anyone with a valid enterprise Azure account is able to onboard their test collateral and quickly start testing their applications on the service.
+Then, as new Windows pre-release updates become available, application packages are automatically tested with new pre-release content. Thereafter, an additional round of insights may be required. If customers do not specifically request access, then application packages will be tested against only the current released version of Windows.
-## Who should onboard?
-
-We're encouraging all Software Vendors (SVs), System Integrators (SIs) to onboard their applications, binaries, and test scripts onto the service.
+After packages are successfully tested, customers can deliver them to their software customers and end users with confidence and the assurance that Test Base did its job.
## Next steps Follow the link to get started > [!div class="nextstepaction"]
-> [Next step](createaccount.md)
+> [Create a new Test Base account | Microsoft Docs](createaccount.md)