Updates from: 05/31/2024 05:29:24
Service Microsoft Docs article Related commit history on GitHub Change details
SharePoint B2b Sync https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/b2b-sync.md
Previously updated : 02/04/2019 Last updated : 04/23/2024 Title: "B2B Sync"-+
This article gives an overview of the B2B Sync experience and describes these re
- Content shared from a tenant in one cloud (for example, Microsoft Azure China) can't be synced by a user in a different cloud (for example, Microsoft Azure Commercial). - On the Mac, Files On-Demand thumbnails will not display from external organization's sites. Thumbnails will display correctly for files from the user's own organization.-- On the Mac, if the guest account was created with a different email address format than the form they are using with the sync app, the external site's content cannot be synced. For example, first.last@fabrikam.com vs alias@fabrikam.com.
+- On the Mac, if the guest account was created with a different email address format than the form they are using with the sync app, the external site's content cannot be synced. For example, <first.last@fabrikam.com> vs <alias@fabrikam.com>.
- On the Mac, the external content may be placed on the local computer in the user's own organization's folder instead of one with the external organization's name. - Interactive authentication UI for guest accounts from an external organization is not supported by the sync client.
SharePoint Business Requirements https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/business-requirements.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Compliant Environment https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/compliant-environment.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Deploy File Collaboration https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/deploy-file-collaboration.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Hybrid https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/hybrid.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Modern Experience Sharing Permissions https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/modern-experience-sharing-permissions.md
recommendations: true
audience: Admin f1.keywords: - NOCSH-+ - Strat_SP_modern
SharePoint Plan File Sync https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/plan-file-sync.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Plan For Sharepoint Onedrive https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/plan-for-sharepoint-onedrive.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Roll Out Sharepoint Onedrive https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/roll-out-sharepoint-onedrive.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Setup Wizard https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/setup-wizard.md
recommendations: true audience: Admin f1.keywords: NOCSH-+ ms.localizationpriority: medium
SharePoint Sharepoint Copilot Best Practices https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointOnline/sharepoint-copilot-best-practices.md
audience: Admin
f1.keywords: - NOCSH-+ - M365-collaboration
SharePoint Specify Configuration Database Settings https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/PSConfigHelp/specify-configuration-database-settings.md
You must provide connectivity settings to an existing database server through an
### Database server
-You must type the name for the computer that is running a supported version of the 64 bit edition of SQL Server 2014 SP1. You can type the name of the computer that is running SQL Server as either server or server\instance.
+You must type the name for the computer that is running a supported version of the 64-bit edition of SQL Server 2014 SP1. You can type the name of the computer that is running SQL Server as either server or server\instance.
### Database name
-If you are creating a new configuration database, you can either type the name of the database for the configuration wizard to create, or you can type the name of a database that has been provisioned in advance. If you want to use an existing database you must run Psconfig.exe from the command line to connect to that database, and it must not contain any tables, stored procedures, or other objects.
+If you're creating a new configuration database, you can either type the name of the database for the configuration wizard to create, or you can type the name of a database that has been provisioned in advance. If you want to use an existing database you must run Psconfig.exe from the command line to connect to that database, and it must not contain any tables, stored procedures, or other objects.
-If you are connecting to an existing configuration database, you can click **Retrieve Database Names**. The configuration databases that exist on the computer that is running SQL Server will be returned, and you can choose the appropriate configuration database.
+If you're connecting to an existing configuration database, you can click **Retrieve Database Names**. The configuration databases that exist on the computer that is running SQL Server will be returned, and you can choose the appropriate configuration database.
### Database access account You must enter the credentials for an existing user account that will always be used to connect to the configuration database. If the configuration database is hosted on a different computer, you must provide the credentials for a domain account.
-Although you can enter either a local or a domain account in installations in which you are using a SQL Server installation, a local account will work only for single-server deployments. We recommend that you use a domain account so that you preserve the flexibility to later add more computers to the farm.
+Although you can enter either a local or a domain account in installations in which you're using a SQL Server installation, a local account will work only for single-server deployments. We recommend that you use a domain account so that you preserve the flexibility to later add more computers to the farm.
-To deploy SharePoint products in a server farm environment, you will need a unique domain user account that you can specify as the SharePoint service account. This user account is used to access the configuration database. The database access account will be used for both initial database configuration, and ongoing connections from servers in this farm to the databases.
+To deploy SharePoint products in a server farm environment, you'll need a unique domain user account that you can specify as the SharePoint service account. This user account is used to access the configuration database. The database access account will be used for both initial database configuration, and ongoing connections from servers in this farm to the databases.
> [!IMPORTANT] > Ensure that your domain does not have Group Policy that prohibits the account chosen as your database access account from running as a service.
-This account also acts as the application pool identity for the SharePoint Central Administration application pool and it is the account under which the SharePoint Timer service runs. The SharePoint 2016 Products Configuration Wizard adds this account to the SQL Server Logins, the SQL Server Database Creator server role, and the SQL Server Security Administrators server role. We recommend that you follow the principle of least privilege and do not make this user account a member of any particular security group on your web servers or your database servers.
+This account also acts as the application pool identity for the SharePoint Central Administration application pool and it's the account under which the SharePoint Timer service runs. The SharePoint 2016 Products Configuration Wizard adds this account to the SQL Server Logins, the SQL Server Database Creator server role, and the SQL Server Security Administrators server role. We recommend that you follow the principle of least privilege and don't make this user account a member of any particular security group on your web servers or your database servers.
-The database access account must have the following permissions: security administrator, database creator, and database owner (DBO) of all SharePoint databases. If you run Psconfig.exe from the command line to install and specify the account using SQL authentication, then permissions for this account must be configured in SQL Server. The configuration wizard does not perform this configuration when you run the Psconfig.exe file from the command line with the SQL authentication option.
+The database access account must have the following permissions: security administrator, database creator, and database owner (DBO) of all SharePoint databases. If you run Psconfig.exe from the command line to install and specify the account using SQL authentication, then permissions for this account must be configured in SQL Server. The configuration wizard doesn't perform this configuration when you run the Psconfig.exe file from the command line with the SQL authentication option.
The account that you specify for database access must have the following properties, at minimum:
SharePoint After Installing .NET Security Patches To Address CVE 2018 8421 Sharepoint Crawler May Fail https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/after-installing-.NET-security-patches-to-address-CVE-2018-8421-SharePoint-crawler-may-fail.md
description: "Learn how to fix SharePoint Search after applying the September 20
## Symptoms:
-After applying September 2018 .NET 4.6 or later updates, the crawl will fail and you will see entries similar to the following:
+After applying September 2018 .NET 4.6 or later updates, the crawl will fail and you'll see similar following entries:
```xml <date> <time> mssdmn.exe (0x2730) 0x00D8 SharePoint Foundation Database fa45 High System.TypeInitializationException: The type initializer for 'System.Data.SqlClient.SqlConnection' threw an exception. > System.IO.FileNotFoundException: C:\Program Files\Microsoft Office Servers\16.0\bin\mssdmn.exe at System.Diagnostics.FileVersionInfo.GetVersionInfo(String fileName) at System.Configuration.ClientConfigPaths.SetNamesAndVersion(String applicationFilename, Assembly exeAssembly, Boolean isHttp) at
Microsoft.SharePoint.Utilities.SqlSession.ExecuteReader(SqlCommand command, Comm
## Cause:
-The configuration section for ΓÇÿSqlColumnEncryptionEnclaveProvidersΓÇÖ isn't defined but it's required by .NET Sql Client.
+The configuration section for `SqlColumnEncryptionEnclaveProviders` isn't defined but it's required by .NET Sql Client.
## Solution 1:
-Create a config file for mssdmn.exe if it doesn't exist already (by default it does not). Ensure the config file has the following content:
+Create a config file for mssdmn.exe if it doesn't exist already (by default it doesn't). Ensure the config file has the following content:
Content of mssdmn.exe.config:
SharePoint An Introduction To Recommendations And Popular Items https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/an-introduction-to-recommendations-and-popular-items.md
Here's how the Usage analytics features work:
2. The usage event is recorded in the **Event store**.
-3. The usage events are sent to the **Analytics Processing Component**, where they are analyzed. The result is sent to the **Search index**.
+3. The usage events are sent to the **Analytics Processing Component**, where they're analyzed. The result is sent to the **Search index**.
4. When visitors browse to a page that contains a **Recommendations** or **Popular Items Web Part**, a query is automatically issued and sent to the search index.
To view the usage event definitions, do the following:
> [!IMPORTANT] > One important aspect of how Usage analytics works is step 2 of the overview: *The usage event is recorded in the Event store*. In the Event store, each usage event must be recorded using *the URL of the item*.
-This is especially important in a cross-site publishing scenario (see [An introduction to cross-site publishing in SharePoint Server](an-introduction-to-cross-site-publishing.md)). With cross-site publishing, content is stored in an Authoring site collection and displayed in a Publishing site collection. Managed navigation is used together with category pages and catalog item pages to display content (see [Stage 8: Assign a category page and a catalog item page to a term in SharePoint Server](stage-8-assign-a-category-page-and-a-catalog-item-page-to-a-term.md)). This means that when a visitor views an item on the publishing site, the usage event happens on the catalog item page, for example `http://www.contoso/sites/Pages/ContosoItemPage.aspx`. Because the same catalog item page is used to display many items, the usage event cannot be recorded using the URL of the catalog item page. For Usage analytics to work in a cross-site publishing scenario, the usage event must be recorded using the URL of the item in the authoring site collection, for example `http://www.contoso/sites/catalog/Lists/Products/DispForm.aspx?ID=12`.
+This is especially important in a cross-site publishing scenario (see [An introduction to cross-site publishing in SharePoint Server](an-introduction-to-cross-site-publishing.md)). With cross-site publishing, content is stored in an Authoring site collection and displayed in a Publishing site collection. Managed navigation is used together with category pages and catalog item pages to display content (see [Stage 8: Assign a category page and a catalog item page to a term in SharePoint Server](stage-8-assign-a-category-page-and-a-catalog-item-page-to-a-term.md)). This means that when a visitor views an item on the publishing site, the usage event happens on the catalog item page, for example `http://www.contoso/sites/Pages/ContosoItemPage.aspx`. Because the same catalog item page is used to display many items, the usage event can't be recorded using the URL of the catalog item page. For Usage analytics to work in a cross-site publishing scenario, the usage event must be recorded using the URL of the item in the authoring site collection, for example `http://www.contoso/sites/catalog/Lists/Products/DispForm.aspx?ID=12`.
![Usage Event Recorded Using URL](../media/OTCSP_IntroductionRecommendations2.jpg)
-Depending on how you've set up your website, SharePoint Server can automatically record the usage event on the URL of the item in the authoring site. Here's the question that you must ask yourself: which Web Part are you using to display items on your catalog item page? If the answer is Catalog Item Reuse Web Parts, then you have nothing to worry about. The Catalog Item Reuse Web Part will automatically make sure that usage events are recorded correctly in the Event store. But, if you are using a Content Search Web Part to display items on your catalog item page, you must do some additional configuration steps. But don't worry, all these steps will be explained later in this series.
+Depending on how you've set up your website, SharePoint Server can automatically record the usage event on the URL of the item in the authoring site. Here's the question that you must ask yourself: which Web Part are you using to display items on your catalog item page? If the answer is Catalog Item Reuse Web Parts, then you have nothing to worry about. The Catalog Item Reuse Web Part will automatically make sure that usage events are recorded correctly in the Event store. But, if you're using a Content Search Web Part to display items on your catalog item page, you must do some extra configuration steps. But don't worry, all these steps will be explained later in this series.
Before we move on, there's one more thing that you need to know about: the *UsageAnalyticsID* managed property.
The default Usage analytics calculation will consider the color of items when ca
![Item Recommendations](../media/OTCSP_ItemRecommendations.png)
-The two recommended keyboards are the same product, and so are the two mouse devices. These are not good recommendations. To get recommendations that will ignore the product color, we have to change the mapping of the *UsageAnalyticsID* managed property.
+The two recommended keyboards are the same product, and so are the two mouse devices. These aren't good recommendations. To get recommendations that will ignore the product color, we have to change the mapping of the *UsageAnalyticsID* managed property.
By default, *UsageAnalyticsID* is mapped to the crawled property *ows_ProductCatalogItemNumber*. If you used the Product Catalog site template when you created your authoring site collection (as explained in [Stage 1: Create site collections for cross-site publishing in SharePoint Server](stage-1-create-site-collections-for-cross-site-publishing.md)), this crawled property represents the site column *Item Number* in your *Products* list.
SharePoint Assign Or Remove Administrators Of Service Applications https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/assign-or-remove-administrators-of-service-applications.md
description: "Learn how to assign or remove service administrators to a service
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
-An administrator of a SharePoint Server service application must be a member of the Farm Administrators group to assign or remove additional administrators to that service application. Service application administrators are granted security-trimmed access to the SharePoint Central Administration Web site and can manage settings related to the service application but must be a member of the Farm Administrators group to add and remove other service application administrators.
+An administrator of a SharePoint Server service application must be a member of the Farm Administrators group to assign or remove other administrators to that service application. Service application administrators are granted security-trimmed access to the SharePoint Central Administration Web site and can manage settings related to the service application but must be a member of the Farm Administrators group to add and remove other service application administrators.
> [!NOTE] > By default, members of the Farm Administrators group have permissions to manage all service applications.
-You can assign or remove service application administrators by using the SharePoint Central Administration websiteor by using Microsoft PowerShell.
+You can assign or remove service application administrators by using the SharePoint Central Administration website or by using Microsoft PowerShell.
## To assign or remove administrators to a service application by using Central Administration
You can assign or remove service application administrators by using the SharePo
6. To remove an administrator:
- - In the second text box on the page, select the administrator whom you want to remove. Note that this step does not remove the user from the systemΓÇöit merely revokes the user's administrative permissions to the selected service application.
+ - In the second text box on the page, select the administrator whom you want to remove. This step doesn't remove the user from the systemΓÇöit merely revokes the user's administrative permissions to the selected service application.
- Click **Remove**. - After you have finished removing administrators, click **OK**.
You can assign or remove service application administrators by using the SharePo
- You must have membership in the **db_owner** fixed database role on all databases that are to be updated.
- - You must be a member of the Administrators group on the server on which you are running the PowerShell cmdlet.
+ - You must be a member of the Administrators group on the server on which you're running the PowerShell cmdlet.
> [!NOTE] > If these permissions are not satisfied, contact your Setup administrator or SQL Server administrator to request these permissions.
- For additional information about PowerShell permissions, see [Permissions](/powershell/module/sharepoint-server/?view=sharepoint-ps#section3&preserve-view=true) and [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true)
+ For more information about PowerShell permissions, see [Permissions](/powershell/module/sharepoint-server/?view=sharepoint-ps#section3&preserve-view=true) and [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true)
2. Start the SharePoint Management Shell.
SharePoint Back Up Customizations https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/back-up-customizations.md
Before you begin this operation, review the following list of possible customiza
- Assemblies -- Authored site elements, which are typically created by web designers, are not explicitly compiled and are located in a content database. Authored site elements include the following:
+- Authored site elements, which are typically created by web designers, aren't explicitly compiled and are located in a content database. Authored site elements include the following:
- Master pages
Before you begin this operation, review the following list of possible customiza
- Changes to sites created by direct editing through the browser -- Developed customizations that are not packaged as solutions
+- Developed customizations that aren't packaged as solutions
> [!NOTE] > Each of these kinds of customizations requires a different type of backup.
The method that you use to back up solution packages is determined by whether th
Trusted solutions are solution packages that farm administrators deploy. Trusted solutions are deployed to the entire farm and can be used on any site within the farm. Trusted solutions are stored in the configuration database. Trusted solutions are backed up when a farm is backed up by using SharePoint Server backup, and are included in configuration-only backups. You can also back up trusted solutions as a group or individually. Trusted solutions are visible in the backup hierarchy.
-Sandboxed solutions are solution packages that site collection administrators can deploy to a single site collection. Sandboxed solutions are stored in the content database that is associated with the site collection to which the solution packages are deployed. They are included in SharePoint Server farm, web application, content database, and site collection backups, but are not visible in the backup hierarchy and cannot be selected or backed up individually.
+Sandboxed solutions are solution packages that site collection administrators can deploy to a single site collection. Sandboxed solutions are stored in the content database that is associated with the site collection to which the solution packages are deployed. They're included in SharePoint Server farm, web application, content database, and site collection backups, but aren't visible in the backup hierarchy and can't be selected or backed up individually.
We recommend that you keep a backup of the original .wsp file and the source code used to build the .wsp file for both trusted solutions and sandboxed solutions.
We recommend that you keep a backup of the original .wsp file and the source cod
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
For more information, see [Backup-SPFarm](/powershell/module/sharepoint-server/B
### Backing up sandboxed solutions in SharePoint Server <a name="SandboxedSolutions"> </a>
-You cannot back up only sandboxed solutions. Instead, you must back up the farm, Web application, or content database with which the sandboxed solution is associated.
+You can't back up only sandboxed solutions. Instead, you must back up the farm, Web application, or content database with which the sandboxed solution is associated.
## Back up authored site elements in SharePoint Server <a name="AuthoredSite"> </a>
-You cannot back up only authored site elements. Instead, you must back up the farm, Web application, or content database with which the authored site element is associated.
+You can't back up only authored site elements. Instead, you must back up the farm, Web application, or content database with which the authored site element is associated.
## Back up workflows in SharePoint Server <a name="Workflows"> </a> Workflows are a special case of customizations that you can back up. Make sure that your backup and recovery plan addresses any of the following scenarios that apply to your environment: -- Declarative workflows, such as those that were created in SharePoint Designer, are stored in the content database for the site collection to which they are deployed. Backing up the content database protects these workflows.
+- Declarative workflows, such as those that were created in SharePoint Designer, are stored in the content database for the site collection to which they're deployed. Backing up the content database protects these workflows.
- Custom declarative workflow actions have components in the following three locations:
Workflows are a special case of customizations that you can back up. Make sure t
- The XML definition files (.ACTIONS files) are stored in the 16\TEMPLATE\< _LCID_>\Workflow directory.
- - An XML entry to mark the action as an authorized type is stored in the Web.config file for the Web applications in which it is used.
+ - An XML entry to mark the action as an authorized type is stored in the Web.config file for the Web applications in which it's used.
If the farm workflows use custom actions, you should use a file backup system to protect these files and XML entries. Similar to features such as Web Parts and event receivers, these files should be reapplied to the farm as needed after recovery.
Workflows are a special case of customizations that you can back up. Make sure t
- If you create a custom workflow that interacts with a site collection other than the one where the workflow is deployed, you must back up both site collections to protect the workflow. This includes workflows that write to a history list or other custom list in another site collection. Performing a farm backup is sufficient to back up all site collections in the farm and all workflows that are associated with them. -- Workflows that are not yet deployed must be backed up and restored separately. When you are developing a new workflow but have not yet deployed it to the SharePoint Server farm, make sure that you back up the folder where you store the workflow project files by a file system backup application.
+- Workflows that aren't yet deployed must be backed up and restored separately. When you're developing a new workflow but haven't yet deployed it to the SharePoint Server farm, make sure that you back up the folder where you store the workflow project files by a file system backup application.
## Back up changes to the Web.config file in SharePoint Server <a name="WebConfig"> </a> A common customization to SharePoint Server is to change the Web.config file. We strongly recommend that you make changes to the Web.config file by using Central Administration or the SharePoint Server APIs and object model. Because these changes are stored in the configuration database, they can be recovered from a farm or configuration-only backup.
-Changes to the Web.config file that are not made by using Central Administration or the SharePoint Server APIs and object model should be protected by using a file system backup.
+Changes to the Web.config file that aren't made by using Central Administration or the SharePoint Server APIs and object model should be protected by using a file system backup.
> [!NOTE] > If you are using forms-based authentication, provider registration in the Web.config file is manual, and is not protected by SharePoint Server backup. In this case, make sure that you back up the Web.config file by using a file system backup.
Changes to the Web.config file that are not made by using Central Administration
## Back up third-party products in SharePoint Server <a name="ThirdParty"> </a>
-If third-party products are deployed as solution packages, they are protected by SharePoint Server backup. We recommend that you keep all the original files, distribution media, documentation, and the license and product keys that are required for installation.
+If third-party products are deployed as solution packages, they're protected by SharePoint Server backup. We recommend that you keep all the original files, distribution media, documentation, and the license and product keys that are required for installation.
-## Back up developed customizations that are not packaged as solutions in SharePoint Server
+## Back up developed customizations that aren't packaged as solutions in SharePoint Server
<a name="DevelopedCustomizations"> </a>
-Backing up developed customizations that are not deployed as solution packages can be a complex process because the customization file locations might not be stored in standardized places and SharePoint Server does not automatically back them up.
+Backing up developed customizations that aren't deployed as solution packages can be a complex process because the customization file locations might not be stored in standardized places and SharePoint Server doesn't automatically back them up.
Consult with the development team or customization vendor to determine whether the customizations involve additional add-in software or files in other locations. We recommend that you back up these directories with a file system backup solution. The following table lists locations where developed customizations are typically stored on Web servers.
SharePoint Capacity Planning https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/capacity-planning.md
Workload describes the demand that the system will need to sustain, the user bas
|Outlook Social Connector <br/> || |Other interactions(Custom Applications/Web services) <br/> || -- **Concurrent users** - It is most common to measure the concurrency of operations executed on the server farm as the number of distinct users generating requests in a given time frame. The key metrics are the daily average and the concurrent users at peak load.
+- **Concurrent users** - It's most common to measure the concurrency of operations executed on the server farm as the number of distinct users generating requests in a given time frame. The key metrics are the daily average and the concurrent users at peak load.
- **Requests per second (RPS)** - RPS is a commonly used indicator used to describe the demand on the server farm expressed in the number of requests processed by the farm per second, but with no differentiation between the type or size of requests. Every organization's user base generates system load at a rate that is dependent on the organization's unique usage characteristics. For more information, see [Glossary](/previous-versions/office/sharepoint-server-2010/ff758647(v=office.14)#glossary). -- **Total daily requests** - Total daily requests is a good indicator of the overall load the system will need to handle. It is most common to measure all requests except authentication handshake requests (HTTP status 401) over a 24-hour period.
+- **Total daily requests** - Total daily requests is a good indicator of the overall load the system will need to handle. It's most common to measure all requests except authentication handshake requests (HTTP status 401) over a 24-hour period.
- **Total daily users** - Total users is another key indicator of the overall load the system will need to handle. This measurement is the actual number of unique users in a 24-hour period, not the total number of employees in the organization.
Workload describes the demand that the system will need to sustain, the user bas
#### Estimating your production workload
-In estimating the required throughput your farm needs to be able to sustain, begin with estimating the mix of transactions that will be used in your farm. Focus on analyzing the most frequently used transactions that the system will serve, and understand how frequently they will be used and by how many users. This understanding will help you later when you validate whether the farm can sustain such loads in pre-production testing.
+In estimating the required throughput your farm needs to be able to sustain, begin with estimating the mix of transactions that will be used in your farm. Focus on analyzing the most frequently used transactions that the system will serve, and understand how frequently they'll be used and by how many users. This understanding will help you later when you validate whether the farm can sustain such loads in preproduction testing.
The following diagram describes the relationship of the workload and load on the system:
To estimate your expected workload, collect the following information:
- Web page browses. - File downloads and uploads. - Office Web Application views and edits in the browser.
- - Co-authoring interactions.
+ - Coauthor interactions.
- SharePoint Workspace site syncs. - Outlook Social Connections. - RSS sync (in Outlook or other viewers).
To estimate your expected workload, collect the following information:
- The total number of users per day that are expected to utilize each capability. - Derive the estimated concurrent users and high-level Requests per second.
- You will be making some assumptions. For example:
+ You'll be making some assumptions. For example:
- Present concurrency. - The factor of RPS per concurrent user that's different across capabilities Use the workload table earlier in this section for your estimates. It's important to focus on peak hours, rather than average throughput. Planning for peak activity, allows you to properly size your SharePoint Server 2013-based solution.
-If you have an existing Office SharePoint Server 2007 solution, you can mine the IIS log files or look to other Web monitoring tools you have to better understand some of the expected behaviors from the existing solution. Otherwise, see the instructions in the section below for more details. If you're not migrating from an existing solution, you should fill out the table using rough estimates. In later steps, you will need to validate your assumptions and tune the system.
+If you have an existing Office SharePoint Server 2007 solution, you can mine the IIS log files or look to other Web monitoring tools you have to better understand some of the expected behaviors from the existing solution. Otherwise, see the instructions in the section below for more details. If you're not migrating from an existing solution, you should fill out the table using rough estimates. In later steps, you'll need to validate your assumptions and tune the system.
#### Analyzing your SharePoint Server 2013 IIS Logs You'll need to extract data from the ULS and IIS logs to discover key metrics about an existing SharePoint Server 2013 deployment. For example: - How many users are active.-- How heavily they are using the system.
+- How heavily they're using the system.
- What kind of requests are coming in. - What kind of clients the requests originate from.
Dataset describes the volume of content stored in the system and how it can be d
- **Content size** - Understanding the size of the content that you expect to store in the SharePoint Server 2013 system is important for planning and architecting the system storage, and also for properly sizing the Search solution that will crawl and index this content. The content size is described in total disk space. If you're migrating content from an existing deployment, you might find it simple to identify the total content size to move. During planning, you should leave room for growth over time based. -- Total number of documents - Other than the data corpus size, it's important to track the overall number of items. The system reacts differently if 100 GB of data is composed of 50 files of 2 GB each versus 100,000 files of 1 KB each. In large deployments, the less stress there is on a single item, document, or area of documents, the better performance will be. Widely distributed content like multiple smaller files across many sites and site collection is easier to serve then a single large document library with large files.
+- Total number of documents - Other than the data corpus size, it's important to track the overall number of items. The system reacts differently if 100 GB of data is composed of 50 files of 2 GB each versus 100,000 files of 1 KB each. In large deployments, the less stress there is on a single item, document, or area of documents, the better performance will be. Widely distributed content like multiple smaller files across many sites and site collection is easier to serve than a single large document library with large files.
-- **Maximum site collection size** - It's important to identify what's the biggest unit of content that you will store in SharePoint Server 2013; usually it's an organizational need that prevents you from splitting that unit of content. Average size of all site collections and the estimated total number of site collections are additional indicators that will help you identify your preferred data architecture.
+- **Maximum site collection size** - It's important to identify what's the biggest unit of content that you'll store in SharePoint Server 2013; usually it's an organizational need that prevents you from splitting that unit of content. Average size of all site collections and the estimated total number of site collections are other indicators that will help you identify your preferred data architecture.
- **Service applications data characteristics** - In addition to analyzing the storage needs for the content store, you should analyze and estimate the sizes of other SharePoint Server 2013 stores, including:
Dataset describes the volume of content stored in the system and how it can be d
### Setting Farm Performance and Reliability Targets
-One of the deliverables of [Step 1: Model](capacity-planning.md#step1) is a good understanding of the performance and reliability targets that best fit the needs of your organization. A properly designed SharePoint Server 2013 solution should be able to achieve "four nines" (99.99%) of uptime with sub-second server responsiveness.
+One of the deliverables of [Step 1: Model](capacity-planning.md#step1) is a good understanding of the performance and reliability targets that best fit the needs of your organization. A properly designed SharePoint Server 2013 solution should be able to achieve "four nines" (99.99%) of uptime with subsecond server responsiveness.
The indicators used to describe the performance and reliability of the farm can include: -- **Server availability**: Usually described by the percent of overall uptime of the system. You should track any unexpected downtime and compare the overall availability to the organizational target you set. The targets are commonly described by a number of nines (that's, 99%, 99.9%, 99.99%)
+- **Server availability**: Described by the percent of overall uptime of the system. You should track any unexpected downtime and compare the overall availability to the organizational target you set. The targets are commonly described by a number of nines (that's, 99%, 99.9%, 99.99%)
- **Server responsiveness**: The time it takes the farm to serve requests is a good indicator to track the health of the farm. This indicator is named _server side latency_. Tt's common to use the average or median (the 50th percentile) latency of the daily requests being served. The targets are commonly described in seconds or fractions of seconds. If target is to serve pages in less than two seconds, the server-side goal should be in fractions of a second. This increased performance allows time for the page to reach the client and to render in the browser. Also, longer server response times are usually an indication of an unhealthy farm. RPS can rarely keep up if you spend more than a second on the server for most requests. -- **Server spikiness**: Another good server side latency indicator worth tracking is the behavior of the slowest 5% of all requests. Slower requests are usually the requests that hit the system when it's under higher load or even more commonly, requests that are impacted by less frequent activity that occur while users interact with the system; a healthy system is one that has the slowest requests under control as well. The target here is similar to Server Responsiveness, but to achieve sub-second response on server spikiness, you will need to build the system with numerous spare resources to handle the spikes in load.
+- **Server spikiness**: Another good server side latency indicator worth tracking is the behavior of the slowest 5% of all requests. Slower requests are usually the requests that hit the system when it's under higher load or even more commonly, requests that are impacted by less frequent activity that occur while users interact with the system; a healthy system is one that has the slowest requests under control as well. The target here's similar to Server Responsiveness, but to achieve subsecond response on server spikiness, you'll need to build the system with numerous spare resources to handle the spikes in load.
-- **System resource utilization**: Other common indicators used to track the health of the system are a collection of system counters that indicate the health of each server in the farm topology. The most frequently used indicators to track are % CPU utilization and Available Memory; however, there are several more counters that can help identify a non-healthy system; more details can be found in [Step 5: Monitor and Maintain](capacity-planning.md#step5).
+- **System resource utilization**: Other common indicators used to track the health of the system are a collection of system counters that indicate the health of each server in the farm topology. The most frequently used indicators to track are % CPU utilization and Available Memory; however, there are several more counters that can help identify a nonhealthy system; more details can be found in [Step 5: Monitor and Maintain](capacity-planning.md#step5).
## Step 2: Design <a name="step2"> </a>
By the end of this step you should have a design for your physical topology and
The hardware specifications and the number of machines you layout are tightly related, to handle a specific load there are several solutions you can choose to deploy. It's common to either use a small set of strong machines (scale up) or a larger set of smaller machines (scale out); each solution has its advantages and disadvantages when it comes to capacity, redundancy, power, cost, space, and other considerations.
-We recommend that you begin this step by determining your architecture and topology. Define how you plan to layout the different farms and the different services in each farm, and then pick the hardware specifications for each of the individual servers in your design. You can also execute this process by identifying the hardware specifications you're expected to deploy (many organizations are constrained to a certain company standard) and then define your architecture and topology.
+We recommend that you begin this step by determining your architecture and topology. Define how you plan to lay out the different farms and the different services in each farm, and then pick the hardware specifications for each of the individual servers in your design. You can also execute this process by identifying the hardware specifications you're expected to deploy (many organizations are constrained to a certain company standard) and then define your architecture and topology.
Use the following table to record your design parameters. The data included is sample data, and don't use to size your farm. It's intended to demonstrate how to use this table for your own data.
Selecting the right specifications for the machines in your farm is a crucial st
The core capacity and performance hardware features of servers reflect four main categories: processing power, disk performance, network capacity, and memory capabilities of a system.
-Another thing to consider is using virtualized machines. A SharePoint Server 2013 farm can be deployed using virtual machines. Although virtualization has not been found to add any performance benefits, it does provide manageability benefits. Virtualizing SQL Server-based computers is not recommended, but there may be certain benefits to virtualizing the Web server and application server tiers. For more information, see [Virtualization planning](/previous-versions/office/sharepoint-server-2010/ff607968(v=office.14)) (/previous-versions/office/sharepoint-server-2010/ff607968(v=office.14)).
+Another thing to consider is using virtualized machines. A SharePoint Server 2013 farm can be deployed using virtual machines. Although virtualization hasn't been found to add any performance benefits, it does provide manageability benefits. Virtualizing SQL Server-based computers isn't recommended, but there may be certain benefits to virtualizing the Web server and application server tiers. For more information, see [Virtualization planning](/previous-versions/office/sharepoint-server-2010/ff607968(v=office.14)) (/previous-versions/office/sharepoint-server-2010/ff607968(v=office.14)).
For more information about hardware requirements, see [Hardware and software requirements for SharePoint Server 2016](../install/hardware-and-software-requirements.md).
The memory requirements of database servers are tightly dependent on the databas
#### Choosing Networks
-In addition to the benefit offered to users, if clients have fast data access through the network, a distributed farm must have fast access for inter-server communication. This is especially true when you distribute services across multiple servers or federate some services to other farms. There is significant traffic in a farm across the web server tier, the application server tier, and the database server tier, and network can easily become a bottleneck under certain conditions like dealing with large files or high loads.
+In addition to the benefit offered to users, if clients have fast data access through the network, a distributed farm must have fast access for inter-server communication. This is especially true when you distribute services across multiple servers or federate some services to other farms. There's significant traffic in a farm across the web server tier, the application server tier, and the database server tier, and network can easily become a bottleneck under certain conditions like dealing with large files or high loads.
Web servers and application servers should be configured to use at least two network interface cards (NICs): one NIC to handle end-user traffic and the other to handle the inter-server communication. Network latency between servers can have a significant effect on performance. Therefore, it's important to maintain less than 1 millisecond of network latency between the web server and the SQL Server-based computers hosting the content databases. The SQL Server-based computers that host each service application database should be as close as possible to the consuming application server also. The network between farm servers should have at least 1 Gbps of bandwidth. #### Choosing Disks and Storage
-Disk management's not simply a function of providing sufficient space for your data. You must assess the on-going demand and growth, and make sure that the storage architecture is not slowing the system down. You should always make sure that you have at least 30 percent additional capacity on each disk, above your highest data requirement estimate, to leave room for future growth. Additionally, in most production environments, disk speed (IOps) is crucial to providing sufficient throughput to satisfy the servers' storage demands. You must estimate the amount of traffic (IOps) the major databases will require in your deployment and allocate enough disks to satisfy that traffic.
+Disk management's not simply a function of providing sufficient space for your data. You must assess the on-going demand and growth, and make sure that the storage architecture isn't slowing the system down. You should always make sure that you have at least 30 percent additional capacity on each disk, above your highest data requirement estimate, to leave room for future growth. Additionally, in most production environments, disk speed (IOps) is crucial to providing sufficient throughput to satisfy the servers' storage demands. You must estimate the amount of traffic (IOps) the major databases will require in your deployment and allocate enough disks to satisfy that traffic.
For more information about how to choose disks for database servers, see [Storage and SQL Server capacity planning and configuration (SharePoint Server)](storage-and-sql-server-capacity-planning-and-configuration.md).
The testing and optimization stage is an important component of effective capaci
Once you have tested your environment, you can analyze the test results to determine what changes must be made in order to achieve the performance and capacity targets you established in [Step 1: Model](capacity-planning.md#step1).
-Following are the sub steps for pre-production:
+Following are the sub steps for preproduction:
- Create the test environment that mimics the initial architecture you designed in [Step 2: Design](capacity-planning.md#step2).
Before you deploy SharePoint Server 2013 to a production environment, it's impor
### Optimize
-If you cannot meet your capacity and performance targets by scaling your farm hardware or making changes to the topology, you may have to consider revising your solution. For example, if your initial requirements were for a single farm for collaboration, Search and Social, you may have to federate some services such as search to a dedicated services farm, or split the workload across more farms. One alternative is to deploy a dedicated farm for social and another for team collaboration.
+If you can't meet your capacity and performance targets by scaling your farm hardware or making changes to the topology, you may have to consider revising your solution. For example, if your initial requirements were for a single farm for collaboration, Search and Social, you may have to federate some services such as search to a dedicated services farm, or split the workload across more farms. One alternative is to deploy a dedicated farm for social and another for team collaboration.
## Step 4: Deploy <a name="step4"> </a>
Once you have executed your final round of tests and confirmed that the architec
The appropriate rollout strategy will vary depending on the environment and situation. While SharePoint Server 2013 deployment generally is outside the scope of this document, there are certain suggested activities that may come out of the capacity planning exercise. Here are some examples: -- **Deploying a new SharePoint Server 2013 farm:** The capacity planning exercise should have guided and confirmed plans for a design and deployment of SharePoint Server 2016. In this case, the rollout will be the first broad deployment of SharePoint Server 2013. It will require moving or rebuilding the servers and services that were used during the capacity planning exercises into production. This is the most straight-forward scenario because there are not any upgrades or modifications needed to an existing farm.
+- **Deploying a new SharePoint Server 2013 farm:** The capacity planning exercise should have guided and confirmed plans for a design and deployment of SharePoint Server 2016. In this case, the rollout will be the first broad deployment of SharePoint Server 2013. It will require moving or rebuilding the servers and services that were used during the capacity planning exercises into production. This is the most straight-forward scenario because there aren't any upgrades or modifications needed to an existing farm.
- **Upgrading an Office SharePoint Server 2007 farm to SharePoint Server 2013:** The capacity planning exercise should have validated the design for a farm that can meet existing demands and scale up to meet increased demand and usage of a SharePoint Server 2013 farm. Part of the capacity planning exercise should have included test migrations to validate how long the upgrade process will take, whether any custom code must be modified or replaced, whether any third-party tools have to be updated, and so on. At the conclusion of capacity planning you should have a validated design, and understanding of how much time that it will take to upgrade, and a plan for how best to work through the upgrade process - for example, an in-place upgrade, or migrating content databases into a new farm. If you're doing an in-place upgrade, then during capacity planning you may have found that additional or upgraded hardware will be needed, and considerations for downtime. Part of the output from the planning exercise should be a list of the hardware changes that are needed and a detailed plan to deploy the hardware changes to the farm first. Once the hardware platform that was validated during capacity planning is in place, you can move forward with the process of upgrading to SharePoint Server 2013.
SharePoint Configure Automatic Password Change https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/configure-automatic-password-change.md
Use the Password Management Settings page of Central Administration to configure
4. In the **Notification E-Mail Address** section of the Password Management Settings page, enter the e-mail address of one person or group to be notified of any imminent password change or expiration events.
-5. If automatic password change is not configured for a managed account, enter a numeric value in the **Account Monitoring Process Settings** section that indicates the number of days before password expiration that a notification will be sent to the e-mail address configured in the **Notification E-Mail Address** section.
+5. If automatic password change isn't configured for a managed account, enter a numeric value in the **Account Monitoring Process Settings** section that indicates the number of days before password expiration that a notification will be sent to the e-mail address configured in the **Notification E-Mail Address** section.
6. In the **Automatic Password Change Settings** section, enter a numeric value that indicates the number of seconds that automatic password change will wait (after notifying services of a pending password change) before starting the change. Enter a numeric value that indicates the number of times a password change will be tried before the process stops.
Use the following guidance to avoid the most common issues that can occur when y
### Password mismatch
-If the automatic password change process fails because there is a password mismatch between Active Directory Domain Services (AD DS) and SharePoint Server, the password change process can result in access denial at logon, an account lockout, or AD DS read errors. If any of these issues occur, make sure that your AD DS passwords are configured correctly and that the AD DS account has read access for setup. Use Microsoft PowerShell to fix any password mismatch issues that might occur, and then resume the password change process.
+If the automatic password change process fails because there's a password mismatch between Active Directory Domain Services (AD DS) and SharePoint Server, the password change process can result in access denial at logon, an account lockout, or AD DS read errors. If any of these issues occur, make sure that your AD DS passwords are configured correctly and that the AD DS account has read access for setup. Use Microsoft PowerShell to fix any password mismatch issues that might occur, and then resume the password change process.
**To correct for a password mismatch by using PowerShell**
If the automatic password change process fails because there is a password misma
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
- Add memberships that are required beyond the minimums above. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
If the automatic password change process fails because there is a password misma
### Service account provisioning failure
-If service account provisioning or re-provisioning fails on one or more servers in the farm, check the status of the Timer Service. If the Timer Service has stopped, restart it. Consider using the following Stsadm command to immediately start Timer Service administration jobs: `stsadm -o execadmsvcjobs`
+If service account provisioning or reprovisioning fails on one or more servers in the farm, check the status of the Timer Service. If the Timer Service has stopped, restart it. Consider using the following Stsadm command to immediately start Timer Service administration jobs: `stsadm -o execadmsvcjobs`
-If restarting the Timer Service does not resolve the issue, use PowerShell to repair the managed account on each server in the farm that has experienced a provisioning failure.
+If restarting the Timer Service doesn't resolve the issue, use PowerShell to repair the managed account on each server in the farm that has experienced a provisioning failure.
**To resolve a service account provisioning failure**
If restarting the Timer Service does not resolve the issue, use PowerShell to re
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
- Add memberships that are required beyond the minimums above. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
If restarting the Timer Service does not resolve the issue, use PowerShell to re
For more information, see [Repair-SPManagedAccountDeployment](/powershell/module/sharepoint-server/Repair-SPManagedAccountDeployment?view=sharepoint-ps&preserve-view=true).
-If the previous procedure does not resolve a service account provisioning failure, it is likely because the farm encryption key cannot be decrypted. If this is the issue, use PowerShell to update the local server pass phrase to match the pass phrase for the farm.
+If the previous procedure doesn't resolve a service account provisioning failure, it's likely because the farm encryption key can't be decrypted. If this is the issue, use PowerShell to update the local server pass phrase to match the pass phrase for the farm.
**To update the local server pass phrase**
If the previous procedure does not resolve a service account provisioning failur
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
- Add memberships that are required beyond the minimums above. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
For more information, see [Set-SPPassPhrase](/powershell/module/sharepoint-serve
### Imminent password expiration
-If the password is about to expire, but automatic password change has not been configured for this account, use PowerShell to update the account password to a new value that can be chosen by the administrator or automatically generated. After you have updated the account password, make sure that the Timer Service is started and the Administrator Service is enabled on all servers in the farm. Then, the password change can be propagated to all of the servers in the farm.
+If the password is about to expire, but automatic password change hasn't been configured for this account, use PowerShell to update the account password to a new value that can be chosen by the administrator or automatically generated. After you have updated the account password, make sure that the Timer Service is started and the Administrator Service is enabled on all servers in the farm. Then, the password change can be propagated to all of the servers in the farm.
> [!NOTE] > When an administrator performs a password change for the servers in the SharePoint search topology, there is an implied query downtime when the services are restarted. The query downtime is typically in the range of 3-5 minutes.
If the password is about to expire, but automatic password change has not been c
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
- Add memberships that are required beyond the minimums above. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
SharePoint Create An Audience For Sharepoint Server https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/create-an-audience-for-sharepoint-server.md
Learn how to use a Microsoft PowerShell script to create an audience.
- You must read [about_Execution_Policies](/previous-versions//dd347641(v=technet.10)).
-2. Copy the following variable declarations, and paste them into a text editor such as Notepad. Set input values specific to your organization. You will use these values in step 3. Save the file, and name it Audiences.ps1.
+2. Copy the following variable declarations, and paste them into a text editor such as Notepad. Set input values specific to your organization. You'll use these values in step 3. Save the file, and name it Audiences.ps1.
``` ## Settings you may want to change for Audience Name and Description ##
Learn how to use a Microsoft PowerShell script to create an audience.
./Audiences.ps1 ```
-For additional information about PowerShell scripts and .ps1 files, see [Running Windows PowerShell Scripts](/previous-versions/windows/it-pro/windows-powershell-1.0/ee176949(v=technet.10)).
+For additional information about PowerShell scripts and .ps1 files see [Running Windows PowerShell Scripts](/previous-versions/windows/it-pro/windows-powershell-1.0/ee176949(v=technet.10)).
-For additional information about how to create audiences, see [AudienceRuleComponent class](/previous-versions/office/sharepoint-server/ms578007(v=office.15)).
+For more information about how to create audiences, see [AudienceRuleComponent class](/previous-versions/office/sharepoint-server/ms578007(v=office.15)).
SharePoint Create New Certificates https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/create-new-certificates.md
The cmdlet parameters are:
| | | |FriendlyName| The friendly name for the certificate. This name can be used to help you remember the purpose of this certificate. The friendly name will only be visible to SharePoint farm administrators, not to end users.| |CommonName | The primary DNS domain name or IP address that this certificate will be assigned to. Fully Qualified Domain Names (FQDNs) are recommended.|
-|AlternativeNames | Additional DNS domain names or IP addresses that this certificate will be assigned to. Fully Qualified Domain Names (FQDNs) are recommended.|
+|AlternativeNames | Other DNS domain names or IP addresses that this certificate will be assigned to. Fully Qualified Domain Names (FQDNs) are recommended.|
|OrganizationalUnit | The name of your department within your organization or company. If this parameter isn't specified, the default organizational unit of the farm will be used.| |Organization| The legally registered name of your organization or company. If this parameter isn't specified, the default organization of the farm will be used.| |Locality | The name of the city or locality where your organization is legally located. Don't abbreviate the name. If this parameter isn't specified, the default locality of the farm will be used.| |State | The name of the state or province where your organization is legally located. Don't abbreviate the name. If this parameter isn't specified, the default state of the farm will be used.| |Country | The two letter country code where your organization is legally located. This must be an ISO 3166-1 alpha-2 country code. If this parameter isn't specified, the default country of the farm will be used.| |Exportable| Specifies whether the private key of the certificate may be exported. If this parameter isn't specified, the private key of certificate deployed to the Windows Certificate Store on each server in the SharePoint farm won't be exportable, and SharePoint won't allow you to export the private key from within the SharePoint administration interface.|
-|KeySize | Specifies to use the RSA key algorithm for your certificate, and the size of your public and private RSA keys in bits. Larger key sizes provide more cryptographic strength than smaller key sizes, but they're also more computationally expensive and take more time to complete the SSL / TLS connection. Select `2048` if you're unsure, which key size to use. Key sizes larger than `4096` are not recommended. If neither this parameter nor the `EllipticCurve` parameter is specified, the default key algorithm and key size / elliptic curve of the farm will be used.|
-|EllipticCurve|Specifies to use the elliptic curve cryptography key algorithm for your certificate, and the elliptic curve of your public and private ECC keys. Larger elliptic curves provide more cryptographic strength than smaller elliptic curves, but they're also more computationally expensive and take more time to complete the SSL / TLS connection. Select `nistP256` if you're unsure, which elliptic curve to use. Elliptic curves larger than `nistP384` are not recommended. If neither this parameter nor the `KeySize` parameter is specified, the default key algorithm and key size / elliptic curve of the farm will be used.|
-|HashAlgorithm|Specifies the hash algorithm of your certificate signature, which your certificate authority will use to verify that your certificate request hasn't been tampered with. Larger hash algorithms provide more cryptographic strength than smaller hash algorithms, but they're also more computationally expensive. Select `SHA256` if you're unsure, which hash algorithm to use. Hash algorithms larger than `SHA384` are not recommended. If this parameter isn't specified, the default hash algorithm of the farm will be used.|
+|KeySize | Specifies to use the RSA key algorithm for your certificate, and the size of your public and private RSA keys in bits. Larger key sizes provide more cryptographic strength than smaller key sizes, but they're also more computationally expensive and take more time to complete the SSL / TLS connection. Select `2048` if you're unsure, which key size to use. Key sizes larger than `4096` aren't recommended. If neither this parameter nor the `EllipticCurve` parameter is specified, the default key algorithm and key size / elliptic curve of the farm will be used.|
+|EllipticCurve|Specifies to use the elliptic curve cryptography key algorithm for your certificate, and the elliptic curve of your public and private ECC keys. Larger elliptic curves provide more cryptographic strength than smaller elliptic curves, but they're also more computationally expensive and take more time to complete the SSL / TLS connection. Select `nistP256` if you're unsure, which elliptic curve to use. Elliptic curves larger than `nistP384` aren't recommended. If neither this parameter nor the `KeySize` parameter is specified, the default key algorithm and key size / elliptic curve of the farm will be used.|
+|HashAlgorithm|Specifies the hash algorithm of your certificate signature, which your certificate authority will use to verify that your certificate request hasn't been tampered with. Larger hash algorithms provide more cryptographic strength than smaller hash algorithms, but they're also more computationally expensive. Select `SHA256` if you're unsure, which hash algorithm to use. Hash algorithms larger than `SHA384` aren't recommended. If this parameter isn't specified, the default hash algorithm of the farm will be used.|
|Path|Specifies the path to the certificate signing request file that will be generated.| |Force| Specifies to overwrite a file if it already exists at the specified path.|
-|AssignmentCollection| Manages objects for the purpose of proper disposal. Use of objects, such as SPWeb or SPSite, can use large amounts of memory and use of these objects in Windows PowerShell scripts requires proper memory management. Using the `SPAssignment` object, you can assign objects to a variable and dispose of the objects after they are needed to free up memory. When SPWeb, SPSite, or `SPSiteAdministration` objects are used, the objects are automatically disposed of if an assignment collection or the Global parameter is not used.|
-|WhatIf|Shows what would happen if the cmdlet runs. The cmdlet is not run.|
+|AssignmentCollection| Manages objects for proper disposal. Use of objects, such as SPWeb or SPSite, can use large amounts of memory and use of these objects in Windows PowerShell scripts requires proper memory management. Using the `SPAssignment` object, you can assign objects to a variable and dispose of the objects after they're needed to free up memory. When SPWeb, SPSite, or `SPSiteAdministration` objects are used, the objects are automatically disposed of if an assignment collection or the Global parameter isn't used.|
+|WhatIf|Shows what would happen if the cmdlet runs. The cmdlet isn't run.|
|Confirm|Prompts you for confirmation before running the cmdlet.| Example cmdlet syntax:
SharePoint Data Refresh Using A Specified Account https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/data-refresh-using-a-specified-account.md
description: "Learn to configure scheduled data refresh in Power Pivot for Share
In this article, we'll take a look at configuring scheduled data refresh in SQL Server 2012 Power Pivot for SharePoint 2013 by using an account that you specify.
-We recommend [using Secure Store to store your data refresh credentials](data-refresh-using-secure-store.md), but if Secure Store is not available or if you working in a test or pre-production environment, this article will show you how to easily configure the account of your choice for scheduled data refresh.
+We recommend [using Secure Store to store your data refresh credentials](data-refresh-using-secure-store.md), but if Secure Store isn't available or if you're working in a test or preproduction environment, this article will show you how to easily configure the account of your choice for scheduled data refresh.
## Before you begin <a name="begin"> </a>
-Before starting, you will need:
+Before starting, you'll need:
- An Active Directory account that you can use to access the data sources used in your report. We'll refer to this as the data access account. We'll look at how to configure the account for access to your data sources in this article, so you just need the account itself to get started. -- Contribute access to the SharePoint document library that you will be using.
+- Contribute access to the SharePoint document library that you'll be using.
Additionally, be sure that [Excel Services](excel-services-overview.md) is configured in your SharePoint Server 2013 farm.
Now that the workbook has been saved to a SharePoint document library, let's con
You can test if data refresh is working properly by making some changes to your data, and then setting the workbook to refresh right away by using the **Also refresh as soon as possible** option.
-Note that anytime you make changes to the data refresh settings page, such as selecting the **Also refresh as soon as possible** check box, you'll need to reenter the password of the data access account.
+Anytime you make changes to the data refresh settings page, such as selecting the **Also refresh as soon as possible** check box, you'll need to reenter the password of the data access account.
## See also <a name="ver"> </a>
SharePoint Deployment Considerations For Implementing Microsoft Identity Manager With Share https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/deployment-considerations-for-implementing-microsoft-identity-manager-with-share.md
During MIM Synchronization Setup, the remote database access depends on the acce
## Set access rights if SQL Server is installed on a remote server
-If you install SQL Server on a remote computer, that is, on a different computer than the one running MIM, be sure that the policy for the SQL Server service account allows users access to that computer from the network. If access is not allowed, MIM setup will fail.
+If you install SQL Server on a remote computer, that is, on a different computer than the one running MIM, be sure that the policy for the SQL Server service account allows users access to that computer from the network. If access isn't allowed, MIM setup will fail.
> [!IMPORTANT] > If you install SQL Server on a remote computer and allow network access to the remote computer, you'll receive a security warning from MIM setup. For this scenario, the warning can be ignored.
After you use Export Management Agent, you can then use the **Import Management
## Populate the displayName attribute in the metaverse to make search results easier to identify
-When listing objects by using Metaverse Search, MIM returns results identified by the **displayName** attribute. If the **displayName** attribute is not populated, the search results are identified by the globally unique identifier (GUID). For more information on how to use metaverse search, see [Using Metaverse Search](/previous-versions/mim/jj572785(v=ws.10))
+When listing objects by using Metaverse Search, MIM returns results identified by the **displayName** attribute. If the **displayName** attribute isn't populated, the search results are identified by the globally unique identifier (GUID). For more information on how to use metaverse search, see [Using Metaverse Search](/previous-versions/mim/jj572785(v=ws.10))
## Design your flow rules to act upon the state of an object
With Preview, you can run test synchronizations and view the results without com
## Schedule a recurring run profile using the Delta Synchronization step to process disconnectors automatically
-Objects that fail to join are not reevaluated by the Delta Import and Delta Synchronization run profile step and might remain as disconnectors. Running a Delta Synchronization step on a regular basis will reevaluates and processes these disconnectors. For more information on how to run profile steps, see [Configuring Management Agents](/previous-versions/mim/jj590191(v=ws.10)).
+Objects that fail to join aren't reevaluated by the Delta Import and Delta Synchronization run profile step and might remain as disconnectors. Running a Delta Synchronization step regularly will reevaluates and processes these disconnectors. For more information on how to run profile steps, see [Configuring Management Agents](/previous-versions/mim/jj590191(v=ws.10)).
## Save and clear the management agent run history in Operations regularly
-Operations records a history of every management agent run. Each management agent run history is saved in the SQL Server database, and can cause the database to grow over time, affecting performance. The run history can be saved using Operations. For more information on how to use Operations, see [Using Operations](/previous-versions/mim/jj590289(v=ws.10)).
+Operations record a history of every management agent run. Each management agent run history is saved in the SQL Server database, and can cause the database to grow over time, affecting performance. The run history can be saved using Operations. For more information on how to use Operations, see [Using Operations](/previous-versions/mim/jj590289(v=ws.10)).
> [!NOTE] > Deleting very large numbers of runs at once make take considerable time. It's recommended you delete no more than 100 runs at a time. ## Use multiple partitions in a management agent to control synchronization of single object types
-To control synchronization of single object types in a file-based management agent, create a partition for each object type. For example, to synchronize the object types **mailbox** and **group**, create two partitions in the management agent, and assign **mailbox** to one partition and **group** to the other. Then, create a management agent run profile for each partition. With this configuration, you've one management agent with the flexibility to synchronize one or both of the selected object types. For more information on how to use partitions, see [The Metaverse and the Connector Space](/previous-versions/mim/jj590171(v=ws.10))
+To control synchronization of single object types in a file-based management agent, create a partition for each object type. For example, to synchronize the object types **mailbox** and **group**, create two partitions in the management agent, and assign **mailbox** to one partition and **group** to the other. Then, create a management agent run profile for each partition. With this configuration, you have one management agent with the flexibility to synchronize one or both of the selected object types. For more information on how to use partitions, see [The Metaverse and the Connector Space](/previous-versions/mim/jj590171(v=ws.10))
## Capacity Planning
-There are a number of variables that can affect the overall capacity and performance of MIM deployment.
+There are many variables that can affect the overall capacity and performance of MIM deployment.
-Performance can be negatively impacted if all the databases in the system are created with a smaller size and set to auto-grow especially by small increments. A minimum of 16 GB of RAM for the SQL Servers is required but you'll benefit from more memory. You should have at least 16 CPU cores on the SQL servers but more cores will help overall performance.
+Performance can be negatively impacted if all the databases in the system are created with a smaller size and set to autogrow especially by small increments. A minimum of 16 GB of RAM for the SQL Servers is required but you'll benefit from more memory. You should have at least 16 CPU cores on the SQL servers but more cores will help overall performance.
Finally, it's recommended not to run MIM and SharePoint databases together on the same server.
The MIM solution is designed to be highly available to prevent any single point
> [!NOTE] > The information in this section are recommendations only. -- **MIM Synchronization Service** - although clustering of the MIM Synchronization Service is not supported, a warm standby server could be deployed to assume the workload of the primary in the event of a failure. However, hardware failure should not be a concern as the MIM Synchronization Service will be running on a virtual machine hosted on multiple physical nodes. Also in case of a software failure, the virtual machine hosting the synchronization server could be quickly recovered from a previous backup or rebuilt from scratch. A down time of this service has no impact on end-user interactions with the solution. It would only delay the fulfillment of all access provisioning and deprovisioning requests. When the service is brought back online those operations would resume with no data loss. The warm standby of the MIM Synchronization Service will be connected to the same SQL Server database as the primary instance and will have to be activated through a script in case the primary instance goes down and cannot be restarted in a timely manner. Note that the MIM Management Agent used to synchronize data between the MIM Synchronization Service database and the MIM Service database will have to point to the local MIM Service instance.
+- **MIM Synchronization Service** - although clustering of the MIM Synchronization Service isn't supported, a warm standby server could be deployed to assume the workload of the primary in the event of a failure. However, hardware failure shouldn't be a concern as the MIM Synchronization Service will be running on a virtual machine hosted on multiple physical nodes. Also if there is a software failure, the virtual machine hosting the synchronization server could be quickly recovered from a previous backup or rebuilt from scratch. A down time of this service has no impact on end-user interactions with the solution. It would only delay the fulfillment of all access provisioning and deprovisioning requests. When the service is brought back online those operations would resume with no data loss. The warm standby of the MIM Synchronization Service will be connected to the same SQL Server database as the primary instance and will have to be activated through a script in case the primary instance goes down and can't be restarted in a timely manner. Note that the MIM Management Agent used to synchronize data between the MIM Synchronization Service database and the MIM Service database will have to point to the local MIM Service instance.
- **SQL Server** - A SQL Server cluster is required by the MIM solution to provide high-availability for the database layer. The MIM cluster will consist of two servers with specifications detailed in the previous paragraphs. Even though each SQL node has both SQL instances installed, only one of the two instances will be active at a given time.
- The design takes into account the best use of the clustered virtual machines without oversubscribing each node and potentially causing both nodes to go down in case of failover.
+ The design takes into account the best use of the clustered virtual machines without oversubscribing each node and potentially causing both nodes to go down if there is failover.
- As the databases are hosted on a remote SQL Server the network connection between the MIM Servers and SQL Servers must be 1 Gbit. 100 Mbit network will not provide enough bandwidth and will degrade synchronization performance by 20 to 30 percent.
+ As the databases are hosted on a remote SQL Server the network connection between the MIM Servers and SQL Servers must be 1 Gbit. 100 Mbit network won't provide enough bandwidth and will degrade synchronization performance by 20 to 30 percent.
## Always use Active Directory Import as the sync setting in User Profile Administration
-If you plan to use the MIM Synchronization service, do not select it. Instead select the **Use SharePoint Active Directory Import** option. There is a known issue with Audience compilation and Manager attribute if the **Enable External Identity Manager** option is selected.
+If you plan to use the MIM Synchronization service, don't select it. Instead select the **Use SharePoint Active Directory Import** option. There's a known issue with Audience compilation and Manager attribute if the **Enable External Identity Manager** option is selected.
> [!NOTE] > This issue is fixed in the February 2017 Public Update (PU), see [February 21, 2017, update for SharePoint Server 2016 (KB3141517)](https://support.microsoft.com/help/3141517/february-21-2017-update-for-sharepoint-server-2016-kb3141517)
-## Do not switch between synchronization types
+## Don't switch between synchronization types
If you switch from one synchronization type to another by using the **Configure Synchronization Settings** in the SharePoint Central Administration website, you'll experience issues with no objects being returned when an import on the SharePoint Connector instance is started, and no results in the ULS logs.
Microsoft Identity Manager supports exporting user profile pictures from SharePo
## No BCS Integration to support additional Profile Properties
-There is no Business Connectivity Services integration to support profile properties in MIM. You can manually configure Connectors to achieve this.
+There's no Business Connectivity Services integration to support profile properties in MIM. You can manually configure Connectors to achieve this.
## User Profile properties
-New user profile properties can be created in SharePoint Servers; however, the mappings are not created in SharePoint, but within MIM.
+New user profile properties can be created in SharePoint Servers; however, the mappings aren't created in SharePoint, but within MIM.
## NetBios name
SharePoint Disaster Recovery Best Practices For Sharepoint Server And Access Services https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/disaster-recovery-best-practices-for-sharepoint-server-and-access-services.md
Regardless of your choice of technologies there are a few requirements and best-
## Step 1: Setting up SharePoint Server for Disaster Recovery
-The goal of this step is to create a smoother disaster recovery experience by removing potential points of failure. By matching Authentication Realms, and Database Server ReferenceIDs so they are the same in the disaster recovery farm as in the primary farm, you will be prepared for recovery. Likewise, it is essential to know which databases must be managed in order to recover successfully.
+The goal of this step is to create a smoother disaster recovery experience by removing potential points of failure. By matching Authentication Realms, and Database Server ReferenceIDs so they're the same in the disaster recovery farm as in the primary farm, you'll be prepared for recovery. Likewise, it's essential to know which databases must be managed in order to recover successfully.
Let's drill into the details, below.
$newdbserver = New-SPAccessServicesDatabaseServer -ServiceContext $context -Data
```
-Write the ServerRefID to the screen for use when registering the secondary farm Access Services Database Server
+Write the ServerRefID to the screen for use when registering the secondary farm Access Services Database Server.
``` $ServerRefID
These databases need to be managed as a part of your Disaster Recovery strategy.
|:--|:--| |App Management database <br/> |Contains Access app registrations and app principals. <br/> | |Subscription Settings database <br/> |Manages the unique identities provided to Access apps to create the URL for the Access application. <br/> |
-|Secure Store database <br/> |The Secure Store Service can be leveraged to provide alternate authentication methods for Access apps. The guide referred to earlier doesn't cover doing this, but we will add the Secure Store database to our strategy for completeness. <br/> |
+|Secure Store database <br/> |The Secure Store Service can be used to provide alternate authentication methods for Access apps. The guide referred to earlier doesn't cover doing this, but we'll add the Secure Store database to our strategy for completeness. <br/> |
|SharePoint Content database <br/> |These databases contain the site collections into which Access apps have been deployed. <br/> | |Access Services Application databases <br/> |The databases containing the actual data you need to preserve for the Access Services application to function. <br/> |
After failing-over to the secondary datacenter, you need to use the five differe
> [!NOTE] > This article only deals with the five database types listed in the table above. To successfully recover a full SharePoint Server farm after a data center failover, additional steps are needed and the reader is directed to review the steps in [Plan for high availability and disaster recovery for SharePoint Server](high-availability-and-disaster-recovery-concepts.md).
-For the test environment we discuss in this article, that means the following databases are recovered from the Primary SQL Server SQL01 to the Secondary SQL Server SQL02 in the DR site.
+For the test environment we discuss in this article, the following databases are recovered from the Primary SQL Server SQL01 to the Secondary SQL Server SQL02 in the DR site.
- App Management database
Use the following PowerShell commands:
$secstoreproxy = New-SPSecureStoreServiceApplicationProxy -Name "Secure Store Proxy" -ServiceApplication $secstore ```
-Also note that if you are using the secure store in the secondary farm you will need to generate a new secure store encryption key before you can leverage any Applications registered there.
+Also note that if you're using the secure store in the secondary farm you'll need to generate a new secure store encryption key before you can leverage any Applications registered there.
### b. Attach the content database(s)
At this point we have almost everything we need to support Access Services in Di
The key elements to consider here are the domains you had specified in the primary site and the domains you intend to use in the secondary DR site. If you plan to use the same domains, repoint the CNAME record for the SP Apps domain to the secondary SharePoint server, for example repoint **\*.contosoapps.com** to the secondary SharePoint Server.
-Make sure you setup the App Urls in Central Administration on the DR site.
+Make sure your setup the App Urls in Central Administration on the DR site.
1. Open Central Administration, select **Apps**. 2. Select **Configure App URLs**.
-Recovering the App Management Database does not preserve the App Domain even though it does preserve the App Prefix.
+Recovering the App Management Database doesn't preserve the App Domain even though it does preserve the App Prefix.
> [!IMPORTANT] > Failing to set the App Domain will result in a DNS lookup failure and a site not found error in the browser. ### b. Set up Access Database Logins for the secondary site
-Access Services requires the Contained Databases feature of SQL Server, which supports contained database logins. However, Access Services in SharePoint 2013 and 2016 only partially leverages this feature, and so the database logins are actually stored in the Master DB, just like any other login. The downside to this is that on failover we need to regenerate any missing logins and ensure we set the same password for the account.
+Access Services requires the Contained Databases feature of SQL Server, which supports contained database logins. However, Access Services in SharePoint 2013 and 2016 only partially uses this feature, and so the database logins are stored in the Master DB, just like any other login. The downside to this is that on failover we need to regenerate any missing logins and ensure we set the same password for the account.
Fortunately, Microsoft has produced an easy way to do this documented right here (and we'll be using this article in step 1, below) [How to transfer logins and passwords between instances of SQL Server](https://support.microsoft.com/kb/918992).
SharePoint Server 2016 has been tested in a disaster recovery scenario using SQL
In all scenarios we were able to successfully recover the Access Applications on the Secondary SharePoint farm and perform all CRUD operations post failover, after following the guidance in this document.
-The key elements are : Ensure both server farms are setup with matching Authentication Realms. Ensure Access Services database servers are referenced with the same **ServerReferenceID**. Transfer SQL Logins from Production to DR SQL Servers.
+The key elements are: Ensure both server farms are set up with matching Authentication Realms. Ensure Access Services database servers are referenced with the same **ServerReferenceID**. Transfer SQL Logins from Production to DR SQL Servers.
## See also
SharePoint Enterprise Intranet Collaboration Performance And Capacity https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/enterprise-intranet-collaboration-performance-and-capacity.md
This article contains guidance on performance and capacity planning for an enter
- The **test farm workload and dataset** that was used to generate test load -- **Test results and analysis** that demonstrate and explain trends in throughput, latency and hardware demand under load at specific scale points.
+- **Test results and analysis** that demonstrate and explain trends in throughput, latency, and hardware demand under load at specific scale points.
Use the information in this article to understand the characteristics of the scenario under both normal and peak loads, and how performance trends change when farm servers are scaled out. This article can also help you estimate an appropriate starting point for your planned architecture, and the factors that are important to consider when you plan for the resources your farm will need to maintain acceptable levels of performance under peak load.
Use the information in this article to understand the characteristics of the sce
This article provides guidance about how to scale out servers in a SharePoint Server 2013 enterprise intranet collaboration solution. Capacity planning informs decisions about hardware to purchase and system configurations that optimize your solution.
-Individual SharePoint Server 2013 farms are unique, and each farm has different requirements that depend on hardware, user behavior, the configuration of installed features, and many other factors. Therefore, supplement this guidance with additional testing on your own hardware in your own environment. If your planned design and workload resembles the environment described in this article, you can use this article to draw conclusions about how to scale your environment.
+Individual SharePoint Server 2013 farms are unique, and each farm has different requirements that depend on hardware, user behavior, the configuration of installed features, and many other factors. Therefore, supplement this guidance with other testing on your own hardware in your own environment. If your planned design and workload resembles the environment described in this article, you can use this article to draw conclusions about how to scale your environment.
-Test results that appear in this article were produced in a test lab, using a workload, dataset, and architecture emulate a production environment under highly controlled conditions. While great care was exercised in designing these tests, the performance characteristics of a test lab are never the same as the behavior of a production environment. These test results do not represent the performance and capacity characteristics of a production farm. Instead, the test results demonstrate observed trends in throughput, latency, and hardware demand, and provide analysis of the observed data that can help you make decisions about how to plan capacity and manage your own farm.
+Test results that appear in this article were produced in a test lab, using a workload, dataset, and architecture emulate a production environment under highly controlled conditions. While great care was exercised in designing these tests, the performance characteristics of a test lab are never the same as the behavior of a production environment. These test results don't represent the performance and capacity characteristics of a production farm. Instead, the test results demonstrate observed trends in throughput, latency, and hardware demand, and provide analysis of the observed data that can help you make decisions about how to plan capacity and manage your own farm.
This article includes the following:
These articles provide the following information:
## Glossary <a name="Glossary"> </a>
-Here are some specialized terms that you will encounter in this article.
+Here are some specialized terms that you'll encounter in this article.
-- **RPS:** Requests per second, or the number of requests a that a farm or server receives in one second. This is a common measurement of server and farm load.
+- **RPS:** Requests per second, or the number of requests that a farm or server receives in one second. This is a common measurement of server and farm load.
- Note that requests differ from page loads. A page contains several components, each of which creates one or more requests when a browser loads the page. Therefore, one page load creates several requests. Typically, authentication checks and events that use insignificant resources are not counted in RPS measurements.
+ Requests differ from page loads. A page contains several components, each of which creates one or more requests when a browser loads the page. Therefore, one page load creates several requests. Typically, authentication checks and events that use insignificant resources aren't counted in RPS measurements.
- **Green Zone:** Green Zone represents a defined set of load characteristics under normal operation conditions, up to expected daily peak loads. A farm that operates in this range should be able to sustain response times and latency that are within acceptable parameters.
This section describes the approach that we took to scale this lab environment.
3. We disabled the Distributed Cache Service on the web servers.
-4. We scaled out additional web servers to the maximum for the scope of testing.
+4. We scaled out more web servers to the maximum for the scope of testing.
-5. We conducted additional testing to compare the performance characteristics of SharePoint Server 2013 and SharePoint Server 2010.
+5. We conducted more testing to compare the performance characteristics of SharePoint Server 2013 and SharePoint Server 2010.
### Methodology and test notes <a name="Methodology"> </a>
Because this article provides results from a test lab environment, we could cont
- Between test runs, we modified only one variable at a time to make it easy to compare results between test runs. -- The database servers were not part of a cluster because redundancy was not necessary for the purposes of these tests.
+- The database servers weren't part of a cluster because redundancy wasn't necessary for the purposes of these tests.
-- Search crawl was not running during the tests. Of course, it might be running in a production environment. To take this into account, we lowered the SQL Server CPU utilization in our definitions of 'Green Zone' and 'Red Zone' to accommodate the resources that a running search crawl would normally consume during testing.
+- Search crawl wasn't running during the tests. It might be running in a production environment. To take this into account, we lowered the SQL Server CPU utilization in our definitions of 'Green Zone' and 'Red Zone' to accommodate the resources that a running search crawl would normally consume during testing.
## Specifications <a name="Specs"> </a>
The farm has from one to 10 virtual web servers. An additional dedicated virtual
#### Database Servers
-One physical database server runs the default SQL Server instance that has the SharePoint databases. The logging database is not tracked in this article.
+One physical database server runs the default SQL Server instance that has the SharePoint databases. The logging database isn't tracked in this article.
> [!NOTE] > If you enable usage reporting, we recommend that you store the logging database on a separate Logical Unit Number (LUN). Large deployments and some medium deployments might require a dedicated logging database server to accommodate the processor demand of a high log volume. > In this lab environment, logging was constrained, and the logging database was stored in a separate instance of SQL Server.
The dataset for the lab environment in this article, which represents a typical
The following results are ordered based on the scaling approach that is described in the [Overview](enterprise-intranet-collaboration-performance-and-capacity.md#Overview) section of this article.
-### Web server scale out
+### Web server scale-out
This section describes the test results that were obtained when we scaled out the number of web servers in this lab environment.
In our testing, we found the following:
- The environment scaled to ten web servers per database server. The increase in throughput was fairly linear. -- Even up to the maximum tested scale of ten web servers, the addition of more database servers did not increase throughput. The bottleneck was generally confined to web server resources.
+- Even up to the maximum tested scale of ten web servers, the addition of more database servers didn't increase throughput. The bottleneck was confined to web server resources.
-- The average latency at green zone was almost constant throughout the whole test. The number of web servers and throughput did not affect green zone latency. Red Zone latency data shows an expected trend line. Latency is very high at a single web server. A curve between 2 and 10 web servers remains comfortably within Red Zone criteria.
+- The average latency at green zone was almost constant throughout the whole test. The number of web servers and throughput didn't affect green zone latency. Red Zone latency data shows an expected trend line. Latency is high at a single web server. A curve between 2 and 10 web servers remains comfortably within Red Zone criteria.
> [!NOTE] > Latency may be mildly affected when you move the Distributed Cache service from a farm's web servers to a server that is dedicated to the Distributed Cache. This may occur because Distributed Cache traffic, which was previously internal to each web server, begins traversing the network. Test scale-out performance in your own environment to determine whether this tradeoff is significant. Note that latency in our test environment increased mildly when the Distributed Cache service was migrated to a dedicated server. Latency decreased with each added web server as the nominal added latency was offset by the decreased processing and memory load on the web servers. > For more information about Distributed Cache capacity planning, see [Plan for feeds and the Distributed Cache service in SharePoint Server](plan-for-feeds-and-the-distributed-cache-service.md). -- When performance testing was conducted for SharePoint Server 2010, the database server became a bottleneck at maximum throughput using four web servers. Because of improvements in caching and database usage characteristics in SharePoint Server 2013, the average load on the database server layer is significantly lower than it was in SharePoint Server 2010, and it was not necessary to scale out the database servers during testing.
+- When performance testing was conducted for SharePoint Server 2010, the database server became a bottleneck at maximum throughput using four web servers. Because of improvements in caching and database usage characteristics in SharePoint Server 2013, the average load on the database server layer is lower than it was in SharePoint Server 2010, and it wasn't necessary to scale out the database servers during testing.
For more information about SharePoint Server 2010 test results for this scenario, see [Enterprise intranet collaboration environment lab study (SharePoint Server 2010)](/previous-versions/office/sharepoint-server-2010/ff758657(v=office.14))
This section provides information about how performance for this workload varied
#### Workload
-To compare SharePoint Server 2013 with SharePoint Server 2010, we used a different test mix from the one outlined in the [Specifications](#Specs) section. This was necessary because some SharePoint Server 2013 features (such as the Distributed Cache Service) and operations were not available in SharePoint Server 2010.
+To compare SharePoint Server 2013 with SharePoint Server 2010, we used a different test mix from the one outlined in the [Specifications](#Specs) section. This was necessary because some SharePoint Server 2013 features (such as the Distributed Cache Service) and operations weren't available in SharePoint Server 2010.
#### Test methodology
This upgraded environment was then tested again on the upgraded servers that hos
- We tested two environments for comparison. One environment used physical server hardware, and the other environment used virtual machines to run the web servers on a Hyper-V host. In both cases, the database server ran on a physical server. -- We did not modify the dataset after the content database upgrade for the SharePoint Server 2013 tests.
+- We didn't modify the dataset after the content database upgrade for the SharePoint Server 2013 tests.
- The test mix for SharePoint Server 2010 excluded new SharePoint Server 2013-specific operations, and resembled the enterprise intranet collaboration solution that was tested and described earlier in this article.
The goal of the testing was to apply similar loads against both SharePoint Serve
#### Analysis -- In general, SharePoint Server 2013 performed better than SharePoint Server 2010 when scaled out to five web servers, but SharePoint Server 2010 results were better at two web servers. Testing against the upgraded SharePoint Server 2013 server farm did not involve post-upgrade optimizations or take advantage of SharePoint Server 2013 performance improvements such as the Distributed Cache Service or Request Manager. SharePoint Server 2013 test results, therefore, are significantly different from results in a real-world environment.
+- In general, SharePoint Server 2013 performed better than SharePoint Server 2010 when scaled out to five web servers, but SharePoint Server 2010 results were better at two web servers. Testing against the upgraded SharePoint Server 2013 server farm didn't involve post-upgrade optimizations or take advantage of SharePoint Server 2013 performance improvements such as the Distributed Cache Service or Request Manager. SharePoint Server 2013 test results, therefore, are significantly different from results in a real-world environment.
- The relationship between data trends in the graphs in this section show how the SharePoint Server 2013 resource management model prioritizes the use of processor resources over disk IOPs.
SharePoint General Guidance For Hosters In Sharepoint Server 2013 https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/general-guidance-for-hosters-in-sharepoint-server-2013.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 52e844d0-5f65-4091-8c16-549b1f25a587
-description: "Administrator guidance for multi-tenant hosts in SharePoint Server."
+description: "Administrator guidance for multitenant hosts in SharePoint Server."
# General guidance for hosters in SharePoint Server 2013
For information about architectural, security, operational, and management guida
## Main Characteristics of Multi-Tenancy and Considerations
-When considering multi-tenancy, there are two extremes for the implementation model of a multi-tenant hosting platform:
+When considering multi-tenancy, there are two extremes for the implementation model of a multitenant hosting platform:
- **Dedicated**
These extremes lie at opposite ends of a spectrum as shown in the following diag
![This diagram shows the two implementation methods to consider when deploying a multi-tenant hosting platform](../media/DedicatedShared.jpg)
-The decision to implement a multi-tenant hosting platform comes down to selecting between these two implementation models. The decision is typically based on the required functional components, and the following service-level related attributes that ordinarily influence the choice of an appropriate implementation model for a multi-tenant hosting platform.
+The decision to implement a multitenant hosting platform comes down to selecting between these two implementation models. The decision is typically based on the required functional components, and the following service-level related attributes that ordinarily influence the choice of an appropriate implementation model for a multitenant hosting platform.
- Cost Saving
The decision to implement a multi-tenant hosting platform comes down to selectin
- Quality of Service
-These key attributes are important to consider when you make the decisions for selecting the appropriate architecture, implementation, and deployment of a multi-tenant SharePoint Server 2013 hosting platform. The following diagram shows how these key attributes vary from one implementation model to the other.
+These key attributes are important to consider when you make the decisions for selecting the appropriate architecture, implementation, and deployment of a multitenant SharePoint Server 2013 hosting platform. The following diagram shows how these key attributes vary from one implementation model to the other.
![This diagram shows key attributes of Multi-tenant Hosting Platforms](../media/KeyAttributes.jpg)
In SharePoint Server 2010, a new shared service model was implemented. This new
In SharePoint, multi-tenancy refers to the ability to partition the data of otherwise shared services to accommodate multiple tenants. This contrasts with setting up separate dedicated hardware or even running multiple dedicated instances of a given service.
-Services can be configured to share data across all tenants or to partition data for each tenant, for example, provide data isolation. Each service can be set up differently. Services can be created in partitioned mode by using Microsoft PowerShell or in un-partitioned mode by using Microsoft PowerShell or Central Administration. Once created, the mode of the Service cannot be changed. To achieve partitioning, both the service and the service connection must be deployed in partitioned mode. The service connection is called a proxy in Microsoft PowerShell.
+Services can be configured to share data across all tenants or to partition data for each tenant, for example, provide data isolation. Each service can be set up differently. Services can be created in partitioned mode by using Microsoft PowerShell or in unpartitioned mode by using Microsoft PowerShell or Central Administration. Once created, the mode of the Service can't be changed. To achieve partitioning, both the service and the service connection must be deployed in partitioned mode. The service connection is called a proxy in Microsoft PowerShell.
-Not all services can be partitioned. Services that do not store tenant data, such as PowerPoint Automation Services, do not have to be partitioned. These services can be shared across multiple tenants without risk of exposing tenant-specific data.
+Not all services can be partitioned. Services that don't store tenant data, such as PowerPoint Automation Services, don't have to be partitioned. These services can be shared across multiple tenants without risk of exposing tenant-specific data.
The following diagram shows how service data is partitioned: ![This diagram shows how data is partitioned in a multi-tenancy platform](../media/PartitionData.jpg)
-In SharePoint, multi-tenancy is tied to site subscriptions. A site subscription is a logical group of site collections that can share settings, features, and service data. Site collections for each tenant are brought together with a subscription ID. The subscription ID is used to map features, services, and sites to tenants and also to partition service data according to tenant. The Subscription Settings service keeps track of multi-tenant services and subscription IDs.
+In SharePoint, multi-tenancy is tied to site subscriptions. A site subscription is a logical group of site collections that can share settings, features, and service data. Site collections for each tenant are brought together with a subscription ID. The subscription ID is used to map features, services, and sites to tenants and also to partition service data according to tenant. The Subscription Settings service keeps track of multitenant services and subscription IDs.
-Here is how it works:
+Here's how it works:
-1. Farm administrators deploy services to the farm. This includes the Subscription Settings service. Service applications can be either deployed as partitioned where data is isolated for each tenant, or un-partitioned where data is shared across all tenants. Some services do not store tenant data and are shared across all tenants without a partition.
+1. Farm administrators deploy services to the farm. This includes the Subscription Settings service. Service applications can be either deployed as partitioned where data is isolated for each tenant, or unpartitioned where data is shared across all tenants. Some services don't store tenant data and are shared across all tenants without a partition.
-2. Farm administrators deploy a Tenant Administration site for each tenant by using Microsoft PowerShell. The Tenant Administration site is associated with a subscription ID. Administrators can deploy additional site collections for each tenant. Each site collection is tied to the subscription ID of the tenant.
+2. Farm administrators deploy a Tenant Administration site for each tenant by using Microsoft PowerShell. The Tenant Administration site is associated with a subscription ID. Administrators can deploy other site collections for each tenant. Each site collection is tied to the subscription ID of the tenant.
3. All service applications that are connected at the web application level are available for use by site collections within the web application. Administrators choose the services to offer and become active for each tenant. The subscription ID for a tenant is used to map services to the site collections.
By following these guidelines, future SharePoint upgrades can be scoped to eithe
#### Physical vs. Logical Data Partitioning
-Data partitioning plays a big role in deciding which approach to take for a SharePoint deployment then a physical partition is required. The solution is to have a farm and even a single Identity Provider per tenant. But if some data can be shared we can then move towards sharing elements of the infrastructure across multiple tenants, database servers, some SharePoint services, and the farm.
+Data partitioning plays a large role in deciding which approach to take for a SharePoint deployment then a physical partition is required. The solution is to have a farm and even a single Identity Provider per tenant. But if some data can be shared we can then move towards sharing elements of the infrastructure across multiple tenants, database servers, some SharePoint services, and the farm.
#### What Identity and Authentication model do I want?
-Depending upon authentication requirements, custom code may be required. When using Windows Authentication, code is not required and the SharePoint people picker is fully functional. Security Assertion Markup Language (SAML) or Forms Based authentication (FBA) require a custom claims provider that implements search and validates procedures.
+Depending upon authentication requirements, custom code may be required. When using Windows Authentication, code isn't required and the SharePoint people picker is fully functional. Security Assertion Markup Language (SAML) or Forms Based authentication (FBA) require a custom claims provider that implements search and validates procedures.
> [!NOTE] > The previous considerations are valid for both single-tenant per farm and multi-tenant per farm.
-For additional information about SAML authentication and FBA authentication in SharePoint Server 2013, see [Configure SAML-based claims authentication with AD FS in SharePoint Server](../security-for-sharepoint-server/security-for-sharepoint-server.md) and [Configure forms-based authentication for a claims-based web application in SharePoint Server](../security-for-sharepoint-server/security-for-sharepoint-server.md).
+For more information about SAML authentication and FBA authentication in SharePoint Server 2013, see [Configure SAML-based claims authentication with AD FS in SharePoint Server](../security-for-sharepoint-server/security-for-sharepoint-server.md) and [Configure forms-based authentication for a claims-based web application in SharePoint Server](../security-for-sharepoint-server/security-for-sharepoint-server.md).
#### The Tenant Administrator Experience
-Depending on the topology, the administrator experience can be significantly different:
+Depending on the topology, the administrator experience can be different:
-- In a one tenant per farm approach there are no in-product possibilities to separate the Service administration experience from the tenant administration experience. In other words, there's no way to delegate specific functionalities of the the SharePoint Central Administration website to the tenant administrator to provision new site collections. You also can't configure a default Search Center for the tenant without permitting any Farm Topology change or Service configuration change.
+- In a one tenant per farm approach there are no in-product possibilities to separate the Service administration experience from the tenant administration experience. In other words, there's no way to delegate specific functionalities of the SharePoint Central Administration website to the tenant administrator to provision new site collections. You also can't configure a default Search Center for the tenant without permitting any Farm Topology change or Service configuration change.
-- In a multiple tenants per farm approach, the product provides an in-product tenant administration console. In this approach, the Tenant Administration Site Collection lets tenant administrators perform specific functionalities. In the previous configuration, there are specific functionalities in the Central Administration that cannot be performed without causing potential unexpected results. For example, don't create site collections in Central Administration.
+- In a multiple tenants per farm approach, the product provides an in-product tenant administration console. In this approach, the Tenant Administration Site Collection lets tenant administrators perform specific functionalities. In the previous configuration, there are specific functionalities in the Central Administration that can't be performed without causing potential unexpected results. For example, don't create site collections in Central Administration.
## Different Type of Topologies
There three topologies available for hosting in SharePoint Server 2016 in an on-
#### Services and Functionalities
-The collection of features and services that enable multi-tenancy with SharePoint Server 2013 comprise a base platform that is typically extended to provide the end-to-end service offering by hosting organizations. A number of key operational service management aspects of SharePoint Server 2013 behave differently when they use multi-tenancy, and therefore require careful consideration. These aspects vary between different organizations based on service levels and operational capabilities. Hosting partners should guarantee that satisfactory planning for the required additional customization, either with custom solutions or Microsoft PowerShell is in place as soon as possible, to meet customer expectations. Examples in this space include:
+The collection of features and services that enable multi-tenancy with SharePoint Server 2013 comprise a base platform that is typically extended to provide the end-to-end service offering by hosting organizations. Many key operational service management aspects of SharePoint Server 2013 behave differently when they use multi-tenancy, and therefore require careful consideration. These aspects vary between different organizations based on service levels and operational capabilities. Hosting partners should guarantee that satisfactory planning for the required more customization, either with custom solutions or Microsoft PowerShell is in place as soon as possible, to meet customer expectations. Examples in this space include:
- Initial tenant provisioning
The service applications available in a SharePoint Server 2013 on-premises envir
> [!NOTE] > The column, **Supported for Multi-Tenancy**, means you can't configure it in multi-tenancy. You will get an error message.
-Along with the previous considerations, tenant provisioning and de-provisioning processes and scripts have to account for each service application that stores tenant data. For some service applications, all the management of tenant data is moved to elements of the tenant administration site, whereas with some, a combination of farm-level and tenant-level administration is required.
+Along with the previous considerations, tenant provisioning and deprovisioning processes and scripts have to account for each service application that stores tenant data. For some service applications, all the management of tenant data is moved to elements of the tenant administration site, whereas with some, a combination of farm-level and tenant-level administration is required.
SharePoint How To Display Recommendations And Popular Items In Sharepoint Server https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/how-to-display-recommendations-and-popular-items-in-sharepoint-server.md
description: "Learn how to display recommendations and popular items on a ShareP
This series of articles explains how to use the Usage analytics feature that was introduced in SharePoint Server to display "smart content" on your website. By "smart content" we mean content such as recommendations and popular items. -- *Recommendations* are based on how other visitors have interacted with your website. For example, on a product page, you can display "People who viewed this product also viewed *\<other products\>* ."
+- *Recommendations* are based on how other visitors have interacted with your website. For example, on a product page, you can display "People who viewed this product also viewed *\<other products\>. "
- *Popular items* are the most frequently viewed items on the whole site, or in a specific category. For example, the most viewed laptops on a site that sells technology products.
-To help explain, we'll use examples from a fictitious company called Contoso. Here's what the website will look like when it is finished.
+To help explain, we'll use examples from a fictitious company called Contoso. Here's what the website will look like when it's finished.
![Examples](../media/OTCSP_Examples.png)
To help explain, we'll use examples from a fictitious company called Contoso. He
- [View and configure usage analytics reports in SharePoint Server](view-and-configure-usage-analytics-reports.md)
-The scenario in this set of topics draws heavily upon the concepts, feature descriptions, and procedures introduced in the [How to set up a product-centric website in SharePoint Server](how-to-set-up-a-product-centric-website.md) series. We recommend reading it first, as it will makes things much easier to understand.
+The scenario in this set of articles draws heavily upon the concepts, feature descriptions, and procedures introduced in the [How to set up a product-centric website in SharePoint Server](how-to-set-up-a-product-centric-website.md) series. We recommend reading it first, as it will make things easier to understand.
SharePoint Install Microsoft Identity Manager For User Profiles In Sharepoint Server https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/install-microsoft-identity-manager-for-user-profiles-in-sharepoint-server.md
Which option is right for you?
|&nbsp;|&nbsp;|&nbsp;| |:--|:--|:--| ||**Microsoft Identity Management server** <br/> |**Active Directory Import** <br/> |
-|Pros <br/> |1. Flexibility allows for customized import. <br/> 2. Can be customized for bidirectional flow. <br/> 3. Imports user profile photos automatically. <br/> 4. Supports non-Active Directory LDAP sources. <br/> 5. Multi-forest scenarios are supported. <br/> |1. Very fast and performant. <br/> 2. Known to be reliable (used by Microsoft 365). <br/> 3. Configurable inside of Central Administration. (Less complex.) <br/> |
-|Cons <br/> |1. A separate MIM server is recommended for use with your SharePoint farm. <br/> 2. The more customized the more complex the architecture, deployment, and management. <br/> |1. Import is unidirectional (changes go from Active Directory to SharePoint Server Profile). <br/> 2. Import from a single Active Directory forest only. <br/> 3. Does not import user photos. <br/> 4. Supports Active Directory LDAP only. <br/> 5. Multi-forest scenarios are not supported. <br/> |
+|Pros <br/> |1. Flexibility allows for customized import. <br/> 2. Can be customized for bidirectional flow. <br/> 3. Imports user profile photos automatically. <br/> 4. Supports non-Active Directory LDAP sources. <br/> 5. Multi-forest scenarios are supported. <br/> |1. Fast and performant. <br/> 2. Known to be reliable (used by Microsoft 365). <br/> 3. Configurable inside of Central Administration. (Less complex.) <br/> |
+|Cons <br/> |1. A separate MIM server is recommended for use with your SharePoint farm. <br/> 2. The more customized the more complex the architecture, deployment, and management. <br/> |1. Import is unidirectional (changes go from Active Directory to SharePoint Server Profile). <br/> 2. Import from a single Active Directory forest only. <br/> 3. Doesn't import user photos. <br/> 4. Supports Active Directory LDAP only. <br/> 5. Multi-forest scenarios aren't supported. <br/> |
> [!TIP] > If you need details, or you need to set up Active Directory Import for your SharePoint Server installation? Try [these steps](./configure-profile-synchronization-by-using-sharepoint-active-directory-import.md).
Which option is right for you?
## Choosing MIM for use with SharePoint Server <a name="BKMK_ChooseMIM"> </a>
-If you choose MIM, there are some **prerequisites** of which you should be aware. You will need:
+If you choose MIM, there are some **prerequisites** of which you should be aware. You'll need:
1. For SharePoint Server 2016, a Windows Server 2012 R2 computer or virtual machine for the installation of MIM components. For SharePoint Server 2019, a Windows Server 2016 computer is required. For SharePoint Server Subscription Edition, a Windows Server 2019 computer is required.
If you choose MIM, there are some **prerequisites** of which you should be aware
During these steps, you'll actually install three different elements essential to MIM. The first install will be of the MIM software, itself. You'll also need the SharePoint Management Agent.
-1. First, download and install MIM to the server where you want to install.
+1. First, download, and install MIM to the server where you want to install.
-2. Extract the .zip file and double-click Setup.exe. (Setup.exe is usually found in the SynchronizationService folder of the MIM media.)
+2. Extract the .zip file and double-click Setup.exe. (Setup.exe is found in the SynchronizationService folder of the MIM media.)
3. Click **Next** > accept the end-user license agreement, and click **Next** through the feature selection screen. (You don't need to change the default selection.)
During these steps, you'll actually install three different elements essential t
6. Next, set up the security groups that are needed for MIM to function. You can leave these as default if you wish, but in that case your security groups will be created on the local machine were MIM is being installed. If you have more than one machine configured to run MIM, you may want to create these security groups in Active Directory (AD). Do this in the same domain as the machines where MIM is configured, and enter the group names into this page of the wizard.
-7. The next step (firewall rules) is optional. We recommend you do not check the firewall rule checkbox.
+7. The next step (firewall rules) is optional. We recommend you don't check the firewall rule checkbox.
8. Click to install MIM.
During these steps, you'll actually install three different elements essential t
> [!NOTE] > You will need to backup the keys generated at this point if you are to move to another database server. Save these keys to a secure location and make certain you backup the key file along with the database backup so they're both available in a disaster recovery scenario.
-10. MIM installation should complete. You should log off and back onto your server again to ensure the MIM cache is updated.
+10. MIM installation should complete. You should log off and back on to your server again to ensure the MIM cache is updated.
-11. Once you log on again, ensure the MIM service is running on the server by going to Services (or Start or Windows key> **Run** > services.msc) and then locating the **Forefront Identity Manager Synchronization Service**. No mistake. The service name has not changed!
+11. Once you log on again, ensure the MIM service is running on the server by going to Services (or Start or Windows key> **Run** > services.msc) and then locating the **Forefront Identity Manager Synchronization Service**. No mistake. The service name hasn't changed!
### Install the SharePoint Management Agent (Forefront Identity Manager Connector for SharePoint)
SharePoint Managing A Minrole Server Farm In Sharepoint Server 2016 https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/managing-a-minrole-server-farm-in-sharepoint-server-2016.md
This is a new page in the System Settings category of Central Administration. It
The **Auto Provision** column displays whether the service is enabled in the farm. If the value **Yes** is displayed, service instances for this service will be started on the appropriate MinRole-managed servers in the farm. If the value **No** is displayed, service instances for this service will be stopped on the appropriate MinRole-managed servers in the farm.
-The **Action** column displays one of three values depending on the type of service it is and whether it is enabled in the farm: **Manage Service application**, **Disable Auto Provision**, and **Enable Auto Provision**.
+The **Action** column displays one of three values depending on the type of service it's and whether it's enabled in the farm: **Manage Service application**, **Disable Auto Provision**, and **Enable Auto Provision**.
The **Manage Service Application** value indicates that the service is associated with a service application. This service will be enabled or disabled in the farm by its service application, typically when you create or delete the service application. Click the link to access the Service Application Management page.
The **Disable Auto Provision** link disables the service in the farm. When you c
The **Enable Auto Provision** link enables the service in the farm. When you click this link, service instances for this service will be started on the appropriate MinRole-managed servers in the farm.
-The **Compliant** column displays whether the service is in compliance on every server in the farm. If this service is not in compliance on one or more servers, a **Fix** link will be provided. Click this link to automatically reconfigure the service instances of this service to match the expected configuration.
+The **Compliant** column displays whether the service is in compliance on every server in the farm. If this service isn't in compliance on one or more servers, a **Fix** link will be provided. Click this link to automatically reconfigure the service instances of this service to match the expected configuration.
> [!NOTE] > Only members of the local Administrators group on the server that hosts Central Administration have access to the **Fix** link.
In previous releases of SharePoint, this page was accessible only to members of
The role of the server is now displayed next to the name of the server.
-The **Compliant** column has been added to the page. It displays whether the service instance is in compliance on this server. If this service instance is not in compliance on this server, a Fix link will be provided. Click this link to automatically reconfigure the service instance on this server to match the expected configuration.
+The **Compliant** column has been added to the page. It displays whether the service instance is in compliance on this server. If this service instance isn't in compliance on this server, a Fix link will be provided. Click this link to automatically reconfigure the service instance on this server to match the expected configuration.
> [!NOTE] > Only members of the local Administrators group on the server that hosts the Central Administration have access to the Fix link.
New PowerShell cmdlets have been introduced to manage the services in the farm.
### Health monitoring
-A new health analyzer rule has been created to ensure that your servers are operating in their optimal MinRole configuration. The **Server role configuration isn't correct** rule runs every night at midnight on each server in your farm. It scans all service instances on the server to detect if any are not in compliance. If any service instance is not in compliance, the health rule will automatically reconfigure it to match the expected configuration. No manual intervention by the SharePoint farm administrator is required.
+A new health analyzer rule has been created to ensure that your servers are operating in their optimal MinRole configuration. The **Server role configuration isn't correct** rule runs every night at midnight on each server in your farm. It scans all service instances on the server to detect if any aren't in compliance. If any service instance isn't in compliance, the health rule will automatically reconfigure it to match the expected configuration. No manual intervention by the SharePoint farm administrator is required.
:::image type="content" alt-text="Displays health rules for MinRole topology in SharePoint Servers 2016 and 2019." source="../media/df3dd75f-d64f-4a1f-8d5c-57daecc9cb38.PNG" lightbox="../media/df3dd75f-d64f-4a1f-8d5c-57daecc9cb38.PNG":::
-The automatic repair functionality of the health rule can be disabled by the SharePoint farm administrator while still allowing the health rule to run. If the health rule detects that a server is not in compliance and the automatic repair functionality is disabled, it will generate a health report in Central Administration. The health report will identify which servers are not in compliance, offer the ability to automatically repair the servers, and provide instructions on how to manually repair the servers.
+The automatic repair functionality of the health rule can be disabled by the SharePoint farm administrator while still allowing the health rule to run. If the health rule detects that a server isn't in compliance and the automatic repair functionality is disabled, it will generate a health report in Central Administration. The health report will identify which servers aren't in compliance, offer the ability to automatically repair the servers, and provide instructions on how to manually repair the servers.
-The SharePoint farm administrator can control the health rule schedule, changing it to run more frequently or less frequently or disabling it so that it is never scheduled. It can also run on demand.
+The SharePoint farm administrator can control the health rule schedule, changing it to run more frequently or less frequently or disabling it so that it's never scheduled. It can also run on demand.
> [!NOTE] > This health rule will not scan or repair servers that are assigned to the Custom role. A server assigned to the Custom role will not be managed by MinRole. ## Developers: How to assign services to server roles
-If you are a SharePoint developer intending to create an application with services, it is recommended that you assign each type of service instance to one or more server roles supported by MinRole:
+If you're a SharePoint developer intending to create an application with services, it's recommended that you assign each type of service instance to one or more server roles supported by MinRole:
**Assign services to server roles**
SharePoint Monitor And Manage App Licenses https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/monitor-and-manage-app-licenses.md
description: "Learn how SharePoint Server farm administrators assign, monitor, a
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
-You can use the SharePoint Central Administration website to monitor and manage licenses for apps for SharePoint. Licenses for apps for SharePoint are digital sets of verifiable information that state the user rights for a app for SharePoint. Apps that are distributed through the SharePoint Store are the only apps that have built-in licenses that SharePoint Server recognizes.
+You can use the SharePoint Central Administration website to monitor and manage licenses for apps for SharePoint. Licenses for apps for SharePoint are digital sets of verifiable information that state the user rights for an app for SharePoint. Apps that are distributed through the SharePoint Store are the only apps that have built-in licenses that SharePoint Server recognizes.
Members of the Farm Administrators group manage licenses for apps and can also assign license managers for others to manage app for SharePoint licenses.
-Here are the basics of what SharePoint Server does and does not provide for apps for SharePoint licensing:
+Here are the basics of what SharePoint Server does and doesn't provide for apps for SharePoint licensing:
- SharePoint Server provides:
Here are the basics of what SharePoint Server does and does not provide for apps
- APIs for developers to query for license information -- SharePoint Server does not enforce app for SharePoint licenses.
+- SharePoint Server doesn't enforce app for SharePoint licenses.
- Developers must add code in their apps for SharePoint to retrieve license information and react accordingly. -- All app for SharePoint licenses are bound to a specific SharePoint Server deployment but can be transferred to a different SharePoint Server deployment three times.
+- All apps for SharePoint licenses are bound to a specific SharePoint Server deployment but can be transferred to a different SharePoint Server deployment three times.
## Monitoring and managing app licenses <a name="proc1"> </a>
-A farm administrator or a license manager can check the licenses for all apps for SharePoint on the App Licenses page. It is important to track the number of licenses that are available for each app for SharePoint so that users do not exceed this number. An administrator can assign additional users to a app for SharePoint license, purchase additional licenses for an app, and also add managers to a license.
+A farm administrator or a license manager can check the licenses for all apps for SharePoint on the App Licenses page. It's important to track the number of licenses that are available for each app for SharePoint so that users don't exceed this number. An administrator can assign more users to an app for SharePoint license, purchase more licenses for an app, and also add managers to a license.
**To view app license details**
A farm administrator or a license manager can check the licenses for all apps fo
2. On the **Actions** drop down list, click **Recover License**.
- The app for SharePoint details show any changes the administrator has made.
+ The app for SharePoint details shows any changes the administrator has made.
**To add a license manager**
SharePoint Overview Of Access Services In Sharepoint Server 2013 https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/overview-of-access-services-in-sharepoint-server-2013.md
description: "Learn how to use Access Services in SharePoint Server to share sol
[!INCLUDE[appliesto-2013-xxx-xxx-xxx-xxx-md](../includes/appliesto-2013-xxx-xxx-xxx-xxx-md.md)]
-Access Services in SharePoint Server 2013 are service applications that enable you to share two types of Access 2013 solutions on the web.
+Access Services in SharePoint Server 2013 is service applications that enable you to share two types of Access 2013 solutions on the web.
## Access apps are new in SharePoint Server 2013
-Access apps for SharePoint are a new type of database that you build in Access 2013, then use and share with others as an app for SharePoint in a web browser. To build an Access app, you select the type of data that you want to track (contacts, tasks, projects, and so on). Access creates the database structure as well as various views that let you add and edit data. Navigation and basic commands are built-in, so you can start to use the app right away. Once the Access app is running, it's a straightforward task to customize and enhance the Access app over time.
+Access apps for SharePoint are a new type of database that you build in Access 2013, then use and share with others as an app for SharePoint in a web browser. To build an Access app, you select the type of data that you want to track (contacts, tasks, projects, and so on). Access creates the database structure and various views that let you add and edit data. Navigation and basic commands are built in, so you can start to use the app right away. Once the Access app is running, it's a straightforward task to customize and enhance the Access app over time.
## Access web databases are supported for backward compatibility
-By default, you cannot create a web database by using Access 2013. However, you can still view and edit a web database that was previously created by using Access 2010 and SharePoint Server 2010, and you can republish it to SharePoint Server 2016.
+By default, you can't create a web database by using Access 2013. However, you can still view and edit a web database that was previously created by using Access 2010 and SharePoint Server 2010, and you can republish it to SharePoint Server 2016.
-There is no way to automatically convert a web database to an Access app. If you want to manually convert a web database to an Access app to take advantage of new functionality and features, you can do the following:
+There's no way to automatically convert a web database to an Access app. If you want to manually convert a web database to an Access app to take advantage of new functionality and features, you can do the following:
- Import the data from the web database into a new Access app.
SharePoint Performance Testing https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/performance-testing.md
description: Learn about how to plan and execute performance testing of a ShareP
[!INCLUDE[appliesto-2013-xxx-xxx-xxx-xxx-md](../includes/appliesto-2013-xxx-xxx-xxx-xxx-md.md)]
-This article describes how to test the performance of SharePoint Server 2013. The testing and optimization stage is a critical component of effective capacity management. You should test new architectures before you deploy them to production and you should conduct acceptance testing with following monitoring best practices in order to ensure the architectures you design achieve the performance and capacity targets. This allows you to identify and optimize potential bottlenecks before they impact users in a live deployment. If you are upgrading from an Office SharePoint Server 2007 environment and plan to make architectural changes, or are estimating user load of the new SharePoint Server 2013 features, then testing particularly important to make sure your new SharePoint Server 2013-based environment will meet performance and capacity targets.
+This article describes how to test the performance of SharePoint Server 2013. The testing and optimization stage is a critical component of effective capacity management. You should test new architectures before you deploy them to production and you should conduct acceptance testing with following monitoring best practices in order to ensure the architectures you design achieve the performance and capacity targets. This allows you to identify and optimize potential bottlenecks before they impact users in a live deployment. If you're upgrading from an Office SharePoint Server 2007 environment and plan to make architectural changes, or are estimating user load of the new SharePoint Server 2013 features, then testing important to make sure your new SharePoint Server 2013-based environment will meet performance and capacity targets.
Once you have tested your environment, you can analyze the test results to determine what changes need to be made in order to achieve the performance and capacity targets you established in [Step 1: Model](capacity-planning.md#step1) of [Capacity planning for SharePoint Server 2013](capacity-planning.md).
-These are the recommended sub steps you should follow for pre-production:
+These are the recommended sub steps you should follow for preproduction:
- Create the test environment that mimics the initial architecture you designed in [Step 2: Design](capacity-planning.md#step2).
Verify that your plan includes:
- Hardware that is designed to operate at expected production performance targets. Always measure the performance of test systems conservatively. -- If you have custom code or custom component, it is important that you test the performance of those components in isolation first to validate their performance and stability. After they are stable, you should test the system with those components installed and compare performance to the farm without them installed. Custom components are often a major culprit of performance and reliability problems in production systems.
+- If you have custom code or custom component, it's important that you test the performance of those components in isolation first to validate their performance and stability. After they're stable, you should test the system with those components installed and compare performance to the farm without them installed. Custom components are often a major culprit of performance and reliability problems in production systems.
- Know the goal of your testing. Understand ahead of time what your testing objectives are. Is it to validate the performance of some new custom components that were developed for the farm? Is it to see how long it will take to crawl and index a set of content? Is it to determine how many requests per second your farm can support? There can be many different objectives during a test, and the first step in developing a good test plan is deciding what your objectives are. -- Understand how to measure for your testing goal. If you are interested in measuring the throughput capacity of your farm, for example, you will want to measure the RPS and page latency. If you are measuring for search performance, then you will want to measure crawl time and document indexing rates. If your testing objective is well understood, that will help you clearly define what key performance indicators you need to validate in order to complete your tests.
+- Understand how to measure for your testing goal. If you're interested in measuring the throughput capacity of your farm, for example, you'll want to measure the RPS and page latency. If you're measuring for search performance, then you'll want to measure crawl time and document indexing rates. If your testing objective is well understood, that will help you clearly define what key performance indicators you need to validate in order to complete your tests.
## Create the Test Environment <a name="createenvironment"> </a> Once your test objectives have been decided, your measurements have been defined, and you have determined what the capacity requirements are for your farm (from steps 1 and 2 of this process), the next objective will be to design and create the test environment. The effort to create a test environment is often underestimated. It should duplicate the production environment as closely as possible. Some of the features and functionality you should consider when designing your test environment include: -- **Authentication**: Decide whether the farm will use Active Directory Domain Services (AD DS), forms-based authentication (and if so with what directory), claims-based authentication, etc. Regardless of which directory you are using, how many users do you need in your test environment and how are you going to create them? How many groups or roles are you going to need and how will you create and populate them? You also need to ensure that you have enough resources allocated to your authentication services that they don't become a bottleneck during testing.
+- **Authentication**: Decide whether the farm will use Active Directory Domain Services (AD DS), forms-based authentication (and if so with what directory), claims-based authentication, etc. Regardless of which directory you're using, how many users do you need in your test environment and how are you going to create them? How many groups or roles are you going to need and how will you create and populate them? You also need to ensure that you have enough resources allocated to your authentication services that they don't become a bottleneck during testing.
-- **DNS**: Know what the namespaces are that you will need during your testing. Identify which servers will be responding to those requests and make sure you've included a plan that has what IP addresses will be used by which servers, and what DNS entries you will need to create.
+- **DNS**: Know what the namespaces are that you'll need during your testing. Identify which servers will be responding to those requests and make sure you've included a plan that has what IP addresses will be used by which servers, and what DNS entries you'll need to create.
-- **Load balancing**: Assuming you are using more than one server (which you normally would or you likely wouldn't have enough load to warrant load testing), you will need some kind of load balancer solution. That could be a hardware load balancing device, or you could use software load balancing like Windows NLB. Figure out what you will use and write down all of the configuration information you will need to get it set up quickly and efficiently. Another thing to remember is that load test agents typically try and resolve the address to a URL only once every 30 minutes. That means that you should not use a local hosts file or round robin DNS for load balancing because the test agents will likely end up going to the same server for every single request, instead of balancing around all available servers.
+- **Load balancing**: Assuming you're using more than one server (which you normally would or you likely wouldn't have enough load to warrant load testing), you'll need some kind of load balancer solution. That could be a hardware load balancing device, or you could use software load balancing like Windows NLB. Figure out what you will use and write down all of the configuration information you'll need to get it set up quickly and efficiently. Another thing to remember is that load test agents typically try to resolve the address to a URL only once every 30 minutes. That means that you shouldn't use a local hosts file or round robin DNS for load balancing because the test agents will likely end up going to the same server for every single request, instead of balancing around all available servers.
-- **Test servers**: When you plan your test environment, you not only need to plan for the servers for the SharePoint Server 2013 farm, you also need to plan for the machines needed to execute the tests. Typically that will include three servers at a minimum; more may be necessary. If you are using Visual Studio Team System (Team Test Load Agent) to do the testing, one machine will be used as the load test controller. There are generally two or more machines that are used as load test agents. The agents are the machines that take the instructions from the test controller about what to test and issue the requests to the SharePoint Server 2013 farm. The test results themselves are stored on a SQL Server-based computer. You should not use the same SQL Server-based computer that is used for the SharePoint Server 2016 farm, because writing the test data will skew the available SQL Server resources for the SharePoint Server 2013 farm. You also need to monitor your test servers when running your tests, the same way as you would monitor the servers in the SharePoint Server 2013 farm, or domain controllers, etc. to make sure that the test results are representative of the farm you're setting up. Sometimes the load agents or controller can become the bottleneck themselves. If that happens then the throughput, you see in your test is typically not the maximum farm can support.
+- **Test servers**: When you plan your test environment, you not only need to plan for the servers for the SharePoint Server 2013 farm, you also need to plan for the machines needed to execute the tests. Typically that will include three servers at a minimum; more may be necessary. If you're using Visual Studio Team System (Team Test Load Agent) to do the testing, one machine will be used as the load test controller. There are generally two or more machines that are used as load test agents. The agents are the machines that take the instructions from the test controller about what to test and issue the requests to the SharePoint Server 2013 farm. The test results themselves are stored on a SQL Server-based computer. You shouldn't use the same SQL Server-based computer that is used for the SharePoint Server 2016 farm, because writing the test data will skew the available SQL Server resources for the SharePoint Server 2013 farm. You also need to monitor your test servers when running your tests, the same way as you would monitor the servers in the SharePoint Server 2013 farm, or domain controllers, etc. to make sure that the test results are representative of the farm you're setting up. Sometimes the load agents or controller can become the bottleneck themselves. If that happens, then the throughput, you see in your test is typically not the maximum farm can support.
- **SQL Server**: In your test environment, follow the guidance in the sections "Configure SQL Server" and "Validate and monitor storage and SQL Server performance" in the article [Storage and SQL Server capacity planning and configuration (SharePoint Server)](storage-and-sql-server-capacity-planning-and-configuration.md). -- **Dataset validation**: As you decide what content you are going to run tests against, remember that in the best case scenario you will use data from an existing production system. For example, you can back up your content databases from a production farm and restore them into your test environment, then attach the databases to bring the content into the farm. Anytime you run tests against made up or sample data, you run the risk of having your results skewed because of differences in your content corpus.
+- **Dataset validation**: As you decide what content you're going to run tests against, remember that in the best case scenario you'll use data from an existing production system. For example, you can back up your content databases from a production farm and restore them into your test environment, then attach the databases to bring the content into the farm. Anytime you run tests against made up or sample data, you run the risk of having your results skewed because of differences in your content corpus.
If you do have to create sample data, there are a few considerations to keep in mind as you build out that content:
If you do have to create sample data, there are a few considerations to keep in
- You should have an idea of the customizations the production site will be using. For example, master pages, style sheets, JavaScript, etc. should all be implemented in the test environment as closely as possible to the production environment. -- Determine how many SharePoint groups and/or permission levels you are going to need, and how you are going to associate users with them.
+- Determine how many SharePoint groups and/or permission levels you're going to need, and how you're going to associate users with them.
- Figure out whether you'll need to do profile imports, and how long that will take. - Determine whether you'll need Audiences, and how you'll create and populate them. -- Determine whether you need additional search content sources, and what you will need to create them. If you won't need to create them, determine whether you'll have network access to be able to crawl them.
+- Determine whether you need more search content sources, and what you will need to create them. If you won't need to create them, determine whether you'll have network access to be able to crawl them.
-- Determine whether you have enough sample data - documents, lists, list items, etc. If not, create a plan for how you will create this content.
+- Determine whether you have enough sample data - documents, lists, list items, etc. If not, create a plan for how you'll create this content.
- Have a plan for enough unique content to adequately test search. A common mistake is to upload the same document - maybe hundreds or even thousands of times - to different document libraries with different names. That can impact search performance because the query processor will spend an ordinate amount of time doing duplicate detection that it wouldn't otherwise have to in a production environment with real content. ## Create Tests and Tools <a name="createtests"> </a>
-After the test environment is functional, it is time to create and fine-tune the tests that will be used to measure the performance capacity of the farm. This section will at times make references specifically to Visual Studio Team System (Team Test Load Agent), but many of the concepts are applicable irrespective of which load test tool you use. For more information about testing tools available for Azure DevOps (formerly VSTS), see [DevOps tools overview for Azure DevOps](/azure/devops/user-guide/devops-alm-overview).
+After the test environment is functional, it's time to create and fine-tune the tests that will be used to measure the performance capacity of the farm. This section will at times make references specifically to Visual Studio Team System (Team Test Load Agent), but many of the concepts are applicable irrespective of which load test tool you use. For more information about testing tools available for Azure DevOps (formerly VSTS), see [DevOps tools overview for Azure DevOps](/azure/devops/user-guide/devops-alm-overview).
-You can also use the SharePoint Load Test Kit (LTK) with VSTS for load testing of SharePoint 2010 farms. The Load Test Kit generates a Visual Studio Team System 2008 load test based on Windows SharePoint Services 3.0 and Microsoft Office SharePoint Server 2007 IIS logs. The VSTS load test can be used to generate synthetic load against SharePoint Foundation 2010 or SharePoint Server 2010 as part of a capacity planning exercise or a pre-upgrade stress test.
+You can also use the SharePoint Load Test Kit (LTK) with VSTS for load testing of SharePoint 2010 farms. The Load Test Kit generates a Visual Studio Team System 2008 load test based on Windows SharePoint Services 3.0 and Microsoft Office SharePoint Server 2007 IIS logs. The VSTS load test can be used to generate synthetic load against SharePoint Foundation 2010 or SharePoint Server 2010 as part of a capacity planning exercise or a preupgrade stress test.
The Load Test Kit is included in the Microsoft SharePoint 2010 Administration Toolkit v2.0, available from the [Microsoft Download Center](https://www.microsoft.com/download/details.aspx?id=20022).
-A key criterion to the success of the tests is to be able to effectively simulate a realistic workload by generating requests across a wide range of the test site data, just as users would access a wide range of content in a production SharePoint Server 2013 farm. In order to do that, you will typically need to construct your tests such that they are data driven. Rather than creating hundreds of individual tests that are hard-coded to access a specific page, you should use just a few tests that use data sources containing the URLs for those items to dynamically access that set of pages.
+A key criterion to the success of the tests is to be able to effectively simulate a realistic workload by generating requests across a wide range of the test site data, just as users would access a wide range of content in a production SharePoint Server 2013 farm. In order to do that, you'll typically need to construct your tests such that they're data driven. Rather than creating hundreds of individual tests that are hard-coded to access a specific page, you should use just a few tests that use data sources containing the URLs for those items to dynamically access that set of pages.
- In Visual Studio Team System (Team Test Load Agent), a data source can come in a variety of formats, but a CSV file format is often easiest to manage and transport between development and test environments. Keep in mind that creating CSV files with that content might require the creation of custom tools to enumerate the SharePoint Server 2013-based environment and record the various URLs being used.
+ In Visual Studio Team System (Team Test Load Agent), a data source can come in various formats, but a CSV file format is often easiest to manage and transport between development and test environments. Keep in mind that creating CSV files with that content might require the creation of custom tools to enumerate the SharePoint Server 2013-based environment and record the various URLs being used.
You may need to use tools for tasks like:
You may need to use tools for tasks like:
- Creating a list of sample search keywords and phrases -- Populating SharePoint groups and permission levels with users and Active Directory groups (or roles if you are using forms based authentication)
+- Populating SharePoint groups and permission levels with users and Active Directory groups (or roles if you're using forms based authentication)
When creating the web tests, there are other best practices that you should observe and implement. They include: -- Record simple web tests as a starting point. Those tests will have hard-coded values in them for parameters like URL, ID's, etc. Replace those hard-coded values with links from your CSV files. Data binding those values in Visual Studio Team System (Team Test Load Agent) is extremely easy.
+- Record simple web tests as a starting point. Those tests will have hard-coded values in them for parameters like URL, IDs, etc. Replace those hard-coded values with links from your CSV files. Data binding those values in Visual Studio Team System (Team Test Load Agent) is easy.
-- Always have validation rules for your test. For example, when requesting a page, if an error occurs you will often get the error.aspx page in response. From a web test perspective, it appears as just another positive response, because you get an HTTP status code of 200 (successful) in the load test results. Obviously an error has occurred though so that should be tracked differently. Creating one or more validation rules allows you to trap when certain text is sent as a response so that the validation fails and the request is marked as a failure. For example, in Visual Studio Team System (Team Test Load Agent) a simple validation rule might be a ResponseUrl validation - it records a failure if the page that is rendered after redirects is not the same response page that was recorded in the test. You could also add a FindText rule that will record a failure if it finds the word "access denied", for example, in the response.
+- Always have validation rules for your test. For example, when requesting a page, if an error occurs you'll often get the error.aspx page in response. From a web test perspective, it appears as just another positive response, because you get an HTTP status code of 200 (successful) in the load test results. Obviously an error has occurred though so that should be tracked differently. Creating one or more validation rules allows you to trap when certain text is sent as a response so that the validation fails and the request is marked as a failure. For example, in Visual Studio Team System (Team Test Load Agent) a simple validation rule might be a ResponseUrl validation - it records a failure if the page that is rendered after redirects isn't the same response page that was recorded in the test. You could also add a FindText rule that will record a failure if it finds the word "access denied," for example, in the response.
-- Use multiple users in different roles for tests. Certain behaviors such as output caching work differently depending on the rights of the current user. For example, a site collection administrator or an authenticated user with approval or authoring rights will not get cached results because we always want them to see the most current version of content. Anonymous users, however, will get the cached content. You need to make sure that your test users are in a mix of these roles that approximately matches the mix of users in the production environment. For example, in production there are probably only two or three site collection administrators, so you should not create tests where 10% of the page requests are made by user accounts that are site collection administrators over the test content.
+- Use multiple users in different roles for tests. Certain behaviors such as output caching work differently depending on the rights of the current user. For example, a site collection administrator or an authenticated user with approval or authoring rights won't get cached results because we always want them to see the most current version of content. Anonymous users, however, will get the cached content. You need to make sure that your test users are in a mix of these roles that approximately matches the mix of users in the production environment. For example, in production there are probably only two or three site collection administrators, so you shouldn't create tests where 10% of the page requests are made by user accounts that are site collection administrators over the test content.
- Parsing dependent requests is an attribute of a Visual Studio Team System (Team Test Load Agent) that determines whether the test agent should attempt to retrieve just the page, or the page and all associated requests that are part of the page, such as images, style sheets, scripts, etc. When load testing, we usually ignore these items for a few reasons: - After a user hits a site the first time these items are often cached by the local browser
- - These items don't typically come from SQL Server in a SharePoint Server 2013-based environment. With BLOB caching turned on, they are instead served by the Web servers so they don't generate SQL Server load.
+ - These items don't typically come from SQL Server in a SharePoint Server 2013-based environment. With BLOB caching turned on, they're instead served by the Web servers so they don't generate SQL Server load.
-If you regularly have a high percentage of first time users to your site, or you have disabled browser caching, or for some reason you don't intend to use the blob cache, then it may make sense to enable parsing dependent requests in your tests. However this is really the exception and not the rule of thumb for most implementations. Be aware that if you do turn this on it can significantly inflate the RPS numbers reported by the test controller. These requests are served so quickly it may mislead you into thinking that there is more capacity available in the farm than there actually is.
+If you regularly have a high percentage of first time users to your site, or you have disabled browser caching, or for some reason you don't intend to use the blob cache, then it may make sense to enable parsing dependent requests in your tests. However this is really the exception and not the rule of thumb for most implementations. Be aware that if you do turn this on it can significantly inflate the RPS numbers reported by the test controller. These requests are served so quickly it may mislead you into thinking that there's more capacity available in the farm than there actually is.
- Remember to model client application activity as well. Client applications, such as Microsoft Word, PowerPoint, Excel, and Outlook generate requests to SharePoint Server 2013 farms as well. They add load to the environment by sending the server requests such as retrieving RSS feeds, acquiring social information, requesting details on site and list structure, synchronizing data, etc. These types of requests should be included and modeled if you have those clients in your implementation. -- In most cases a web test should only contain a single request. It's easier to fine-tune and troubleshoot your testing harness and individual requests if the test only contains a single request. Web tests will typically need to contain multiple requests if the operation it is simulating is composed of multiple requests. For example, to test this set of actions you will need a test with multiple step: checking out a document, editing it, checking it in and publishing it. It also requires reserving state between the steps - for example, the same user account should be used to check it out, make the edits, and check it back in. Those multi-step operations that require state to be carried forward between each step are best served by multiple requests in a single web test.
+- In most cases a web test should only contain a single request. It's easier to fine-tune and troubleshoot your testing harness and individual requests if the test only contains a single request. Web tests will typically need to contain multiple requests if the operation it's simulating is composed of multiple requests. For example, to test this set of actions you'll need a test with multiple step: checking out a document, editing it, checking it in and publishing it. It also requires reserving state between the steps - for example, the same user account should be used to check it out, make the edits, and check it back in. Those multi-step operations that require state to be carried forward between each step are best served by multiple requests in a single web test.
- Test each web test individually. Make sure that each test is able to complete successfully before running it in a larger load test. Confirm that all of the names for web applications resolve, and that the user accounts used in the test have sufficient rights to execute the test.
There are some additional best practices that should be observed and implemented
- Use a reasonable read/write ratio in your tests. Overloading the number of writes in a test can significantly impact the overall throughput of a test. Even on collaboration farms, the read/write ratios tend to have many more reads than writes. -- Consider the impact of other resource intensive operations and decide whether they should be occurring during the load test. For example, operations like backup and restore are not generally done during a load test. A full search crawl may not be usually run during a load test, whereas an incremental crawl may be normal. You need to consider how those tasks will be scheduled in production - will they be running at peak load times? If not, then they should probably be excluded during load testing, when you are trying to determine the maximum steady state load you can support for peak traffic.
+- Consider the impact of other resource intensive operations and decide whether they should be occurring during the load test. For example, operations like backup and restore are not generally done during a load test. A full search crawl may not be usually run during a load test, whereas an incremental crawl may be normal. You need to consider how those tasks will be scheduled in production - will they be running at peak load times? If not, then they should probably be excluded during load testing, when you're trying to determine the maximum steady state load you can support for peak traffic.
-- Don't use think times. Think times are a feature of Visual Studio Team System (Team Test Load Agent) that allow you to simulate the time that users pause between clicks on a page. For example a typical user might load a page, spend three minutes reading it, then click a link on the page to visit another site. Trying to model this in a test environment is nearly impossible to do correctly, and effectively doesn't add value to the test results. It's difficult to model because most organizations don't have a way to monitor different users and the time they spend between clicks on different types of SharePoint sites (like publishing versus search versus collaboration, etc.). It also doesn't really add value because even though a user may pause between page requests, the SharePoint Server 2013-based servers do not. They just get a steady stream of requests that may have peaks and valleys over time, but they are not waiting idly as each user pauses between clicking links on a page.
+- Don't use think times. Think times are a feature of Visual Studio Team System (Team Test Load Agent) that allow you to simulate the time that users pause between clicks on a page. For example a typical user might load a page, spend three minutes reading it, then click a link on the page to visit another site. Trying to model this in a test environment is nearly impossible to do correctly, and effectively doesn't add value to the test results. It's difficult to model because most organizations don't have a way to monitor different users and the time they spend between clicks on different types of SharePoint sites (like publishing versus search versus collaboration, etc.). It also doesn't really add value because even though a user may pause between page requests, the SharePoint Server 2013-based servers don't. They just get a steady stream of requests that may have peaks and valleys over time, but they aren't waiting idly as each user pauses between clicking links on a page.
-- Understand the difference between users and requests. Visual Studio Team System (Team Test Load Agent) has load pattern where it asks you to enter the number of users to simulate. This doesn't have anything to do with application users, it's really just how many threads are going to be used on the load test agents to generate requests. A common mistake is thinking that if the deployment will have 5,000 users, for example, then 5,000 is the number that should be used in Visual Studio Team System (Team Test Load Agent) - it is not! That's one of the many reasons why when estimating capacity planning requirements, the usage requirements should be based on number of requests per second and not number of users. In a Visual Studio Team System (Team Test Load Agent) load test, you will find that you can often generate hundreds of requests per second using only 50 to 75 load test "users".
+- Understand the difference between users and requests. Visual Studio Team System (Team Test Load Agent) has load pattern where it asks you to enter the number of users to simulate. This doesn't have anything to do with application users, it's just how many threads are going to be used on the load test agents to generate requests. A common mistake is thinking that if the deployment will have 5,000 users, for example, then 5,000 is the number that should be used in Visual Studio Team System (Team Test Load Agent) - it isn't! That's one of the many reasons why when estimating capacity planning requirements, the usage requirements should be based on number of requests per second and not number of users. In a Visual Studio Team System (Team Test Load Agent) load test, you'll find that you can often generate hundreds of requests per second using only 50 to 75 load test "users".
-- Use a constant load pattern for the most reliable and reproducible test results. In Visual Studio Team System (Team Test Load Agent), you have the option of basing load on a constant number of users (threads, as explained in the previous point), a stepped up load pattern of users, or a goal based usage test. A stepped load pattern is when you start with a lower number of users and then "step up" adding additional users every few minutes. A goal based usage test is when you establish a threshold for a certain diagnostic counter, like CPU utilization, and test attempts to drive the load to keep that counter between a minimum and maximum threshold that you define for it. However, if you are just trying to determine the maximum throughput your SharePoint Server 2013 farm can sustain during peak load, it is more effective and accurate to just pick a constant load pattern. That allows you to more easily identify how much load the system can take before starting to regularly exceed the thresholds that should be maintained in a healthy farm.
+- Use a constant load pattern for the most reliable and reproducible test results. In Visual Studio Team System (Team Test Load Agent), you have the option of basing load on a constant number of users (threads, as explained in the previous point), a stepped up load pattern of users, or a goal based usage test. A stepped load pattern is when you start with a lower number of users and then "step up" adding additional users every few minutes. A goal based usage test is when you establish a threshold for a certain diagnostic counter, like CPU utilization, and test attempts to drive the load to keep that counter between a minimum and maximum threshold that you define for it. However, if you're just trying to determine the maximum throughput your SharePoint Server 2013 farm can sustain during peak load, it's more effective and accurate to just pick a constant load pattern. That allows you to more easily identify how much load the system can take before starting to regularly exceed the thresholds that should be maintained in a healthy farm.
-Each time you run a load test remember that it is changing data in the database. Whether that's uploading documents, editing list items, or just recording activity in the usage database, there will be data that is written to SQL Server. To ensure a consistent and legitimate set of test results from each load test, you should have a backup available before you run the first load test. After each load test is complete the backup should be used to restore the content back to the way, it was before the test was started.
+Each time you run a load test remember that it's changing data in the database. Whether that's uploading documents, editing list items, or just recording activity in the usage database, there will be data that is written to SQL Server. To ensure a consistent and legitimate set of test results from each load test, you should have a backup available before you run the first load test. After each load test is complete the backup should be used to restore the content back to the way, it was before the test was started.
## See also <a name="createtests"> </a>
SharePoint Plan A Business Connectivity Services Solution https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-a-business-connectivity-services-solution.md
description: "Create a plan for your Microsoft Business Connectivity Services so
Microsoft Business Connectivity Services solutions integrate external data deeply into SharePoint Server and Office. Each Business Connectivity Services solution is custom-built using Visual Studio. There are no out-of-the-box Business Connectivity Services configurations or templates that you can use.
-This article takes you through five questions that you must answer before you can design your Business Connectivity Services solution. Be sure to collect all this information and communicate it to all the key stakeholders to review and approve. When you do this, you will help ensure that everyone involved has the same understanding of the needs of the project and how the solution will work.
+This article takes you through five questions that you must answer before you can design your Business Connectivity Services solution. Be sure to collect all this information and communicate it to all the key stakeholders to review and approve. When you do this, you'll help ensure that everyone involved has the same understanding of the needs of the project and how the solution will work.
## Where is the data? <a name="section1"> </a> Your first step in planning your Business Connectivity Services solution is to understand where the external data that you want is. You need to understand this from three perspectives.
-You will need to know who has daily administrative responsibility over the external data source. This is the group that you will need to work with to help set up connectivity to the external data. They will be able to tell you how the data is made available for external consumption, how it is secured, and so on. You might need them to create credentials in the external system for you to use. Be prepared to answer their questions on the impact of your Business Connectivity Services solution on their data and their external system.
+You'll need to know who has daily administrative responsibility over the external data source. This is the group that you'll need to work with to help set up connectivity to the external data. They'll be able to tell you how the data is made available for external consumption, how it's secured, and so on. You might need them to create credentials in the external system for you to use. Be prepared to answer their questions on the impact of your Business Connectivity Services solution on their data and their external system.
### Network considerations
-You also need to consider where the external data source is in relation to the network that Business Connectivity Services and your users will be on. To help you figure this out, draw a diagram of the three components on your network and see where they lie. For example, you can see whether they are all on your internal network and inside your firewall. Or, you could see that the Business Connectivity Services infrastructure and the external data source are separated by a firewall or boundary network and that they are on completely separate networks. Here are some basic rules that you can use to guide your design:
+You also need to consider where the external data source is in relation to the network that Business Connectivity Services and your users will be on. To help you figure this out, draw a diagram of the three components on your network and see where they lie. For example, you can see whether they're all on your internal network and inside your firewall. Or, you could see that the Business Connectivity Services infrastructure and the external data source are separated by a firewall or boundary network and that they are on completely separate networks. Here are some basic rules that you can use to guide your design:
- If the external data source is outside of your network, such as on the Internet, Business Connectivity Services will need to communicate with the external data source through your corporate firewall and you need to plan for that traffic.
You also need to consider where the external data source is in relation to the n
## How is the data surfaced? <a name="section2"> </a>
-Business Connectivity Services solutions can connect to an external data source through OData, SQL Server, Windows Communication Foundation (WCF) service, and .NET Assemblies. You need to know (and you can find this out from the external system administrators) how the data is surfaced for external consumption. How the external data is surfaced determines what development tools you will use to create the external content type. The following table shows you which tools to use based on the external data source.
+Business Connectivity Services solutions can connect to an external data source through OData, SQL Server, Windows Communication Foundation (WCF) service, and .NET Assemblies. You need to know (and you can find this out from the external system administrators) how the data is surfaced for external consumption. How the external data is surfaced determines what development tools you'll use to create the external content type. The following table shows you which tools to use based on the external data source.
## How is the data secured? <a name="section3"> </a>
-Business Connectivity Services handles all authentications for communications between itself and the external system. Basically, Business Connectivity Services presents the external system with information that allows the external system to authenticate (determine whether you are who you say you are) the request and then authorize access to data in the external system. Business Connectivity Services supports many types of authentication.
+Business Connectivity Services handles all authentications for communications between itself and the external system. Basically, Business Connectivity Services presents the external system with information that allows the external system to authenticate (determine whether you're who you say you are) the request and then authorize access to data in the external system. Business Connectivity Services supports many types of authentication.
-For your Business Connectivity Services solution design, you have to know what authentication mechanism the external system requires. This way, you will know how to configure Business Connectivity Services so that it presents the authentication information in the manner that the external system requires. Business Connectivity Services supports three authentication models:
+For your Business Connectivity Services solution design, you have to know what authentication mechanism the external system requires. This way, you'll know how to configure Business Connectivity Services so that it presents the authentication information in the manner that the external system requires. Business Connectivity Services supports three authentication models:
-- **Credentials-based authentication** In credentials-based authentication models, credentials are passed from Business Connectivity Services to the external system. Credentials are a combination of a user name and some form of password. Business Connectivity Services has a number of ways of doing this, including passing the credentials of the user who is logged on, passing the credentials of the service that is making the request, or mapping the credentials of the user who is logged on to a different set of credentials that the external system recognizes.
+- **Credentials-based authentication** In credentials-based authentication models, credentials are passed from Business Connectivity Services to the external system. Credentials are a combination of a user name and some form of password. Business Connectivity Services has many ways of doing this, including passing the credentials of the user who is logged on, passing the credentials of the service that is making the request, or mapping the credentials of the user who is logged on to a different set of credentials that the external system recognizes.
-- **Claims-based authentication** In some authentication scenarios, the external system will not accept credentials directly from Business Connectivity Services. However, the external system will accept them from a third-party authentication service that it trusts. The third-party authentication service (a security token provider) accepts a grouping of information (known as assertions) about the requestor. The whole grouping is known as a claim, and a claim can contain more info about the requestor than just the user name and password. For example, a claim can contain metadata about the requestor, such as the requestor's email address or the security groups to which the requestor belongs. The third-party authentication service performs the authentication of the requestor based on the assertions in the claim and creates a security token for the requestor to use. The requestor (Business Connectivity Services) then presents the security token to the external system, and the external system looks to see what data the requestor has been authorized to access.
+- **Claims-based authentication** In some authentication scenarios, the external system won't accept credentials directly from Business Connectivity Services. However, the external system will accept them from a third-party authentication service that it trusts. The third-party authentication service (a security token provider) accepts a grouping of information (known as assertions) about the requestor. The whole grouping is known as a claim, and a claim can contain more info about the requestor than just the user name and password. For example, a claim can contain metadata about the requestor, such as the requestor's email address or the security groups to which the requestor belongs. The third-party authentication service performs the authentication of the requestor based on the assertions in the claim and creates a security token for the requestor to use. The requestor (Business Connectivity Services) then presents the security token to the external system, and the external system looks to see what data the requestor has been authorized to access.
-- **Custom authentication** If the external system that you are working with does not support credentials-based or claims-based authentication, then you will have to develop, test, and implement a custom solution that takes the credentials that Business Connectivity Services can produce and translates them into a format that the external system will accept. You can implement a custom authentication solution for OData data sources that are secured either by OAuth or a custom ASP.NET HTTP module and are on premises.
+- **Custom authentication** If the external system that you're working with doesn't support credentials-based or claims-based authentication, then you'll have to develop, test, and implement a custom solution that takes the credentials that Business Connectivity Services can produce and translates them into a format that the external system will accept. You can implement a custom authentication solution for OData data sources that are secured either by OAuth or a custom ASP.NET HTTP module and are on premises.
## How will the data be consumed? <a name="section4"> </a> As part of your requirement gathering, you need to find out from your business stakeholders what they need the solution to do and how they need users to interact with it. They might need the users to interact with the data in SharePoint Server, via external lists, and external web parts and Office apps. Or, they might need the solution to surface data through Office apps and SharePoint applications. For more info about apps for Office and SharePoint, see [(OLD) Overview of apps for SharePoint 2016](./plan-for-apps-for-sharepoint.md). Or, the solution might require some other combination of browser, client, and application access to the external data.
-How users access the data affects how you will scope the external content type that Business Connectivity Services uses to access the external data. If your Business Connectivity Services solution requires an apps for Office and SharePoint application, then the external content type must be scoped to that application. If your Business Connectivity Services solution will not use apps for Office and SharePoint to access external data, then the external content type must be scoped to the Business Data Connectivity service application.
+How users access the data affects how you'll scope the external content type that Business Connectivity Services uses to access the external data. If your Business Connectivity Services solution requires an apps for Office and SharePoint application, then the external content type must be scoped to that application. If your Business Connectivity Services solution won't use apps for Office and SharePoint to access external data, then the external content type must be scoped to the Business Data Connectivity service application.
**Business Connectivity Services-scoped external content types** are stored centrally in the BDC Metadata Store and a farm administrator manages security on them. You can share these external content types with multiple Business Connectivity Services web applications.
How users access the data affects how you will scope the external content type t
## How will you assign permissions to the solution? <a name="section5"> </a>
-In every Business Connectivity Services solution, you must plan who will have which permissions on which objects. This is how you both restrict and grant access to the solution to the appropriate users in the appropriate way. You will have to work with the external system administrator and the SharePoint Server farm administrators, site collection administrators, and site administrators to configure permissions. At the most fundamental level however, here is what you must consider during your planning.
+In every Business Connectivity Services solution, you must plan who will have which permissions on which objects. This is how you both restrict and grant access to the solution to the appropriate users in the appropriate way. You'll have to work with the external system administrator and the SharePoint Server farm administrators, site collection administrators, and site administrators to configure permissions. At the most fundamental level however, here is what you must consider during your planning.
There are three fundamental roles that are involved with every Business Connectivity Services solution:
There are three fundamental roles that are involved with every Business Connecti
- **User roles** People in these roles consume and manipulate the external data in the Business Connectivity Services solution. There can be multiple user roles in your solution, each with different levels of permissions. For example, in a support-ticketing system scenario that uses Business Connectivity Services to integrate external information into the solution, the Tier I Help Desk technicians might be granted only the ability to read and start workflows on a ticket, while Tier II and Tier III technicians have the ability to update tickets.
-There are also four main aspects to every Business Connectivity Services solution for which you will manage permissions:
+There are also four main aspects to every Business Connectivity Services solution for which you'll manage permissions:
-- **External system** Every external system will have a method for performing authentication and authorization. (For more information, see [How is the data secured?](plan-a-business-connectivity-services-solution.md#section3) earlier in this article.) You need to work with the external system administrator to identify how to grant access to the solution users according to the principle of least privileges. In general, you will map a group of users from the Business Connectivity Services side to a single account on the external system side and use the single external system account to restrict access. Another way is to do a 1:1 mapping between individual accounts on each system. In either case, unless the external system can directly accept the credentials with which the user authenticates to SharePoint Server, you will need to use the [Secure Store Service](/previous-versions/office/sharepoint-server-2010/ee806889(v=office.14)). For more in-depth information about the authentication models that Business Connectivity Services supports, see [Business Connectivity Services security overview (SharePoint 2010)](/previous-versions/office/sharepoint-server-2010/ee661743(v=office.14)).
+- **External system** Every external system will have a method for performing authentication and authorization. (For more information, see [How is the data secured?](plan-a-business-connectivity-services-solution.md#section3) earlier in this article.) You need to work with the external system administrator to identify how to grant access to the solution users according to the principle of least privileges. In general, you'll map a group of users from the Business Connectivity Services side to a single account on the external system side and use the single external system account to restrict access. Another way is to do a 1:1 mapping between individual accounts on each system. In either case, unless the external system can directly accept the credentials with which the user authenticates to SharePoint Server, you'll need to use the [Secure Store Service](/previous-versions/office/sharepoint-server-2010/ee806889(v=office.14)). For more in-depth information about the authentication models that Business Connectivity Services supports, see [Business Connectivity Services security overview (SharePoint 2010)](/previous-versions/office/sharepoint-server-2010/ee661743(v=office.14)).
- **Business Connectivity Services central infrastructure** In Central Administration, you manage the assignment of permissions to the BDC Metadata Store. In the BDC Metadata Store, you manage BDC models, external systems, and external content types. You must assign execute permissions on an external content type to all users who will be using the Business Connectivity Services solution. The following tables provide a detailed mapping of abilities, permissions, and objects.
There are also four main aspects to every Business Connectivity Services solutio
|Execute a method instance|Execute|The method instance| |Set permissions on a method instance|SetPermissions|The method instance| -- **The development environment** When you are developing a Business Connectivity Services solution, including the external content type, and any apps for SharePoint and connection settings objects, it is a best practice to use a development environment that is separate from your production environment. In the development environment, you can grant higher levels of permissions to the developers than you would usually do in your production environment.
+- **The development environment** When you're developing a Business Connectivity Services solution, including the external content type, and any apps for SharePoint and connection settings objects, it's a best practice to use a development environment that is separate from your production environment. In the development environment, you can grant higher levels of permissions to the developers than you would usually do in your production environment.
-- **The user environment** All external data will be accessed through external lists, external data columns, Business Data Web Parts, apps for SharePoint, or Office. For apps for SharePoint, you can choose to let the app for Office and SharePoint enforce permissions. In this case, if the users can access the app for Office and SharePoint, then they can access all the external data that is surfaced in the app for Office and SharePoint. You will have to work with site and site collection administrators to plan and implement permissions to the external data in your solution.
+- **The user environment** All external data will be accessed through external lists, external data columns, Business Data Web Parts, apps for SharePoint, or Office. For apps for SharePoint, you can choose to let the app for Office and SharePoint enforce permissions. In this case, if the users can access the app for Office and SharePoint, then they can access all the external data that is surfaced in the app for Office and SharePoint. You'll have to work with site and site collection administrators to plan and implement permissions to the external data in your solution.
## See also <a name="section5"> </a>
SharePoint Plan For Apps For Sharepoint https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-for-apps-for-sharepoint.md
Configuring apps for SharePoint requires the following:
- If you want to monitor apps, then Search must be configured. -- You'll need SSL Certificates If you are using SSL to help secure traffic. You must create a wildcard certificate to use for all app URLs.
+- You'll need SSL Certificates If you're using SSL to help secure traffic. You must create a wildcard certificate to use for all app URLs.
-- Each app for SharePoint that is installed creates a subweb under the site on which it is installed with its own URL. This means that environments that contain many apps for SharePoint will have many additional subwebs. Be sure to consider this when planning for capacity for your farm.
+- Each app for SharePoint that is installed creates a subweb under the site on which it's installed with its own URL. This means that environments that contain many apps for SharePoint will have many additional subwebs. Be sure to consider this when planning for capacity for your farm.
-Additionally, using apps for SharePoint requires a separate DNS domain configuration (discussed below), as well as the Subscription Settings and App Management service applications. We cover how to configure the separate app domain and the service applications in the [apps for SharePoint configuration article](configure-an-environment-for-apps-for-sharepoint.md).
+Additionally, using apps for SharePoint requires a separate DNS domain configuration (discussed below), and the Subscription Settings and App Management service applications. We cover how to configure the separate app domain and the service applications in the [apps for SharePoint configuration article](configure-an-environment-for-apps-for-sharepoint.md).
## Plan app configuration settings <a name="AppConfig"> </a>
-With apps for SharePoint, apps are deployed to their own web site in a special, isolated domain name, instead of in the same domain name as your farm. Processes run under that domain name and do not affect the SharePoint sites. This difference in domain names provides a layer of isolation for the apps.
+With apps for SharePoint, apps are deployed to their own web site in a special, isolated domain name, instead of in the same domain name as your farm. Processes run under that domain name and don't affect the SharePoint sites. This difference in domain names provides a layer of isolation for the apps.
You must set up a Domain Name Services (DNS) domain name to provide a host name for the installed apps. By using a separate domain name, apps for SharePoint are separated from SharePoint sites to prevent unauthorized access to user data and to reduce the possibility of cross-site scripting attacks.
The details of how to configure the app domain are covered in [Configure an envi
### How SharePoint uses the domain
-Each app for SharePoint has a unique URL, which is made up of the app domain plus a prefix and an Apphash. The format is as follows: prefix-Apphash.domain.com. The Apphash is an arbitrarily-assigned unique identifier for each app for SharePoint. These URLs are generated automatically depending on the settings that you specify. You do not have to create or manage these URLs separately; instead you configure a wildcard entry in DNS to provide the URLs for all apps.
+Each app for SharePoint has a unique URL, which is made up of the app domain plus a prefix and an Apphash. The format is as follows: prefix-Apphash.domain.com. The Apphash is an arbitrarily assigned unique identifier for each app for SharePoint. These URLs are generated automatically depending on the settings that you specify. You don't have to create or manage these URLs separately; instead you configure a wildcard entry in DNS to provide the URLs for all apps.
When you install an app to a site, a subweb of that site is created to host the app content. The subweb for the app is hierarchically below the site collection, but has an isolated unique host header instead of being under the site's URL. The following diagram shows the relationship between the site's URL and the app's URL:
In this diagram, the Main SharePoint Site is the site on which the user installe
### Determine the domain name to use
-When you choose the domain name and prefixes to use for your environment, consider the following:
+When you choose the domain name and prefix to use for your environment, consider the following:
- **Use a unique domain name, not a subdomain**
- For security reasons, we highly recommend that you not use a subdomain of the root domain name that hosts SharePoint Server or other applications. For example, if the SharePoint sites are at Contoso.com, do not use Apps.Contoso.com. Instead use a unique name such as Contoso-Apps.com. This is because other applications that run under that host name might contain sensitive information that is stored in cookies that might not be protected.
+ For security reasons, we highly recommend that you not use a subdomain of the root domain name that hosts SharePoint Server or other applications. For example, if the SharePoint sites are at Contoso.com, don't use Apps.Contoso.com. Instead use a unique name such as Contoso-Apps.com. This is because other applications that run under that host name might contain sensitive information that is stored in cookies that might not be protected.
- **The app domain should be in the Internet or Restricted sites security zone in Internet Explorer**
- For security reasons, we recommend that you configure the app domain to be in either the Internet or Restricted sites security zone in Internet Explorer options, and not in the Intranet zone or Trusted sites zone. Internet Explorer security settings for the Intranet zone or Trusted sites zone do not provide a sufficient level of isolation of apps from user data in SharePoint sites.
+ For security reasons, we recommend that you configure the app domain to be in either the Internet or Restricted sites security zone in Internet Explorer options, and not in the Intranet zone or Trusted sites zone. Internet Explorer security settings for the Intranet zone or Trusted sites zone don't provide a sufficient level of isolation of apps from user data in SharePoint sites.
- **For multi-tenancy environments, use unique prefixes for each tenant's apps**
When you choose the domain name and prefixes to use for your environment, consid
- **Keep prefixes short and simple**
- Prefixes must be less than 48 characters and cannot contain special characters or dashes.
+ Prefixes must be fewer than 48 characters and cannot contain special characters or dashes.
## Recommended logical architecture <a name="AppConfig"> </a>
-As a best practice, we recommend that you use a single web application that uses host-named site collections (host headers) instead of multiple web applications that use path-named site collections in your environment. When you use multiple web applications and path-named site collections you might have to complete additional configuration steps to guarantee that requests for apps for SharePoint are routed to the correct web application.
+As a best practice, we recommend that you use a single web application that uses host-named site collections (host headers) instead of multiple web applications that use path-named site collections in your environment. When you use multiple web applications and path-named site collections, you might have to complete additional configuration steps to guarantee that requests for apps for SharePoint are routed to the correct web application.
## Plan App Catalog <a name="AppGallery"> </a>
See [Monitor apps for SharePoint for SharePoint Server](monitor-apps-for-sharepo
## Plan for app licenses <a name="AppLicenses"> </a>
-SharePoint Server does not enforce app licenses. Developers who build apps must add code that retrieves license information and then addresses users. SharePoint Server provides the storage and together with SharePoint Store web services the app license renewal. SharePoint Store handles payments for the licenses, issues the correct licenses, and provides the process to verify license integrity. Note that licensing only works for apps that are distributed through the SharePoint Store. Apps that you purchase from another source and apps that you develop internally must implement their own licensing mechanisms. SharePoint Server supports the following app licenses formats:
+SharePoint Server doesn't enforce app licenses. Developers who build apps must add code that retrieves license information and then addresses users. SharePoint Server provides the storage and together with SharePoint Store web services the app license renewal. SharePoint Store handles payments for the licenses, issues the correct licenses, and provides the process to verify license integrity. Note that licensing only works for apps that are distributed through the SharePoint Store. Apps that you purchase from another source and apps that you develop internally must implement their own licensing mechanisms. SharePoint Server supports the following app licenses formats:
|**License Type**|**Duration**|**User Limit**| |:--|:--|:--|
SharePoint Plan For Internet Intranet And Extranet Publishing Sites https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-for-internet-intranet-and-extranet-publishing-sites.md
This article builds on the information in [Overview of publishing to Internet, i
## Determine the SharePoint architecture
-If you are planning to build a publishing site based on SharePoint Server, you have probably already made some decisions about what kind of sites are needed, and how you want them distributed. For example, if you are building an Internet business site, you probably already know how many public-facing sites you require for each region or brand. If you are building an intranet site for your company, you have probably already determined how many internal sites you require for each division or group. It is not necessary to have a detailed plan of the physical architecture at this stage in the project. However, when you plan to set up a publishing site, it's important to consider if you have to plan for sites to be inside or outside a firewall, and if some sites will allow anonymous users. For more information about how to plan for sites, see [Plan sites and site collections in SharePoint Server](../sites/plan-sites-and-site-collections.md).
+If you're planning to build a publishing site based on SharePoint Server, you have probably already made some decisions about what kind of sites are needed, and how you want them distributed. For example, if you're building an Internet business site, you probably already know how many public-facing sites you require for each region or brand. If you're building an intranet site for your company, you have probably already determined how many internal sites you require for each division or group. It isn't necessary to have a detailed plan of the physical architecture at this stage in the project. However, when you plan to set up a publishing site, it's important to consider if you have to plan for sites to be inside or outside a firewall, and if some sites will allow anonymous users. For more information about how to plan for sites, see [Plan sites and site collections in SharePoint Server](../sites/plan-sites-and-site-collections.md).
-## Determine what type of content you will have on each site
+## Determine what type of content you'll have on each site
-What type of content do you plan to publish on the site? Is the site an intranet knowledge base that will have only Pages library content? Or is the site an Internet business site that will display lists of products and descriptions, together with Pages library content? If you plan to use cross-site publishing, how many catalogs do you have? Are any catalogs in external data sources? Will you import catalogs into the site, or will you create content directly in SharePoint Server? Knowing what type of content you have will help you determine what publishing method you will use later in this article.
+What type of content do you plan to publish on the site? Is the site an intranet knowledge base that will have only Pages library content? Or is the site an Internet business site that will display lists of products and descriptions, together with Pages library content? If you plan to use cross-site publishing, how many catalogs do you have? Are any catalogs in external data sources? Will you import catalogs into the site, or will you create content directly in SharePoint Server? Knowing what type of content you have will help you determine what publishing method you'll use later in this article.
## Decide on multilingual support
-Does your site require that content be provided in more than one language, or to more than one locale? Even if you currently plan to only create and publish content in a single language, you should carefully consider whether that business requirement might change in the future. If there is a chance you might eventually want to use variations on the site, you should plan for using variations now. By setting up your site structure now in preparation for using variations, you will save yourself and your organization time and resources in the future. If you have to change the site structure when you switch to using variations later, it is often more difficult and can affect the URLs that you planned for your sites. For more information about variations, see [Variations overview in SharePoint Server](variations-overview.md). For information about how to plan variations, see [Plan for variations in SharePoint Server](plan-for-variations.md).
+Does your site require that content be provided in more than one language, or to more than one locale? Even if you currently plan to only create and publish content in a single language, you should carefully consider whether that business requirement might change in the future. If there's a chance you might eventually want to use variations on the site, you should plan for using variations now. By setting up your site structure now in preparation for using variations, you'll save yourself and your organization time and resources in the future. If you have to change the site structure when you switch to using variations later, it's often more difficult and can affect the URLs that you planned for your sites. For more information about variations, see [Variations overview in SharePoint Server](variations-overview.md). For information about how to plan variations, see [Plan for variations in SharePoint Server](plan-for-variations.md).
## Decide which publishing method to use SharePoint Server has two ways that you can make published content available to users: author-in-place and cross-site publishing. Deciding which publishing method to use is an important step in planning publishing sites. The publishing method that you select will lead to additional planning steps, and some steps are unique to each method. -- **Author-in-place** Uses a single site collection to author content and make it available to readers of your site. If you plan to publish only Pages library content, and you do not have to author on more than one site, or publish to more than one site, and you do not have a business need to author separately from your production environment, you should use author-in-place. If you must publish multilingual content, you can still use variations to make content available to sites in multiple languages or regions. Author-in-place is available in both SharePoint Server and SharePoint in Microsoft 365.
+- **Author-in-place** Uses a single site collection to author content and make it available to readers of your site. If you plan to publish only Pages library content, and you don't have to author on more than one site, or publish to more than one site, and you don't have a business need to author separately from your production environment, you should use author-in-place. If you must publish multilingual content, you can still use variations to make content available to sites in multiple languages or regions. Author-in-place is available in both SharePoint Server and SharePoint in Microsoft 365.
- **Cross-site publishing** Uses one or more site collections to author content, and one or more site collections to control the design of the site and the display of the content. If you want to separate your authoring and publishing environments, you should use cross-site publishing. If you plan to publish only Pages library content, but you want to author in more than one site, or publish to more than one site, you should also use cross-site publishing. Cross-site publishing is available only in SharePoint Server. For more information about cross-site publishing, see [Overview of cross-site publishing in SharePoint Server](overview-of-cross-site-publishing.md). For information about how to plan for cross-site publishing, see [Plan for cross-site publishing in SharePoint Server](plan-for-cross-site-publishing.md).
Consider cross-site publishing if you answer yes to any of the following questio
- Do you want to keep the authoring environment separate from the publishing environment? -- Do you have multiple site collections represented, on either the authoring or publishing side ΓÇö for example, n:1 or 1:n?
+- Do you have multiple site collections represented, on either the authoring or publishing sideΓÇöfor example, n:1 or 1:n?
- Do you want to have multiple sites for different brands? - If you plan to use variations or if you plan to translate content for multiple sites, do you want to have different URLs for different locale-specific variants of your site?
-Although cross-site publishing is the recommended method to use for making content available to multiple sites, it might not be the right method for your publishing solution. You should not use cross-site publishing if you do not plan to use variations with unique URLs, or publish to multiple sites, and you want to author content on the same site collection in which it is published. Use author-in-place instead.
+Although cross-site publishing is the recommended method to use for making content available to multiple sites, it might not be the right method for your publishing solution. You shouldn't use cross-site publishing if you don't plan to use variations with unique URLs, or publish to multiple sites, and you want to author content on the same site collection in which it's published. Use author-in-place instead.
Use the following flowchart to help you determine which publishing method to use:
After you have determined the publishing method to use, there are additional pla
- [Plan for large Pages libraries (SharePoint Server 2010)](/previous-versions/office/sharepoint-server-2010/ee721053(v=office.14)) describes the use of large Pages libraries in SharePoint Server publishing sites. Also, this article contains information to help you determine whether to use large Pages libraries with your publishing solution and information about how to plan for them. -- [Plan for variations in SharePoint Server](plan-for-variations.md) provides information about important items that you should consider when you are using variations in publishing sites, and it describes the tasks that are involved in planning a solution that uses variations in SharePoint Server.
+- [Plan for variations in SharePoint Server](plan-for-variations.md) provides information about important items that you should consider when you're using variations in publishing sites, and it describes the tasks that are involved in planning a solution that uses variations in SharePoint Server.
- [Plan navigation term sets in SharePoint Server](plan-navigation-term-sets.md) provides information about how to create the navigation term set to provide site navigation for SharePoint Server publishing sites.
SharePoint Plan Profile Synchronization For Sharepoint Server 2013 https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-profile-synchronization-for-sharepoint-server-2013.md
description: "Learn how to implement profile synchronization in SharePoint Serve
[!INCLUDE[appliesto-2013-xxx-xxx-xxx-xxx-md](../includes/appliesto-2013-xxx-xxx-xxx-xxx-md.md)]
-Profile synchronization (also known as "profile sync") allows you to create user profiles by importing information from other systems that are used in your organization. Before you read this article you should understand the concepts introduced in the article [Overview of profile synchronization in SharePoint Server 2013](profile-synchronization-in-sharepoint-server-2013.md). Profile synchronization is also used in server-to-server authentication which enables servers to access and request resources from one another server on behalf of users. For more information, see [Server-to-server authentication and user profiles in SharePoint Server](../security-for-sharepoint-server/server-to-server-authentication-and-user-profiles.md).
+Profile synchronization (also known as "profile sync") allows you to create user profiles by importing information from other systems that are used in your organization. Before you read this article, you should understand the concepts introduced in the article [Overview of profile synchronization in SharePoint Server 2013](profile-synchronization-in-sharepoint-server-2013.md). Profile synchronization is also used in server-to-server authentication that enables servers to access and request resources from one another server on behalf of users. For more information, see [Server-to-server authentication and user profiles in SharePoint Server](../security-for-sharepoint-server/server-to-server-authentication-and-user-profiles.md).
This article describes:
This article describes:
- The external content types that will have to be created, if any.
-This article does not describe how to implement your plan. That information is covered in the article [Synchronize user and group profiles in SharePoint Server 2013](configure-profile-synchronization.md).
+This article doesn't describe how to implement your plan. That information is covered in the article [Synchronize user and group profiles in SharePoint Server 2013](configure-profile-synchronization.md).
## Before you begin <a name="beforeyoubegin"> </a>
Before you work through the planning tasks in this article, you should already:
## About planning for profile synchronization <a name="about"> </a>
-As the first step towards planning for profile synchronization, you'll identify synchronization connections, and collect information that you will need when you create the connection. If you will need any external content types, you'll document the requirements for those external content types, provide the requirements to a developer, and receive the details that you'll use to specify a synchronization connection to the business system.
+As the first step towards planning for profile synchronization, you'll identify synchronization connections, and collect information that you'll need when you create the connection. If you'll need any external content types, you'll document the requirements for those external content types, provide the requirements to a developer, and receive the details that you'll use to specify a synchronization connection to the business system.
Next, you'll determine how to map user profile properties to information in the external systems so that they can be synchronized.
Finally, you'll answer more straightforward questions such as whether you'll syn
## Plan synchronization connections <a name="connections"> </a>
-Each property in a user's profile can come from an external system. There are two types of external systems: directory services and business systems. Throughout this article, the phrase business system is used to mean an external system that is not a directory service. SAP, Siebel, SQL Server, and custom applications are all examples of business systems.
+Each property in a user's profile can come from an external system. There are two types of external systems: directory services and business systems. Throughout this article, the phrase business system is used to mean an external system that isn't a directory service. SAP, Siebel, SQL Server, and custom applications are all examples of business systems.
> [!NOTE] > For a list of supported directory services, see [Profile synchronization overview](profile-synchronization-in-sharepoint-server-2013.md).
-In SharePoint Server 2013, a synchronization connection is a way to obtain user profile information from an external system. To import profiles from one of the supported directory services, you create a synchronization connection to the directory service. To import additional profile properties from a business system, you create an external content type to bring the data from the business system into SharePoint Server 2013, and then create a synchronization connection to the external content type. The following sections explain how to collect the information that you will need about each synchronization connection.
+In SharePoint Server 2013, a synchronization connection is a way to obtain user profile information from an external system. To import profiles from one of the supported directory services, you create a synchronization connection to the directory service. To import additional profile properties from a business system, you create an external content type to bring the data from the business system into SharePoint Server 2013, and then create a synchronization connection to the external content type. The following sections explain how to collect the information that you'll need about each synchronization connection.
### Connections to directory services
-Each user who you want to have a profile in SharePoint Server 2013 must have an identity in a directory service. (If users are not represented in a directory service, you can't synchronize user profiles.) Identify which directory services contain information about these users. Unless you can access the directory service yourself, you should also identify an administrator of the directory service. You will need this person's help to collect some information that will be needed to create synchronization connections.
+Each user who you want to have a profile in SharePoint Server 2013 must have an identity in a directory service. (If users aren't represented in a directory service, you can't synchronize user profiles.) Identify which directory services contain information about these users. Unless you can access the directory service yourself, you should also identify an administrator of the directory service. You'll need this person's help to collect some information that will be needed to create synchronization connections.
The [Connection planning worksheet](https://go.microsoft.com/fwlink/p/?LinkID=268733) contains templates for the information that you need to collect for each type of connection. Each template is in a separate tab that is labeled with the name of the directory service provider to which it applies. Create a tab for each directory service that you identified. Copy the template for the type of directory service into the new tab. Then complete the information on each new tab according to the following table.
The [Connection planning worksheet](https://go.microsoft.com/fwlink/p/?LinkID=26
|Forest <br/> |AD DS <br/> |The name of the directory service forest. <br/> | |Domain controller <br/> |AD DS <br/> |The name of the preferred domain controller. You only have to identify the domain controller if there are multiple domain controllers in the forest and you want to synchronize with a specific domain controller. <br/> | |Authentication provider type <br/> |All <br/> | The type of authentication SharePoint Server 2013 should use to connect to the directory service. This is one of the following: <br/> Windows authentication <br/> Forms-based authentication <br/> Claims-based authentication <br/> The systems architect should be able to provide this information. <br/> |
-|Authentication provider <br/> |All <br/> |If forms-based authentication or claims-based authentication will be used, fill in the name of the trusted provider. The systems architect should be able to provide this information. An authentication provider is not needed for Windows authentication. <br/> |
+|Authentication provider <br/> |All <br/> |If forms-based authentication or claims-based authentication will be used, fill in the name of the trusted provider. The systems architect should be able to provide this information. An authentication provider isn't needed for Windows authentication. <br/> |
|Synchronization account <br/> |All <br/> |The account, including the domain, that will be used to connect to the directory service. It is likely that the directory service administrator will create an account to be used for synchronization. <br/> **Note**: The permissions that the synchronization account must have are described in the [Plan account permissions](plan-profile-synchronization-for-sharepoint-server-2013.md#permission) section of this topic. <br/> | |Synchronization account password <br/> |All <br/> |The password for the synchronization account. <br/> **Note**: You must know the password for the synchronization account. We recommend that you do not record the password in the worksheet. <br/> | |Connection port <br/> |All <br/> |The port that will be used to connect to the directory service. <br/> |
There are two ways to join the clauses of an exclusion filter:
You can't mix ANDs and ORs in a filter.
-For example, assume that temporary employees in your organization are given Active Directory accounts that begin with "T-". You want to synchronize profiles for all permanent (non-temporary) users whose accounts are not disabled. You could create a filter that uses the clauses in the following table.
+For example, assume that temporary employees in your organization are given Active Directory accounts that begin with "T-". You want to synchronize profiles for all permanent (non-temporary) users whose accounts aren't disabled. You could create a filter that uses the clauses in the following table.
> [!NOTE] > After any changes are made to a filter, a full synchronization is required.
You can't create a filter that is based on membership in a directory service gro
### Connections to business systems
-To import properties from a business system, you will need an external content type that brings the property value from the external system into SharePoint Server 2013. This article does not cover how to create an external content type. That task is usually done by a developer. This article describes the data that you must collect and give to the developer, and tells you what to do with the information that you receive. For developer information, see [External content types in SharePoint 2013](/sharepoint/dev/general-development/external-content-types-in-sharepoint).
+To import properties from a business system, you'll need an external content type that brings the property value from the external system into SharePoint Server 2013. This article doesn't cover how to create an external content type. That task is usually done by a developer. This article describes the data that you must collect and give to the developer, and tells you what to do with the information that you receive. For developer information, see [External content types in SharePoint 2013](/sharepoint/dev/general-development/external-content-types-in-sharepoint).
You can use the [External content type planning worksheet](https://go.microsoft.com/fwlink/p/?LinkId=268734) to specify the external content types to be created. Go through the User Profile Properties Planning worksheet that you completed when you read the article [Plan user profiles in SharePoint Server](plan-user-profiles.md). In the External Content Type Planning worksheet, create one row for each user profile property that comes from a business system. Fill in the first three columns of each row according to the instructions in the following table.
The Connection Planning worksheet ([User profile properties and profile synchron
To indicate that a user profile property comes from an external system, you map the property to a specific attribute of the external system. By default, certain user profile properties are mapped. You can only map a profile property to an attribute whose data type is compatible with the data type of the property. For example, you can't map the **SPS-HireDate** user profile property to the **homePhone** Active Directory attribute because **SPS-HireDate** is a date and **homePhone** is a Unicode string. For a list of which user profile property data types are compatible with which AD DS data types, see [User profile property data types in SharePoint Server 2013](/previous-versions/office/sharepoint-server-2010/hh227257(v=office.14)).
-When you synchronize profile information, in addition to importing profile properties from external systems, you can also write data back to a directory service. You can't write data back to a business system. To indicate that SharePoint Server 2013 should export a user profile property, you map the property, and set the direction of the mapping to **Export**. Each property can only be mapped in one direction. You can't both import and export the same user profile property. The data that is exported overwrites any values that might already be present in the directory service. This is true for multivalued properties alsoΓÇöthe exported value is not appended to the existing values, it overwrites them.
+When you synchronize profile information, in addition to importing profile properties from external systems, you can also write data back to a directory service. You can't write data back to a business system. To indicate that SharePoint Server 2013 should export a user profile property, you map the property, and set the direction of the mapping to **Export**. Each property can only be mapped in one direction. You can't both import and export the same user profile property. The data that is exported overwrites any values that might already be present in the directory service. This is true for multivalued properties alsoΓÇöthe exported value isn't appended to the existing values, it overwrites them.
Examine the User Profile Properties Planning worksheet that you completed as you read the [Plan user profiles in SharePoint Server](plan-user-profiles.md) topic. For each row (property) whose value will be imported from an external system, fill in the final three columns according to the instructions in the following table.
For each row (property) whose value will be exported to a directory service, fil
By default, SharePoint Server 2013 synchronizes groups, such as distribution lists, when it synchronizes user profiles. You can turn off this functionality from the Configure Synchronization Settings page of Central Administration. Synchronizing groups is only supported for AD DS.
-If you synchronize groups in addition to users, SharePoint Server 2013 imports information about the groups and about which users are members of the groups. Synchronizing a group does not create a profile for the group, and causes no additional user profiles to be created. In SharePoint Server 2013, groups are only used to create audiences and to display which memberships a visitor has in common with the person whose My Site the person is visiting.
+If you synchronize groups in addition to users, SharePoint Server 2013 imports information about the groups and about which users are members of the groups. Synchronizing a group doesn't create a profile for the group, and causes no additional user profiles to be created. In SharePoint Server 2013, groups are only used to create audiences and to display which memberships a visitor has in common with the person whose My Site the person is visiting.
If you decide to synchronize groups, SharePoint Server 2013 will import information about all of the groups that exist in the directory service containers that you are synchronizing unless you choose to exclude groups by using a filter. The filter for excluding groups differs from the filter for excluding users, although both follow the same format.
The synchronization account for a connection to Active Directory Domain Services
- It must have Replicate Directory Changes permission on the domain with which you'll synchronize.
- The Replicate Directory Changes permission allows an account to query for the changes in the directory. This permission does not allow an account to make any changes in the directory.
+ The Replicate Directory Changes permission allows an account to query for the changes in the directory. This permission doesn't allow an account to make any changes in the directory.
- If the domain controller is running Windows Server 2003, the synchronization account must be a member of the Pre-Windows 2000 Compatible Access built-in group.
The synchronization account for a connection to a Sun Java System Directory Serv
- Read, Write, Compare, and Search permissions to the RootDSE. -- To perform incremental synchronization, the synchronization account must also have Read, Compare, and Search permissions to the change log (cn=changelog). If the change log does not exist, you must create it before synchronizing.
+- To perform incremental synchronization, the synchronization account must also have Read, Compare, and Search permissions to the change log (cn=changelog). If the change log doesn't exist, you must create it before synchronizing.
### IBM Tivoli version 5.2
SharePoint Plan Search For Sharepoint Cross Site Publishing Sites https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-search-for-sharepoint-cross-site-publishing-sites.md
description: "Learn how to plan for search-driven pages for SharePoint cross-sit
[!INCLUDE[appliesto-xxx-2016-xxx-xxx-xxx-md](../includes/appliesto-xxx-2016-xxx-xxx-xxx-md.md)]
-Search-driven pages are pages that use search technology to dynamically show content. This article describes features that you will use when you set up search-driven pages, such as managed properties, refiners, result sources, and recommendations, and what you must consider when you set up and use these features.
+Search-driven pages are pages that use search technology to dynamically show content. This article describes features that you'll use when you set up search-driven pages, such as managed properties, refiners, result sources, and recommendations, and what you must consider when you set up and use these features.
## Plan content sources and crawling <a name="BKMK_PlanContentSourcesAndCrawling"> </a>
-The default content source is Local SharePoint sites. You can use this content source to crawl all content within the web application. However, we recommend that you create separate content sources for libraries or lists that you share as catalogs. When you set up a content source for libraries or lists that are shared as catalogs, we recommend that you select **Enable Continuous Crawl** when specifying the crawl schedules. A continuous crawl starts at set intervals, which enables the search system to crawl the content and quickly add any changed content to the index. The default crawl interval for continuous crawl is 15 minutes, but you can set shorter or longer intervals. Enabling continuous crawl also means that a site administrator will not have to wait for the search service application administrator to manually start a crawl in order to update the search index with the latest changes from a catalog.
+The default content source is Local SharePoint sites. You can use this content source to crawl all content within the web application. However, we recommend that you create separate content sources for libraries or lists that you share as catalogs. When you set up a content source for libraries or lists that are shared as catalogs, we recommend that you select **Enable Continuous Crawl** when specifying the crawl schedules. A continuous crawl starts at set intervals, which enables the search system to crawl the content and quickly add any changed content to the index. The default crawl interval for continuous crawl is 15 minutes, but you can set shorter or longer intervals. Enabling continuous crawl also means that a site administrator won't have to wait for the search service application administrator to manually start a crawl in order to update the search index with the latest changes from a catalog.
For information about how to set up a content source and manage continuous crawling, see "Configure search for cross-site publishing" in [Configure cross-site publishing in SharePoint Server](configure-cross-site-publishing.md).
When you plan to use refiners and faceted navigation, consider the questions in
### What type of refiners do you want to use?
-There are two types of refiners: stand-alone and for faceted navigation. You can use just one type on the site, or a combination of both. The structure of the content and what kind of navigation you use on the site will determine the type of refiners that you should choose.
+There are two types of refiners: stand alone and for faceted navigation. You can use just one type on the site, or a combination of both. The structure of the content and what kind of navigation you use on the site will determine the type of refiners that you should choose.
-- Stand-alone refiners are usually used in scenarios where you have unstructured content. Within this content, you can identify several managed properties that can be used as refiners across all content. However, you do not want the refiners to change depending on a term in a term set. For example, in an intranet scenario, you can add stand-alone refiners to a Search Center page. These stand-alone refiners are typically managed properties that apply to most items in the intranet, such as Author and Date.
+- Stand-alone refiners are used in scenarios where you have unstructured content. Within this content, you can identify several managed properties that can be used as refiners across all content. However, you don't want the refiners to change depending on a term in a term set. For example, in an intranet scenario, you can add stand-alone refiners to a Search Center page. These stand-alone refiners are typically managed properties that apply to most items in the intranet, such as Author and Date.
-- Refiners for faceted navigation are used in scenarios where you have structured content, such as catalog content. This content is tied to a term set, and you want to have different refiners for different terms. For example, in an Internet business scenario where a product catalog of electronic products is shown, a term set is used to categorize the different products ΓÇö for example, Computers or Cameras. After you enable the managed properties Screen Size and Megapixels as refiners, you can configure faceted navigation so that Screen Size appears as a refiner for Computers, and Megapixels appears as a refiner for Cameras. This means that you can guide users to content that is relevant for a specific category. This makes it easier and faster to browse through catalog content.
+- Refiners for faceted navigation are used in scenarios where you have structured content, such as catalog content. This content is tied to a term set, and you want to have different refiners for different terms. For example, in an Internet business scenario where a product catalog of electronic products is shown, a term set is used to categorize the different productsΓÇöfor example, Computers or Cameras. After you enable the managed properties Screen Size and Megapixels as refiners, you can configure faceted navigation so that Screen Size appears as a refiner for Computers, and Megapixels appears as a refiner for Cameras. This means that you can guide users to content that is relevant for a specific category. This makes it easier and faster to browse through catalog content.
### How do you identify a refiner? When identifying which managed properties you want to specify as refiners, consider what kind of information users want to differentiate on and browse to.
-When using refiners for faceted navigation, it is especially important that you identify refiners that represent information that users will find useful when they browse through a catalog. You configure refiners in Term Store Management. You can set refiners for a specific term set, you can configure refiners to apply to all terms in a term set, or you can set specific refiners for a specific term. Consider the following:
+When using refiners for faceted navigation, it's especially important that you identify refiners that represent information that users will find useful when they browse through a catalog. You configure refiners in Term Store Management. You can set refiners for a specific term set, you can configure refiners to apply to all terms in a term set, or you can set specific refiners for a specific term. Consider the following:
- Which managed properties represent information that users want to quickly browse to for all my catalog items? -- Which managed properties represent information that is unique for only a sub-set of my catalog items?
+- Which managed properties represent information that is unique for only a subset of my catalog items?
-Note that adding many refiners to your page may increase the time to process a query. For more information, see [Estimate capacity and performance for Web Content Management (SharePoint Server 2013)](web-content-management-capacity-and-performance.md)
+Adding many refiners to your page may increase the time to process a query. For more information, see [Estimate capacity and performance for Web Content Management (SharePoint Server 2013)](web-content-management-capacity-and-performance.md)
### How should you configure refiners in the Term Store Management Tool?
For information about how to add refiners and configure faceted navigation, see
## Plan result sources and query rules <a name="BKMK_PlanResultSourcesAndQueryRules"> </a>
-Result sources narrow the scope of search results that are retrieved. SharePoint Server 2016 provides many pre-defined result sources. Many of the pre-defined result sources have a corresponding Web Part where the result source is specified as part of the query. For example, the result source Local Video Results is set as part of the query that is used in the Videos Web Part.
+Result sources narrow the scope of search results that are retrieved. SharePoint Server 2016 provides many predefined result sources. Many of the predefined result sources have a corresponding Web Part where the result source is specified as part of the query. For example, the result source Local Video Results is set as part of the query that is used in the Videos Web Part.
-You manage result sources in Central Administration and in site collection administration. If you are familiar with Keyword Query Language (KQL), you can create custom result sources.
+You manage result sources in Central Administration and in site collection administration. If you're familiar with Keyword Query Language (KQL), you can create custom result sources.
All available result sources appear in a list when you build the query in a Content Search Web Part. Users who configure this Web Part can easily narrow the scope of results that can be shown in the Web Part. For example, for an intranet site, a site collection administrator can create a result source named My PowerPoint Presentations, and configure it to narrow the scope of the search results to PowerPoint presentations created by the user who is logged on to the site. Any user can add a Content Search Web Part to their My Site and configure the Web Part by selecting the result source My PowerPoint Presentations. When users browse to their My Sites, the Web Part only shows PowerPoint presentations that they have created themselves. Another example is an Internet site, where a list that is shared as a catalog is used to maintain product data in many languages. A site collection administrator can create a result source named US English products and configure it to narrow the scope of search results to products with the language tag en-us. A user who has the Contribute permission level can then add a Content Search Web Part, and configure it to show only products that have information in US English, by selecting the US English products result source.
-In addition to the pre-defined result sources, SharePoint Server 2016 automatically creates a result source when you connect a catalog to a publishing site. The automatically created result source is added to the result sources in your publishing site. This result source limits search results to the URL of the catalog, which means that only content from that catalog will be shown when that result source is selected in a Web Part.
+In addition to the predefined result sources, SharePoint Server 2016 automatically creates a result source when you connect a catalog to a publishing site. The automatically created result source is added to the result sources in your publishing site. This result source limits search result to the URL of the catalog, which means that only content from that catalog will be shown when that result source is selected in a Web Part.
> [!NOTE] > Before you create any result sources, start a full crawl of the catalog content, and connect a catalog to your publishing site. For information about how to configure result sources, see [Configure result sources for web content management in SharePoint Server](configure-result-sources-for-web-content-management.md).
-Query rules can be specified for one or more result sources. Result sources are used as part of the query in Web Parts that use search technology, and you can easily influence how search results are shown for all Search Web Parts on your site. By specifying a limited time period for when a query rule is triggered, you can control when certain items are promoted within your website, without having to worry about how to add or remove content at a particular time. Let's say that you are selling electronic products through a product catalog, and you want to promote pink cameras on Valentine's day in the United States. You can create a query rule for a result source that starts on February 14th, and ends on February 15th. The query rule is triggered if a query contains the term Cameras, and pink cameras appear first in the Search Web Part.
+Query rules can be specified for one or more result sources. Result sources are used as part of the query in Web Parts that use search technology, and you can easily influence how search results are shown for all Search Web Parts on your site. By specifying a limited time period for when a query rule is triggered, you can control when certain items are promoted within your website, without having to worry about how to add or remove content at a particular time. Let's say that you're selling electronic products through a product catalog, and you want to promote pink cameras on Valentine's day in the United States. You can create a query rule for a result source that starts on February 14th, and ends on February 15th. The query rule is triggered if a query contains the term Cameras, and pink cameras appear first in the Search Web Part.
For information about how to create query rules, see [Create query rules for web content management in SharePoint Server](create-query-rules-for-web-content-management.md).
The Usage Analytics feature in SharePoint Server 2016 automatically tracks how d
- Views - number of views for a single item, page, or document. -- Recommendations Displayed - number of times a single item, page or document was displayed as a recommendation.
+- Recommendations Displayed - number of times a single item, page, or document was displayed as a recommendation.
- Recommendation Clicks - number of times a single item, page or document was clicked when it was displayed as a recommendation.
You can use the data that is generated by usage events in the following ways:
- Sort search results by the number of counts of a usage event. For example, show items that have the most view events at the top of search results. -- View the usage event data in the **Most Popular Items** usage report. This report applies to all items in a library, and lists the most popular items for each usage event ΓÇö for example, a list of the most viewed pages in a library.
+- View the usage event data in the **Most Popular Items** usage report. This report applies to all items in a library, and lists the most popular items for each usage eventΓÇöfor example, a list of the most viewed pages in a library.
-- View the usage event data in the **Popularity Trends** report. This report applies to a Site Collection, a Site or an individual item in a library or list. The report shows the daily and monthly counts for each usage event ΓÇö for example, the total views of a page on a specific day.
+- View the usage event data in the **Popularity Trends** report. This report applies to a Site Collection, a Site, or an individual item in a library or list. The report shows the daily and monthly counts for each usage eventΓÇöfor example, the total views of a page on a specific day.
The following illustration shows how usage events are sent from Web Parts, through components in the Search service application, and are then used in usage reports and to show recommendations and popular items in Web Parts.
The preconfigured usage events may not be sufficient for your business needs. To
### Plan to import existing usage events
-When you set up a new site, there has not been any user traffic to generate usage events. Therefore, there will be no recommendations and popular items in the system. To be able to show recommendations and popular items from the beginning, you can import existing events from a previous SharePoint system, or import events from a third-party web analytics provider. To correctly import existing data, the data have to be formatted according to specific rules.
+When you set up a new site, there hasn't been any user traffic to generate usage events. Therefore, there will be no recommendations and popular items in the system. To be able to show recommendations and popular items from the beginning, you can import existing events from a previous SharePoint system, or import events from a third-party web analytics provider. To correctly import existing data, the data have to be formatted according to specific rules.
Whenever a usage event occurs, the event is logged to the item in the library that you shared as a catalog. The default method for logging and sending usage events through the system is to use the URL of the item in the library as ID. However, the usage event data that you import may use a different ID to log usage events, for example InternalNumber. You must change the way that usage events in the library that you shared as a catalog are logged to match how usage events for the imported events are logged. To change this, you must do the following:
SharePoint Plan Trusted Data Connection Libraries https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-trusted-data-connection-libraries.md
description: "Use Excel Services trusted data connection libraries to manage and
Excel Services provides the ability to connect to external data sources and refresh the data in data-connected Excel workbooks when it renders them in a browser. Data connections can be loaded by using information from the workbook file, but using a data connection library provides an additional layer for data connections so that they can be managed separately from workbooks.
-Trusted data connection libraries are SharePoint Server 2013 data connection libraries that contain data connection files that Excel Services will trust to use to connect to databases. These files contain everything that Excel Services and Excel client have to have to connect to an external data source.
+Trusted data connection libraries are SharePoint Server 2013 data connection libraries that contain data connection files that Excel Services will trust to use to connect to databases. These files contain everything that Excel Services and Excel client have to connect to an external data source.
Data connection libraries enable broad reuse and sharing of data connections. By using trusted data connection libraries with Excel Services, you can create, deploy, and manage the data connections that your users use. By managing connections in this way, you can ensure that connections are configured correctly and maintained properly and that your users are all using a consistent group of connection files that have been created and approved by authorized users and administrators.
-Excel Services does not use data connection files that are not stored in a trusted data connection library. However, data connection information can be embedded directly in a workbook that is trying to make a connection.
+Excel Services doesn't use data connection files that aren't stored in a trusted data connection library. However, data connection information can be embedded directly in a workbook that is trying to make a connection.
You can create different trusted data connection libraries for different purposes or projects, and you can customize the settings and permissions to the libraries accordingly.
-For workbooks that use the same data connection file, changing the data connection file is all that is required to change connection information; changing the individual workbooks is not necessary.
+For workbooks that use the same data connection file, changing the data connection file is all that is required to change connection information; changing the individual workbooks isn't necessary.
Initially, there are no Excel Services trusted data connection libraries. To store data connection files, you must create at least one trusted data connection library.
SharePoint Plan Visio Services Deployment https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/plan-visio-services-deployment.md
System performance of application servers that are running the Visio Graphics Se
- The performance of the data sources to which diagrams are connected -- The frequency of data refresh for data-connected diagrams
+- The frequency of data refreshes for data-connected diagrams
- Peak loads of users who are accessing diagrams
System performance of application servers that are running the Visio Graphics Se
- Visio Services cache settings
-The diagram size limit and refresh parameters can be adjusted by the administrator. Being able to adjust these parameters can help you adjust the performance of the server. If changing these parameters does not provide the desired performance, you may have to add processing capacity or memory.
+The diagram size limit and refresh parameters can be adjusted by the administrator. Being able to adjust these parameters can help you adjust the performance of the server. If changing these parameters doesn't provide the desired performance, you may have to add processing capacity or memory.
-When planning system resources for Visio Services, the most important factor is peak load. For example, if users will make heaviest use of the Visio Services functionality early Monday morning, plan your server capacity for that peak load. Peak load times can vary widely depending on how Visio Services is used within your organization. It is important to estimate peak loads as best as possible to avoid overtaxing system resources.
+When planning system resources for Visio Services, the most important factor is peak load. For example, if users will make heaviest use of the Visio Services functionality early Monday morning, plan your server capacity for that peak load. Peak load times can vary widely depending on how Visio Services is used within your organization. It's important to estimate peak loads as best as possible to avoid overtaxing system resources.
In addition to SharePoint Server performance considerations, you should also examine the performance impact of Visio Services on your other systems. For example, if you have a data-connected diagram that is querying data from an Oracle database, what is the effect of your Visio Services peak load on that Oracle database? Large numbers of users querying any data source at the same time could put a strain on the resources of that data source. The following best practices can be used to optimize the performance of Visio -- Monitor the performance of the application servers in the farm and add CPU and memory or additional Front-end role servers if they are needed to handle peak loads.
+- Monitor the performance of the application servers in the farm and add CPU and memory or additional Front-end role servers if they're needed to handle peak loads.
- Limit the maximum diagram size.
For many deployments, a single Visio Graphics Service Application is sufficient.
## Use a Visio Services pilot deployment
-To help determine capacity requirements for Visio Services, consider rolling Visio Services out to a limited pilot group that is representative of typical users. Giving a fairly small number of people access to Visio Services functionality lets you monitor server resource usage and effect on related systems, such as external data sources, without overtaxing system resources.
+To help determine capacity requirements for Visio Services, consider rolling Visio Services out to a limited pilot group that is representative of typical users. Giving a fairly small amount of people access to Visio Services functionality lets you monitor server resource usage and effect on related systems, such as external data sources, without overtaxing system resources.
Once you have compiled performance data for the pilot group, you can extrapolate system requirements for Visio Services when you deploy it across your whole organization. The pilot data will also help you determine peak load requirements and times when peak loads are likely to occur.
By monitoring other affected systemsΓÇösuch as data sources used by data-connect
## Monitor system resources consumed by Visio Services
-We highly recommend that you monitor system resources consumed by Visio ServicesΓÇöalongside the other services in your SharePoint Server farm. It is typical for resource usage to increase over time as additional users are brought online and existing users make more use of Visio Services and other SharePoint Server technologies.
+We highly recommend that you monitor system resources consumed by Visio ServicesΓÇöalongside the other services in your SharePoint Server farm. It's typical for resource usage to increase over time as additional users are brought online and existing users make more use of Visio Services and other SharePoint Server technologies.
The SharePoint Server services architecture enables easy addition of servers to the farm. As user demands increase, you can continue to add servers to the farm to provide additional capacity and redundancy.
By monitoring resource usage, you can predict when additional capacity is likely
## Backup and recovery of data used by Visio Services
-Visio Services settings and Visio documents stored in SharePoint Server libraries can be backed up by the farm administrator when doing a standard farm backup. However, be aware that when working with Visio documents that are connected to data sources that are outside the farm, the data to which the Visio documents are connected is not backed up as part of a standard farm backup. In this case, the administrator of the system where the data resides should perform a separate backup procedure.
+Visio Services settings and Visio documents stored in SharePoint Server libraries can be backed up by the farm administrator when doing a standard farm backup. However, be aware that when working with Visio documents that are connected to data sources that are outside the farm, the data to which the Visio documents are connected isn't backed up as part of a standard farm backup. In this case, the administrator of the system where the data resides should perform a separate backup procedure.
## Requirements for authors of Visio diagrams
-Visio Services lets you display Visio diagrams in a Web Part without the need to have Visio installed on the client computer. However, Visio Services does not allow for creating or editing Visio diagrams. As part of your deployment plan for Visio Services, you should also plan for the needs of diagram authors within your organization. Each diagram author who has to use Visio Services must have a copy of Visio Professional, Visio Premium, or Visio Pro for Microsoft 365.
+Visio Services lets you display Visio diagrams in a Web Part without the need to have Visio installed on the client computer. However, Visio Services doesn't allow for creating or editing Visio diagrams. As part of your deployment plan for Visio Services, you should also plan for the needs of diagram authors within your organization. Each diagram author who has to use Visio Services must have a copy of Visio Professional, Visio Premium, or Visio Pro for Microsoft 365.
SharePoint Remote Share Provider https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/remote-share-provider.md
description: "Learn about a new RBS provider."
[!INCLUDE[appliesto-xxx-xxx-xxx-SUB-xxx-md](../includes/appliesto-xxx-xxx-xxx-SUB-xxx-md.md)]
-With organizations increasingly using SharePoint for rich contents rather than normal documents, the storage requirements have grown multifold. Administrators must regularly review and clean up contents in the SharePoint. By default, all the structured content such as metadata, or unstructured content such as files, are stored in content databases in the SQL server attached to SharePoint Server. Unstructured data in SharePoint are stored in content database as Binary Large Object (BLOB) and they are immutable.
+With organizations increasingly using SharePoint for rich contents rather than normal documents, the storage requirements have grown multifold. Administrators must regularly review and clean up contents in the SharePoint. By default, all the structured content such as metadata, or unstructured content such as files, are stored in content databases in the SQL server attached to SharePoint Server. Unstructured data in SharePoint are stored in content database as Binary Large Object (BLOB) and they're immutable.
In SharePoint 2013, Remote BLOB Storage (RBS) technology was created in SQL server to offload BLOBs from content database and SQL FILESTREAM provider was provided at that time. In SharePoint Server Subscription, new Remote Share Provider was created for IT administrators to lower down the overall cost of SharePoint deployment in on-premise environments as reasonable and easy to use storage solution by offloading content from SQL server to network SMB storage.
In SharePoint Server Subscription Edition, we provide a new RBS provider **Remot
Following are the key features of Remote Share Provider: - This provider supports Binary Large Object (BLOB) storage offload to remote SMB system and totally enables content database storage in SQL server side. Therefore, with the same amount of limitation of content database, as in 200-GB size, more file volumes can be stored in one content database. Hence, it helps not only to reduce the cost for storage but also for the maintenance.-- There is a PowerShell Cmdlet to check the data completeness to figure out storage problem.
+- There's a PowerShell Cmdlet to check the data completeness to figure out storage problem.
- By applying the existing backup and restore methodology of SMB system, it provides relatively reasonable disaster recovery. ## Limitations of Remote Share Provider
-Remote Share Provider introduces the new storage system into SharePoint. Any other system can introduce complexity and reliability downgrade in some circumstances. As it is based on RBS, following are some of the limitations that are also applicable to Remote Share Provider:
+Remote Share Provider introduces the new storage system into SharePoint. Any other system can introduce complexity and reliability downgrade in some circumstances. As it's based on RBS, following are some of the limitations that are also applicable to Remote Share Provider:
-- Encryption is not supported on BLOBs, even if transparent data encryption is enabled.-- RBS does not support using data compression.-- As content database and BLOB storages are separated, backup and restore from farm and content database level are not enough for disaster recovering. BLOB storages need to be backed up and recovered at the same time when performing farm and content database level backup and recovery.
+- Encryption isn't supported on BLOBs, even if transparent data encryption is enabled.
+- RBS doesn't support using data compression.
+- As content database and BLOB storages are separated, backup and restore from farm, and content database level aren't enough for disaster recovering. BLOB storages need to be backed up and recovered at the same time when performing farm and content database level backup and recovery.
## Advantages and disadvantages of Remote Share Provider
When compared with storing BLOBs inside SQL server, there are advantages and dis
- Requires other backup and restore step for remote storage (SMB). - Requires separate configuration of security and data protection in the remote storage (SMB).-- Another storage layer reduces availability and reliability of the overall system that is High availability and Disaster Recovery (HADR) will not work by default until you set up HADR for SMB storage.
+- Another storage layer reduces availability and reliability of the overall system that is High availability and Disaster Recovery (HADR) won't work by default until you set up HADR for SMB storage.
## Planning Remote Share Provider Remote Share Provider is suitable for the scenarios when you need: - Huge volume of contents in site collection, which can cause a problem in storage cost and system performance.-- Site collection is not for time-critical business. If the site collection is down, you can take time to restore content database and remote BLOB storage. Service downtime for this specific site collection will not have significant impact to organization business.
+- Site collection isn't for time-critical business. If the site collection is down, you can take time to restore content database and remote BLOB storage. Service downtime for this specific site collection won't have significant impact to organization business.
- More READ operations than WRITE operations on that site collection. Backup and restore methodology must be planned for remote storage system as SharePoint as SQL server backup or restore might not be able to cover BLOBs stored in the remote storage system.
-It is recommended to use System Center Data Protection Manager (DPM) to manage backup and restore so that content database and remote BLOB storage can be backed up at the same time.
+It's recommended to use System Center Data Protection Manager (DPM) to manage backup and restore so that content database and remote BLOB storage can be backed up at the same time.
-In DPM, you can create protection group for both SharePoint content database and remote SMB storage so that these data sets can be backed up/managed together by DPM. For more information, see
+In DPM, you can create protection group for both SharePoint content database and remote SMB storage so that these data sets can be backed up/managed together by DPM. For more information, see:
- [How to backup SharePoint with DPM](/system-center/dpm/back-up-sharepoint) - [How to backup SQL with DPM](/system-center/dpm/back-up-sql-server) - [How to backup file data with DPM](/system-center/dpm/back-up-file-data)
-If you do not use DPM to manage backup and restore, you can follow the two steps to back up SharePoint in sequence:
+If you don't use DPM to manage backup and restore, you can follow the two steps to back up SharePoint in sequence:
1. Back up farm or content database by using Backup-SPFarm PowerShell cmdlet. 2. Backup remote SMB storage by using your existing backup tool.
If you do not use DPM to manage backup and restore, you can follow the two steps
To restore SharePoint back, reverse the sequence to: 1. Restore the SMB storage from backup storage.
-2. Restore content database or farm by using Restore-SPFarm PS cmdlet.
+2. Restore content database or farm by using Restore-SPFarm PowerShell cmdlet.
-Remote Share Provider does not provide encryption to ensure the data security. It relies on the security and access control provided by SMB storage. Hence, to keep your BLOB data safe from threats, proper actions must be taken at the storage level.
+Remote Share Provider doesn't provide encryption to ensure the data security. It relies on the security and access control provided by SMB storage. Hence, to keep your BLOB data safe from threats, proper actions must be taken at the storage level.
1. Enable SMB encryption to ensure BLOBs are transferred safely through network and storage. 2. Enable access control so that only limited users can access BLOBs in the SMB storage.
Remote Share Provider does not provide encryption to ensure the data security. I
By using previous FILESTREAM provider, which is default shipped with SQL server, High availability and Disaster Recovery (HADR) is handled by SQL Server HADR cluster.
-By moving to new Remote Share Provider, this SQL Server level HADR cannot cover the BLOBs inside SMB storage. Hence, Remote Share Provider cannot support same as SQL server HADR by default. It requires more cost and effort to set up a HADR-ready SMB storage and integrate with SharePoint and SQL server with layer HADR system.
+By moving to new Remote Share Provider, this SQL Server level HADR can't cover the BLOBs inside SMB storage. Hence, Remote Share Provider can't support same as SQL server HADR by default. It requires more cost and effort to set up a HADR-ready SMB storage and integrate with SharePoint and SQL server with layer HADR system.
High availability was supported by setting up a failover SharePoint farm in the past. With Remote Share Provider, it can still work.
There are two different configurations for failover farm with Remote Share Provi
:::image type="content" source="../media/smb-storage-new.png" alt-text="smr storage image":::
- For this configuration, there are two sets of SharePoint servers and SQL servers, however, they share the same SMB storage for BLOBs. Real-time database synchronization is set up to stream changes from active SQL server in active SharePoint farm to failover SQL server in fail over SharePoint farm. So if there is any problem in active SharePoint farm, admin can immediately switch to fail over SharePoint farm.
+ For this configuration, there are two sets of SharePoint servers and SQL servers, however, they share the same SMB storage for BLOBs. Real-time database synchronization is set up to stream changes from active SQL server in active SharePoint farm to failover SQL server in failover SharePoint farm. So if there's any problem in active SharePoint farm, admin can immediately switch to fail over SharePoint farm.
2. Share same SMB BLOB storage between active SharePoint farm and failover farm with SMB BLOB storage failover backup. :::image type="content" source="../media/smr-blob-new.png" alt-text="smr blob":::
- This configuration is exactly the same as configuration #1 except there is a failover backup for SMB BLOB storage. Not only database is synced but SMB storage is also backed up. In this situation, when active SharePoint farm has problem, it can switch to fail over farm with other setting to change the SMB storage UNC path to failover SMB storage.
+ This configuration is exactly the same as configuration #1 except there's a failover backup for SMB BLOB storage. Not only database is synced but SMB storage is also backed up. In this situation, when active SharePoint farm has problem, it can switch to fail over farm with other setting to change the SMB storage UNC path to failover SMB storage.
### Security and permission
-Remote Share Provider does not provide encryption to ensure the data security. It relies on the security and access control provided by SMB storage. Therefore, to keep the BLOB data safe from threats, proper actions must be taken at the storage level:
+Remote Share Provider doesn't provide encryption to ensure the data security. It relies on the security and access control provided by SMB storage. Therefore, to keep the BLOB data safe from threats, proper actions must be taken at the storage level:
- Enable SMB encryption to ensure BLOBs are transferred safely through network and storage. - Enable access control so that only limited users can access BLOBs in the SMB storage. - Enable BitLocker to strengthen the data safety, if possible.-- The user account used to perform the steps in the [Provision a BLOB store for each content database](/sharepoint/administration/install-and-configure-rbs#provision) section must be a member of the `db_owner` fixed database role on each database that you are configuring RBS for.-- The user account installing the client library in the steps in the [Install the RBS client library on SQL Server](/sharepoint/administration/install-and-configure-rbs#library) and each front-end or application server section must be a member of the administrators' group on all of the computers where you are installing the library.
+- The user account used to perform the steps in the [Provision a BLOB store for each content database](/sharepoint/administration/install-and-configure-rbs#provision) section must be a member of the `db_owner` fixed database role on each database that you're configuring RBS for.
+- The user account installing the client library in the steps in the [Install the RBS client library on SQL Server](/sharepoint/administration/install-and-configure-rbs#library) and each front-end or application server section must be a member of the administrators' group on all of the computers where you're installing the library.
- The user account enabling RBS in the [Enable RBS for each content database](/sharepoint/administration/install-and-configure-rbs#enableRBS) section must have sufficient permissions to run Microsoft PowerShell. ## Setting up Remote Share Provider
Msiexec /qn /lvx* rbs_install_log.txt /I RBS.msi ADDLOCAL="Client"
RBS is applied to specific content database. Hence, every time when a new content database needs to use RBS, it needs to be set up. Then RBS providers can be registered on the content database.
-Ensure that there is a master key for this content database for which you want to apply RBS. If the master key does not exist, create a new one for the content database.
+Ensure that there's a master key for this content database for which you want to apply RBS. If the master key doesn't exist, create a new one for the content database.
To create master key for specific content database:
-1. Confirm that the user account performing these steps is a member of the `db_owner` fixed database role on each database that you are configuring for RBS.
+1. Confirm that the user account performing these steps is a member of the `db_owner` fixed database role on each database that you're configuring for RBS.
2. Open **SQL Server Management Studio**. 3. Connect to the instance of SQL Server that hosts the content database. 4. Expand **Databases**. 5. Select the content database for which you want to create a BLOB store, and then click **New Query**.
-6. Paste the following SQL queries in **Query** pane, and then run them in the sequence listed. In each case, replace `[WSS_Content]` with the content database name, and replace `c:\BlobStore` with the `volume\directory` in which you want the BLOB store created. The provisioning process creates a folder in the location that you specify. You can provision a BLOB store only once. If you attempt to provision the same BLOB store multiple times, you will receive an error.
+6. Paste the following SQL queries in **Query** pane, and then run them in the sequence listed. In each case, replace `[WSS_Content]` with the content database name, and replace `c:\BlobStore` with the `volume\directory` in which you want the BLOB store created. The provisioning process creates a folder in the location that you specify. You can provision a BLOB store only once. If you attempt to provision the same BLOB store multiple times, you'll receive an error.
```SQL #Replace with <your content database>
msiexec /qn /lvc* rbs.log /i rbs.msi TRUSTSERVERCERTIFICATE=true DBNAME="Your co
### Setting up credentials for Remote Share Provider
-To access restricted SMB storage, it is recommended that specific domain account is assigned to Remote Share Provider to READ/WRITE BLOB files in SMB storage. The provider is using PSCredential object to sign-in remote RBS storage with this specific account credential.
+To access restricted SMB storage, it's recommended that specific domain account is assigned to Remote Share Provider to READ/WRITE BLOB files in SMB storage. The provider is using PSCredential object to sign-in remote RBS storage with this specific account credential.
See [Get-Credential](/powershell/module/microsoft.powershell.security/get-credential) to get PSCredential object for the RBS provider.
The UNC path of the SMB storage this BLOB store will use.
**`[-BlobStoreCredential] <PSCredential>`**
-The PSCredential object, which is used to access the SMB storage. If this parameter is not specified, it will use the service account, which is applied to the current web application.
+The PSCredential object, which is used to access the SMB storage. If this parameter isn't specified, it will use the service account, which is applied to the current web application.
**`[-PoolCapacity] <Int [ValidateRange(1000, 10000)]>`**
-The number of BLOB trunks in each BLOB pool. If this parameter is not specified, it will be set to 1000.
+The number of BLOB trunks in each BLOB pool. If this parameter isn't specified, it will be set to 1000.
Example cmdlet syntax:
Register-SPRemoteShareBlobStore -ContentDatabase $db -Name "RemoteBlob" -Locatio
### Switching and activating BLOB store
-The registered remote share BLOB store will not take effect until it is activated. `Switch-SPBlobStorage` cmdlet needs to be run after `Register-SPRemoteShareBlobStore`, so that new contents to the content database will be routed to the newly created remote share BLOB store.
+The registered remote share BLOB store won't take effect until it's activated. `Switch-SPBlobStorage` cmdlet needs to be run after `Register-SPRemoteShareBlobStore`, so that new contents to the content database will be routed to the newly created remote share BLOB store.
```PowerShell Switch-SPBlobStorage -RemoteShareBlobStore <SPRemoteShareBlobStorePipeBind> ```
-This cmdlet will switch default BLOB storage of the content database to the given remote share BLOB store. Since BLOB store is unique in farm and linked to specific content database, there is no need to specify the content database in this cmdlet.
+This cmdlet will switch default BLOB storage of the content database to the given remote share BLOB store. Since BLOB store is unique in farm and linked to specific content database, there's no need to specify the content database in this cmdlet.
The remote share BLOB store, which will be active, can either be a remote share BLOB store object or the remote share BLOB store name.
The capacity of each pool that is used in the BLOB store.
### Migrating BLOBs from one remote share BLOB store to another
-One content database can have several BLOB stores, and at any time only one of these BLOB stores can be active for WRITE while others are just for READ. Sometimes, there is a need to move BLOBs from one BLOB store to another. For example, you may need to set up a new SMB storage and move all BLOBs to the new SMB storage. In this situation, you need to migrate data from SQL or old BLOB stores to the new BLOB store.
+One content database can have several BLOB stores, and at any time only one of these BLOB stores can be active for WRITE while others are just for READ. Sometimes, there's a need to move BLOBs from one BLOB store to another. For example, you may need to set up a new SMB storage and move all BLOBs to the new SMB storage. In this situation, you need to migrate data from SQL or old BLOB stores to the new BLOB store.
By following the below sample, you can migrate your BLOBs from old BLOB storage to a new active BLOB store created by committing Migrate().
The remote share BLOB store needs to be unregistered. It can either be the remot
**`[-Force]`**
-This switch forces unregistration even when the RemoteShareBLOBStore is not active but the BLOBs are still in use. If there's need to unregister such BLOB store, you can run the cmdlet with this switch to ignore the detection of in-use BLOBs in the BLOB store. If the blob store is active in the content database, this cmdlet will throw an exception to prevent unintentional unregistering of a BLOB store in-use.
+This switch forces unregistration even when the RemoteShareBLOBStore isn't active but the BLOBs are still in use. If there's need to unregister such BLOB store, you can run the cmdlet with this switch to ignore the detection of in-use BLOBs in the BLOB store. If the blob store is active in the content database, this cmdlet will throw an exception to prevent unintentional unregistering of a BLOB store in-use.
-We do not recommend using this **-Force** switch because it will leave BLOBs in the SMB storage behind and make your SharePoint contents unaccessible. We recommend migrating BLOBs firstly and then unregister remote share BLOB store.
+We don't recommend using this **-Force** switch because it will leave BLOBs in the SMB storage behind and make your SharePoint contents unaccessible. We recommend migrating BLOBs firstly and then unregister remote share BLOB store.
### Validating data consistency of remote share BLOB store
SharePoint Remove Certificates https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/remove-certificates.md
description: "Learn how SharePoint supports removing certificates."
SharePoint supports removing certificates via [Remove-SPCertificate](/powershell/module/sharepoint-server/remove-spcertificate) PowerShell cmdlet. -- By default, SharePoint will not allow you to remove a certificate if it is currently assigned to a SharePoint object. You must override the default behavior if you want to force the removal of a certificate. If you override the default behavior, existing assignments of the certificate are cleared.
+- By default, SharePoint won't allow you to remove a certificate if it's currently assigned to a SharePoint object. You must override the default behavior if you want to force the removal of a certificate. If you override the default behavior, existing assignments of the certificate are cleared.
- The certificate and any private key associated with that certificate is removed from the Windows certificate store on every server in the SharePoint farm.-- The certificate and any private key associated with it is removed from the SharePoint configuration database.-- Any previous exports from the certificate through the SharePoint administration interface will not be removed. Those exported files will still exist.
+- The certificate and any private key associated with it's removed from the SharePoint configuration database.
+- Any previous exports from the certificate through the SharePoint administration interface won't be removed. Those exported files will still exist.
Use the `Remove-SPCertificate` cmdlet to remove a certificate from SharePoint.
SharePoint Restore A Farm https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/restore-a-farm.md
You can restore a SharePoint Server farm by using the SharePoint Central Adminis
## Before you begin <a name="begin"> </a>
-Farm-level recovery is usually performed only after a failure that involves the complete farm, or where partial recovery of part of the farm is not possible. If you only have to restore part of the farm, a specific database, a service application, a list, or document library, or a specific document, use another recovery method. For more information about alternate forms of recovery, see [Related content](#proc4).
+Farm-level recovery is performed only after a failure that involves the complete farm, or where partial recovery of part of the farm isn't possible. If you only have to restore part of the farm, a specific database, a service application, a list, or document library, or a specific document, use another recovery method. For more information about alternate forms of recovery, see [Related content](#proc4).
- Farm recovery is usually performed for any of the following reasons:
+ Farm recovery is performed for any of the following reasons:
- Restoring a farm after a fire, disaster, equipment failure, or other data-loss event.
Farm-level recovery is usually performed only after a failure that involves the
Before you begin this operation, review the following information about how to recover a farm in SharePoint: -- You cannot back up from one version of SharePoint Server 2019 and restore to another version of SharePoint Server 2019. The same applies to SharePoint Servers 2016 and 2013.
+- You can't back up from one version of SharePoint Server 2019 and restore to another version of SharePoint Server 2019. The same applies to SharePoint Servers 2016 and 2013.
-- Backing up the farm will back up the configuration and Central Administration content databases, but these cannot be restored using SharePoint Server tools. For more information about how to back up and restore all of the farm databases, see [Move all databases in SharePoint Server](move-all-databases.md).
+- Backing up the farm will back up the configuration and Central Administration content databases, but these can't be restored using SharePoint Server tools. For more information about how to back up and restore all of the farm databases, see [Move all databases in SharePoint Server](move-all-databases.md).
-- When you restore the farm by using SharePoint Server, the restore process will not automatically start all of the service applications. You must manually start them by using Central Administration or Microsoft PowerShell. Do not use SharePoint Products Configuration Wizard to start the services because doing this will also re-provision the services and service proxies. For more information, see [Start or stop a service in SharePoint Server](start-or-stop-a-service.md).
+- When you restore the farm by using SharePoint Server, the restore process won't automatically start all of the service applications. You must manually start them by using Central Administration or Microsoft PowerShell. Don't use SharePoint Products Configuration Wizard to start the services because doing this will also reprovision the services and service proxies. For more information, see [Start or stop a service in SharePoint Server](start-or-stop-a-service.md).
- The identifier (ID) of each content database is retained when you restore or reattach a database by using built-in tools. Default change log retention behavior when using built-in tools is as follows:
Before you begin this operation, review the following information about how to r
When a database ID and change log are retained, the search system continues crawling based on the regular schedule that is defined by crawl rules.
- When you restore an existing database and do not use the overwrite option, a new ID is assigned to the restored database, and the database change log is not preserved. The next crawl of the database will add data from the content database to the index.
+ When you restore an existing database and don't use the overwrite option, a new ID is assigned to the restored database, and the database change log isn't preserved. The next crawl of the database will add data from the content database to the index.
- If a restore is performed and the ID in the backup package is already being used in the farm, a new ID is assigned to the restored database and a warning is added to the restore log. The ability to perform an incremental crawl instead of a full crawl depends on the content database ID being the same as before and the change log token that is used by the search system being valid for the current change log in the content database. If the change log is not preserved, the token is not valid and the search system has to perform a full crawl.
+ If a restore is performed and the ID in the backup package is already being used in the farm, a new ID is assigned to the restored database and a warning is added to the restore log. The ability to perform an incremental crawl instead of a full crawl depends on the content database ID being the same as before and the change log token that is used by the search system being valid for the current change sign-in the content database. If the change log isn't preserved, the token isn't valid and the search system has to perform a full crawl.
-- SharePoint Server backup backs up the Business Data Connectivity service external content type definitions but does not back up the data source itself. To protect the data, you should back up the data source when you back up the Business Data Connectivity service or the farm.
+- SharePoint Server backup backs up the Business Data Connectivity service external content type definitions but doesn't back up the data source itself. To protect the data, you should back up the data source when you back up the Business Data Connectivity service or the farm.
- If you restore the Business Data Connectivity service or the farm and then restore the data source to a different location, you must change the location information in the external content type definition. If you do not, the Business Data Connectivity service might be unable to locate the data source.
+ If you restore the Business Data Connectivity service or the farm and then restore the data source to a different location, you must change the location information in the external content type definition. If you don't, the Business Data Connectivity service might be unable to locate the data source.
-- SharePoint Server restores remote Binary Large Objects (BLOB) stores only if you are using the FILESTREAM remote BLOB store provider to put data in remote BLOB stores.
+- SharePoint Server restores remote Binary Large Objects (BLOB) stores only if you're using the FILESTREAM remote BLOB store provider to put data in remote BLOB stores.
- If you are using another provider, you must manually restore the remote BLOB stores.
+ If you're using another provider, you must manually restore the remote BLOB stores.
-- If you are sharing service applications across farms, be aware that trust certificates that were exchanged are not included in farm backups. You must back up your certificate store separately or keep the certificates in a separate location. When you restore a farm that shares a service application, you must import and redeploy the certificates, and then re-establish any inter-farm trusts.
+- If you're sharing service applications across farms, be aware that trust certificates that were exchanged aren't included in farm backups. You must back up your certificate store separately or keep the certificates in a separate location. When you restore a farm that shares a service application, you must import and redeploy the certificates, and then re-establish any inter-farm trusts.
For more information, see [Exchange trust certificates between farms in SharePoint Server](exchange-trust-certificates-between-farms.md). -- After a Web application that is configured to use claims-based authentication is restored, duplicate or additional claims providers are often visible. If duplicates appear, then you must manually save each Web application zone to remove them. For more information, see [Restore web applications in SharePoint Server](restore-a-web-application.md).
+- After a Web application that is configured to use claims-based authentication is restored, duplicate, or additional claims providers are often visible. If duplicates appear, then you must manually save each Web application zone to remove them. For more information, see [Restore web applications in SharePoint Server](restore-a-web-application.md).
- Additional steps are required when you restore a farm that contains a Web application that is configured to use forms-based authentication. For more information, see [Restore web applications in SharePoint Server](restore-a-web-application.md).
You can use Microsoft PowerShell to restore a farm.
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Open the SharePoint Management Shell.
You can use Microsoft PowerShell to restore a farm.
> [!NOTE] > If you are not logged on as the Farm account, you are prompted for the Farm account's credentials.
- If you do not specify the `BackupId`, the most recent backup will be used. To view the backups for the farm, at the Microsoft PowerShell command prompt,type the following command:
+ If you don't specify the `BackupId`, the most recent backup will be used. To view the backups for the farm, at the Microsoft PowerShell command prompt, type the following command:
```powershell Get-SPBackupHistory -Directory <BackupFolder> -ShowBackup [-Verbose]
You can use Microsoft PowerShell to restore a farm.
- _\<BackupFolder\>_ is the path of the folder you use for storing backup files.
- You cannot use a configuration-only backup to restore content databases together with the configuration.
+ You can't use a configuration-only backup to restore content databases together with the configuration.
4. To restart a service application, at the PowerShell command prompt, type the following command:
You can use the Central Administration Web site to restore a farm.
2. In Central Administration, on the home page, in the **Backup and Restore** section, click **Restore from a backup**.
-3. On the Restore from Backup ΓÇö Step 1 of 3: Select Backup to Restore page, from the list of backups, select the backup job that contains the farm backup, and then click **Next**. You can view more details about each backup by clicking the (+) next to the backup.
+3. On the Restore from BackupΓÇöStep 1 of 3: Select Backup to Restore page, from the list of backups, select the backup job that contains the farm backup, and then click **Next**. You can view more details about each backup by clicking the (+) next to the backup.
> [!NOTE] > If the correct backup job does not appear, in the **Backup Directory Location** text box, type the Universal Naming Convention (UNC) path of the correct backup folder, and then click **Refresh**. You cannot use a configuration-only backup to restore the farm.
-4. On the Restore from Backup ΓÇö Step 2 of 3: Select Component to Restore page, select the check box that is next to the farm, and then click **Next**.
+4. On the Restore from BackupΓÇöStep 2 of 3: Select Component to Restore page, select the check box that is next to the farm, and then click **Next**.
-5. On the Restore from Backup ΓÇö Step 3 of 3: Select Restore Options page, in the **Restore Component** section, make sure that **Farm** appears in the **Restore the following component** list.
+5. On the Restore from BackupΓÇöStep 3 of 3: Select Restore Options page, in the **Restore Component** section, make sure that **Farm** appears in the **Restore the following component** list.
In the **Restore Only Configuration Settings** section, make sure that the **Restore content and configuration settings** option is selected.
You can use the Central Administration Web site to restore a farm.
## Using SQL Server tools to restore a farm <a name="proc3"> </a>
-Although you cannot restore the complete farm by using SQL Server tools, you can restore most of the farm databases. If you restore the databases by using SQL Server tools, you must restore the farm configuration by using Central Administration or PowerShell. For more information about how to restore the farm's configuration settings, see [Restore farm configurations in SharePoint Server](restore-a-farm-configuration.md).
+Although you can't restore the complete farm by using SQL Server tools, you can restore most of the farm databases. If you restore the databases by using SQL Server tools, you must restore the farm configuration by using Central Administration or PowerShell. For more information about how to restore the farm's configuration settings, see [Restore farm configurations in SharePoint Server](restore-a-farm-configuration.md).
> [!NOTE] > The search index is not stored in SQL Server. If you use SQL Server tools to back up and restore search, you must perform a full crawl after you restore the content database.
Use the following procedure to restore your farm databases.
1. Verify that the user account that is performing this procedure is a member of the **sysadmin** fixed server role.
-2. If the SharePoint Timer service is running, stop the service and wait for several minutes for any currently running stored procedures to finish. Do not restart the service until after you restore all the databases that you have to restore.
+2. If the SharePoint Timer service is running, stop the service and wait for several minutes for any currently running stored procedures to finish. Don't restart the service until after you restore all the databases that you have to restore.
3. Start SQL Server Management Studio and connect to the database server.
Use the following procedure to restore your farm databases.
5. Right-click the database that you want to restore, point to **Tasks**, point to **Restore**, and then click **Database**.
- The database is automatically taken offline during the recovery operation and cannot be accessed by other processes.
+ The database is automatically taken offline during the recovery operation and can't be accessed by other processes.
6. In the **Restore Database** dialog, specify the destination and the source, and then select the backup set or sets that you want to restore.
Use the following procedure to restore your farm databases.
7. In the **Select a page** pane, click **Options**.
-8. In the **Restore options** section, select only **Overwrite the existing database**. Unless your environment or policies require otherwise, do not select the other options in this section.
+8. In the **Restore options** section, select only **Overwrite the existing database**. Unless your environment or policies require otherwise, don't select the other options in this section.
9. In the **Recovery state** section:
Use the following procedure to restore your farm databases.
- If you must restore additional transaction logs, select **RECOVER WITH NORECOVERY**.
- - The third option, **RECOVER WITH STANDBY** is not used in this scenario.
+ - The third option, **RECOVER WITH STANDBY** isn't used in this scenario.
> [!NOTE] > For more information about these recovery options, see [Restore Database (Options Page)]( https://go.microsoft.com/fwlink/p/?LinkID=717045&amp;clcid=0x409). 10. Click **OK** to complete the recovery operation.
-11. Except for the configuration database, repeat steps 4 through 9 for each database that you are restoring.
+11. Except for the configuration database, repeat steps 4 through 9 for each database that you're restoring.
> [!IMPORTANT] > If you are restoring the User Profile database (by default named "User Profile Service_ProfileDB_\<GUID\>"), then also restore the Social database (by default named "User Profile Service_SocialDB_\<GUID\>"). Failing to do this can cause inaccuracies in the User Profile data that might be difficult to detect and fix.
SharePoint Restore A Web Application https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/restore-a-web-application.md
When you restore a web application, you also restore the Internet Information Se
Before you begin this operation, review the following information as you prepare to restore a web application: -- You can only restore one web application at a time by using the procedures in this article. However, you can at the same time restore all the web applications in the farm by restoring the complete farm.
+- You can only restore one web application at a time by using the procedures in this article. However, at the same time, you can restore all the web applications in the farm by restoring the complete farm.
- If a web application uses the object cache, you must manually configure two special user accounts for the web application after you restore the web application. For more information about the object cache and how to configure these user accounts, see [Configure object cache user accounts in SharePoint Server](configure-object-cache-user-accounts.md). -- You cannot use SQL Server tools to restore a web application.
+- You can't use SQL Server tools to restore a web application.
- When you restore a web application that is configured to use claims-based authentication, there are additional steps that you must follow after restoring the web application to restore claims-based authentication.
You can use PowerShell to restore a web application manually or as part of a scr
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the PowerShell cmdlets.
+ - Administrators group on the server on which you're running the PowerShell cmdlets.
An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
You can use PowerShell to restore a web application manually or as part of a scr
- _\<GUID\>_ is the identifier of the back up to use for the restore operation.
- If you do not specify the value of the `BackupID` parameter, the most recent backup will be used. You cannot restore a web application by using a configuration-only backup. You can view the backups for the farm by typing the following:
+ If you don't specify the value of the `BackupID` parameter, the most recent backup will be used. You can't restore a web application by using a configuration-only backup. You can view the backups for the farm by typing the following:
```powershell Get-SPBackupHistory -Directory <BackupFolderName> -ShowBackup
You can use Central Administration to restore a web application.
## Using SQL Server tools to restore databases associated with a web application in SharePoint Server <a name="proc3"> </a>
-You cannot restore the complete web application by using SQL Server tools. However, you can restore all the databases that are associated with the web application. To restore the complete web application, use either PowerShell or Central Administration.
+You can't restore the complete web application by using SQL Server tools. However, you can restore all the databases that are associated with the web application. To restore the complete web application, use either PowerShell or Central Administration.
**To restore databases associated with a web application by using SQL Server tools** 1. Verify that the user account performing this procedure is a member of the **sysadmin** fixed server role.
-2. If the SharePoint Timer service is running, stop the service and wait for several minutes for any currently running stored procedures to finish. Do not restart the service until after you restore the databases.
+2. If the SharePoint Timer service is running, stop the service and wait for several minutes for any currently running stored procedures to finish. Don't restart the service until after you restore the databases.
3. Start SQL Server Management Studio and connect to the database server.
You cannot restore the complete web application by using SQL Server tools. Howev
5. Right-click the database that you want to restore, point to **Tasks**, point to **Restore**, and then click **Database**.
- The database is automatically taken offline during the recovery operation and cannot be accessed by other processes.
+ The database is automatically taken offline during the recovery operation and can't be accessed by other processes.
6. In the **Restore Database** dialog, specify the destination and the source, and then select the backup set or sets that you want to restore.
You cannot restore the complete web application by using SQL Server tools. Howev
7. In the **Select a page** pane, click **Options**.
-8. In the **Restore options** section, select only **Overwrite the existing database**. Unless the environment or policies require otherwise, do not select the other options in this section.
+8. In the **Restore options** section, select only **Overwrite the existing database**. Unless the environment or policies require otherwise, don't select the other options in this section.
9. In the **Recovery state** section:
You cannot restore the complete web application by using SQL Server tools. Howev
10. Click **OK** to complete the recovery operation.
-11. Repeat steps 4 through 10 for each database that you are restoring.
+11. Repeat steps 4 through 10 for each database that you're restoring.
12. Start the Windows SharePoint Services Timer service.
After you restore a web application that uses forms-based authentication, you mu
## Additional steps to remove duplicate claims providers after restoring a web application that uses claims-based authentication in SharePoint Server <a name="Claims"> </a>
-After a web application that is configured to use claims-based authentication is restored, duplicate or additional claims providers are often visible. You must use the following process to remove the duplicate providers:
+After a web application that is configured to use claims-based authentication is restored, duplicate, or additional claims providers are often visible. You must use the following process to remove the duplicate providers:
1. In Central Administration, click **Manage Web application**, select a web application that uses claims-based authentication, and then click **Authentication Providers**.
After a web application that is configured to use claims-based authentication is
3. Repeat for each zone, and then for each web application that uses claims-based authentication.
-## Additional steps to re-configure object cache user accounts in SharePoint Server
+## Additional steps to reconfigure object cache user accounts in SharePoint Server
<a name="cache"> </a>
-If you configured object cache user accounts for the web application, the restore process will not restore these settings. You must re-configure the settings for the web application. For more information, see [Configure object cache user accounts in SharePoint Server](configure-object-cache-user-accounts.md).
+If you configured object cache user accounts for the web application, the restore process won't restore these settings. You must reconfigure the settings for the web application. For more information, see [Configure object cache user accounts in SharePoint Server](configure-object-cache-user-accounts.md).
## See also <a name="cache"> </a>
SharePoint Search Engine Optimization Seo https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/search-engine-optimization-seo.md
description: "Learn about Search Engine Optimization (SEO) in SharePoint Server
[!INCLUDE[appliesto-xxx-2016-xxx-xxx-xxx-md](../includes/appliesto-xxx-2016-xxx-xxx-xxx-md.md)]
-If you are a website owner, you know how important it is that users can easily find your website by using Internet search engines such as Bing or Google. The higher your website is shown in the search results list, the more likely it is that users will click on it. Just think of your own behavior when looking at search results. When was the last time that you clicked to view the second page of search results?
+If you're a website owner, you know how important it's that users can easily find your website by using Internet search engines such as Bing or Google. The higher your website is shown in the search results list, the more likely it's that users will click on it. Just think of your own behavior when looking at search results. When was the last time that you clicked to view the second page of search results?
## Optimizing SharePoint Server 2016 websites for Internet search engines
SharePoint Stage 12 Plan To Use Refiners For Faceted Navigation Inpart I https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/stage-12-plan-to-use-refiners-for-faceted-navigation-inpart-i.md
Previous stages of this series explained the following:
- How to upload and add display templates to a CWSP ([Stage 11: Upload and apply display templates to the Content Search Web Part in SharePoint Server](stage-11-upload-and-apply-display-templates-to-the-content-search-web-part.md)).
-After completing these stages, our Contoso site is starting to look good, but now it's time to to make sure that that our site visitors can find the exact product they are looking for, quickly and easily. This article focuses on what we must plan before we begin to configure refiners.
+After completing these stages, our Contoso site is starting to look good, but now it's time to make sure that our site visitors can find the exact product they're looking for, quickly and easily. This article focuses on what we must plan before we begin to configure refiners.
In this article, you'll learn:
In this article, you'll learn:
### How refiners helped plan a trip to Japan <a name="BKMK_HowRefinersHelpedPlanATripToJapan"> </a>
-Although the term "refiners" might sound new to you, there is a high chance you've already used them, and much more often than you think. For example, if you've ever bought a book online, most likely refiners were there to help locate just the right book.
+Although the term "refiners" might sound new to you, there's a high chance you've already used them, and much more often than you think. For example, if you've ever bought a book online, most likely refiners were there to help locate just the right book.
-Let's consider a hypothetical situation in which you are planning a trip to Japan. You visit your favorite online bookstore in search of a travel guide for your trip. You type "Japan" and, pages of search results are returned. Trawling through page after page of search results does not seem like much fun. Luckily, the site designers have provided a way to narrow the search results. On the left side of the page is a "Categories" list, which contains entries such as "Cooking," "Geography," "History," and "Travel." You click "Travel" and in an instant, the search results show only travel books that contains the word "Japan." But, turns out there are many travel books out there on Japan. Therefore, you have to trim the results additionally. As it happens, you want a paperback version. So, still focusing on the lists on the left side of the page, you spot a category called "Format" that contains terms like "Hardcover," "PDF," "Audio," Digital," and "Paperback." So you click "Paperback" and received what you've been after: results for travel books about Japan in paperback! Unfortunately, the number of search results is still too large. Therefore, you continue to use the category lists on the left side of the page until you've drilled right down to five hopeful candidates, one of which makes it over the finish line and straight into your shopping cart.
+Let's consider a hypothetical situation in which you're planning a trip to Japan. You visit your favorite online bookstore in search of a travel guide for your trip. You type "Japan" and, pages of search results are returned. Trawling through page after page of search results doesn't seem like much fun. Luckily, the site designers have provided a way to narrow the search results. On the left side of the page is a "Categories" list, which contains entries such as "Cooking," "Geography," "History," and "Travel." You click "Travel" and in an instant, the search results show only travel books that contain the word "Japan." But, turns out there are many travel books out there on Japan. Therefore, you have to trim the results additionally. As it happens, you want a paperback version. So, still focusing on the lists on the left side of the page, you spot a category called "Format" that contains terms like "Hardcover," "PDF," "Audio," Digital," and "Paperback." So you click "Paperback" and received what you've been after: results for travel books about Japan in paperback! Unfortunately, the number of search results is still too large. Therefore, you continue to use the category lists on the left side of the page until you've drilled right down to five hopeful candidates, one of which makes it over the finish line and straight into your shopping cart.
Now, here's the techy part: when you were clicking "Travel" and "Paperback" you were, in fact, using refiners. In SharePoint terms, a refiner is a managed property that has been enabled as a refiner. Refiner values are the values of a managed property that has been enabled as a refiner. So in the online shopping trip, "Categories" and "Format" were **refiners**, whereas "Travel" and "Paperback" were **refiner values**.
-Earlier in this series, the article [From site column to managed property - What's up with that?](from-site-column-to-managed-propertywhat-s-up-with-that.md) explained how a site column is represented as a managed property after they were crawled. For example, in our Contoso catalog we have a site column named "Contoso Color." For each item in the catalog, we have added a color value, such as red, green, or blue, to this column. To enable our site visitors to narrow search results quickly, (for example, to a particular color), we must enable the managed property that represents the "Contoso Color" site column as a refiner. There is, of course, more to it than that, and we'll show you all these steps in later articles.
+Earlier in this series, the article [From site column to managed property - What's up with that?](from-site-column-to-managed-propertywhat-s-up-with-that.md) explained how a site column is represented as a managed property after they were crawled. For example, in our Contoso catalog we have a site column named "Contoso Color." For each item in the catalog, we have added a color value, such as red, green, or blue, to this column. To enable our site visitors to narrow search results quickly (for example, to a particular color), we must enable the managed property that represents the "Contoso Color" site column as a refiner. There is, of course, more to it than that, and we'll show you all these steps in later articles.
### About refiner types <a name="BKMK_AboutRefinerTypes"> </a>
There are two types of refiners:
Going back to the scenario about finding a travel book about Japan, we used stand-alone refiners. Stand-alone refiners are usually applied in scenarios where you have unstructured content, and where the refiners can be applied across all content. These refiners are often used on a search results page to narrow search results.
-Now, you might be thinking that stand-alone refiners seem like an excellent refiner type to use on a search results page. But what about a scenario like our Contoso site, where we show catalog content? On our Contoso site, visitors want to browse the catalog to find what they are looking for. Therefore, they won't enter any words into a search box. Well, remember that we are using a Content Search Web Part to display content on our search page, as explained in [Stage 9: Configure the query in a Content Search Web Part on a category page in SharePoint Server](stage-9-configure-the-query-in-a-content-search-web-part-on-a-category-page.md). Because the Content Search Web Part is using search technology to display search results, we can use refiners to narrow search results that are displayed in the Content Search Web Part.
+Now, you might be thinking that stand-alone refiners seem like an excellent refiner type to use on a search results page. But what about a scenario like our Contoso site, where we show catalog content? On our Contoso site, visitors want to browse the catalog to find what they're looking for. Therefore, they won't enter any words into a search box. Well, remember that we're using a Content Search Web Part to display content on our search page, as explained in [Stage 9: Configure the query in a Content Search Web Part on a category page in SharePoint Server](stage-9-configure-the-query-in-a-content-search-web-part-on-a-category-page.md). Because the Content Search Web Part is using search technology to display search results, we can use refiners to narrow search results that are displayed in the Content Search Web Part.
-So we can use refiners on our category page. That's nice, but again you might be thinking: What if we want to use different refiners for different categories? For example, on the "Televisions" category, we want visitors to be able to refine on "Screen Size." On the "Air conditioners" category it doesn't make sense to have a refiner called "Screen Size," so we do not want to display it there. But, we do want visitors to be able to refine on "Installation type." In addition, there are some common refiners that should be applied to all categories, such as "Brand."
+So we can use refiners on our category page. That's nice, but again you might be thinking: What if we want to use different refiners for different categories? For example, on the "Televisions" category, we want visitors to be able to refine on "Screen Size." On the "Air conditioners" category it doesn't make sense to have a refiner called "Screen Size," so we don't want to display it there. But, we do want visitors to be able to refine on "Installation type." In addition, there are some common refiners that should be applied to all categories, such as "Brand."
Well, you might have guessed it. You can achieve all of this with **refiners for faceted navigation**.
Of these 14 properties, only three represent information that can be applied to
Adding refiners for faceted navigation is performed on the tagging term set on the authoring side. When specifying which refiners to use in which category, it's helpful to use the tagging term set as a guide. In our Contoso scenario, this is the **Product Hierarchy** term set.
-By default, all children of a term inherit refiners that are added to a parent term. For example, a refiner that is added to the term "Cameras" will be applied to all its children, such as "Camcorders," "Camera accessories," and so on You can override this inheritance to add or remove refiners for a child category. Later in this series, we'll look at how this is done. For now, let's concentrate on the planning part.
+By default, all children of a term inherit refiners that are added to a parent term. For example, a refiner that is added to the term "Cameras" will be applied to all its children, such as "Camcorders," "Camera accessories," and so on, You can override this inheritance to add or remove refiners for a child category. Later in this series, we'll look at how this is done. For now, let's concentrate on the planning part.
To save space, we'll not look at the complete faceted navigation structure of our Contoso site. Instead, the following table lists the refiners we want to use for the categories "Audio" and "Cameras," and to which term we want to assign them.
To save space, we'll not look at the complete faceted navigation structure of ou
Now that we've identified which refiners to use, the next thing to consider is how we want the refiner values to be displayed.
-Basically, there are two ways refiner values can be displayed: as a list, or grouped in intervals. For refiner values that use the data type Text, Person or Group, Choice or Yes/No, there's not really much to consider. These refiner values will most likely always be displayed in a list as shown in the screen shot below.
+Basically, there are two ways refiner values can be displayed: as a list, or grouped in intervals. For refiner values that use the data type Text, Person or Group, Choice or Yes/No, there's not much to consider. These refiner values will most likely always be displayed in a list as shown in the screenshot below.
![Brand Refiner List](../media/OTCSP_BrandRefinerList.png)
-But, for numeric refiner values, the story is somewhat different. For example, consider all the refiner values for a refiner such as Price. In our Contoso scenario, it could overwhelm our visitors with information and would therefore not be of much us.
+But, for numeric refiner values, the story is different. For example, consider all the refiner values for a refiner such as Price. In our Contoso scenario, it could overwhelm our visitors with information and would therefore not be of much us.
![Price Refiner List](../media/OTCSP_PriceRefinerList.png)
SharePoint Stage 16 Add A Taxonomy Refinement Panel Web Part To A Publishing Site https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/stage-16-add-a-taxonomy-refinement-panel-web-part-to-a-publishing-site.md
description: "Learn how to add a taxonomy refinement panel web part to a publish
## Quick overview
-[Stage 15: Add refiners for faceted navigation to a publishing site in SharePoint Server](stage-15-add-refiners-for-faceted-navigation-to-a-publishing-site.md) explained how to add refiners for faceted navigation to a publishing site. By using category-specific refiners, visitors can easily find the product they are looking for. But, visitors can't easily see the different subcategories inside a particular category.
+[Stage 15: Add refiners for faceted navigation to a publishing site in SharePoint Server](stage-15-add-refiners-for-faceted-navigation-to-a-publishing-site.md) explained how to add refiners for faceted navigation to a publishing site. By using category-specific refiners, visitors can easily find the product they're looking for. But, visitors can't easily see the different subcategories inside a particular category.
In this article, you'll learn:
In this article, you'll learn:
## Start stage 16
-Throughout this series, when a new feature was introduced, we started by explaining the feature, and then went on to the procedures for using it. In this stage, we'll flip the sequence, because it will be easier to explain what's going on by using screen shots of the Web Part.
+Throughout this series, when a new feature was introduced, we started by explaining the feature, and then went on to the procedures for using it. In this stage, we'll flip the sequence, because it will be easier to explain what's going on by using screenshots of the Web Part.
### How to add a Taxonomy Refinement Panel Web Part to a page <a name="BKMK_HowToAddATaxonomyRefinementPanelWebPartToAPage"> </a>
Browse to the page where you want to add the Web Part. In our scenario, let's br
5. Save the page.
-That's all there is to it! Without having to do any configuration, the sub-categories under "Cameras" are displayed. Also notice that refiner counts are automatically displayed.
+That's all there's to it! Without having to do any configuration, the subcategories under "Cameras" are displayed. Also notice that refiner counts are automatically displayed.
![Web Part Added](../media/OTCSP_WebPartAdded.png)
If we browse to "Audio," the Audio subcategories are displayed with counts.
![Audio Sub Categories](../media/OTCSP_AudioSubCategories.png)
-Now let's look at how these sub-categories "magically" appear.
+Now let's look at how these subcategories "magically" appear.
### About the Taxonomy Refinement Panel Web Part <a name="BKMK_AboutTheTaxonomyRefinementPanelWebPart"> </a>
-Let's start with a definition of this Web Part: The Taxonomy Refinement Panel Web Part filters search results from an associated Search Web Part, which show refiners based on the current navigation term. For example, in our case the Web Part showed the sub-categories of "Audio" and sub-categories of "Computer".
+Let's start with a definition of this Web Part: The Taxonomy Refinement Panel Web Part filters search results from an associated Search Web Part, which show refiners based on the current navigation term. For example, in our case, the Web Part showed the subcategories of "Audio" and subcategories of "Computer".
For the Taxonomy Refinement Web Part to work correctly, there are two conditions that have to be considered:
-1. The Taxonomy Refinement Web Part must be associated with another Search Web Part on the page that it is added to.
+1. The Taxonomy Refinement Web Part must be associated with another Search Web Part on the page that it's added to.
2. The Taxonomy Refinement Web Part must be associated with the managed property that represents the managed navigation of the site.
Let's start with the first condition.
Unlike the Content Search Web Part, the Taxonomy Refinement Web Part doesn't contain a query. Because it doesn't query for content, it has to receive search results from elsewhere to display content.
-In the following screen shot, the Taxonomy Refinement Panel Web Part is shown in the default edit mode. In the Web Part Tool Pane, in the **Query** section, **Refinement Target** is set to **Content Search - Default**.
+In the following screenshot, the Taxonomy Refinement Panel Web Part is shown in the default edit mode. In the Web Part Tool Pane, in the **Query** section, **Refinement Target** is set to **Content Search - Default**.
![Refinement Target](../media/OTCSP_RefinementTarget.png)
That's very cool, but what makes the Taxonomy Refinement Panel Web Part even coo
![Cameras Fabrikam Refiner](../media/OTCSP_CamerasFabrikam.png)
-To visitors, this makes browsing for products really convenient, because they can immediately see which sub-categories have *Fabrikam* camera products, without having to click back and forth.
+To visitors, this makes browsing for products convenient, because they can immediately see which subcategories have *Fabrikam* camera products, without having to click back and forth.
-So, that was all for this series. If you are setting up your own site, we hope you'll make good use of the features that are described in this series.
+So, that was all for this series. If you're setting up your own site, we hope you'll make good use of the features that are described in this series.
SharePoint Stage 2 Import List Content Into The Product Catalog Site Collection https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/stage-2-import-list-content-into-the-product-catalog-site-collection.md
description: "Learn how to import list content into the Product Catalog Site Col
## Start stage 2
-After we've set up our Product Catalog Site Collection, as described in [Stage 1: Create site collections for cross-site publishing in SharePoint Server](stage-1-create-site-collections-for-cross-site-publishing.md), we can import content into this site collection. To do this, we'll use PowerShell scripts. But before we start, let's take a look at what is automatically created in a **Product Catalog Site Collection**.
+After we've set up our Product Catalog Site Collection, as described in [Stage 1: Create site collections for cross-site publishing in SharePoint Server](stage-1-create-site-collections-for-cross-site-publishing.md), we can import content into this site collection. To do this, use PowerShell scripts. But before we start, let's take a look at what is automatically created in a **Product Catalog Site Collection**.
In our newly created **Product Catalog Site Collection**, we can see a default list template named **Products**.
The **Item Category** site column is associated with the term set named **Produc
![Product Hierarchy](../media/OTCSP_ProductHierarchy.PNG)
-To import list content into the **Products** list, we'll use PowerShell scripts that will:
+To import list content into the **Products** list, use PowerShell scripts that will:
- Add content to the **Products** list.
To import list content into the **Products** list, we'll use PowerShell scripts
- Associate each item in the **Products** list with the correct term from the **Product Hierarchy** term set, and display this in the **Item Category** column in the **Products** list.
-Before we can run the PowerShell scripts, we'll have to prepare the following:
+Before we can run the PowerShell scripts, we have to prepare the following:
- A list of the site columns we want to add to the **Products** list.
The PowerShell scripts, instructions on how to create the tab delimited text fil
After we have run the five PowerShell scripts, we get the following: -- List content in the **Products** list. In our scenario, each list item is a product that Contoso want to display on their website.
+- List content in the **Products** list. In our scenario, each list item is a product that Contoso wants to display on their website.
- Terms in the **Product Hierarchy** term set. In our scenario, the term set reflects how Contoso has categorized their products, for example one category is called "Laptops" and another is "MP3" players. -- In the **Products** list, content in the **Item Category** column is associated with the correct term from the **Product Hierarchy** term set. The following screen shot shows how the list item *Southridge Video Laptop15.4W M1548* is associated with the term *Laptops* through the **Item Category** column.
+- In the **Products** list, content in the **Item Category** column is associated with the correct term from the **Product Hierarchy** term set. The following screenshot shows how the list item *Southridge Video Laptop15.4W M1548* is associated with the term *Laptops* through the **Item Category** column.
![Item Term Connection](../media/OTCSP_ItemTermConnection.PNG)
So, now that we have content in the **Products** list, the next task is to enabl
#### Concepts
-[Configure cross-site publishing in SharePoint Server](configure-cross-site-publishing.md)
+[Configure cross-site publishing in SharePoint Server.](configure-cross-site-publishing.md)
SharePoint Stage 3 How To Enable A List As A Catalog https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/stage-3-how-to-enable-a-list-as-a-catalog.md
description: Learn how to enable a list as a catalog in SharePoint Server 2016.
## Quick overview
-As described in as described in [Stage 2: Import list content into the Product Catalog Site Collection in SharePoint Server](stage-2-import-list-content-into-the-product-catalog-site-collection.md), we've imported content about Contoso's product line into the **Products** list. To display this product information in our Publishing Portal (the Contoso website), we must enable the **Products** list as a catalog.
+As described in [Stage 2: Import list content into the Product Catalog Site Collection in SharePoint Server](stage-2-import-list-content-into-the-product-catalog-site-collection.md), we've imported content about Contoso's product line into the **Products** list. To display this product information in our Publishing Portal (the Contoso website), we must enable the **Products** list as a catalog.
## Start stage 3
By selecting this, we confirm that content from the **Products** list should be
![Anonymous Access](../media/OTCSP_AnonymousAccess.PNG)
-By doing this, we grant anonymous visitors, that is, visitors who are not logged on to Contoso's website, access to view content from this list.
+By doing this, we grant anonymous visitors, that is, visitors who aren't logged on to Contoso's website, access to view content from this list.
-Note that we're **not** granting visitors access to the list itself. All we're doing is granting anonymous visitors access to **view** the catalog content from the search index. Anonymous visitors will never be able to see the actual **Products** list.
+We're **not** granting visitors access to the list itself. All we're doing is granting anonymous visitors access to **view** the catalog content from the search index. Anonymous visitors will never be able to see the actual **Products** list.
![Anonymous Access Image](../media/OTCSP_AnonymousAccessImage.gif)
The terms from the **Product Hierarchy** term set will also be used to create a
![Item URL](../media/OTCSP_ItemURL.PNG)
-The URL to a specific product will be composed of the terms that we specify in **Navigation Hierarchy** (previous step), *and* the values from the fields we specify as **Catalog Item URL Fields**. When selecting these fields, we should use at least one field that contains a product unique value, because we want to use this unique value in the product URL. By doing this, the URL to the product *Fabricam Home Movimaker M300* will differ from the URL to the product *Fabricam Home Movimaker M400* .
+The URL to a specific product will be composed of the terms that we specify in **Navigation Hierarchy** (previous step), *and* the values from the fields we specify as **Catalog Item URL Fields**. When selecting these fields, we should use at least one field that contains a product unique value, because we want to use this unique value in the product URL. By doing this, the URL to the product *Fabricam Home Movimaker M300* will differ from the URL to the product *Fabricam Home Movimaker M400*.
For Contoso, the unique identifier of a product is the value in the **Item Number** column. We also want to use the value of the **Group Number** column. Therefore, we'll add them both. Later in this series, we'll explain why we also want to use **Group Number**.
SharePoint Stage 9 Configure The Query In A Content Search Web Part On A Category Page https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/stage-9-configure-the-query-in-a-content-search-web-part-on-a-category-page.md
In our scenario, we'll add a CSWP to Zone 3.
![CSWP Added](../media/OTCSP_CSWPadded.png)
-The CSWP contains a default query. Therefore, it already displays some content (Audio, Cameras and Computers). But it does not display the content we want to display. To make the Web Part display Contoso catalog content, we must configure the query in the Web Part.
+The CSWP contains a default query. Therefore, it already displays some content (Audio, Cameras, and Computers). But it doesn't display the content we want to display. To make the Web Part display Contoso catalog content, we must configure the query in the Web Part.
### How to configure a query in a Content Search Web Part on a category page <a name="BKMK_HowToConfigureAQueryInAContentSearchWebPartOnACategoryPage"> </a>
A key phrase in this selection is *navigation terms*. This refers to the categor
Remember, one of the first things we did in this series was import catalog content into a list. We also imported terms into the term set **Product Hierarchy**. In [Stage 2: Import list content into the Product Catalog Site Collection in SharePoint Server](stage-2-import-list-content-into-the-product-catalog-site-collection.md), we associated each item in the list with a term from the term set. In [Stage 5: Connect your publishing site to a catalog in SharePoint Server](stage-5-connect-your-publishing-site-to-a-catalog.md), we specified that the full site navigation should contain terms from the **Product Hierarchy** term set. Because we have used *the same term set* to tag the items in our catalog and to build our site navigation, we can use a term from our site navigation to search for catalog items that are tagged with that same term.
-Our query in the CSWP will therefore display search results for items that are in the *catalog - Products Results result source* , and are tagged with either "Audio", or any child of "Audio", for example "MP3 players" or "Speakers".
+Our query in the CSWP will therefore display search results for items that are in the *catalog - Products Results result source, and are tagged with either "Audio", or any child of "Audio", for example "MP3 players" or "Speakers".
This selection reduced the relevant search results to 114, which is the number of items in our catalog that belong to the "Audio" group.
-Another key phrase from the selection **Restrict by current and child navigation** terms is *current* . More information about the importance of this phrase is provided in [About the query configuration](stage-9-configure-the-query-in-a-content-search-web-part-on-a-category-page.md#BKMK_AboutTheQueryConfiguration) in the next section
+Another key phrase from the selection **Restrict by current and child navigation** terms is *current* . More information about the importance of this phrase is provided in [About the query configuration](stage-9-configure-the-query-in-a-content-search-web-part-on-a-category-page.md#BKMK_AboutTheQueryConfiguration) in the next section.
5. Select **OK**, and save the page.
If we browse to the "MP3" category, we'll see three other different search resul
![MP3 Results](../media/OTCSP_MP3Results.png)
-If you are now thinking "OK, I understand how we got the correct search results for the "Audio" category, because that is the category we clicked, and where we changed the query in the Web Part. But why do we see different search results when we browse the catalog? And shouldn't we change the query for all the other categories also?"
+If you're now thinking "OK, I understand how we got the correct search results for the "Audio" category, because that is the category we clicked, and where we changed the query in the Web Part. But why do we see different search results when we browse the catalog? And shouldn't we change the query for all the other categories also?"
Let's take a closer look at what's going on.
Let's take a closer look at what's going on.
We only had to configure one query because the same page is used for all categories. Remember, in [Stage 8: Assign a category page and a catalog item page to a term in SharePoint Server](stage-8-assign-a-category-page-and-a-catalog-item-page-to-a-term.md), when we assigned the page *ContosoCategoryPage.aspx* to all terms within the **Site Navigation** term set. We assigned this page to *all terms*. Therefore, even though we edited this page in the "Audio" category, we could have edited it in any other category, and achieved the same result.
-We only had to configure the query one time, because the query issued from the Web Part differs depending on which category we browse to. Remember that the CSWP contains a query that is automatically issued when someone browses to a page that contains a CSWP, and that search results are displayed in the Web Part. Also, remember that we selected **Restrict by current and child navigation terms** when we configured the query in the Web Part. The word "current" is very important here, because it means that the query issued by the CSWP will change depending on the category the visitor is currently browsing. If you edit the Web Part from another category, you can see that the Web Part has changed.
+We only had to configure the query one time, because the query issued from the Web Part differs depending on which category we browse to. Remember that the CSWP contains a query that is automatically issued when someone browses to a page that contains a CSWP, and that search results are displayed in the Web Part. Also, remember that we selected **Restrict by current and child navigation terms** when we configured the query in the Web Part. The word "current" is important here, because it means that the query issued by the CSWP will change depending on the category the visitor is currently browsing. If you edit the Web Part from another category, you can see that the Web Part has changed.
For example, if we browse to the "Cameras" category and take a closer look at the CSWP, we see that:
So, when we browse to the "Audio" category, the CSWP issues a query for catalog
### How to view details of the query configuration <a name="BKMK_HowToViewDetailsOfTheQueryConfiguration"> </a>
-To view details of the query configuration, click on the **TEST** tab. The actual query issued by the CSWP, is shown in the **Query text** field.
+To view details of the query configuration, click on the **TEST** tab. The actual query issued by the CSWP is shown in the **Query text** field.
![TEST 2](../media/OTCSP_TEST2.png)
In our scenario, the query that is issued by the CSWP from the "Audio" category
`(contentclass:sts_listitem OR IsDocument:True) SPSiteUrl:http://contoso/sites/catalog ListId:3a3f66cd-9741-4f15-b53a-b4b23c3187ea owstaxidProductCatalogItemCategory:#c771504f-6a2f-423f-98de-0e12fcfa08c9`
-If this doesn't make any sense now, don't worry! There is logic to it, and we'll break it down to make it clearer.
+If this doesn't make any sense now, don't worry! There's logic to it, and we'll break it down to make it clearer.
- `(contentclass:sts_listitem OR IsDocument:True) SPSiteUrl:http://contoso/sites/catalog ListId:3a3f66cd-9741-4f15-b53a-b4b23c3187ea` is our catalog result source, *catalog - Products Results*
In our Product catalog site collection, in the **Product Hierarchy** term set, y
So now we have configured the query for the CSWP on our category page. We still have to do some configuration to make it display more than three search results, and also give it a "Contoso look." This will be explained later in this series.
-The next step is to add a CSWP to our catalog item page
+The next step is to add a CSWP to our catalog item page.
#### Next article in this series
SharePoint Troubleshoot Common Fine Grained Permissions Issues https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/troubleshoot-common-fine-grained-permissions-issues.md
After you implement fine-grained permission, a SharePoint environment could expe
## Recommended issues and resolutions for common fine-grained permissions performance issues <a name="fgpperfissuesolutions"> </a>
-The following issues can help reduce the effect of performance issues that are related to the extensive use of fine-grained permissions. Each of the following issues covers changes to the environment security, object hierarchy or custom code that is contributing to the performance issues that are related to fine-grained permissions. Each issue starts with the following example environment where a single web contains multiple document libraries, each with many uniquely-permissioned child objects.
+The following issues can help reduce the effect of performance issues that are related to the extensive use of fine-grained permissions. Each of the following issues covers changes to the environment security, object hierarchy or custom code that is contributing to the performance issues that are related to fine-grained permissions. Each issue starts with the following example environment where a single web contains multiple document libraries, each with many uniquely permissioned child objects.
![Illustrates a single Web contains multiple document libraries, each with a great many number of uniquely uniquely-permissioned child objects.](../media/FGP_Image3.jpg) ### Issue 1: Remove fine-grained permissions and use security enforcement only at web level <a name="sol1"> </a>
-To re-architect the environment so that it no longer requires fine-grained permissions, an environment cleanup process can be implemented, and then the number of scoped items can be adjusted to improve the scalability of the environment over the longer term. The following recommendations describe the environment cleanup and architectural security changes that are required to achieve this solution.
+To rearchitect the environment so that it no longer requires fine-grained permissions, an environment cleanup process can be implemented, and then the number of scoped items can be adjusted to improve the scalability of the environment over the longer term. The following recommendations describe the environment cleanup and architectural security changes that are required to achieve this solution.
#### Environmental security cleanup
After all item-level scopes are removed, individual scope memberships at the web
#### Environmental security architecture redesign
-After the existing fine-grained permissions and scopes are removed, the long-term architecture plan should be to maintain a unique scope only at the web level. The following diagram shows how this might be structured so that only the web-level scope remains. The core requirement in the architecture is not to have too many items at the same level of hierarchy in the document libraries, because the time that is required to process items in the views increases.
+After the existing fine-grained permissions and scopes are removed, the long-term architecture plan should be to maintain a unique scope only at the web level. The following diagram shows how this might be structured so that only the web-level scope remains. The core requirement in the architecture isn't to have too many items at the same level of hierarchy in the document libraries, because the time that is required to process items in the views increases.
**Resolution:**
The maximum count of items or folders at any level in the hierarchy should be ab
![Illustrates the architecture how a Web-level scope should be structured.](../media/FGP_Image6.jpg)
-If additional changes are needed to the architecture, consider moving document libraries to different webs or site collections. The number of document libraries could also be changed to more closely support business needs and scaling recommendations that are based on the taxonomy or audience of the stored content.
+If more changes are needed to the architecture, consider moving document libraries to different webs or site collections. The number of document libraries could also be changed to more closely support business needs and scaling recommendations that are based on the taxonomy or audience of the stored content.
### Issue 2: Use fine-grained permissions by hierarchical structure changes <a name="sol2"> </a>
-To re-architect the environment so that it still uses fine-grained permissions, but without causing too many updates to or sizing of a single web scope, consider moving differently secured document libraries to different webs.
+To rearchitect the environment so that it still uses fine-grained permissions, but without causing too many updates to or sizing of a single web scope, consider moving differently secured document libraries to different webs.
#### Environment hierarchy redesign
-In the following diagram, the physical architecture was changed so that each document library is in a uniquely-permissioned web. Additionally, when item-level fine-grained permissions must be preserved, as a best practice the cumulative number of security principals who will be granted access should be limited to approximately 2,000, although this is not a fixed limit. Therefore, the effective membership of each web that includes all Limited Access members users, should be no more than approximately 2,000 users. This keeps each web-level scope from growing too large.
+In the following diagram, the physical architecture was changed so that each document library is in a uniquely permissioned web. Additionally, when item-level fine-grained permissions must be preserved, as a best practice the cumulative number of security principals who will be granted access should be limited to approximately 2,000, although this isn't a fixed limit. Therefore, the effective membership of each web that includes all Limited Access members users, should be no more than approximately 2,000 users. This keeps each web-level scope from growing too large.
![Illustrates a document library that is in a uniquely- permissioned Web. The membership of each Web should not exceed 2,000 users.](../media/FGP_Image7.jpg)
-The number of uniquely-scoped children is not a significant issue, and can scale to large numbers. However, the number of principles that will be added as limited access up the chain of scopes to the first uniquely permissioned web will be a limiting factor.
+The number of uniquely scoped children isn't a significant issue, and can scale to large numbers. However, the number of principles that will be added as limited access up the chain of scopes to the first uniquely permissioned web will be a limiting factor.
Lastly, although not specifically an issue about fine-grained permissions, the folder structure should guarantee that no single hierarchical level of the document library ever exceeds about 2,000 items. This limit can help guarantee good performance of views requested by users. ### Issue 3: Use fine-grained permissions by scope structure changes <a name="sol3"> </a>
-To re-architect the environment so that it still uses fine-grained permissions, but without causing too many updates to or sizing of a single web scope, consider using a different process of securing items. This is mainly applicable if the cause of the large number of unique scopes was an automated process such as an event handler or workflow that dynamically changed object permissions. The recommendation in this case is to make a code change to whatever process was creating the unique security scopes.
+To rearchitect the environment so that it still uses fine-grained permissions, but without causing too many updates to or sizing of a single web scope, consider using a different process of securing items. This is applicable if the cause of the large number of unique scopes was an automated process such as an event handler or workflow that dynamically changed object permissions. The recommendation in this case is to make a code change to whatever process was creating the unique security scopes.
#### Dynamic security changing code redesign
-In the following diagram, the scope architecture was changed so that scope membership does not cause ACL recalculation at the parent document library and web. As mentioned earlier, the effective membership of the web that includes all Limited Access members, should be no more than approximately 2,000 to keep the web-level scope from growing too large. In this case, by implementing a new SharePoint group to hold all members who should have Limited Access rights, the scope won't grow too large. When users are added to individual scopes under the web level by using the SharePoint Server **SPRoleAssignmentCollection.AddToCurrentScopeOnly** method, they can also be added, by additional code, to the new group that was established as having Limited Access rights at the web and document library level.
+In the following diagram, the scope architecture was changed so that scope membership doesn't cause ACL recalculation at the parent document library and web. As mentioned earlier, the effective membership of the web that includes all Limited Access members, should be no more than approximately 2,000 to keep the web-level scope from growing too large. In this case, by implementing a new SharePoint group to hold all members who should have Limited Access rights, the scope won't grow too large. When users are added to individual scopes under the web level by using the SharePoint Server **SPRoleAssignmentCollection.AddToCurrentScopeOnly** method, they can also be added, by extra code, to the new group that was established as having Limited Access rights at the web and document library level.
![IIustrates scope membership which does not cause ACL recalculation at the parent document library and Web.](../media/FGP_Image8.jpg) **Resolution:**
-When item-level fine-grained permissions must be preserved, the cumulative number of security principals who will be granted access should be limited to about 2,000, although this is not a fixed limit. When the number of security principals increases it takes longer to recalculate the binary ACL. If the membership of a scope is changed, the binary ACL must be recalculated. The addition of users at a child item unique scope will cause parent scopes to be updated with the new Limited Access members, even if this ultimately results in no change to the parent scope membership. When this occurs, the binary ACL for the parent scopes must also be recalculated.
+When item-level fine-grained permissions must be preserved, the cumulative number of security principals who will be granted access should be limited to about 2,000, although this isn't a fixed limit. When the number of security principals increases, it takes longer to recalculate the binary ACL. If the membership of a scope is changed, the binary ACL must be recalculated. The addition of users at a child item unique scope will cause parent scopes to be updated with the new Limited Access members, even if this ultimately results in no change to the parent scope membership. When this occurs, the binary ACL for the parent scopes must also be recalculated.
-As in the previous solution, the number of uniquely scoped children is not a significant issue, and can scale to large numbers. The number of principles that will be added as limited access up the chain of scopes to the first uniquely-permissioned web will be a limiting factor.
+As in the previous solution, the number of uniquely scoped children isn't a significant issue, and can scale to large numbers. The number of principles that will be added as limited access up the chain of scopes to the first uniquely permissioned web will be a limiting factor.
## See also <a name="fgpperfissuesolutions"> </a>
SharePoint Update A Web Application Url And Iis Bindings https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/update-a-web-application-url-and-iis-bindings.md
After you have extended a web application into a zone with a set of Internet Inf
## About updating a web application URL and IIS bindings <a name="section1"> </a>
-Unlike typical IIS applications, you cannot simply use IIS Manager or other IIS metabase tools to modify the bindings of IIS web applications that have been extended with SharePoint Server.
+Unlike typical IIS applications, you can't use IIS Manager or other IIS metabase tools to modify the bindings of IIS web applications that have been extended with SharePoint Server.
-If you modify the IIS bindings of a web application by adding a host header binding or SSL port or by changing a port number, SharePoint Server will not be aware of these changes and will not update the web application's alternate access mapping URLs. If you update the web application's alternate access mappings to change a host header, switch to an SSL URL, or change a port number, SharePoint Server will not automatically update your IIS bindings to match.
+If you modify the IIS bindings of a web application by adding a host header binding or SSL port or by changing a port number, SharePoint Server won't be aware of these changes and won't update the web application's alternate access mapping URLs. If you update the web application's alternate access mappings to change a host header, switch to an SSL URL, or change a port number, SharePoint Server won't automatically update your IIS bindings to match.
To update the URL or IIS bindings of a web application, unextend and reextend the web application and reconfigure the alternate access mapping URLs or IIS website bindings.
-We do not recommend reusing the same IIS website for your HTTP and SSL hosting. Instead, extend a dedicated HTTP and a dedicated SSL website, with each assigned to its own alternate access mapping zone and URLs.
+We don't recommend reusing the same IIS website for your HTTP and SSL hosting. Instead, extend a dedicated HTTP and a dedicated SSL website, with each assigned to its own alternate access mapping zone and URLs.
For more information about alternate access mappings, see [Plan alternate access mappings for SharePoint 2013](plan-alternate-access-mappings.md).
After you have unextended the web application, you can reextend the web applicat
4. In the **Port**, **Host Header**, and **Use Secure Sockets Layer (SSL)** fields, type the IIS bindings you want to use.
-5. In the **Load Balanced URL** section, in the **URL** field, type the URL that users will use to locate this web application. If you are using a load balancer or reverse proxy, this is the URL of the load balancer or reverse proxy.
+5. In the **Load Balanced URL** section, in the **URL** field, type the URL that users will use to locate this web application. If you're using a load balancer or reverse proxy, this is the URL of the load balancer or reverse proxy.
6. In the **Load Balanced URL** section, in the **Zone** list, click the zone that you previously selected.
To complete the process of updating a web application URL or IIS bindings, perfo
### Update the alternate access mapping URLs for the zone
-If you are using a load balancer or a reverse proxy, make sure that your internal URLs are updated in the alternate access mappings to reflect the new IIS bindings. In addition, update your load balancer rules or your reverse proxy rules to align with the new IIS bindings.
+If you're using a load balancer or a reverse proxy, make sure that your internal URLs are updated in the alternate access mappings to reflect the new IIS bindings. In addition, update your load balancer rules or your reverse proxy rules to align with the new IIS bindings.
### Apply an SSL certificate
If Excel Services in SharePoint Server 2013 is part of your deployment, verify t
### Redeploy solutions
-When you remove SharePoint Server from an IIS website, if you are removing the last (or only) website that is associated with the web application, any web application solutions you have deployed will also be removed. If you need these solutions, redeploy them. For additional information about how to manage solutions, see [Install and manage solutions for SharePoint Server](./configure-excel-services.md)
+When you remove SharePoint Server from an IIS website, if you're removing the last (or only) website that is associated with the web application, any web application solutions you have deployed will also be removed. If you need these solutions, redeploy them. For more information about how to manage solutions, see [Install and manage solutions for SharePoint Server](./configure-excel-services.md)
SharePoint Upgrade Sharepoint 2013 To Sharepoint 2016 Through Workflow Manager https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/upgrade-sharepoint-2013-to-sharepoint-2016-through-workflow-manager.md
SharePoint stores the workflow history and workflow task information for SharePo
### Prerequisites
-The following prrequisites must be done installed this upgrade:
+The following prerequisites must be done installed this upgrade:
- Install the latest cumulative update for Workflow Manager by using Web Platform Installer (Web PI).
The following prrequisites must be done installed this upgrade:
Use the following steps to register Workflow Manager with SharePoint Server 2016:
-1. In the SharePoint 2013 farm, on the Central Administration website click **Application Management** and click **Manage Service Applications**, and then delete **Workflow Service Applidcation Proxy**.
+1. In the SharePoint 2013 farm, on the Central Administration website click **Application Management** and click **Manage Service Applications**, and then delete **Workflow Service Application Proxy**.
2. In the SharePoint Server 2016 farm, run the following Microsoft PowerShell cmdlet to pair SharePoint 2016 together with the same Workflow Manager installation:
If your site URL is changed in SharePoint 2016 but the site ID remains the same,
If workflows don't start on some sites, republish the workflows from the affected site. Or, run the **Refresh Trusted Security Token Services Metadata feed** timer job.
-### Issue 3: Workflows fail and return the "Cannot get app principal permission information" error
+### Issue 3: Workflows fail and return the "Can't get app principal permission information" error
Consider the following scenario:
Consider the following scenario:
- You have recently connected sites in the farm to a previously existing instance of Workflow Manager.
-In this scenario, workflows that are created after you connect to the Workflow Manager installation finish successfully. However, workflows that are created before you connect to Workflow Manager don't finish. Instead, they get stuck when they try to finish or they remain in a suspended state. For workflows that remain suspended, you receive an HTTP 500 error. Additionally, the following entry is logged in the ULS log: *Cannot get app principal permission information.*
+In this scenario, workflows that are created after you connect to the Workflow Manager installation finish successfully. However, workflows that are created before you connect to Workflow Manager don't finish. Instead, they get stuck when they try to finish or they remain in a suspended state. For workflows that remain suspended, you receive an HTTP 500 error. Additionally, the following entry is logged in the ULS log: *Can't get app principal permission information.*
### Cause
To get the SPAuthenticationRealm value of ApplicationID that's stored in the sco
Alternatively, you can find the SPAuthenticationRealm value in ULS log, such as in the following example log entry:
-11/03/2017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation Authentication Authorization an3eg Medium Cannot get app principal permission information. AppId=i:0i.t|ms.sp.ext|\<SPWeb object ID\>@\<SPAuthenticationRealm\>
+11/03/2017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation Authentication Authorization an3eg Medium Can't get app principal permission information. AppId=i:0i.t|ms.sp.ext|\<SPWeb object ID\>@\<SPAuthenticationRealm\>
-11/03/2017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation General 8nca Medium Application error when access /site/teamsite/teamweb/_vti_bin/client.svc, Error=Object reference not set to an instance of an object. at Microsoft.SharePoint.SPAppRequestContext.EnsureTenantPermissions(SPServiceContext serviceContext, Boolean throwIfAppNotExits, Boolean allowFullReset) at Microsoft.SharePoint.SPAppRequestContext.InitCurrent(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.InitCurrentAppPrincipalToken(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.PostAuthenticateRequestHandler(Object oSender, EventArgs ea) at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
+11/03/2017 12:13:16.72 w3wp.exe (SPWFE01:0x51FC) 0x1298 SharePoint Foundation General 8nca Medium Application error when access /site/teamsite/teamweb/_vti_bin/client.svc, Error=Object reference not set to an instance of an object at Microsoft.SharePoint.SPAppRequestContext.EnsureTenantPermissions(SPServiceContext serviceContext, Boolean throwIfAppNotExits, Boolean allowFullReset) at Microsoft.SharePoint.SPAppRequestContext.InitCurrent(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.InitCurrentAppPrincipalToken(HttpContext context) at Microsoft.SharePoint.ApplicationRuntime.SPRequestModule.PostAuthenticateRequestHandler(Object oSender, EventArgs ea) at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
SharePoint Wan Performance And Testing https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/administration/wan-performance-and-testing.md
SharePoint Server 2013 is optimized to perform well over WAN connections. This a
## Key concepts <a name="section1"> </a> -- Bandwidth ΓÇö The data transfer capacity, or speed of transmission, of a digital communications system as measured in bits-per-second (bps).
+- BandwidthΓÇöThe data transfer capacity, or speed of transmission, of a digital communications system as measured in bits-per-second (bps).
-- Latency ΓÇö The time that is required for a request to travel from one point on a network to another point.
+- LatencyΓÇöThe time that is required for a request to travel from one point on a network to another point.
- Network congestion - The condition of a network when the current load approaches or exceeds the available resources and bandwidth that are designed to handle that load at a particular location in the network. Packet loss and delays are associated with congestion.
SharePoint Server 2013 is optimized to perform well over WAN connections. This a
SharePoint Server 2013 responds to incoming requests 50% faster the previous version. It utilizes available bandwidth between the server and the client almost 40% more efficiently than the previous version. These performance gains were quantified in Microsoft's environment with the busiest SharePoint farms in the world.
-A Microsoft 365 environment demands higher levels of performance over WAN connections because many customers are geographically distributed. As a result, Microsoft 365 was extensively tested under WAN conditions. Testing scenarios included latencies up to 300 milliseconds, which is much higher than latencies between the North America and Asia.
+A Microsoft 365 environment demands higher levels of performance over WAN connections because many customers are geographically distributed. As a result, Microsoft 365 was extensively tested under WAN conditions. Testing scenarios included latencies up to 300 milliseconds, which is higher than latencies between the North America and Asia.
To achieve up to 40% improvement in the use of available bandwidth (compared to the previous version), optimizations were targeted to various layers of the network stack: - IIS compression and image compression are more effective on the server side. -- Servers respond to http and https requests much faster.
+- Servers respond to http and https requests faster.
- Low-level TCP/IP optimizations result in better use of the communication ports that are open between the client and the server. The ports ramp up quicker and are used more efficiently. Users benefit not only from the performance gains but also from additional features that improve the experience: -- Active download management and script on demand ΓÇö these optimizations prioritize resources and JavaScript to download the content that is most meaningful to users first.
+- Active download management and script on demandΓÇöthese optimizations prioritize resources and JavaScript to download the content that is most meaningful to users first.
- Smooth page transitions with animations provide a rich, interactive browser experience. -- Minimal download strategy ΓÇö As users browse SharePoint content, only changes to a page are downloaded and sent to the client.
+- Minimal download strategyΓÇöAs users browse SharePoint content, only changes to a page are downloaded and sent to the client.
## WAN product-team test results <a name="section3"> </a>
-The following diagrams detail the effect of WAN performance optimizations on one of the most popular pages in SharePoint ΓÇö Teamsite. The diagrams show network traces of Teamsite for both SharePoint 2010 and SharePoint Server 2013 with the following network conditions:
+The following diagrams detail the effect of WAN performance optimizations on one of the most popular pages in SharePointΓÇöTeamsite. The diagrams show network traces of Teamsite for both SharePoint 2010 and SharePoint Server 2013 with the following network conditions:
- Approximately 300 ms latency roundtrip - 1 mpbs bandwidth connection between the server and clients
-These conditions represent higher latencies and lower bandwidths than are typical for global WAN connections. However, some customers who have extremely remote sites find themselves within this range (for example, mining, oil and gas, and global construction companies). 1 mpbs bandwidth connection is lower than a typical mobile phone connection.
+These conditions represent higher latencies and lower bandwidths than are typical for global WAN connections. However, some customers who have remote sites find themselves within this range (for example, mining, oil and gas, and global construction companies). 1 mpbs bandwidth connection is lower than a typical mobile phone connection.
The following diagram demonstrates that SharePoint Server 2013 makes better use of available communication ports. ![Comparison of port usage between SharePoint 2010 and SharePoint 2013](../media/WAN_perf_ports.gif)
-In the two network traces, the horizontal rows represent the ports that are open. The colored blocks represent content that is traveling over the wire, such as the images, JavaScript, and HTML. In the SharePoint 2010 network trace, the white spaces between the colored blocks represent idle time in which the client or server is waiting for something to happen before completing the next action. In the SharePoint Server 2013 network trace, the network pipe is filled almost 100%. Communication between the client and server is ongoing until the transaction is complete. There is very little or no idle time between actions. These improvements are provided by the optimizations described earlier in this article (minimal download strategy, active download management, and script on demand).
+In the two network traces, the horizontal rows represent the ports that are open. The colored blocks represent content that is traveling over the wire, such as the images, JavaScript, and HTML. In the SharePoint 2010 network trace, the white spaces between the colored blocks represent idle time in which the client or server is waiting for something to happen before completing the next action. In the SharePoint Server 2013 network trace, the network pipe is filled almost 100%. Communication between the client and server is ongoing until the transaction is complete. There's little or no idle time between actions. These improvements are provided by the optimizations described earlier in this article (minimal download strategy, active download management, and script on demand).
The following diagram calls attention to the improvement in bandwidth utilization. The blue graphs in both network traces represent the bandwidth utilization. The use of available bandwidth is more efficient in SharePoint Server 2013. ![Comparison of bandwidth utilization between SharePoint 2010 and SharePoint 2013 and illustration of improved efficiency of SharePoint 2013.](../media/WAN_perf_bandwidth.gif)
-This following diagram of the network traces shows that the content that users interact with on the page (the document library, prompts, navigational elements, etc.) are downloaded a full second faster in SharePoint Server 2013 compared to SharePoint 2010. Users can interact with the site much sooner.
+This following diagram of the network traces shows that the content that users interact with on the page (the document library, prompts, navigational elements, etc.) are downloaded a full second faster in SharePoint Server 2013 compared to SharePoint 2010. Users can interact with the site sooner.
![Comparison of content download speed between SharePoint 2010 and SharePoint 2013](../media/WAN_perf_content.gif)
Compared to SharePoint 2010, WAN optimizations in SharePoint Server 2013 achieve
- Downloads 65% fewer bytes for images because of better use of image compression. -- Downloads 20% more bytes for the JavaScript which provides quicker and improved functionality in the browser.
+- Downloads 20% more bytes for the JavaScript, which provides quicker and improved functionality in the browser.
- Downloads 15% fewer bytes total.
The simplest method to test performance over WAN connections is to have a user a
For example, during the early adoption phase of SharePoint Server 2013, Microsoft worked with Teck to evaluate WAN performance between the mining company's two datacenters in Santiago, Chile, and Calgary, Canada. Mahmood Jaffer, IT Specialist and SharePoint Architect, created a remote connection from his Canadian office to the datacenter in Santiago, Chile. From a computer in Santiago, he connected to a server running SharePoint Server 2013 in the Calgary datacenter and uploaded several files. He also connected to a server running SharePoint 2010 in Calgary and uploaded files that have the same characteristics. The following table records the results.
-**Teck unit test ΓÇö File upload from Santiago to Calgary (140ms latency) with Riverbed device**
+**Teck unit testΓÇöFile upload from Santiago to Calgary (140ms latency) with Riverbed device**
|**File size and type**|**SharePoint 2010**|**SharePoint 2013**| |:--|:--|:--|
For example, during the early adoption phase of SharePoint Server 2013, Microsof
An important consideration for this user test is the use of a WAN accelerator device between the two locations. Teck uses a Riverbed device to accelerate traffic. WAN accelerators look for patterns within packets of data and potentially only send packets that are unique, replacing duplicate packets with content that is cached on the other end. For Teck to obtain accurate results, it was important to use files that have different content for each test, instead of just renaming files.
-To repeat this unit test, the Microsoft SharePoint writing team had colleagues in the Beijing office connect to SharePoint sites in the Redmond office. In this scenario, two writers repeated the test multiple times throughout the day and produced a range of results. Files with different content were used each time to avoid potential caching issues, although a WAN accelerator device is not used between the two locations. The following table records the results.
+To repeat this unit test, the Microsoft SharePoint writing team had colleagues in the Beijing office connect to SharePoint sites in the Redmond office. In this scenario, two writers repeated the test multiple times throughout the day and produced a range of results. Files with different content were used each time to avoid potential caching issues, although a WAN accelerator device isn't used between the two locations. The following table records the results.
-**Microsoft writing team unit test ΓÇö File upload from Beijing to Redmond (144ms latency)**
+**Microsoft writing team unit testΓÇöFile upload from Beijing to Redmond (144ms latency)**
|**File size and type**|**SharePoint 2010**|**SharePoint 2013**| |:--|:--|:--|
Several observations result from a comparison of these two sets of results:
- Simple unit testing can provide meaningful data. In these two cases, the real-world experience is unlikely to be duplicated by plugging numbers for bandwidth and latency into a WAN simulation device.
-Here are recommendations if you conduct your own simple unit testing:
+Here are recommendations if you conduct your own unit testing:
- Use different files that have different content to avoid optimization of WAN accelerator devices on second upload.
Here are recommendations if you conduct your own simple unit testing:
## WAN test tools and scenarios for systematic testing <a name="section5"> </a>
-Before you begin any type of systematic load testing across a WAN environment make, sure that you understand the nature of your network. You should have data about bandwidth, latency, network congestion, packet loss and types of devices between users and the SharePoint front-end web server. This data is not always easy to obtain. However tools, such as System Center Operations Manager, can make it easier.
+Before you begin any type of systematic load testing across a WAN environment make, sure that you understand the nature of your network. You should have data about bandwidth, latency, network congestion, packet loss and types of devices between users and the SharePoint front-end web server. This data isn't always easy to obtain. However tools, such as System Center Operations Manager, can make it easier.
After you understand the network environment, you'll know whether you must address items before you test over the WAN. For initial testing, minimize network congestion and packet loss. Also remove or disable network optimization devices. This will leave you with bandwidth and latency as the two primary factors that will impact your end-users from a network perspective. ### Test tools
-After you address WAN constraints, you can start to use a combination of tools to testing WAN efficiency. Prescriptive tools, such as Visual Studio 2012 Update 1, provide repeatable unit and load testing capabilities. Non-prescriptive tools, such as Microsoft Network Monitor (Netmon) with Visual Round Trip Analyzer, provide end-user oriented monitoring. Both types of tools can be useful because they each provide a different approach to WAN testing and data collection. The combined results can provide a complete view of the impact of WAN connections on user performance.
+After you address WAN constraints, you can start to use a combination of tools to testing WAN efficiency. Prescriptive tools, such as Visual Studio 2012 Update 1, provide repeatable unit and load testing capabilities. Nonprescriptive tools, such as Microsoft Network Monitor (Netmon) with Visual Round Trip Analyzer, provide end-user oriented monitoring. Both types of tools can be useful because they each provide a different approach to WAN testing and data collection. The combined results can provide a complete view of the impact of WAN connections on user performance.
The following chart lists the strengths of both tools.
Create test scenarios that reflect the types of actions users will perform as pa
- Add a social tag
-The goal is to have a well-rounded set of unit tests which capture actions that end-users perform in a SharePoint environment and expose any potential latency-sensitive transactions.
+The goal is to have a well-rounded set of unit tests, which capture actions that end-users perform in a SharePoint environment and expose any potential latency-sensitive transactions.
-Finally, make sure that you conduct rounds of tests at various times throughout the day to capture differences in network utilization patterns. For example, 09:00 on Monday morning may have a very different network and performance pattern compared to 23:00 on Friday. Also, be aware of events in other geographical regions, such as a natural disaster that results in region-wide power outages, that might impact WAN routing or performance. A comprehensive set of tests spread across different time intervals will provide insight and set expectations about what your end-users will experience when they use SharePoint Server 2013 across the WAN.
+Finally, make sure that you conduct rounds of tests at various times throughout the day to capture differences in network utilization patterns. For example, 09:00 on Monday morning may have a different network and performance pattern compared to 23:00 on Friday. Also, be aware of events in other geographical regions, such as a natural disaster that results in region-wide power outages, that might impact WAN routing or performance. A comprehensive set of tests spread across different time intervals will provide insight and set expectations about what your end-users will experience when they use SharePoint Server 2013 across the WAN.
### Example WAN test using Visual Studio 2013
The test results show that performance is good, especially for the social tasks.
The next set of results shows performance for the same load test across a larger set of geographic locations where Fabrikam employees work. The SharePoint servers are located in Texas, USA.
-**FabrikamΓÇö Results across the feature set for differentlocations**
+**FabrikamΓÇöResults across the feature set for different locations**
![Fabrikam test results for WAN connections initiating in Australia, Germany, India, Singapore, South Africa, and the UK. 2-6 seconds for file download. 3-8 seconds for file upload. less than 2 seconds for most social tasks.](../media/WAN_CaseStudy_chart2.gif) Even though there are varying degrees of latency, performance is good for users across the globe. The Fabrikam test results provide an example of systematic WAN testing that uses a load test made up of many SharePoint tasks that are important to the company.
-Fabrikam is an example of a world-wide company that succeeds with a central datacenter model, instead of deploying SharePoint Server 2013 to multiple regions across the world. If you plans include a move from a central datacenter model to multiple SharePoint sites in different geographical regions, make sure that you conduct WAN testing to see whether it is really necessary.
+Fabrikam is an example of a world-wide company that succeeds with a central datacenter model, instead of deploying SharePoint Server 2013 to multiple regions across the world. If your plans include a move from a central datacenter model to multiple SharePoint sites in different geographical regions, make sure that you conduct WAN testing to see whether it's necessary.
## See also <a name="section5"> </a>
SharePoint Document Set Planning https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/governance/document-set-planning.md
This article describes Documents Sets and provides guidance on how you can integ
## About Document Sets <a name="bkmk_about_ds"> </a>
-Document Sets are a feature in SharePoint Server that enables an organization to manage a single deliverable, or work product, which can include multiple documents or files. A Document Set is a special kind of folder that combines unique Document Set attributes, the attributes and behavior of folders and documents, and provides a user interface (UI), metadata, and object model elements to help manage all aspects of the work product.
+Document Sets are a feature in SharePoint Server that enables an organization to manage a single deliverable, or work product, which can include multiple documents or files. A Document Set is a special kind of folder that combines unique Document Set attributes, the attributes, and behavior of folders and documents, and provides a user interface (UI), metadata, and object model elements to help manage all aspects of the work product.
For teams and users in many organizations, a set of documents, or a work product, is needed to better manage a project or deliverable. For example, a legal team might need to collect, create, and manage various documents, photos, and audio files that are related to a particular case. Or, a sales team might need to compile documents from various sources to create and manage a request for proposal (RFP) for a potential client. Documents Sets provide those teams and users with the ability to manage those sets of documents as a single collection, deliverable, or work product. Document Set owners can then create a custom Welcome Page that can display the items included and important information about the work product.
-In SharePoint Server, organizations that want to create and manage Document Sets consistently can configure a Document Set content type for each work product they typically create. A Document Set content type can then define approved content types, attributes, default items, columns, workflows, and policies. Additional customized Document Set content types can then be created from the parent content type, each inheriting properties and settings from the parent Document Set content type. After the content type is added to a library, users can then create a Document Set that inherits the attributes of the Document Set content type by using the **New** command. A Document Set content type provides additional settings that enable you to specify allowed content types, default content, shared columns, Welcome Page columns, and default Welcome Page view.
+In SharePoint Server, organizations that want to create and manage Document Sets consistently can configure a Document Set content type for each work product they typically create. A Document Set content type can then define approved content types, attributes, default items, columns, workflows, and policies. More customized Document Set content types can then be created from the parent content type, each inheriting properties and settings from the parent Document Set content type. After the content type is added to a library, users can then create a Document Set that inherits the attributes of the Document Set content type by using the **New** command. A Document Set content type provides additional settings that enable you to specify allowed content types, default content, shared columns, Welcome Page columns, and default Welcome Page view.
For more information about content types, see [Plan content types and workflows in SharePoint 2013](/previous-versions/office/sharepoint-server-2010/cc262735(v=office.14)).
For more information about how to create and manage Document Sets in SharePoint
Document Sets in SharePoint Server share many of the same attributes and properties as folders. However there are some important considerations you should be aware of when planning a Document Set solution. -- There is no limit on the number of documents that can exist in a Document Set. However, display load times may be limited by the list view threshold which by default is set at 5,000 items. Folders are allowed in document sets, but metadata navigation cannot be used in a Document Set. Therefore, it is important to consider the possibility of exceeding list view thresholds and navigation design concerns when you determine how many items should exist in a Document Set. In addition, when you use the **Send to** feature with a Document Set, the sum for all documents in a Document Set cannot be larger than 50MB. For a collection or work product with a very large number of items, a folder structure in a document library may be a better solution.
+- There's no limit on the number of documents that can exist in a Document Set. However, display load times may be limited by the list view threshold, which by default is set at 5,000 items. Folders are allowed in document sets, but metadata navigation can't be used in a Document Set. Therefore, it's important to consider the possibility of exceeding list view thresholds and navigation design concerns when you determine how many items should exist in a Document Set. In addition, when you use the **Send to** feature with a Document Set, the sum for all documents in a Document Set can't be larger than 50 MB. For a collection or work product with a large number of items, a folder structure in a document library may be a better solution.
-- There is no limit on the number of Document Sets that can exist in a document library. However, the number of Document Sets that can appear in lists will be limited by the list view threshold.
+- There's no limit on the number of Document Sets that can exist in a document library. However, the number of Document Sets that can appear in lists will be limited by the list view threshold.
-- When using shared metadata, if there are more than 10 items that are using shared metadata in a Document Set, metadata updates will be run by a timer job every 15 minutes. For example, if you have 10 documents in the top level of the library, and a single document in a Document Set with shared metadata, the time job will not run. But if you add another Document Set with 9 more documents, the timer job will run.
+- When using shared metadata, if there are more than 10 items that are using shared metadata in a Document Set, metadata updates will be run by a timer job every 15 minutes. For example, if you have 10 documents in the top level of the library, and a single document in a Document Set with shared metadata, the time job won't run. But if you add another Document Set with nine more documents, the timer job will run.
- When using Document Set routing, Document Sets that are sent to a content organizer will remain in the drop-off library and be moved to the appropriate location by the content organizer processing timer job, which by default runs daily.
SharePoint Records Management In Sharepoint Server https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/governance/records-management-in-sharepoint-server.md
A record is a document or other electronic or physical entity in an organization
- Determines what kinds of information should be considered records. -- Determines how active documents that will become records should be handled while they are being used, and determines how they should be collected after they are declared to be records.
+- Determines how active documents that will become records should be handled while they're being used, and determines how they should be collected after they're declared to be records.
- Determines in what manner and for how long each record type should be retained to meet legal, business, or regulatory requirements. -- Researches and implements technological solutions and business processes to help ensure that the organization complies with its records management obligations in a cost-effective and non-intrusive way.
+- Researches and implements technological solutions and business processes to help ensure that the organization complies with its records management obligations in a cost-effective and nonintrusive way.
- Performs records-related tasks such as disposing of expired records or locating and protecting records that are related to external events such as lawsuits.
-Determining which documents and other physical or electronic items in your organization are records is the responsibility of corporate compliance officers, records managers, and lawyers. By carefully categorizing all enterprise content in your organization, these people can help you ensure that documents are retained for the appropriate period of time. A well-designed records management system helps protect an organization legally, helps the organization demonstrate compliance with regulatory obligations, and increases organizational efficiency by promoting the disposition of out-of-date items that are not records.
+Determining which documents and other physical or electronic items in your organization are records is the responsibility of corporate compliance officers, records managers, and lawyers. By carefully categorizing all enterprise content in your organization, these people can help you ensure that documents are retained for the appropriate period of time. A well-designed records management system helps protect an organization legally, helps the organization demonstrate compliance with regulatory obligations, and increases organizational efficiency by promoting the disposition of out-of-date items that aren't records.
![Elements of a records management system](../media/RM_Elements_ZA10062703.gif) A records management system includes the following elements: -- **A content analysis** that describes and categorizes content in the enterprise that can become records, that provides source locations, and that describes how the content will move to the records management application.
+- **A content analysis** that describes and categorizes content in the enterprise that can become records that provides source locations and that describes how the content will move to the records management application.
- **A file plan** that indicates, for each kind of record in the enterprise, where they should be retained as records, the policies that apply to them, how long they must be retained, how they should be disposed of, and who is responsible for managing them.
A records management system includes the following elements:
- **A method for collecting records that are no longer active** from all record sources, such as collaboration servers, file servers, and email systems. -- **A method for auditing records** while they are active.
+- **A method for auditing records** while they're active.
- **A method for capturing records' metadata** and audit histories and for maintaining them.
SharePoint Server includes features that can help organizations implement integr
## Overview of records management planning <a name="section2"> </a>
-This topic describes the planning steps that you should take to help make sure that the records management system that you implement based on SharePoint Server will achieve your organization's records management goals. The following is a preview of the records management planning process:
+This article describes the planning steps that you should take to help make sure that the records management system that you implement based on SharePoint Server will achieve your organization's records management goals. The following is a preview of the records management planning process:
1. **Identify records management roles** Successful records management requires specialized roles, such as the following:
This topic describes the planning steps that you should take to help make sure t
- IT personnel to implement the systems that efficiently support records management.
- - Content managers to find where organizational information is kept and to make sure that that their teams follow records management practices.
+ - Content managers to find where organizational information is kept and to make sure that their teams follow records management practices.
2. **Analyze organizational content** Before creating a file plan, records managers and content managers survey document usage in the organization to determine which documents and other items can become records.
-3. **Develop a file plan** After you have analyzed your organizational content and determined retention schedules, fill in the rest of the file plan. File plans differ from organization to organization, but generally they describe the kinds of items the enterprise acknowledges to be records, indicate where they are stored, describe their retention periods, and provide other information, such as who is responsible for managing them and which broader category of records they belong to.
+3. **Develop a file plan** After you have analyzed your organizational content and determined retention schedules, fill in the rest of the file plan. File plans differ from organization to organization, but generally they describe the kinds of items the enterprise acknowledges to be records, indicate where they're stored, describe their retention periods, and provide other information, such as who is responsible for managing them and which broader category of records they belong to.
-4. **Develop retention schedules** For each record type, determine when it is no longer active (being used), how long it should be retained after that, and how it should ultimately be disposed of.
+4. **Develop retention schedules** For each record type, determine when it's no longer active (being used), how long it should be retained after that, and how it should ultimately be disposed of.
5. **Evaluate and improve document management practices** Make sure that required policies are being applied in document repositories. For example, make sure that that content is being appropriately audited so that suitable audits are retained together with records.
-6. **Design the records management solution** Determine whether to create a records archive, to manage records in place, or to use a combination of the two approaches. Based on your file plan, design the record archive, or determine how to use existing sites to contain records. Define content types, libraries, policies, and, when it is required, metadata that determines the location to route a document to.
+6. **Design the records management solution** Determine whether to create a records archive, to manage records in place, or to use a combination of the two approaches. Based on your file plan, design the record archive, or determine how to use existing sites to contain records. Define content types, libraries, policies, and, when it's required, metadata that determines the location to route a document to.
-7. **Plan how content becomes records** If you are using SharePoint Server for both active document management and records management, you can create custom workflows to move documents to a records archive. If you are using either SharePoint Server or an external document management system, you can plan and develop interfaces that move content from those systems to the records archive, or that declare a document to be a record but do not move the document. You also create a training plan to teach users how to create and work with records.
+7. **Plan how content becomes records** If you're using SharePoint Server for both active document management and records management, you can create custom workflows to move documents to a records archive. If you're using either SharePoint Server or an external document management system, you can plan and develop interfaces that move content from those systems to the records archive, or that declare a document to be a record but don't move the document. You also create a training plan to teach users how to create and work with records.
-8. **Plan email integration** Determine whether you will manage email records within SharePoint Server , or whether you will manage email records within the email application itself.
+8. **Plan email integration** Determine whether you'll manage email records within SharePoint Server, or whether you'll manage email records within the email application itself.
9. **Plan compliance for social content** If your organization uses social media such as blogs, wikis, or My Sites, determine how this content will become records.
SharePoint Workflow In Sharepoint Server https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/governance/workflow-in-sharepoint-server.md
ms.localizationpriority: medium ms.assetid: 70d4b4eb-9df0-44e5-980f-542614273439
-description: "Workflows in SharePoint allow you to model and automate business processes. These business processes can range from simple to complex. But most importantly, workflow lets users focus on doing the work -- rather than managing the workflow."
+description: "Workflows in SharePoint allow you to model and automate business processes. These business processes can range from simple to complex. But most importantly, workflow lets users focus on doing the work--rather than managing the workflow."
# Workflow in SharePoint Server [!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
-Workflows in SharePoint allow you to model and automate business processes. These business processes can range from simple to complex. But most importantly, workflow lets users focus on doing the work -- rather than managing the workflow.
-
+Workflows in SharePoint allow you to model and automate business processes. These business processes can range from simple to complex. But most importantly, workflow lets users focus on doing the work--rather than managing the workflow.
+
Workflows help people to collaborate on documents and to manage project tasks by implementing business processes on documents and items in a SharePoint site. Workflows help organizations adhere to consistent business processes, and improve organizational efficiency and productivity. ## SharePoint 2013 Workflow and Power Automate
-SharePoint Server 2019 supports a variety of workflow technologies to meet the needs of our customers. These workflow technologies are described below.
+SharePoint Server 2019 supports various workflow technologies to meet the needs of our customers. These workflow technologies are described below.
-* The SharePoint 2013 Workflow platform is the recommended workflow technology for SharePoint Server 2019. These workflows integrate with Microsoft Workflow Manager and will provide a stable and reliable workflow experience for SharePoint Server 2019.
-* Power Automate is the new cloud-based platform to automate actions across a variety of applications and services. Thanks to hybrid technology, you can also integrate Power Automate into your SharePoint Server environments using the On-Premises Data Gateway. Each related user must be assigned a Power Apps Plan 1 license to use Power Automate with SharePoint Server, which is not included as part of the SharePoint Server license. In addition, Power Automate is currently optimized for non-interactive workflows with SharePoint Server. If you require interactive workflows, we recommend exploring the SharePoint 2013 Workflow platform instead.
-* The SharePoint 2010 Workflow platform is also supported in SharePoint Server 2019 for backward compatibility. This allows SharePoint Sever 2019 to run legacy workflows from previous versions of SharePoint Server. Although SharePoint 2010 workflows are still supported, we don't recommend building new workflows using this technology. Instead, we recommend exploring either SharePoint 2013 workflows or Power Automate.
+* The SharePoint 2013 Workflow platform is the recommended workflow technology for SharePoint Server 2019. These workflows integrate with Microsoft Workflow Manager and will provide a stable and reliable workflow experience for SharePoint Server 2019.
+* Power Automate is the new cloud-based platform to automate actions across various applications and services. Thanks to hybrid technology, you can also integrate Power Automate into your SharePoint Server environments using the On-Premises Data Gateway. Each related user must be assigned a Power Apps Plan 1 license to use Power Automate with SharePoint Server, which isn't included as part of the SharePoint Server license. In addition, Power Automate is currently optimized for non-interactive workflows with SharePoint Server. If you require interactive workflows, we recommend exploring the SharePoint 2013 Workflow platform instead.
+* The SharePoint 2010 Workflow platform is also supported in SharePoint Server 2019 for backward compatibility. This allows SharePoint Server 2019 to run legacy workflows from previous versions of SharePoint Server. Although SharePoint 2010 workflows are still supported, we don't recommend building new workflows using this technology. Instead, we recommend exploring either SharePoint 2013 workflows or Power Automate.
SharePoint Hybrid Site Following https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/hybrid/hybrid-site-following.md
When you enable hybrid site following:
- The **Sites** tile in the app launcher ![Microsoft 365 app launcher icon](../media/0aaa6945-f9a4-4b13-bf5f-d5c5dbe978fb.png) is redirected to Microsoft 365 (SharePoint Server 2016 and SharePoint Server 2019). -- When a user follows a site in SharePoint Server, it is added to the followed list in both SharePoint Server and Microsoft 365.
+- When a user follows a site in SharePoint Server, it's added to the followed list in both SharePoint Server and Microsoft 365.
While the SharePoint Server followed list continues to be updated, users are directed to the followed list in Microsoft 365, which contains followed sites from both locations. The SharePoint in Microsoft 365 newsfeed functionality is unaffected. Users will continue to have separate newsfeeds in SharePoint Server and Microsoft 365, and each will show activities for sites and documents for SharePoint Server and Microsoft 365, respectively. Also, follow documents functionality remains unaffected, and follow people functionality remains in SharePoint Server only.
-Note that existing followed sites lists in SharePoint Server are not migrated to Microsoft 365 when you turn this feature on (though any sites in the Microsoft 365 list will remain there). Users will have to follow their SharePoint Server sites again once the feature is turned on.
+Existing followed sites lists in SharePoint Server aren't migrated to Microsoft 365 when you turn on this feature (though any sites in the Microsoft 365 list will remain there). Users will have to follow their SharePoint Server sites again once the feature is turned on.
## Setting up hybrid site following
SharePoint Bestpractices For Sharepointserver Installation https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/install/BestPractices-for-SharePointServer-Installation.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top - SP2019
-description: "Learn the best practices for SharePoint Server installation and how it will get your servers ready for easy transition to the cloud."
+description: "Learn the best practices for SharePoint Server installation and how it gets your servers ready for easy transition to the cloud."
# Best practices for installation for SharePoint Servers 2016 and 2019
As you prepare for your installation, consider the following:
## Evaluating what features or services are no longer supported
-As you install a new version of SharePoint Server, take time to evaluate any service applications or features that you currently rely on that are no longer supported or listed as "deprecated".
+As you install a new version of SharePoint Server, take time to evaluate any service applications or features that you currently rely on that are no longer supported or listed as "deprecated."
-If you have a reliance on any of the following, plan what you intend to replace that functionality with or if it is no longer being used actively by your company and it is time to phase it out.
+If you have a reliance on any of the following, plan what you intend to replace that functionality with or if it's no longer being used actively by your company and it's time to phase it out.
-Use supported service applications and consider phasing out the use of these deprecated service applications. For any of the SharePoint BI tools, you can use PowerBI as a replacement:
+Use supported service applications and consider phasing out the use of these deprecated service applications. For any of the SharePoint BI tools, you can use Power BI as a replacement:
|Deprecated feature or service| |:--|
Use supported service applications and consider phasing out the use of these dep
If you currently have SharePoint Server installed, chances are you have made some customizations to suit your business needs.
-If you already have a portion of your company in the cloud or plan to do so in the future, know that certain customizations will not transfer to SharePoint. Here is a list of few of those:
+If you already have a portion of your company in the cloud or plan to do so in the future, know that certain customizations won't transfer to SharePoint. Here's a list of few of those:
-- Workflows, User Alerts, and custom master pages will not transfer to SharePoint. We recommend you use Power Automate for workflows, reconfigure alerts once migrated, and use the out of the box customization for site look and feel changes.
+- Workflows, User Alerts, and custom master pages won't transfer to SharePoint. We recommend you use Power Automate for workflows, reconfigure alerts once migrated, and use the out of the box customization for site look and feel changes.
-- Custom Search schema will not transfer to SharePoint. When content is migrated to SharePoint, you may want to re-implement any custom Search schema configuration necessary.
+- Custom Search schema won't transfer to SharePoint. When content is migrated to SharePoint, you may want to reimplement any custom Search schema configuration necessary.
-- Use SharePoint Add-ins with the Low Trust model. To learn more, see [Creating SharePoint add in that use low trust authorization](/sharepoint/dev/sp-add-ins/creating-sharepoint-add-ins-that-use-low-trust-authorization).
+- Use SharePoint Add-ins with the Low Trust model. To learn more, see [Creating SharePoint add in that use low trust authorization](/sharepoint/dev/sp-add-ins/creating-sharepoint-add-ins-that-use-low-trust-authorization).
-- Use SharePoint Framework solutions for custom business solutions. To get started, see [SharePoint Framework Overview](/sharepoint/dev/spfx/sharepoint-framework-overview).
+- Use SharePoint Framework solutions for custom business solutions. To get started, see [SharePoint Framework Overview](/sharepoint/dev/spfx/sharepoint-framework-overview).
## Connect your data the modern way
-Do you use Business Data Connectivity Services (BCS) for any of your data connections? Are your data sources available by using a web service? verify all data sources are available via other means, such as a web service.
+Do you use Business Data Connectivity Services (BCS) for any of your data connections? Are your data sources available by using a web service? Verify all data sources are available via other means, such as a web service.
- Where is your data? Where will it reside?
-Instead of using BCS to display your data, you could use PowerBI and a Data Management Gateway.
+Instead of using BCS to display your data, you could use Power BI and a Data Management Gateway.
## Adopt the modern features
-If a portion of your sites are already in the cloud, or if you intend on moving online in the future, adopting the modern features now will help "futureproof" your installation.
+If a portion of your sites is already in the cloud, or if you intend on moving online in the future, adopting the modern features now will help "futureproof" your installation.
- Use Microsoft 365 Groups and Power Automate. Retire the use of email, Site mailboxes, or Mobile Accounts (SMS/Text Messaging) - Solutions that intercept and/or modify the HTTP pipeline you could use Azure Conditional Access Policies by fronting the farm by using the Microsoft Entra application proxy. For more information on how to use AD FS, see [Access Control Policies in Windows Server 2016 AD FS](/windows-server/identity/ad-fs/operations/access-control-policies-in-ad-fs). -- Implement only the necessary Web Application Policies, such as self-service site creation, Object Cache, and Search Crawler accounts, but try to avoid further usage of Web Application Policies as they are not available in SharePoint.
+- Implement only the necessary Web Application Policies, such as self-service site creation, Object Cache, and Search Crawler accounts, but try to avoid further usage of Web Application Policies as they aren't available in SharePoint.
-- For security purposes, phase out the use of anonymous SharePoint Server sites. Also note that anonymous site access is not available in SharePoint in Microsoft 365.
+- For security purposes, phase out the use of anonymous SharePoint Server sites. Also note that anonymous site access isn't available in SharePoint in Microsoft 365.
SharePoint Install Or Uninstall Language Packs https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/install/install-or-uninstall-language-packs.md
description: "Learn how to download, install, and uninstall language packs for S
Language packs enable site owners and site collection administrators to create SharePoint sites and site collections in multiple languages without requiring separate installations of SharePoint 2013. You install language packs, which contain language-specific site templates, on web and application servers. When an administrator creates a site or a site collection that is based on a language-specific site template, the text that appears on the site or the site collection is displayed in the site template's language. Language packs are typically used in multinational deployments where a single server farm supports users in different locations, or when sites and web pages must be duplicated in one or more languages.
-If users are accessing Project Server 2016 in the SharePoint farm and have to view their project data in another language, they will also have to install a corresponding Project Server 2016 language pack. For more information about Project Server 2016 language packs, see [Deploy language packs in Project Server 2013](/project/deploy-language-packs-in-project-server-2013)
+If users are accessing Project Server 2016 in the SharePoint farm and have to view their project data in another language, they'll also have to install a corresponding Project Server 2016 language pack. For more information about Project Server 2016 language packs, see [Deploy language packs in Project Server 2013.](/project/deploy-language-packs-in-project-server-2013)
Word breakers and stemmers enable you to search efficiently and effectively across content on SharePoint sites and site collections in multiple languages without requiring separate installations of SharePoint 2013. Word breakers and stemmers are automatically installed on web and application servers by Setup.
Site owners or site collection administrators who create sites or site collectio
The language that they select has a language identifier (ID). The language ID determines the language that is used to display and interpret text that is on the site or site collection. For example, when a site owner creates a site in French, the site's toolbars, navigation bars, lists, and column headings appear in French. Similarly, if a site owner creates a site in Arabic, the site's toolbars, navigation bars, lists, and column headings appear in Arabic. In addition, the default left-to-right orientation of the site changes to a right-to-left orientation to correctly display Arabic text.
-The language packs that are installed on the web and application servers determine the list of available languages that you can use to create a site or site collection. By default, sites and site collections are created in the language in which SharePoint 2013 was installed. For example, if you install the Spanish version of SharePoint 2013, the default language for sites, site collections, and web pages is Spanish. If someone has to create sites, site collections, or web pages in a language other than the default SharePoint 2013 language, you must install the language pack for that language on the web and application servers. For example, if you are running the French version of SharePoint 2013, and a site owner wants to create sites in French, English, and Spanish, you must install the English and Spanish language packs on the web and application servers.
+The language packs that are installed on the web and application servers determine the list of available languages that you can use to create a site or site collection. By default, sites and site collections are created in the language in which SharePoint 2013 was installed. For example, if you install the Spanish version of SharePoint 2013, the default language for sites, site collections, and web pages is Spanish. If someone has to create sites, site collections, or web pages in a language other than the default SharePoint 2013 language, you must install the language pack for that language on the web and application servers. For example, if you're running the French version of SharePoint 2013, and a site owner wants to create sites in French, English, and Spanish, you must install the English and Spanish language packs on the web and application servers.
By default, when a site owner creates a new web page in a site, the site displays text in the language that is specified by the language ID.
-Language packs are not bundled into multilingual installation packages. You must install a specific language pack for each language that you want to support. Also, language packs must be installed on each web and application server to make sure that each web and application server can display content in the specified language.
+Language packs aren't bundled into multilingual installation packages. You must install a specific language pack for each language that you want to support. Also, language packs must be installed on each web and application server to make sure that each web and application server can display content in the specified language.
> [!IMPORTANT] > You cannot change an existing site, site collection, or web page from one language to another by applying different language-specific site templates. After you use a language-specific site template for a site or a site collection, the site or site collection always displays content in the language of the original site template. Only a limited set of language packs are available for SharePoint 2013.
-Although a site owner specifies a language ID for a site, some user interface elements such as error messages, notifications, and dialoges do not display in the language that was specified. This is because SharePoint 2013 relies on several supporting technologies ΓÇö for example, the Microsoft .NET Framework, Microsoft Windows Workflow Foundation, Microsoft ASP.NET, and SQL Server ΓÇö some of which are localized into only a limited number of languages. If a user interface element is generated by any of the supporting technologies that are not localized into the language that the site owner specified for the site, the user interface element appears in English. For example, if a site owner creates a site in Hebrew, and the .NET Framework component displays a notification message, the notification message will not display in Hebrew because the .NET Framework is not localized into Hebrew. This situation can occur when sites are created in any language except the following: Chinese, French, German, Italian, Japanese, Korean, and Spanish.
+Although a site owner specifies a language ID for a site, some user interface elements such as error messages, notifications, and dialogues don't display in the language that was specified. This is because SharePoint 2013 relies on several supporting technologiesΓÇöfor example, the Microsoft .NET Framework, Microsoft Windows Workflow Foundation, Microsoft ASP.NET, and SQL ServerΓÇösome of which are localized into only a limited number of languages. If a user interface element is generated by any of the supporting technologies that aren't localized into the language that the site owner specified for the site, the user interface element appears in English. For example, if a site owner creates a site in Hebrew, and the .NET Framework component displays a notification message, the notification message won't display in Hebrew because the .NET Framework isn't localized into Hebrew. This situation can occur when sites are created in any language except the following: Chinese, French, German, Italian, Japanese, Korean, and Spanish.
-Each language pack that you install creates a folder at %COMMONPROGRAMFILES%\Microsoft Shared\Web server extensions\15\LAYOUTS\Locale_ID that contains language-specific data. In each locale_ID folder, you must have only one HTML error file that contains the error information that is used when a file cannot be found. Anytime a file cannot be found for any site in that language, this file will be used. You can specify the file to use by setting the **FileNotFoundPage()** for each web application.
+Each language pack that you install creates a folder at %COMMONPROGRAMFILES%\Microsoft Shared\Web server extensions\15\LAYOUTS\Locale_ID that contains language-specific data. In each locale_ID folder, you must have only one HTML error file that contains the error information that is used when a file can't be found. Anytime a file can't be found for any site in that language, this file will be used. You can specify the file to use by setting the **FileNotFoundPage()** for each web application.
-In some cases, some text might originate from the original installation language, which can create a mixed-language experience. This kind of mixed-language experience is typically seen only by content creators or site owners and is not seen by site users.
+In some cases, some text might originate from the original installation language, which can create a mixed-language experience. This kind of mixed-language experience is typically seen only by content creators or site owners and isn't seen by site users.
## Downloading language packs <a name="section2"> </a>
-Follow these steps for each language that you want to support. If you decide to download more than one language, please be aware that a unique file that has a common name is downloaded for each language. Therefore, make sure that you download each language pack to a separate folder on the hard disk so that you do not overwrite a language pack of a different language.
+Follow these steps for each language that you want to support. If you decide to download more than one language, please be aware that a unique file that has a common name is downloaded for each language. Therefore, make sure that you download each language pack to a separate folder on the hard disk so that you don't overwrite a language pack of a different language.
> [!IMPORTANT] > By default, the Microsoft PowerShell Help files are installed in English (en-us). To view these files in the same language as the operating system, install the language pack for the same language in which the operating system was installed.
You can download language packs from the same location where you downloaded Shar
## Installing language packs on the web and application servers <a name="section4"> </a>
-After you install the necessary language files on the web and application servers, you can install the language packs. Language packs are available as individual downloads (one download for each supported language). If you have a server farm environment and you are installing language packs to support multiple languages, you must install the language packs on each web and application server.
+After you install the necessary language files on the web and application servers, you can install the language packs. Language packs are available as individual downloads (one download for each supported language). If you have a server farm environment and you're installing language packs to support multiple languages, you must install the language packs on each web and application server.
> [!IMPORTANT] > The language pack is installed in its native language. The procedure that follows is for the English language pack.
After you install the necessary language files on the web and application server
4. The Setup wizard runs and installs the language pack.
-5. Rerun the SharePoint Products Configuration Wizard by using the default settings. If you do not run the SharePoint Products Configuration Wizard after you install a language pack, the language pack will not be installed correctly.
+5. Rerun the SharePoint Products Configuration Wizard by using the default settings. If you don't run the SharePoint Products Configuration Wizard after you install a language pack, the language pack won't be installed correctly.
- The SharePoint Products Configuration Wizard runs in the language of the base installation of SharePoint 2013, not in the language of the language pack that you just installed.
+ The SharePoint Products Configuration Wizard runs in the language of the base installation of SharePoint 2013, not in the language of the language pack that you installed.
**To rerun the SharePoint 2013 Configuration Wizard**
After you install the necessary language files on the web and application server
5. On the **Modify Server Farm Settings** page, click **Do not disconnect from this server farm**, and then click **Next**.
-6. If the **Modify SharePoint Central Administration Web Administration Settings** page appears, do not change any of the default settings, and then click **Next**.
+6. If the **Modify SharePoint Central Administration Web Administration Settings** page appears, don't change any of the default settings, and then click **Next**.
7. After you complete the Completing the SharePoint Products and Technologies Configuration Wizard, click **Next**.
After you install the necessary language files on the web and application server
9. After you install a new language pack and rerun the Rerun the SharePoint 2013 Configuration Wizard, you must deactivate and then reactivate any language-specific features before you use the new language pack.
-When you install language packs, the language-specific site templates are installed in the %COMMONPROGRAMFILES%\Microsoft Shared\Web server extensions\15\TEMPLATE\ _LanguageID_ directory, where _LanguageID_ is the Language ID number for the language that you are installing. For example, the United States English language pack installs to the %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\15\TEMPLATE\1033 directory. After you install a language pack, site owners and site collection administrators can create sites and site collections based on the language-specific site templates by specifying a language when they are creating a new SharePoint site or site collection.
+When you install language packs, the language-specific site templates are installed in the %COMMONPROGRAMFILES%\Microsoft Shared\Web server extensions\15\TEMPLATE\ _LanguageID_ directory, where _LanguageID_ is the Language ID number for the language that you're installing. For example, the United States English language pack installs to the %COMMONPROGRAMFILES%\Microsoft Shared\Web Server Extensions\15\TEMPLATE\1033 directory. After you install a language pack, site owners and site collection administrators can create sites and site collections based on the language-specific site templates by specifying a language when they're creating a new SharePoint site or site collection.
## Uninstalling language packs <a name="section5"> </a>
-If you no longer have to support a language for which you have installed a language pack, you can remove the language pack by using the Control Panel. Removing a language pack removes the language-specific site templates from the computer. All sites that were created that have those language-specific site templates will no longer work (It could cause many issues that is, the URL will produce a HTTP 500 - Internal server error page, broken layout, mixture of default, and uninstalled language.). Reinstalling the language pack will make the site functional again.
+If you no longer have to support a language for which you have installed a language pack, you can remove the language pack by using the Control Panel. Removing a language pack removes the language-specific site templates from the computer. All sites that were created that have those language-specific site templates will no longer work (It could cause many issues that are, the URL will produce an HTTP 500 - Internal server error page, broken layout, mixture of default, and uninstalled language.). Reinstalling the language pack makes the site functional again.
-You cannot remove the language pack for the version of SharePoint 2013 that you have installed on the server. For example, if you are running the Japanese version of SharePoint 2013, you cannot uninstall the Japanese language support for SharePoint 2013.
+You can't remove the language pack for the version of SharePoint 2013 that you have installed on the server. For example, if you're running the Japanese version of SharePoint 2013, you can't uninstall the Japanese language support for SharePoint 2013.
SharePoint How To Change The Order In Which Search Results Are Displayed https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/search/how-to-change-the-order-in-which-search-results-are-displayed.md
description: "Learn how to change the order in which search results are displaye
In the series [How to change the way search results are displayed in SharePoint Server](how-to-change-the-way-search-results-are-displayed.md) we explained how to customize the way search results are displayed by adding custom icons and properties.
-When it comes to displaying search results, design and content are indeed very important. However, there is one thing that often trumps them both: the order in which search results are displayed.
+When it comes to displaying search results, design and content are indeed important. However, there's one thing that often trumps them both: the order in which search results are displayed.
Think of your own behavior when looking at search results. How often do you click to view the second page of search results? Often, the answer is "rarely."
-So, when displaying search results, it is important that the results that your users are looking for are displayed as high up in the search results list as possible. This article, an addendum to the [How to change the way search results are displayed in SharePoint Server](how-to-change-the-way-search-results-are-displayed.md) series, explains how to use a query rule to change the order in which classic search results are displayed. To demonstrate how query rules work, we'll use an example from an internal Microsoft Search Center.
+So, when displaying search results, it's important that the results that your users are looking for are displayed as high up in the search results list as possible. This article, an addendum to the [How to change the way search results are displayed in SharePoint Server](how-to-change-the-way-search-results-are-displayed.md) series, explains how to use a query rule to change the order in which classic search results are displayed. To demonstrate how query rules work, we'll use an example from an internal Microsoft Search Center.
In this article, you'll learn:
In this article, you'll learn:
As you know, Microsoft publishes thousands of articles across TechNet, MSDN, and Office.com. To help in the publishing process, we use several SharePoint lists. Each item in a list represents an article or a media file. To make it easy to find information about a particular list item, we created a Search Center that searches across these lists.
-The following screen shot shows the default order in which search results were displayed in our Search Center. Notice that search results for articles and images were displayed in a mixed order.
+The following screenshot shows the default order in which search results were displayed in our Search Center. Notice that search results for articles and images were displayed in a mixed order.
![Search Results Default Order](../media/OTCSP_BeforeResults.png)
-When users search for something in this Search Center, they are usually looking for information about an article. So, to make it easier for users to find information about articles, we wanted to change the order of the search results so that images would be displayed at the bottom. To do this, we had to create a query rule.
+When users search for something in this Search Center, they're usually looking for information about an article. So, to make it easier for users to find information about articles, we wanted to change the order of the search results so that images would be displayed at the bottom. To do this, we had to create a query rule.
## When using query rules: define before you assign <a name="BKMK_WhenUsingQueryRulesDefineBeforeYouAssign"> </a>
A query rule is largely what the name implies: a rule that can be applied to que
Basically, you have to define two things: a condition and an action. Simply put, this comes down to defining the following:
- *"when X (condition), do Y (action)".*
+ *"when X (condition), do Y (action)."*
-In our Search Center scenario, we knew the action part: *Display list items that represent images at the bottom of the search results list* .
+In our Search Center scenario, we knew the action part: *Display list items that represent images at the bottom of the search results list*.
-In our lists, we use the site column *Content Type* to differentiate between the type of articles or media types a list item represents. For example, all images have the value "Art" for *Content Type* .
+In our lists, we use the site column *Content Type* to differentiate between the type of articles or media types a list item represents. For example, all images have the value "Art" for *Content Type*.
![Art Content Type](../media/OTCSP_ArtListItem.png) Based on this, we were able to define the condition part so that my final definition was:
- *When list items are of Content Type "Art", display these at the end of the search results list.*
+ *When list items are of Content Type "Art," display these at the end of the search results list.*
So, with the definition in place, we could begin to create the query rule that would make this happen.
To save space, we'll only show you how to create a query rule as a Site collecti
4. On the **Add Query Rule** page, in the **Rule name** field, enter a name for the query rule.
- In our Search Center scenario, we named the query rule *Demote Art* .
+ In our Search Center scenario, we named the query rule *Demote Art*.
![Query Rule Name](../media/OTCSP_QueryRuleName.jpg)
To save space, we'll only show you how to create a query rule as a Site collecti
![Remove Condition](../media/OTCSP_RemoveCondition.jpg)
-6. In the Actions section, specify what you want the query rule to do when it is triggered.
+6. In the Actions section, specify what you want the query rule to do when it's triggered.
In our Search Center scenario, we selected **Change ranked results by changing the query**. This opened a dialog where we could define what we wanted the query rule to do.
To save space, we'll only show you how to create a query rule as a Site collecti
![Manual Condition](../media/OTCSP_ManualCondition.jpg)
- Remember, we wanted list items of Content Type *Art* to be displayed at the end of the search results list. So, in the **Manual condition** field, we entered *ContentType:Art* , and selected **Demote to bottom**.
+ Remember, we wanted list items of Content Type *Art* to be displayed at the end of the search results list. So, in the **Manual condition** field, we entered *ContentType: Art*, and selected **Demote to bottom**.
![Demote content](../media/DemoteToBottom.jpg)
To save space, we'll only show you how to create a query rule as a Site collecti
- **ContentType** is the managed property that represents the site column Content Type. [How to display values from custom managed properties in search results - option 1 in SharePoint Server](display-values-custom-managed-properties.md) explains how to find managed property names.
- - The colon : means "contains".
+ - The colon: means "contains".
- **Art** is the managed property value. - **Demote to bottom** is the action that should be taken.
- Put it together, and it matches the definition we specified: *When list items are of Content Type "Art", display these at the end of the search results list* .
+ Put it together, and it matches the definition we specified: *When list items are of Content Type "Art," display these at the end of the search results list*.
8. Select **OK**, and then **Save**.
When we now entered a search in the Search Center, we could see that articles we
## How do I know that the query rule's been applied? <a name="BKMK_HowdoIKnowthattheQueryRulesBeenApplied"> </a>
-In our Search Center scenario, we could easily verify that the query rule we created was being applied. But, if you are uncertain about whether your query rule is being applied, the **Search Results Web Part** can give you an answer.
+In our Search Center scenario, we could easily verify that the query rule we created was being applied. But, if you're uncertain about whether your query rule is being applied, the **Search Results Web Part** can give you an answer.
Here are the steps to verify that a query rule is being applied:
In our Search Center scenario, we could verify that our query rule was working b
1. In the field **Applied query rules**, the name of our query rule, Demote art, was shown.
-2. In the **Query text** section, XRANK was applied to *ContentType:Art* .
+2. In the **Query text** section, XRANK was applied to *ContentType: Art*.
## Think two times before you apply a query rule <a name="BKMK_ThinkTwiceBeforeApplyingaQueryRule"> </a>
-Even though this was a fairly simple query rule, we saw that the effect was very noticeable. So a word of warning: even though query rules are great for changing the order in which classic search results are displayed, you should think carefully before you apply too many of them. The effects can be very large, and the more complex query rules that you have, the more performance resources each query will require.
+Even though this was a fairly simple query rule, we saw that the effect was noticeable. So a word of warning: even though query rules are great for changing the order in which classic search results are displayed, you should think carefully before you apply too many of them. The effects can be large, and the more complex query rules that you have, the more performance resources each query will require.
-But, if they are used with caution, you can make the users of your Search Center very happy customers.
+But, if they're used with caution, you can make the users of your Search Center happy customers.
SharePoint How To Display Values From Custom Managed Properties In Search Resultsoption 2 https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/search/how-to-display-values-from-custom-managed-properties-in-search-resultsoption-2.md
description: "Learn a second option for displaying values from custom managed pr
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
-In [How to display values from custom managed properties in search results - option 1 in SharePoint Server](display-values-custom-managed-properties.md) we showed a simple method to add a custom icon and values from two custom managed properties to your classic search results. In this topic, we'll look at a somewhat fuller method for changing the way classic search results are displayed that includes if statements and hit highlighting. In this article, you'll learn:
+In [How to display values from custom managed properties in search results - option 1 in SharePoint Server](display-values-custom-managed-properties.md) we showed a simple method to add a custom icon and values from two custom-managed properties to your classic search results. In this article, we'll look at a fuller method for changing the way classic search results are displayed that includes if statements and hit highlighting. In this article, you'll learn:
- [Strategy for killing three birds with one stone - search results version](how-to-display-values-from-custom-managed-properties-in-search-resultsoption-2.md#BKMK_StrategyforKillingThreeBirdsWithOneStoneSearchResultsVersion)
First, let's state what we want to achieve:
- Get automatically improved relevancy for our classic search results.
-Before we look at details about how to achieve these goals, let's look at the strategy we want to follow. If this gets a bit complex, please try to hang in there. Hopefully it'll be clear by the end.
+Before we look at details about how to achieve these goals, let's look at the strategy we want to follow. If this gets a bit complex, please try to hang in there. Hopefully it will be clear by the end.
First, remember how we can think about hit highlighting:
Specifically, that means that we have to do the following:
- Add the custom managed properties to an item display template. -- In the item display template, create a variable that will be used by the property *HitHighlightedSummary* to display our two custom managed properties with hit highlighting.
+- In the item display template, create a variable that will be used by the property *HitHighlightedSummary* to display our two custom-managed properties with hit highlighting.
- In the item display template, leave the reference `_#=ctx.RenderBody(ctx)=#_` so that the *Item_ComonItem_Body* display template will render the search result. This makes sure that we get automatically improved relevancy.
Next, you have to do some configuration on the **Search Results Web Part**. Here
![Highlighted Properties Added](../media/OTCSP_HighlightedPropertiesAdded.png)
-5. Select **Apply** to save the changes. Thehe **Display Templates** section closes.
+5. Select **Apply** to save the changes. The **Display Templates** section closes.
6. To reopen the section, select **Display Templates**, and select **Use result types to display items**.
Next, you have to do some configuration on the **Search Results Web Part**. Here
Next, you have to create variables in the item display template that will be used and rendered by the *Item_Common_Item_Body* display template. Here's what you should do:
-10. Because you have no guarantee that the values of your custom properties will contain any of the entered query words, that is, hit highlighting won't be used, you have to create variables that guarantee that that the value of your custom properties will be displayed regardless of hit highlighting.
+10. Because you have no guarantee that the values of your custom properties will contain any of the entered query words, that is, hit highlighting won't be used, you have to create variables that guarantee that the value of your custom properties will be displayed regardless of hit highlighting.
- The following screen shots show how we created two such variables for our custom properties *ContentSummaryOWSMTXT* and *owstaxIdTechnicalSubject*.
+ The following screenshots show how we created two such variables for our custom properties *ContentSummaryOWSMTXT* and *owstaxIdTechnicalSubject*.
![Two Variables](../media/OTCSP_TwoVariables.png)
After we made these changes, when users entered a query in the Search Center, th
- The value of *ContentSummaryOWSMTXT* with hit highlighting -- The value of *owstaxIdTechnicalSubject* (The query words did not match the property value, but because of the variable that we created in step 10, the value appears.)
+- The value of *owstaxIdTechnicalSubject* (The query words didn't match the property value, but because of the variable that we created in step 10, the value appears.)
- A link to the item in the list
-We wanted to make one little change to how the value for *owstaxIdTechnicalSubject* appears. We wanted to give users a bit more context as to what this value represents. Therefore, we decided to add the text "Technical Subject:" before the value. Also, because this value is not always present for all list items, we decided it should only display when a value was present.
+We wanted to make one little change to how the value for *owstaxIdTechnicalSubject* appears. We wanted to give users a bit more context as to what this value represents. Therefore, we decided to add the text "Technical Subject:" before the value. Also, because this value isn't always present for all list items, we decided it should only display when a value was present.
To do this, we made a change to the variable that overrides the *HitHighlightedSummary* property:
Note that we added a slightly different color to the text "Technical Subject:".
![Final Search Result](../media/OTCSP_FinalSearchResult.png)
-In [How to create a new result type in SharePoint Server](how-to-create-a-new-result-type.md), we had decided we wanted 6 different result types. After creating the *TechNet content* result type and display template, it was very easy to copy this work over to the other 5 result types.
+In [How to create a new result type in SharePoint Server](how-to-create-a-new-result-type.md), we had decided we wanted six different result types. After creating the *TechNet content* result type and display template, it was easy to copy this work over to the other five result types.
And here's the result:
SharePoint How To Display Values From Custom Managed Properties In The Hover Panel https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/search/how-to-display-values-from-custom-managed-properties-in-the-hover-panel.md
By default, the rendering of the hover panel is performed by the three common ho
![Default Rendering](../media/OTCSP_DefaultRendering.png)
-To make life as easy as possible when you are adding custom properties to your hover panel, you should leave these three common hover panel display templates as they are, and instead concentrate on the result type specific hover panel display template (highlighted in the illustration below). That's what we did in our Search Center scenario, and it's what we'll demonstrate in this article.
+To make life as easy as possible, when you're adding custom properties to your hover panel, you should leave these three common hover panel display templates as they are, and instead concentrate on the result type specific hover panel display template (highlighted in the illustration below). That's what we did in our Search Center scenario, and it's what we'll demonstrate in this article.
![Hover Panel Display Template](../media/HoverPanelDisplayTemplate.png)
-This may seem confusing now, but we'll show you all the the steps that are required in the next two sections. So let's get started!
+This may seem confusing now, but we'll show you all the steps that are required in the next two sections. So let's get started!
## How to copy an existing hover panel display template <a name="BKMK_HowtoCopyanExistingHoverPanelDisplayTemplate"> </a>
-Remember when we created the custom item display template *TechNet content* , we started by copying the item display template named *Item_Default* (see [How to create a new result type in SharePoint Server](how-to-create-a-new-result-type.md) for more information). The *Item_Default* display template contains a reference to *the Item_Default_HoverPanel* hover panel display template. Because we copied the *Item_Default* display template, our *TechNet content* display template also contains a reference to the *Item_Default_HoverPanel* .
+Remember when we created the custom item display template *TechNet content, we started by copying the item display template named *Item_Default* (see [How to create a new result type in SharePoint Server](how-to-create-a-new-result-type.md) for more information). The *Item_Default* display template contains a reference to *the Item_Default_HoverPanel* hover panel display template. Because we copied the *Item_Default* display template, our *TechNet content* display template also contains a reference to the *Item_Default_HoverPanel.
![Item Default Link](../media/OTCSP_Item_DefaultLink.png)
-We wanted to use the *Item_Default_HoverPanel* hover panel display template as a basis when we added custom properties to our hover panel. Therefore, in our mapped network drive, we copied the *Item_Default_HoverPanel* display template
+We wanted to use the *Item_Default_HoverPanel* hover panel display template as a basis when we added custom properties to our hover panel. Therefore, in our mapped network drive, we copied the *Item_Default_HoverPanel* display template.
![Item Default Displayed](../mediefaultCopy.png)
-and gave it a new name: *TechNet_Content_HoverPanel* .
+And gave it a new name: *TechNet_Content_HoverPanel.
![TechNet Hover Panel](../media/OTCSP_TechNetHoverPanel.png)
We wanted to add the values from the following four site columns to the hover pa
- Submission Contact
-The following screen shot shows how these values are maintained for one item in our internal list.
+The following screenshot shows how these values are maintained for one item in our internal list.
![List Item](../media/OTCSP_ListItem.png) When adding custom properties to a hover panel, we have to add them to the **item display template** (highlighted in the illustration below).
-Again, because this is not really intuitive: *When adding custom properties to a hover panel, we have to add them to the item display template*.
+Again, because this isn't intuitive: *When adding custom properties to a hover panel, we have to add them to the item display template*.
![Result Type Specific DT](../media/OTCSP_ResultTypeSpecificDT.png)
By doing a new search and hovering over a search result, we saw that the four cu
![Custom Properties Displayed](../media/OTCSP_CustomPropertiesDisplayed.png)
-But, we are not completely through yet. The values for *Internal Writer* and *Submission Contact* appeared differently. The screen shot might not show it clearly, but hopefully you can see that the value for *Internal Writer* appeared well, but the value for *Submission Contact* was very long and contained an ugly GUID.
+But, we aren't completely through yet. The values for *Internal Writer* and *Submission Contact* appeared differently. The screenshot might not show it clearly, but hopefully you can see that the value for *Internal Writer* appeared well, but the value for *Submission Contact* was long and contained an ugly GUID.
Both these values come from a site column of type **Person or Group**. The difference is that in the site column settings, *Internal Writer* is configured to show **Name**, whereas *Submission Contact* is configured to show **Name (with presence)**.
To make *Submission Contact* appear correctly, we copied the **HP.GetAuthorsHtml
![Authors Method](../media/OTCSP_AuthorsMethod.png)
-And now the hover panel was starting to look really good.
+And now the hover panel was starting to look good.
![Final Hover Panel](../media/OTCSP_FinalHoverPanel.png)
-But to make the hover panel even more helpful, we wanted to add an action to the bottom of the hover panel. will show how to do this this.
+But to make the hover panel even more helpful, we wanted to add an action to the bottom of the hover panel. will show how to do this.
### Next article in this series
SharePoint Federal Information Processing Standard Security Standards https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/security-for-sharepoint-server/federal-information-processing-standard-security-standards.md
description: "Learn about the Federal Information Processing Standard (FIPS) wit
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
-SharePoint Server uses several Windows encryption algorithms for computing hash values that do not comply with Federal Information Processing Standard (FIPS) 140-2, *Security Requirements for Cryptographic Modules* . These algorithms are not used for security purposes; they are used for internal processing. For example, SharePoint Server uses MD5 to create hash values that are used as unique identifiers.
+SharePoint Server uses several Windows encryption algorithms for computing hash values that don't comply with Federal Information Processing Standard (FIPS) 140-2, *Security Requirements for Cryptographic Modules*. These algorithms aren't used for security purposes; they're used for internal processing. For example, SharePoint Server uses MD5 to create hash values that are used as unique identifiers.
<a name="intro"> </a>
-Because SharePoint Server uses these algorithms, it does not support the Windows security policy setting that requires FIPS compliant algorithms for encryption and hashing. This Windows security policy is managed through the **FIPSAlgorithmPolicy** registry key in Windows, which is described in the "Configure FIPS policy for a mixed environment" section of the following topic:
+Because SharePoint Server uses these algorithms, it doesn't support the Windows security policy setting that requires FIPS compliant algorithms for encryption and hashing. This Windows security policy is managed through the **FIPSAlgorithmPolicy** registry key in Windows, which is described in the "Configure FIPS policy for a mixed environment" section of the following article:
- [Additional System Countermeasures](/previous-versions/windows/it-pro/windows-vista/cc766392(v=ws.10))
FIPS 140-2 defines security standards that the United States and Canadian govern
- [FIPS Publications](https://go.microsoft.com/fwlink/p/?LinkId=209157)
-The goal of FIPS is to provide a standardized way to ensure the security and privacy of sensitive information in computer systems of the United States and Canadian governments. Using a FIPS compliant algorithm for encryption of data over an open network is a key requirement for FISMA certification. The Windows FIPSAlgorithmPolicy registry key is neither necessary nor sufficient for FISMA certification, it is a useful enforcement tool for many solutions, but not SharePoint Server.
+The goal of FIPS is to provide a standardized way to ensure the security and privacy of sensitive information in computer systems of the United States and Canadian governments. Using a FIPS compliant algorithm for encryption of data over an open network is a key requirement for FISMA certification. The Windows FIPSAlgorithmPolicy registry key is not necessary or sufficient for FISMA certification, it's a useful enforcement tool for many solutions, but not SharePoint Server.
The FIPS contribution to FISMA certification is the strength of encryption used for security purposes. Security-related encryption within SharePoint Server is performed by using FIPS-compliant cipher suites.
-For additional information about FISMA see,[Federal Information Security Management Act (FISMA) Implementation Project](https://go.microsoft.com/fwlink/?LinkId=242329)
+For additional information about FISMA, see,[Federal Information Security Management Act (FISMA) Implementation Project.](https://go.microsoft.com/fwlink/?LinkId=242329)
SharePoint Alternate Access Urls Have Not Been Configured https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/alternate-access-urls-have-not-been-configured.md
Title: "Alternate access URLs have not been configured (SharePoint Server)"
+ Title: "Alternate access URLs haven't been configured (SharePoint Server)"
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 7f65b235-8e0d-48d8-acca-e2e0295e6522
-description: "Learn how to resolve the SharePoint Health Analyzer rule: Alternate access URLs have not been configured, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: Alternate access URLs haven't been configured, for SharePoint Server."
-# Alternate access URLs have not been configured (SharePoint Server)
+# Alternate access URLs haven't been configured (SharePoint Server)
[!INCLUDE[appliesto-2013-xxx-xxx-xxx-xxx-md](../includes/appliesto-2013-xxx-xxx-xxx-xxx-md.md)] >[!IMPORTANT] >This health analyzer rule only applies to SharePoint 2010 as this was removed in [KB4011601](https://support.microsoft.com/help/4011601) for SharePoint Server 2013 and [KB4011576](https://support.microsoft.com/help/4011576) for SharePoint Server 2016.
- **Rule Name:** Alternate access URLs have not been configured.
+ **Rule Name:** Alternate access URLs haven't been configured.
- **Summary:** A default zone URL must not point to the computer name of a front-end Web server. Because this installation has more than one front-end Web server, an incorrectly configured default zone URL can result in a variety of errors, including incorrect links and failed operations.
+ **Summary:** A default zone URL must not point to the computer name of a front-end Web server. Because this installation has more than one front-end Web server, an incorrectly configured default zone URL can result in various errors, including incorrect links and failed operations.
**Cause:** A default zone URL is pointing to the computer name of a front-end Web server.
description: "Learn how to resolve the SharePoint Health Analyzer rule: Alternat
1. Verify that the user account that is performing this procedure is a member of the Farm Administrators group.
-2. On the the SharePoint Central Administration website home page, in the **System Settings** section, click **Configure alternate access mappings**.
+2. On the SharePoint Central Administration website home pages, in the System Settings section, click **Configure alternate access mappings**.
3. On the Alternate Access Mappings page, on the **Alternate Access Mapping Collection** menu, click **Change Alternate Access Mapping Collection**.
SharePoint Database Has Large Amounts Of Unused Space https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/database-has-large-amounts-of-unused-space.md
description: "Learn how to resolve the SharePoint Health Analyzer rule: Database
**Rule Name:** Database has large amounts of unused space.
- **Summary:** The database has large amounts of unused space allocated on the disk. This database uses a large amount of space on the file system unless it is shrunk down to a small size. This event occurs if the unused space is more than 20% of the disk space and the unused space is greater than the auto-growth size plus 50 megabytes (MB).
+ **Summary:** The database has large amounts of unused space allocated on the disk. This database uses a large amount of space on the file system unless it's shrunk down to a small size. This event occurs if the unused space is more than 20% of the disk space and the unused space is greater than the autogrowth size plus 50 megabytes (MB).
**Cause:** Many activities can create unused space in the database. These activities include running the Windows PowerShell [Move-SPSite](/powershell/module/sharepoint-server/Move-SPSite?view=sharepoint-ps&preserve-view=true) command, and deleting documents, document libraries, lists, list items, and sites. **Resolution: Ignore this event, or shrink the database if you have to.** -- Normally you can safely ignore this event. You shrink a database only if it proves absolutely necessary ΓÇö for example, when you have performed an operation that removes a very large quantity of data from a database, and the free space is not expected to be used again. You can shrink the database by using the DBCC ShrinkDatabase command or SQL Server Management Studio. For more information, see [DBCC SHRINKDATABASE (Transact-SQL)](/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sql) (https://go.microsoft.com/fwlink/p/?LinkID=110852) and [Shrink a Database](/sql/relational-databases/databases/shrink-a-database?viewFallbackFrom=sql-server-2014) (https://go.microsoft.com/fwlink/p/?LinkID=224904).
+- Normally you can safely ignore this event. You shrink a database only if it proves necessary ΓÇö for example, when you have performed an operation that removes a large quantity of data from a database, and the free space isn't expected to be used again. You can shrink the database by using the DBCC ShrinkDatabase command or SQL Server Management Studio. For more information, see [DBCC SHRINKDATABASE (Transact-SQL)](/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sql) (https://go.microsoft.com/fwlink/p/?LinkID=110852) and [Shrink a Database](/sql/relational-databases/databases/shrink-a-database?viewFallbackFrom=sql-server-2014) (https://go.microsoft.com/fwlink/p/?LinkID=224904).
- The white paper [Database maintenance for SharePoint](https://go.microsoft.com/fwlink/p/?LinkID=229104) provides very important guidelines for shrinking a database. We strongly recommend that you read this white paper before you shrink a database.
+ The white paper [Database maintenance for SharePoint](https://go.microsoft.com/fwlink/p/?LinkID=229104) provides important guidelines for shrinking a database. We strongly recommend that you read this white paper before you shrink a database.
## See also
SharePoint Databases Within This Farm Are Set To Read Only And Will Fail To Upgrade Unless https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/databases-within-this-farm-are-set-to-read-only-and-will-fail-to-upgrade-unless.md
Title: "Databases within this farm are set to read only and will fail to upgrade unless it is set to a read-write state (SharePoint Server)"
+ Title: "Databases within this farm are set to read only and will fail to upgrade unless it's set to a read-write state (SharePoint Server)"
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 85aa1c35-1322-42ad-a625-1496aee67858
-description: "Learn how to resolve the SharePoint Health Analyzer rule: Databases within this farm are set to read only and will fail to upgrade unless it is set to a read-write state, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: Databases within this farm are set to read only and will fail to upgrade unless it's set to a read-write state, for SharePoint Server."
-# Databases within this farm are set to read only and will fail to upgrade unless it is set to a read-write state (SharePoint Server)
+# Databases within this farm are set to read only and will fail to upgrade unless it's set to a read-write state (SharePoint Server)
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
- **Rule Name:** Databases within this farm are set to read only and will fail to upgrade unless it is set to a read-write state.
+ **Rule Name:** Databases within this farm are set to read only and will fail to upgrade unless it's set to a read-write state.
- **Summary:** The databases are set to read-only and cannot be upgraded.
+ **Summary:** The databases are set to read-only and can't be upgraded.
**Cause:** The databases are set to read-only.
SharePoint One Or More App Domains For Web Applications Aren T Configured Correctly https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/one-or-more-app-domains-for-web-applications-aren-t-configured-correctly.md
description: "Learn how to resolve the SharePoint Health Analyzer rule: One or m
**Rule Name:** One or more app domains for web applications aren't configured correctly.
- **Summary:** This health rule checks to see if the multiple app domains feature is enabled by looking at the state of the _Microsoft.SharePoint.Administration.SPWebService.ContentService.SupportMultipleAppDomains_ property. If this is enabled, the health rule then checks to see if there are multiple web application zones in each web application. If there are, it continues to check if there is an app domain defined for each web application zone. The health rule alert is triggered if the final condition is not met. It is also triggered if the web application and app domain are not using the same Internet Information Services (IIS) port binding, web application zone, application pool account, and authentication type.
+ **Summary:** This health rule checks to see if the multiple app domains feature is enabled by looking at the state of the _Microsoft.SharePoint.Administration.SPWebService.ContentService.SupportMultipleAppDomains_ property. If this is enabled, the health rule then checks to see if there are multiple web application zones in each web application. If there are, it continues to check if there's an app domain defined for each web application zone. The health rule alert is triggered if the final condition isn't met. It's also triggered if the web application and app domain aren't using the same Internet Information Services (IIS) port binding, web application zone, application pool account, and authentication type.
- **Cause:** The SharePoint Server environment is not set to use multiple app domain, or the web application is incorrectly configured for multiple web application zones.
+ **Cause:** The SharePoint Server environment isn't set to use multiple app domains, or the web application is incorrectly configured for multiple web application zones.
**Resolution:**
SharePoint One Or More Services Have Started Or Stopped Unexpectedly https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/one-or-more-services-have-started-or-stopped-unexpectedly.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: f91de311-e531-4a15-bcc4-b4af01774e0b
-description: "Learn how to resolve the SharePoint Health Analyzer rul: eOne or more services have started or stopped unexpectedly, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: eOne or more services have started or stopped unexpectedly, for SharePoint Server."
# One or more services have started or stopped unexpectedly (SharePoint Server)
description: "Learn how to resolve the SharePoint Health Analyzer rul: eOne or m
**Rule Name:** One or more services have started or stopped unexpectedly.
- **Summary:** A critical service required for the SharePoint farm to function is not running.
+ **Summary:** A critical service required for the SharePoint farm to function isn't running.
- **Cause:** One or more critical services are not running on the specified server.
+ **Cause:** One or more critical services aren't running on the specified server.
**Resolution: Start the service that is not running** 1. Verify that the user account that is performing this procedure is a member of the Administrators group on the local computer.
-2. In Server Manager, click **Tools**, and then click **Services**.
+2. In Server Manager, select **Tools**, and then select **Services**.
-3. Right-click the service that you want to start, and then click **Start**.
+3. Right-click the service that you want to start, and then select **Start**.
SharePoint Outbound E Mail Has Not Been Configured https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/outbound-e-mail-has-not-been-configured.md
Title: "Outbound e-mail has not been configured (SharePoint Server)"
+ Title: "Outbound e-mail hasn't been configured (SharePoint Server)"
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 53885793-4150-4212-af04-6ea2e6e066f7
-description: "Learn how to resolve the SharePoint Health Analyzer rule: Outbound email has not been configured, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: Outbound email hasn't been configured, for SharePoint Server."
-# Outbound e-mail has not been configured (SharePoint Server)
+# Outbound e-mail hasn't been configured (SharePoint Server)
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
- **Rule Name:** Outbound email has not been configured.
+ **Rule Name:** Outbound email hasn't been configured.
- **Summary:** An outgoing email server has not been configured on this SharePoint Server deployment. With no SMPT server configured for outgoing email, SharePoint Server can't send email messages, including alert email, confirmation email, invitation email, and email about exceeding quotas.
+ **Summary:** An outgoing email server hasn't been configured on this SharePoint Server deployment. With no SMPT server configured for outgoing email, SharePoint Server can't send email messages, including alert email, confirmation email, invitation email, and email about exceeding quotas.
**Cause:** An SMPT email server hasn't yet been configured in the farm.
description: "Learn how to resolve the SharePoint Health Analyzer rule: Outbound
1. Verify that the user account that is performing this procedure is a member of the Farm Administrators group.
-2. On the SharePoint Central Administration website, click **System Settings**.
+2. On the SharePoint Central Administration website, select **System Settings**.
-3. On the System Settings page, in the **E-Mail and Text Messages (SMS)** section, click **Configure outgoing e-mail settings**.
+3. On the System Settings page, in the **E-Mail and Text Messages (SMS)** section, select **Configure outgoing e-mail settings**.
4. On the Outgoing E-Mail Settings page, type the SMTP server information in the **Outbound SMTP server** box, and then specify the addresses and the character set that you want to use.
-5. Click **OK**.
+5. Select **OK**.
## See also
-#### Concepts
+### Concepts
[Plan email integration for a SharePoint Server farm](../administration/email-integration-planning.md)
SharePoint People Search Relevance Is Not Optimized When The Active Directory Has Errors In https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/people-search-relevance-is-not-optimized-when-the-active-directory-has-errors-in.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 982d3ab2-d1a6-4625-b4c6-e12a1f4532d5
-description: "Learn how to resolve the SharePoint Health Analyzer rule: People Search relevance is not optimized when the Active Directory has errors in the manager reporting structure, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: People Search relevance isn't optimized when the Active Directory has errors in the manager reporting structure, for SharePoint Server."
# People Search relevance is not optimized when the Active Directory has errors in the manager reporting structure (SharePoint Server)
description: "Learn how to resolve the SharePoint Health Analyzer rule: People S
>[!IMPORTANT] >This health analyzer rule only applies to SharePoint 2010 as this was removed in [KB4011601](https://support.microsoft.com/help/4011601) for SharePoint Server 2013 and [KB4011576](https://support.microsoft.com/help/4011576) for SharePoint Server 2016.
- **Rule Name:** People Search relevance is not optimized when the Active Directory has errors in the manager reporting structure.
+ **Rule Name:** People Search relevance isn't optimized when the Active Directory has errors in the manager reporting structure.
**Summary:** In Active Directory Domain Services (AD DS), only company leaders should have the **Manager** property set to NULL. If the **Manager** property is set to NULL for other users, people search relevance is reduced. To optimize people search relevance, explicitly specify company leaders. People search can then use this information to improve relevance.
- **Cause:** Company leaders have not been explicitly specified.
+ **Cause:** Company leaders haven't been explicitly specified.
**Resolution: Specify company leaders.**
description: "Learn how to resolve the SharePoint Health Analyzer rule: People S
- **db_owner** fixed database role on all databases that are to be updated.
- - Administrators group on the server on which you are running the Microsoft PowerShell cmdlets.
+ - Administrators group on the server on which you're running the Microsoft PowerShell cmdlets.
- Add memberships that are required beyond the minimums above. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE]
- > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
+ > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For more information about PowerShell permissions, see [Add-SPShellAdmin](/powershell/module/sharepoint-server/Add-SPShellAdmin?view=sharepoint-ps&preserve-view=true).
2. Start the SharePoint Management Shell.
description: "Learn how to resolve the SharePoint Health Analyzer rule: People S
Add-SPProfileLeader -ProfileServiceApplicationProxy $upaProxy -Name "<Domain\UserName> " ```
- where *\<Domain\UserName\>* is the user account that you want to add as a leader ΓÇö for example, Contoso\Joe.Healy. For more information, see [Add-SPProfileLeader](/powershell/module/sharepoint-server/Add-SPProfileLeader?view=sharepoint-ps&preserve-view=true).
+ where *\<Domain\UserName\>* is the user account that you want to add as a leaderΓÇöfor example, Contoso\Joe.Healy. For more information, see [Add-SPProfileLeader](/powershell/module/sharepoint-server/Add-SPProfileLeader?view=sharepoint-ps&preserve-view=true).
-5. You are prompted to confirm. Type **Y** to confirm.
+5. You're prompted to confirm. Type **Y** to confirm.
6. Run a full crawl on the content source that contains the start address (URL) of the User Profile application.
SharePoint Productpatch Installation Or Server Upgrade Required https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/productpatch-installation-or-server-upgrade-required.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 89377b22-b1de-4f11-9a16-e54783c046fc
-description: "Learn how to resolve the SharePoint Health Analyzer rule: Product / patch installation or server upgrade required,,,for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: Product / patch installation or server upgrade required for SharePoint Server."
# Product / patch installation or server upgrade required (SharePoint Server) [!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
- **Rule Name:** Product / patch installation or server upgrade required
+ **Rule Name:** Product / patch installation or server upgrade required.
**Summary:** You must install all required products on all servers in the farm, and all products should have the same software update and upgrade level across the farm.
- **Cause:** You have not installed required products or updates, or the server needs to be upgraded.
+ **Cause:** You haven't installed required products or updates, or the server needs to be upgraded.
**Resolution: Install software updates and security updates or upgrade the server.**
description: "Learn how to resolve the SharePoint Health Analyzer rule: Product
For more information, see [Deploy software updates for SharePoint Server 2016](../upgrade-and-update/deploy-updates-for-sharepoint-server-2016.md).
-If a previous upgrade attempt has failed, you must resolve upgrade issues before attempting upgrade again. You can use the SharePoint Central Administration website to find information about current and previous upgrade attempts and determine issues that may be preventing upgrade from succeeding. To do this in Central Administration, in the **Upgrade and Migration** section, click **Check upgrade status**.
+If a previous upgrade attempt has failed, you must resolve upgrade issues before attempting upgrade again. You can use the SharePoint Central Administration website to find information about current and previous upgrade attempts and determine issues that may be preventing upgrade from succeeding. To do this in Central Administration, in the **Upgrade and Migration** section, select **Check upgrade status**.
SharePoint The Security Token Service Is Not Available https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/the-security-token-service-is-not-available.md
description: "Learn how to resolve the SharePoint Health Analyzer rule: The Secu
[!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
- **Rule Name:** The Security Token Service is not available.
+ **Rule Name:** The Security Token Service isn't available.
- **Summary:** The Security Token Service is not issuing tokens.
+ **Summary:** The Security Token Service isn't issuing tokens.
**Cause:** The service could be malfunctioning or in a bad state, some assemblies are missing when you deploy the custom claims provider, or the STS certificate has expired.
description: "Learn how to resolve the SharePoint Health Analyzer rule: The Secu
1. Verify that the user account that is performing this procedure is a member of the Farm Administrators group.
-2. Identify the server on which this event occurs. On the SharePoint Central Administration website, in the **Monitoring** section, click **Review problems and solutions**, and then find the name of the server in the **Failing Servers** column. If there are multiple failing servers in a server farm, you must repeat the following steps on each failing server.
+2. Identify the server on which this event occurs. On the SharePoint Central Administration website, in the **Monitoring** section, select **Review problems and solutions**, and then find the name of the server in the **Failing Servers** column. If there are multiple failing servers in a server farm, you must repeat the following steps on each failing server.
3. Verify that the user account that is performing the following steps is a member of the Administrators group on the local computer that you identified in the previous step. 4. Log on to the server on which this event occurs.
-5. Open **Server Manager**, click **Tools**, and then click **Internet Information Services (IIS) Manager**.
+5. Open **Server Manager,** select **Tools**, and then select **Internet Information Services (IIS) Manager**.
-6. In the Internet Information Services management console, in the **Connections** pane, expand the tree view, and then click **Application Pools**.
+6. In the Internet Information Services management console, in the **Connections** pane, expand the tree view, and then select **Application Pools**.
-7. In the **Application Pools** list, right-click **SecurityTokenServiceApplicationPool**, and then click **Start**. If the application pool is started already, click **Stop** and then, in the **Action** pane, click **Start** to restart it.
+7. In the **Application Pools** list, right-click **SecurityTokenServiceApplicationPool**, and then select **Start**. If the application pool is started already, select **Stop** and then, in the **Action** pane, select **Start** to restart it.
**Resolution: Install the missing assemblies into the global assembly cache (GAC) manually.**
description: "Learn how to resolve the SharePoint Health Analyzer rule: The Secu
**Resolution: Update the STS certificate**
- Confirm whether the STS certificate has expired by looking for Windows Application event log Event ID 8311 for source "SharePoint Foundation", category Topology, and with "NotTimeValid" in the message. This indicates an expired STS certificate. For more information on updating the STS certificate, please see [Replace the STS certificate for SharePoint Server](/SharePoint/administration/replace-the-sts-certificate).
+ Confirm whether the STS certificate has expired by looking for Windows Application event log Event ID 8311 for source "SharePoint Foundation," category Topology, and with "NotTimeValid" in the message. This indicates an expired STS certificate. For more information on updating the STS certificate, please see [Replace the STS certificate for SharePoint Server](/SharePoint/administration/replace-the-sts-certificate).
SharePoint Verify That The Critical User Profile Application And User Profile Proxy Applica https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/technical-reference/verify-that-the-critical-user-profile-application-and-user-profile-proxy-applica.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: bfe89420-2a60-4ce7-8b11-2b45fd38f822
-description: "Learn how to resolve the SharePoint Health Analyzer rule: Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and have not been mistakenly deleted, for SharePoint Server."
+description: "Learn how to resolve the SharePoint Health Analyzer rule: Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and haven't been mistakenly deleted, for SharePoint Server."
# Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and have not been mistakenly deleted (SharePoint Server) [!INCLUDE[appliesto-2013-2016-2019-SUB-xxx-md](../includes/appliesto-2013-2016-2019-SUB-xxx-md.md)]
- **Rule Name:** Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and have not been mistakenly deleted.
+ **Rule Name:** Verify that the critical User Profile Application and User Profile Proxy Application timer jobs are available and haven't been mistakenly deleted.
- **Summary:** User Profile Service Application or User Profile service proxy timer jobs are not available and might have been deleted.
+ **Summary:** User Profile Service Application or User Profile service proxy timer jobs aren't available and might have been deleted.
**Cause:** A required timer job for the User Profile service or the User Profile service proxy is missing. **Resolution: Edit the rule definition so that the configuration is automatically repaired.**
-1. On the SharePoint Central Administration website, click **Monitoring**.
+1. On the SharePoint Central Administration website, select **Monitoring**.
-2. On the Monitoring page, in the **Health Analyzer** section, click **Review rule definitions**.
+2. On the Monitoring page, in the **Health Analyzer** section, select **Review rule definitions**.
-3. On the Health Analyzer Rule Definitions - All Rules page, in the **Category: Configuration** section, click the name of the rule.
+3. On the Health Analyzer Rule Definitions - All Rules page, in the **Category: Configuration** section, select the name of the rule.
-4. In the **Health Analyzer Rule Definitions** dialog, click **Edit Item**.
+4. In the **Health Analyzer Rule Definitions** dialog, select **Edit Item**.
-5. Select the **Repair Automatically** check box, and then click **Save**.
+5. Select the **Repair Automatically** check box, and then select **Save**.
The system automatically creates the missing timer jobs.
SharePoint Video Cleanup Of Databases After Upgrade Procedure https://github.com/MicrosoftDocs/OfficeDocs-SharePoint/commits/public/SharePoint/SharePointServer/upgrade-and-update/video-cleanup-of-databases-after-upgrade-procedure.md
- IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: e0797f61-5b78-4a8d-92ea-17e9f5b5ab56
-description: "Learn how to use Windows Powershell scripts to assist in cleaning up SharePoint Server 2016 databases after a successful upgrade procedure."
+description: "Learn how to use Windows PowerShell scripts to help cleaning up SharePoint Server 2016 databases after a successful upgrade procedure."
# Video: Cleanup of databases after upgrade procedure
-
-
Listen to two Microsoft Field Engineers as they talk about best practices on how to cleanup databases after an upgrade procedure. > [!VIDEO https://www.microsoft.com/videoplayer/embed/42e6f458-0d0c-4c79-a6bd-58844f06f484?autoplay=false]
-For a list of the sample scripts to use, see [Post-upgrade scripts](https://gallery.technet.microsoft.com/sharepoint/Post-Upgrade-Cleanup-35099a7a)
+For a list of the sample scripts to use, see [Post-upgrade scripts.](https://gallery.technet.microsoft.com/sharepoint/Post-Upgrade-Cleanup-35099a7a)
-For additional information on how to handle cleanup of a database, see [Post Upgrade Cleanup - Missing Server Side Dependencies](/archive/blogs/dawiese/post-upgrade-cleanup-missing-server-side-dependencies)
+For more information on how to handle cleanup of a database, see [Post Upgrade Cleanup - Missing Server Side Dependencies.](/archive/blogs/dawiese/post-upgrade-cleanup-missing-server-side-dependencies)