Introduction and Business Case

I’m wrapping up a 6-month project as a Solution Architect and Business Analyst on a project for a global company that runs capital build (new production wing to an existing facility or brand new production facility) projects.  Capital build projects can cost millions of dollars.  Gaining even small efficiencies with a handful of build projects annually can be a substantial savings.

The goal of our project was to provide a solution that would allow build projects to store, organize and easily find build project documentation throughout (and after) the life of their build.  A subsequent goal was to help align build projects within this organization with their corporate build project process, and in so doing, gain efficiencies which would result in reduced project costs. Even a .5% efficiency gain is substantial. A .5% efficiency gain across ten $100MM USD build projects is $5MM in savings.  In addition, aligning with the corporate build project process can help projects create and maintain safe construction environments.

PnP_Qoute1

The solution’s core requirements were:
-Inject the company’s core build project methodology into the solution
-Allow for external sharing with other companies
-Help project teams store, organize and access project documents over the life of the project
-Allow programmatic creation of the solution (template) when new build projects begin (this is where the PNP Remote Provisioning Engine came in)
-Make processes that are labor intensive quick and easy

The solution developed was in O365, specifically SharePoint online.  This platform was used for a few reasons.

-SharePoint was immediately available.  No need to hunt for custom software (it’s out there), pilot it, wrap consulting services around it, purchase it and roll it out.  This is thought by my project team – who live in this space – to be a 1-2 year journey with fairly sizable costs.
O365 allows for easy external sharing.
SharePoint + minimal custom development met our requirements.

Pause.  These three reasons were the primary drivers for building a low-code solution on top of SharePoint online.  Project cost:  $125K USD.  Time to completion:  6 months.  Compare this to finding, testing and acquiring custom software and you can see why using SharePoint online would be compelling for an organization to go this route.

Technical Stuff for the Nerd in You (and me)
We used the following on this project:
-1 solution architect and BA (me)
-1 developer with SharePoint development, JavaScript, REST services, etc. experience
-PNP Remote Provisioning Engine (learn more herehere and here)
-Request list customized in InfoPath (this was the list Capital Build Projects would use to request a SharePoint site based on our template)
-Information Architecture built around companies build project methodology
-Term store with terms that support the companies build project methodology
-Site columns
-20+ libraries that support the companies build project methodology
-Content search web parts + metadata to build some cool pages that aggregate related files onto a single page from across the 20+ libraries (the files belonged in their own library but had a meaningful association that merited these pages)
-Automated subsite creation process (select type of subsite, libraries you want on subsite, etc.)
-2 other automated processes
-Custom branding

Notes on the PnP Remote Provisioning Engine
Our experience was that the PnP Provisioning Engine was strong when it came to provisioning structure (lists, libraries, site columns, etc.) and we were happy with the outcome.

Here’s a few of the limitations we ran into when using the PnP Provisioning Engine to help us create template sites based on a source site.  Most of these appear to be limitations with our solution and not the PnP Remote Provisioning Engine.

Site Settings that did not Carry Over
Certain settings did not carry over to our target site. Most of these were not HUGE and we found workarounds.  One example of this was turning on content types in a document library, then editing the local document content type in the library to hide the Title column.  This setting did not come through on our target site collections.

Pages
Only the home page gets copied to the target site.  If your site template has multiple pages as ours did, you may have to do some Kung Fu to get all your pages copied to your target site(s).    Our developer created a function that would set a single page as the home page, then copy it over using the provisioning engine. The function iterated through all the pages, setting each as the home page and copying over each page individually to the target site.

PnP_Qoute2

Page Data
Only the web parts and data on the home page were applied to the target site by the PnP Provisioning Engine.  Subsequent page data was not applied to target sites through the PnP Provisioning Engine.

Relative URLs
Some of the web parts on our home page linked to files using server relative URL paths.  The provisioning engine was not able to apply these URLs to the web parts on the home page of the target site.   We created a function in PowerShell to update the URL path of the web parts on the target site with the new site URL before applying the PNP template.

We have a script running on a server that uses the Windows Timer Job service to run the script every morning at 6 a.m.  The script reads a SharePoint online list to see if there are new requests for our template.  If there are new requests, the script looks to a field called SharePoint URL to get the URL for the new site.  Since we know the URL before we provision our template, we can use PowerShell ahead of time to adjust the web part property (relative URL) that links to the reference file.

Web Part Properties/Data
We had a content search web part that used a custom display template on 2 of our pages.  The provisioning engine was not able to properly set the web part properties in the web parts on the target site to choose the correct custom display template. We created a function in PowerShell to update/set the properties before applying the PNP template.

Branding assets and Code Files
We had several files that we used for running custom code to perform actions for users as well as apply custom branding.  PNP was not able to copy custom files and artifacts stored on the site over to the new site. Our developer created a single folder called _BrandingAssets inside the Master Page Gallery. This folder stored all the artifacts needed for our site such as images, JavaScript files, CSS files, custom display templates, etc.  Using PowerShell, we were able to extract all of the files located in that folder on the source site along with all of its metadata and store it in a local folder on a server/local file share. After applying the PNP template, we were then able to upload the files from the server file share to   the _BrandingAssets folder on the target site and set the metadata on the new site.

PnP_Qoute3

List Data
Another issue we ran into, was populating lists with data.  The design of the PnP Remote Provisioning Engine seems to be around copying structure from a source site to a target site and not content within that structure.  Our template however, used about 15 lists, each with data.  These lists contained information that was used in many instances by code processes that would run in the target site.  We created a custom function to extract data from each list and save it as XML data on the server/local file share. Then after the PNP template gets applied, we would load the data on the server/local file share to the newly created target site.

Some of the rules for this function includes:

-Only lists with names that start with _Config* would have its data extracted. All other list data would be ignored.

-Since the default for the command Get-PnPListItem would return all fields (including non-updateable fields such as created, modified, etc.) we created a view on the list named: Export. This view contained only the data that we wanted to export and load into the new site. If we tried to load non-updatable field data the script would error out.

-Lists containing User or SP Group fields would need to do a lookup of the new ID’s on the target site before applying the data to the list.

-Identify and replace any URLs that reference the template site and change it to the new site URL.

As an example of the above, our target site gave authorized users the ability to create templated subsites by clicking on a button and filling out a short form.  The user could choose to create 1 of 3 templated (wsp files) subsites.  Along with this we had some JavaScript that would allow the user to choose: which template he wanted, verify that the subsite name was not already taken and the user could choose which document libraries he wanted to show on the subsite home page.  To bring all this together, one of our lists was named _Config_Subsites and it contained the name of each subsite template which happened to correspond to the WSP template name in the Solutions gallery in Site Settings.  The JavaScript code used the data in the _Config_Subsites list by tying each selection the user had available to him in the form to data in this list.  This information gave the JavaScript function the information needed to then select the right WSP file to use when creating the subsite.

Solution Files and List Templates
Site template (WSP) and List Template (STP) files were not extracted as part of the PNP engine. A PowerShell function was created to extract the files and its metadata to save locally on the server. Then after the PNP template was applied, PowerShell was used to upload the files and set the metadata..

PnP_Qoute4

Conclusion
The PnP Remote Provisioning Engine is strong when it comes to copying source template structure (lists, libraries, etc.) to target sites.  For templates with content, a migration tool or a solution similar to the one we developed would need to be pursued.