Showing posts with label SharePoint 2013. Show all posts
Showing posts with label SharePoint 2013. Show all posts

Monday, September 17, 2018

Unified SharePoint 2013 on-premises and SPO site provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, Azure Function, and Azure Automation using Hybrid Runbook Worker


In previous session we have discussed Office 365 site self-provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, and Azure Function. In our company we are utilizing the same architecture and framework to provisioning BOTH SharePoint online site and SharePoint on-premises site. The architecture will also unify the site provisioning for SharePoint on-premises 2013, 2016, or 2019. Here is the architecture and detailed design for the implementation.

The architecture is almost identical to we implemented previous for SharePoint online. The only difference is we have added a Azure Automation using Hybrid Runbook Worker that could run the SharePoint on-premises site creation. The previous Azure function will call the Azure Automation webhook. 

The key part here is to configure the Azure automation Runbook like PowerShell through Hybrid Runbook Worker. This way the PowerShell from Azure automation can provisioning the on-premises site through on-premises site creation method. Here is the architecture of the Azure automation Runbook Worker.


If you have set up Azure automation Runbook Worker, everything is almost same as we configured previous for SharePoint online site provisioning process. The overall architecture diagram is listed below.



  1. User enters site provisioning information from SharePoint Framework (SPFx) form on SPO site.
  2. The new list item created will trigger the Azure Logic app to send to approval.
  3. If the request approved, Azure Logic App will send the request to Azure queue with request ID that is list item ID. We have separate the two queues and if the request of for on-premises site creation, the request will be send to on-premises queue.
  4. Azure queue will trigger the Azure function.
  5. Azure function will be trigger by new message from the queue.
  6. Azure function will call the Azure Automation through webhook. Azure automation will use Runbook worker to provision the site collection using on-premises web services. Then perform post provisioning steps.
  7. Azure Automation will update the SPO site request list with correct status.
  8. Logic app will read the SPO site request list status
  9. Logic app will send the emaul notification on the site creation.
Since this process is almost identical to SharePoint online site provisioning process except the step 6 & 7 Azure function will call Azure automation. You could refer previous blog for other steps.

The step 6 is Azure function will call the Azure Automation through webhook. You could set up a Azure automation Runbook Webhook as described procedure here. The Azure automation Runbook will look likes as below. Please remember to select the Hybrid Worker for "Run On". You need to record the webhook URL after created and you may not able to get it later!


The you could call this Azure automation just like the call below.

#Import dll from library
Import-Module "D:\home\site\wwwroot\qcsbxssspqueseprocessPS\Modules\SharePointPnPPowerShellOnline\3.0.1808.1\SharePointPnPPowerShellOnline.psd1" -Global;
#Get trigger content
$requestBody = Get-Content $triggerInput -Raw | ConvertFrom-Json
#$itemID = $requestBody.ItemId
$siteTitle = $requestBody.SiteTitle
$siteURL = $requestBody.SiteUrl
$output_SiteTitle = "SITETITLE: " + $siteTitle
Write-Output $output_SiteTitle
$output_SiteUrl = "SITEURL: " + $siteURL
Write-Output $output_SiteUrl

try
{
#region constructing Webhook call body
$webhookurl = 'https://s2events.azure-automation.net/webhooks?token=<webhookToken>'
$body = @{"SITETITLE" = $siteTitle; "SITEURL" = $siteURL}
$params = @{
ContentType = 'application/json'
Headers = @{'from' = 'Harry Chen'; 'Date' = "$(Get-Date)"}
Body = ($body | convertto-json)
Method = 'Post'
URI = $webhookurl
}
#Invoking call
Invoke-RestMethod @params -Verbose
Write-Output "Call Invoked - awaiting updating list item"
# Todo: Waite and query the SharePoint
#Connect-PnPOnline -url $requestListSiteUrl -Credentials $creds
#Set-PnPListItem -List $requestList -Identity $itemID -Values @{"Provisioning_x0020_Status" = "Provisioned";}
Write-Output "Item Updated"

#end region

}
catch [System.Exception]
{
$output = "Error Details: "  + $_.Exception.Message
Write-Output $output

}


The step 7 is to use web service to create on-premises site. We tried SP App model in on-premises with just CSOM mentioned by Vesa Juvonen. However, we are running into some issues. As a result, we are using the on-premises admin web service "_vti_adm/Admin.asmx?WSD" to create the on-premises site.
The Azure automation code looks likes as below.

......

[CmdletBinding()]
Param(
[object]$WebhookData,
[string]$siteTitle,
[string]$siteUrl)
if ($WebhookData)
{
Write-Output ("Starting runbook from webhook")
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
$WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody

# Collect individual headers. Input converted from JSON.
$From = $WebhookHeaders.From
$InputBody = (ConvertFrom-Json -InputObject $WebhookBody)
Write-Verbose "WebhookBody: $InputBody"

$url = $InputBody.url
$fileName = $InputBody.fileName
Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
$siteTitle = $InputBody.siteTitle
$output = "Site Title is: " + $siteTitle
Write-Output $output
$siteUrl = $InputBody.siteUrl
$output = "Site Url is: " + $siteUrl
Write-Output $output
} else
{
Write-Output ("Starting runbook manually")
Write-Output ("Input Parameters following:")
$output = "Site Title is: " + $siteTitle
Write-Output $output
$output = "Site Url is: " + $siteUrl
Write-Output $output
}

try
{

$user = "owner"# Owner Login
$userName = "owner"# Owner Name
$pwd = 'password'
$adminSiteUrl = "http://SPadmin:port#/"
$securePwd = ConvertTo-SecureString $pwd -AsPlainText -Force
$cred = New-Object PSCredential($user, $securePwd)
$wsdlUrl = $adminSiteUrl + "/_vti_adm/Admin.asmx?WSDL"
$svc = New-WebServiceProxy -Uri $wsdlUrl -Credential $cred
$output = "Before Running Web Service - Site URL:" + $siteUrl + " Site Title: " + $siteTitle
write-output $output
$svc.CreateSite(
$siteUrl, # URL
$siteTitle, # Title
"", # Description
1033, # LCID
"STS#0", # WebTemplate
$user, # Owner Login
$userName, # Owner Name
"", # Owner Email
"", # PortalUrl
"") # PortalName

# Todo: Udpate the SPO list status w/ Completed status
# Post provisioning steps
}
catch
{
write-Output "`n[Main] Errors found:`n$_"
# Todo: Udpate the SPO list status w/ error status
}


You can see this is remote site provisioning process and the Azure Runbook worker can be any on-premises
server and does not need to be the SharePoint server. We have verified that the admin web service is supported
for SharePoint 2013, 2016, and 2019. So the code is same for all SharePoint versions.


At this point, we are using SharePoint farm account to provisioning the on-premises site. We are testing if
we could use another account to create site.

Now, we have combined both SharePoint online and SharePoint on-premises site creation into a same Azure
process!

Thursday, March 15, 2018

Why SharePoint 2013 users added as individual magically removed and how to fix the issue


Recently SharePoint 2013 users reported that they added users to access SharePoint site, however, the added users are not able to access the site. As matter of fact, they are removed from the site magically!

After debugging this issue, we found the root cause is SharePoint 2013 user profile SID is out of sync with AD SID. We believe SharePoint recognized as invalid users and removed them from the site. Let’s dig this into more details.



First how user’s AD SID changes? We have worked with AD team and identified two sceneries SID could change.

After user left the company and the AD account terminated, if the same user re-join the same company, AD will assign the same login ID and UPN. However, this user will have different SID. The following query will display one user in this case. You will see two entries and the 2nd is user re- join the company.

Query is here:
Get-ADObject -LDAPFilter "(samaccountname=userId)" -IncludeDeletedObjects -Properties objectsid,whencreated,whenchanged | select name,whencreated,whenchanged,objectsid | fl 

If user change the AD domain like change from “domain1” to “domain2”, AD SID will change but the SID history will track the old SID. Below is the example.

The query is listed here: get-aduser -Identity <userId> -Server domain.mycompany.com -Properties sidhistory


Second let’s identify why SharePoint will remove these users. 

Please find the details in depth to understand what happens internally when you delete a user and recreate it with the same name:

At some point the user had been imported by User Profile Synchronization (Profile Sync), deleted from Active Directory, recreated in Active Directory with the same account name, and then re-imported by Profile Sync.  When the user is re-imported, their SID is not updated in the UserProfile_Full table.  Now the SID in the UPA doesn’t match the SID in the UserInfo table for the site collections.

This causes a chain-reaction as below:
  • Import a user using Profile Sync. – They get a record created with proper SID in UserProfile_Full table and UserProfileValue table in the Profile database. The SIDs match in both tables at this point.
  • Delete and re-create that user in Active Directory. – They will have the same account name, but a new SID.
  • Run another Profile Sync. – The existing profile record will be updated with the new (good) SID in the UserProfileValue table, but the SID stored in UserProfile_Full will not be updated. It will retain the old (bad) SID. We now have a SID mismatch.
  • Give the user permission to a site, list, document, etc. -- It will be added to the site permissions with the new (good) SID.
  • The user opens a file in Office Web Apps. -- Part of the Office Web Apps authentication process (OAuth) is to call out to the User Profile Service Application (UPA) to get information about the user to augment their claims set and use that to open the file.
  • The UPA returns the old (Bad) SID in the Oauth token.
  • The Oauth token is presented to the SharePoint site to try to open the document.
  • The authorization process finds the user by account name in site permissions – Since the user has the same account name but different SID, the existing user record gets deleted from the site collection, removing all user permissions.
  • In SharePoint, the SID is treated as the unique ID for the user. It doesn’t matter what the account name is, if you have a different SID, you are a different user.
  • Since we can’t have more than one user with the same account name active at any given time, the original user record is marked as deleted and all of the permissions for that user are removed.  
  • This is why the user gets “Access Denied” and must be added back to site permissions.
  • When the user is added back to the site, they are added back using their correct (good) SID.  This effectively marks their ‘Bad’ record in the UserInfo table as deleted, and re-activates their ‘good’ record. – The user is fine until they go through the Oauth process again.
Note: The above scenario involves Office Web Apps (OWA), but this same thing could happen with any feature that uses OAuth.  This includes (but is not limited to): Office Web Apps, Workflow, Site Mailboxes, SharePoint-hosted Apps, and Provider-hosted Apps).

Third step is to identify users SID out of sync. There are two ways to get users with mismatch SID. 

The first way is to loop through SharePoint user profile and compare the SID with SID from AD query. Here is snippet of the code.

# Get SID from SharePoint user profile

$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}
if ($snapin -eq $null) {
  Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
}
$userID="domain\ferrari2"
$ca = Get-spwebapplication -includecentraladministration | where {$_.IsAdministrationWebApplication}
$spsite = $ca.url
$site = Get-SPSite $spsite
$context = Get-SPServiceContext $site
$upsa = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
$profile = $upsa.GetEnumerator() |Where-Object {$_.AccountName -eq $userID}

$userProfSID = $profile["SID"].Value
$userProfSID 

# Get SID from AD
$adUser = Get-aduser -Server “domain” "ferrari2"
$adUserSID = $adUser.SID.Value
$adUserSID

The second way is to identify the Profiles where the SIDs don’t match between UserProfile_Full and UserProfileValue


select upf.RecordId, upf.NTName, upf.PreferredName,  upv.PropertyVal as [SIDfromUserProfileValue], pl.PropertyName, upv.PropertyID
into #temp
from UserProfile_Full upf (nolock)
join UserProfileValue upv (nolock)on upf.RecordID = upv.RecordID
join PropertyList pl (nolock) on pl.PropertyID = upv.PropertyID
where upv.propertyid = 2
select upf.RecordId, upf.NTName, upf.PreferredName, upf.SID as [SIDfromUserProfile_Full], #temp.SIDfromUserProfileValue
from UserProfile_Full upf (nolock)
join #temp on upf.RecordID = #temp.recordid
where upf.SID != #temp.SIDfromUserProfileValue
drop table #temp


Now the forth step is to fix the issue. 

We need to update the SID in the UserProfile_Full table.  One way to do this would be to delete all the of the problem profiles and re-import them.  However, all of those users would lose profile data that is manually entered (like “About Me”, “Ask me about”, “Skills”, “Interests”, etc).


Instead, you can run Move-SPUser to update the SID in the UserProfile_Full table to be the “Good” SID for the user. Since we’ll be passing the same account name as both the ‘old’ and ‘new’ account, the SID will be the only change for the user.  Here’s an example of running this for one user:
$url = "http://www.contoso.com/"
$claimsAcc = "i:0#.w|contoso\user1"
$user = Get-SPUser -Identity $claimsAcc -Web $url
Move-SPUser -Identity $user -NewAlias $claimsAcc -IgnoreSID 

If you have a large number of users in this state, you’ll want to run this in a script that loops through each user.  We can use the previous script to dump users with mismatch SID in to a csv file. The use the following script to fix the SIX issue.
#Synopsis: Use this to run move-spuser against a list of account names stored in a CSV
#The script calls move-spuser to fix the issue.  Move-spuser is a farm-wide operation, so it only needs to be run once per-user.
#The “$URL” variable can really be any site collection in the farm.  The script just requires a single "spweb" object so that it can establish the proper context.
#Just set the top three variables: $url, $path, $logfile
$url = "http://team.contoso.com"  # Any site collection
$path = "c:\problemUsers.csv" # The input file with user names to migrate
$logfile = "c:\move-SPUserLog.txt" # The output log file
Add-PSSnapin microsoft.sharepoint.powershell -ea SilentlyContinue
$date = Get-Date -Format U
"Started Move-SPUser at " + $date + " (UTC time)" | out-file $logfile -append
"===============================================" | out-file $logfile -append
$ErrorActionPreference = "stop"
$csv = Import-Csv -Path $path
[array]$NeedtoFix = @()
$web = get-spweb $url
foreach($line in $csv)
{$NeedtoFix += $line}
$fixTotal = $NeedtoFix.Count
for($j=0; $j -lt $fixTotal; $j++)
{
$acc = $NeedtoFix[$j].ntname
$claimsAcc = "i:0#.w|"+$acc
"Fixing user: " + ($j+1) + " out of " + $fixTotal + " --> " + $claimsAcc | out-file $logfile -Append
    try{$user = $web.EnsureUser($claimsAcc)
        Move-SPUser -Identity $user -NewAlias $user.UserLogin -IgnoreSID -confirm:$false
        write-host "Fixed user: " ($j+1) " out of " $fixTotal " --> " $claimsAcc
        }
   catch [system.Exception]
        {"ERROR!!! for user: " + $claimsAcc + " -- " + $_.Exception.Message | out-file $logfile -append}
}

You might need to schedule this fix frequently so users will not have such issues.
Thanks Prerna Vashistha from Microsoft for the detailed explanation and the workaround.

Friday, December 2, 2016

Auditing solutions deployed to SharePoint environments

With increasing SharePoint 2013 customizations with new app model, there are incidents that these customizations are not working correctly in production while they are working on non-production environments. As SharePoint architect, I’m working with the team to come up an deployment auditing process to ensure what we deployed is what we should deployed. This auditing process will reports the details of the customization components especially the version labels that has been beneficial for deployment verification and debugging. Here are some the auditing reports that have helped to eliminated any customization environment discrepancies.

1. Audit app deployed to certain site collection and the app details including the versions. Here is the powershell commands.

param([string]$siteUrl)

$snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'}
    if ($snapin -eq $null) {
      Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
}

############################################################################################################################################
function ListApps([string] $siteUrl, [string]$reportName)
{
        $appInstances = Get-SPappinstance -web $siteUrl

        if($appInstances -ne $null)
        { 
                foreach ($eachAppInstance in $appInstances)
                {        
                    $title = $eachAppInstance.Title
                    $version = $eachAppInstance.App.VersionString
                    $AppPrincipalId = $eachAppInstance.AppPrincipalId
                    $Id = $eachAppInstance.Id

                    Write-Output "$title; $version; $Id; $AppPrincipalId"| Out-File $reportName -append 
    
                }
        }


}

############################################################################################################################################

ListApps https://sharepoint.mycompany.com/sites/criticalsite E:\Applist.txt


The result is like below and you could comapre the result across different environments. There are references you could use for different app API.



2. Audit any customized dlls and the versions. Here are some powershell commands.

This one will list ALL dlls inside the current directory recursively.

Get-ChildItem -Filter *.dll -Recurse | Select-Object -ExpandProperty VersionInfo


This one will list only the named dll.

Get-ChildItem -Filter *.dll -Recurse | Where-Object {$_.name -Match "mycompany.com.sp.application1.common.dll"} | Select-Object -ExpandProperty VersionInfo 



3. Audit all farm solutions and their status. Here are the powershell commands.

Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue

# Get all Solutions deployed
Get-spsolution | select * |out-file c:\solution.txt

# Connect to the Farm
$SPfarm = [Microsoft.SharePoint.Administration.SPFarm]::get_Local()

# What Solution are we looking for?
$solution = "mycompany.com.sp.apps.workflows.wsp";

# Get the solutions
$currentSolution = $SPfarm.get_Solutions() | Where-Object { $_.DisplayName -eq $solution; }
$currentSolution


We are adding other auditing such as features, permission, configurations at this time and will share those in the future.

Tuesday, November 10, 2015

Strike through SharePoint 2013 document item based on its status using JS Link

We have a requirement to strike through SharePoint 2013 document item based on its status "Obsolete". If the obsolete is "yes", the column named "Part Number" should be strike through and highlighted as red as in the below screenshot.


The easiest way to implement this is to use SharePoint 2013 new feature JS Link. There are several good JS Link references to highlight a SharePoint list row and color list item with different color. You could refer my previous blog to configure the JS Link. If you add the following javascript as JS Link named Obsolete.js to the display web part, you will get the desired result.

(function () { 

    // Create object that have the context information about the field that we want to change it's output render  
    var priorityFiledContext = {}; 
    priorityFiledContext.Templates = {}; 
    priorityFiledContext.Templates.Fields = { 
        // Apply the new rendering for Obsolate field on List View 
        "Part_x0020_Number": { "View": obsolateFiledTemplate } 
    }; 

    SPClientTemplates.TemplateManager.RegisterTemplateOverrides(priorityFiledContext); 

})(); 

// This function provides the rendering logic for list view 
function obsolateFiledTemplate(ctx) { 

    var title = ctx.CurrentItem[ctx.CurrentFieldSchema.Name]; 

var obsolate = ctx.CurrentItem.Obsolate;
console.log(obsolate);

      // Return html element with appropriate color based on priority value 
if (obsolate == "Yes")
{
return "<span style='color :#f00'>" + "<del>" + title + "</del>" + "</span>"; 
}
else{
return "<span style='color :#000000'>" + title + "</span>"; 
}


The following script will highlight the item to red when obsolete value is "Yes".


(function () {
    var overrideCtx = {};
    overrideCtx.Templates = {};
    overrideCtx.OnPostRender = {
        HighlightRowOverride
    };
    SPClientTemplates.TemplateManager.RegisterTemplateOverrides(overrideCtx);
})();

function HighlightRowOverride(ctx) {

for (var i=0; i< ctx.ListData.Row.length; ++i){

var lid = GenerateIIDFromListItem(ctx, listItem);
var row = document.getElementById(iid);

if (listItem.Obsolate == "Yes") {
row.style.backgroundColor = "rgba(255, 0, 0, 0.5)";
}
}

ctx.skipNextAnimation = true;

}

You will see this implementation is very flexible and we could use this for many other different implementations.

If you need to strike through the Document Name field, you would look at the field using view source. Here is the field definition.

{"Name": LinkFilename",
"FieldType": "Computed",
"RealFieldName": "FileLeafRef",
"DisplayName": "Name",
"ID": "5cc6dc79-3710-4374-b433-61cb4a686c12",
"ClassInfo": "Menu",
"Type": "Computed",
"Filterable": "FALSE",
"listItemMenu": "TRUE",
"CalloutMenu": "TRUE",
"AllowGridEditing": "TRUE"}


You can change the two line of the javascript.

    priorityFiledContext.Templates.Fields = {
        "LinkFilename": { "View": obsolateFiledTemplate }
    };

 var title = ctx.CurrentItem.FileLeafRef;

Please note that you could use powershell or server side API to automatically set the JSLink to the webpart if there are many lists or libraries you need to configure.

Friday, November 6, 2015

Use JSLink to auto populate the SharePoint item field without server side code like custom field

Recently we have a requirement to automatically generate a unique number string for one field called “Part Number” for many SharePoint Document libraries. This unique number string would need to follow some business logic like below.



There are several ways we could implement this. One of the old school method is to implement the field as custom field with business logic to generate unique number string. However, we had several custom fields on SharePoint that caused SharePoint upgrade complexity. In this blog, we will use a new client side implementation JSLink that will not have any server side component. In this way, the solution will be potable and the upgrade will be extremely simple.

The first step is to write a JSLink script named PartNum.js to replace the out of box “Part Number” text field behavior. You would need to find out the field internal name using the way published here. The internal field name for “Part Number” is “Part_x0020_Number“ in our case.  The script will look like below. You would need to update the guid() function with real business logic.

(function () {
    var overrideCtx = {};
    overrideCtx.Templates = {};
    overrideCtx.Templates.Fields = {
        'Part_x0020_Number': { 'NewForm': renderTitle, 'EditForm': renderTitle }
    };
    SPClientTemplates.TemplateManager.RegisterTemplateOverrides(overrideCtx);
})();

function renderTitle(ctx) {

var formCtx = SPClientTemplates.Utility.GetFormContextForCurrentField(ctx);

RegisterCallBacks(formCtx);

        var fieldVal = ctx.CurrentFieldValue ? ctx.CurrentFieldValue.toString() : null;
       
        if(fieldVal) { 
var html = '<div>';
html += '<input id="uniquePartNum" type="text" value=' + fieldVal + '>';
html += '</input>';
html += '</div>';
return html;
}
else{
var newGUID = guid();
var html = '<div>';
html += '<input id="uniquePartNum" type="text" value=' + newGUID + '>';
html += '</input>';
html += '</div>';
return html;  
}
}

//registers call back functions from SharePoint
function RegisterCallBacks(formCtx) {
                //This is what happens when the user clicks save\submit
                formCtx.registerGetValueCallback(formCtx.fieldName, function () {
                                //get the choosen value from the select element
                                var e = document.getElementById("uniquePartNum");
                                return  e.value;
                });
}

// Generate GUID
function guid() {
  function s4() {
    return Math.floor((1 + Math.random()) * 0x10000)
      .toString(16)
      .substring(1);
  }
  return s4() + s4() + '-' + s4() + '-' + s4() + '-' +
    s4() + '-' + s4() + s4() + s4();
}

The second step is to open SharePoint Designer and go to your SiteAssets library, drop PartNum.js there.

The third step is to configure the document item edit form to use this script. Go to your document library and from your library settings ribbon choose ‘Form Web Parts > Default Edit Form’. Edit the webpart. Then add the script to the JS Link field.






Now if you save the changes, you will see the “Part Number” field will be automatically populated with GUID. You could modify the script to generate the string based on your business requirement.

Since Microsoft have changed the development direction for SharePoint 2013 and 2016, you should always think about SharePoint add-in approach instead of server side solution. There are more samples from Microsoft you could refer here.

Friday, September 25, 2015

How to resolve issue SharePoint OOTB 2010 approval workflow does not create tasks for everyone when large AD list used for the approvals

We have one SharePoint 2013 list that has an out of box SharePoint 2010  approval workflow running. The workflow has an AD group as approval. The AD group has 66 users across five different AD domains. The workflow was running fine for years until recently. Now it did not create all tasks for 66 everyone. Instead it only created 62 tasks and four tasks were missing. In addition, here are some other interesting findings. 
  • If we add all 66 users individually to the workflow approval field, all tasks created for everyone
  • If we separate the large AD group to three AD groups evenly, and add the three AD groups to the workflow approval field, all tasks created for everyone
  • We have different larger AD group with 300 people and it create all 300 tasks for everyone
  • Many workflows with different large AD group as approval have similar issues
  • Some users do not have proper display names but have userID as display name
  • We have another group with 105 users and it only create 102 tasks as in the following screenshot



There three different exceptions in ULS logs for workflows.


1. The first one is transaction timeout.
“Error in persisting workflow: System.Transactions.TransactionAbortedException: The transaction has aborted. ---> System.TimeoutException: Transaction Timeout     -
-- End of inner exception stack trace ---   
 at System.Transactions.TransactionStateAborted.CreateAbortingClone(InternalTransaction tx)   
 at System.Transactions.DependentTransaction..ctor(IsolationLevel isoLevel, InternalTransaction internalTransaction, Boolean blocking)   
 at System.Transactions.Transaction.DependentClone(DependentCloneOption cloneOption)   
 at System.Transactions.TransactionScope.SetCurrent(Transaction newCurrent)   
 at System.Transactions.TransactionScope..ctor(TransactionScopeOption scopeOption)   
 at Microsoft.SharePoint.Workflow.SPWinOePersistenceService.Commit(Transaction transaction, ICollection items)”


2. The second is group cannot be found.
SharePoint Foundation General 8vid Verbose Could not initialize group data from https://sharepointdev.mycompany.com/mydepart/Finance/SalesCert: Microsoft.SharePoint.SPException: Group cannot be found. c7a11c9d-0430-e0c9-8fa7-25092a0dc879

3. The third one is some users is treated as window account not claims.
SharePoint Foundation Security ahluw Verbose Entering: GetByLoginNoThrow(user1@qca.mycompany.com) c7a11c9d-0430-e0c9-8fa7-25092a0dc879
SharePoint Foundation Security ahluz Verbose Exiting: GetByLoginNoThrow(user1@qca.mycompany.com) c7a11c9d-0430-e0c9-8fa7-25092a0dc879

Based on the three exceptions, we tried to increase the workflow time out and the SharePoint timeout, however, the issue was not resolved. We also tried to create same workflow in different SharePoint farm, different web application, different site, with different AD groups, different number of users in the AD group, and different Domain controller. The workflow failed inconsistently.  

Finally we suspect there might be a issue to get ALL the users from the group during the workflow execution. We had seen SharePoint people picker performance issue before on SharePoint 2007. As a result, we set the SharePoint people picker AD search property to help the AD query. The syntax is as below. 

stsadm -o setproperty -pn peoplepicker-searchadforests -pv "forest:corp.mycompany.com;domain:na. mycompany.com;domain:ap. mycompany.com;domain:sa. mycompany.com;domain:mea. mycompany.com;domain:eu. mycompany.com" -url https://webapp.mycompany.com

You could verify the configuration using the following command.
stsadm -o getproperty -pn peoplepicker-searchadforests –url https://webapp.mycompany.com

After this configuration, all the workflows are able to create all tasks for every approval! 




Thursday, September 24, 2015

Access excel workbook content through excel REST API on SharePoint 2013

The Excel Services REST API is a new feature of Excel Services that enables you to access Microsoft Excel workbook data by using a uniform resource locator (URL) address. Using the REST API, you can retrieve resources such as ranges, charts, tables, and PivotTables from workbooks stored on SharePoint Server 2013.

The REST API provides a flexible way to use Excel data. In addition to directly linking to Excel resources using the REST API, you can also make basic HTTP calls programmatically to use Excel data within external programs. In the REST API, code is portable. There are no assemblies to distribute, no additional references are required in your project, and you do not have another object model to learn. As long as you understand the URL convention for accessing resources, you only have to build the URL and then call it with a standard HttpWebRequest object. As a result you could use REST API though HTTP, you could call it through any program like PHP, Perl, Java, C#, and javascript.

In this blog, I’ll use a simple C# console application example to use the Excel Services REST API return all data on the PivotTable. It demonstrates a basic pattern for retrieving resources from an Excel workbook on SharePoint Server 2013 using the REST API. The code will use the user name and password to authenticate to SharePoint. The code example retrieves a PivotTable named " ProjectData" from a workbook named "QMET - Project Data.xlsx" that is stored in a Reports library named "Reports" in a ProjectBICenter.

The key to using the REST API is in understanding how to construct the correct URL to a resource. A REST URL in Excel Services consists of three required parts and one optional part.
  • The first part is the location of the ExcelRest.aspx page. ExcelRest.aspx is the entry point to the Excel Services REST API.
  • The second part is the workbook location relative to the ExcelRest.aspx page.
  • The third part is the path of the requested resource inside the workbook. As an example for the excel book named "QMET - Project Data.xlsx".
  • The forth part is the format you would like to return. The valid formats are html, atom, and json.

This is example of the the REST URL to get PivotTable data as json format.
https://webapp.mycompany.com/projectsite/ProjectBICenter/_vti_bin/ExcelRest.aspx/Sample%20Reports/QMET%20-%20Project%20Data.xlsx/Model/PivotTables('ProjectData')?$format=json

You could directly use the browser to test the URL listed above with different formats and validate the data. Then the following C# code you could use as example to port to other different languages.

using System;
using System.Net.Http;
using System.IO;
using System.Net;

namespace RestExcelClient
{
    class Program
    {
        static void Main(string[] args)
        {
            // REST to get all data from PivotTable named ProjectData
            string jsonRequest = "https://webapp.mycompany.com/projectsite/ProjectBICenter/_vti_bin/ExcelRest.aspx/Sample%20Reports/QMET%20-%20Project%20Data.xlsx/Model/PivotTables('ProjectData')?$format=json";

            // Set up HttpWebRequest call to use local user name and password
            CredentialCache credCache = new CredentialCache();

           // If run on window, you could use the login user account – uncomment the below line
           //credCache.Add(new Uri(jsonRequest), "NTLM", CredentialCache.DefaultNetworkCredentials);

            credCache.Add(new Uri(jsonRequest), "NTLM", new NetworkCredential("username", "passeord", "domain.mycompany.com"));
            HttpWebRequest spRequest = (HttpWebRequest)HttpWebRequest.Create(jsonRequest);
            spRequest.Credentials = credCache;
            spRequest.UserAgent = "Mozilla/4.0+(compatible;+MSIE+5.01;+Windows+NT+5.0";
            spRequest.Method = "GET";
            spRequest.Accept = "application/json;odata=verbose";
            HttpWebResponse endpointResponse = (HttpWebResponse)spRequest.GetResponse();

            try
            {
                // Get HttpWebResponse with Json data as string
                Stream webStream = endpointResponse.GetResponseStream();
                StreamReader responseReader = new StreamReader(webStream);
                string response = responseReader.ReadToEnd();

                Console.WriteLine(response);

                // Next task is to parse the json string result
                //DataContractJsonSerializer jsonSerializer = new DataContractJsonSerializer(typeof(Response));
                //object objResponse = jsonSerializer.ReadObject(response.GetResponseStream());
                responseReader.Close();

            }
            catch (Exception e)
            {
                Console.Out.WriteLine(e.Message);
            }      

        }


    }

}


There are several key configurations we would need to point out here.  
  1. Add the Excel Services Application Trusted Data Connection Libraries
  2. Add Excel Services Application Trusted File Locations 
  3. Make sure the security token app user has access  to the external data
  4. Enable the "Data refresh from REST enabled." for the Excel Services Application Trusted File as described in this blog

You might need to parse the result json data as described here. You could use the handy online json viewer to view the json returns.