Tag Archives: SharePoint

Why I’m loving having my head in the Microsoft cloud

Microsoft Disclaimer – Yes I am a Microsoft evangelist. I have been working with Microsoft tech for the entire of my working career. I know there are competing technologies out there, some of them perhaps more feature rich, or cheaper, or whatever .. but this is merely a conversational piece about my experience which I still believe is the ONLY supplier you can go to in order to get the complete service across all scenarios from the same supplier … please don’t troll, we don’t feed them here!

I have to admit up front .. I have never really been what you might call an “early adopter”. I didn’t get my first mobile phone until 2001.. Everyone I knew was using laptops for years while I got my first one for myself in 2010, and until recently all of my backups have been to portable disk drives sat in the drawer of my office at home.

Things have changed though .. life is different and a whole lot easier .. I’m moved all of my stuff to “the cloud” just over a year back, and my cloud has a Microsoft logo!

Terminology Disclaimer – Yes, I know .. “The Cloud” .. we used to just call these things Data Centres or 3rd Party Hosting. But you gotta keep with the times eh!

Cloud Services
So .. when I’m talking about “moving to the cloud” what exactly am I referring to? Well .. I’ve split this into two sections; Personal Use and Business Use.

  • Personal Use
    • Email (Hotmail)
    • Personal file storage (SkyDrive)
    • Sync between phone and desktop (Windows Phone)

  • Business Use
    • Email (Office 365 – Exchange Online)
    • Document Storage (Office 365 – SharePoint Online)
    • Collaborative workspaces (can’t believe I’m actually using this phrase.. sorry!) (Office 365 – SharePoint Online)
    • Instant Messaging / Video Conferencing / Desktop Sharing (Office 365 – Lync Online)
    • Development Source Control (Team Foundation Service)

… and I could get all of this for £107 per year …


Personal Use – Email (free)
I guess with this I’ve been a “cloud” user for quite some time. I’ve actually had my Hotmail account for over 16 years (1997 .. shortly after it was purchased by Microsoft). Ok .. so back in my youth I needed an email address and this was going through school as being the “cool new thing” so I signed up (didn’t really use it much for the first couple of years though).

That effectively took care of my personal email needs, and 16 years later I’m still using the same email address (and thanks to the pretty damned awesome junk mail filtering Microsoft have in place, I get very little spam at all, if any!).

Personal Use – File Storage (£32 per year)
So the next thing I was going to look for was personal file storage.. fast forward about a decade and my email account also led me to, of course, SkyDrive. Now this service has been around since 2007 and I got it “for free” because of my Hotmail account. Things really started to get interesting when LiveMesh was launched (the beta coming out in 2008) which allowed you to start syncing files on your workstation with your SkyDrive account.

However .. LiveMesh was quite limited as it would only sync a maximum of 5GB of files to the cloud (regardless of how much free space you had .. I had 25GB of space in my SkyDrive account). I dabbled with this starting off by syncing My Documents and My Pictures using LiveMesh .. but the whole experience was a little bit clunky and to be honest .. with a 5GB maximum on there it was never going to be the most useful service to me. I still had tonnes of files sat on my workstation which I needed physical backups for and only having 1 computer I didn’t really get any value out of a “access my files elsewhere” service either which is also what LiveMesh offered.

Then came the game-changer! SkyDrive for Windows launched. This was huge (for me at least) as Microsoft had effectively opened up the floodgates. The new application  had very simple functionality which I tested for all of about 2 hours before removing LiveMesh as quickly as possible .. a worthy replacement had been found!

The new capabilities of SkyDrive allowed me to sync ANYTHING I wanted with my SkyDrive account online, and I could sync up to my maximum file allowance (25GB .. but now options to increase this in increments up as high as 125GB if needed).

I paid for the full 100GB extra storage (which cost me about £30 per year) and this basically takes care of all my backups. It runs seamlessly in the background, and it backs up all my documents, all my music, my pictures, videos and downloads..

The best part is I can access them through the web, so I go to my parents and want to show them some photos or pull up something from one note I can hop on their machine and all my files are “just there” ..

Personal Use – Phone to Desktop file sync (free?)
The final piece to the jigsaw came together about 3 years ago when Windows Phone 7 was launched. I definitely jumped on this with both feet (my previous smartphones being a Nokia N95 and an HTC HD2). This also had SkyDrive integrated right from the get-go .. I could upload photos (automatically as I took them if needed) .. I could read my office files and documents from the built-in office hub straight off SkyDrive .. and it even used SkyDrive for the Twitter and Facebook storage for uploading images if I wanted to share them ..

I’ve kind of marked this as “free” as I was going to buy a phone anyway. It didn’t cost me any extra to get myself a Windows Phone 7 (in many cases cheaper than leading Android and Apple iOS devices) .. so yes, my phone contract cost me money .. but the cloud file sync bit was “free”.

Business Use – Office 365 (£75 per year)
Well, this was mostly a recent requirement as I was formerly full-time worker and let my employer worry about things like hosting email and storing documents .. but in 2010 I joined the increasingly popular contractor route and set myself up with my own company ..

Luckily Microsoft had also recently launched its own Office 365 services .. probably the best bang-for-your-buck online service you can get with fantastic quality, Enterprise level cover and small-business prices.

For £6.25 per month (I went for the E1 plan) I could get:

  • Exchange Online – 25GB email mailbox with the latest Outlook web access
  • SharePoint Online – My own private SharePoint tenant, with all the standard bells and whistles to play with, and ability to invite external “Microsoft Account” users for free to join in!
  • Lync Online – federated with the MSN Messenger / Live Messenger network and all of the other federated Lync users (i.e. Microsoft / other Office 365 users / most Microsoft partner companies).

This to be honest was a no brainer .. I am a SharePoint professional by trade, lets be honest, I was never going to choose anything else (not to mention for the price and featurset Office 365 is simply the best offering out there).

The best bit is I could create “collaborative workspaces” (sorry again!) where I can spin up SharePoint sites (or site collections) to work on stuff with other people.

The typical use case for me is sharing my finance files with my accountant .. and it felt a lot more professional when I can do this on my own branded site and (using my knowledge and experience of SharePoint) offer customised experience specifically for that need.

Business Use – Team Foundation Service (free)
The final piece to the puzzle was Source Control

I do quite a lot of personal projects, typically working on code samples for when I’m speaking at community events (or writing up code for blog articles).

I really struggled to work out how I could get easy backup for my source control… again this stemmed from my experience point .. I have spent almost my entire development career using Team Foundation Server (with a few painful years on Visual SourceSafe) .. the problem is that TFS hosting is typically damned expensive!

Then the miracle came .. Microsoft was offering a preview service called “Team Foundation Service” (https://tfs.visualstudio.com/).

This allowed me to create my own TFS projects and use full source control and even access to build agents!! This eventually went live and Microsoft announced that for small usage (up to 5 TFS projects) it was completely free!

Well .. I made the big leap a while ago, and I am absolutely loving it! I have had to re-install my laptop twice in the past 12 months and I have never had a less painful experience!

The process for getting all of my local files back ended up being:

  • Install Windows
  • Install SkyDrive for Windows
    • Start folder sync
  • Install Office 2013
    • Configure Outlook, start mailbox sync
    • Start SkyDrive Pro sync
  • Leave running overnight

That was it .. next morning all of my files were back. I have never had to run a “backup schedule” or worry about losing my files … stress free and painless computing. It has been so successful, I’ve even gotten my parents running on SkyDrive!

Windows Server 2012, Internet Explorer and missing links in Central Admin

This is one that stumped me for a short while. Anyone who has experienced “missing links in Central Admin” may be aware of this already. You install SharePoint, login to the server and try to navigate to Central Admin from a favourite in Internet Explorer.

Note – I realise you shouldn’t really be using IE locally on production servers. This is typically a “demo/dev box” problem

Everything is fine until you try to configure some services, and you find that a bunch of links have disappeared! In particular I spent a good 15 minutes working out why “Services on Server” had gone walkabout (for about 10 of those minutes I thought I’d gone mad and forgotten where it was).

You will probably have checked your permissions without any success:

  • Are you logged in as an Administrator? (yes)
  • Are you in the Farm Admin group? (yes)
  • Have you tried rebooting? (yes)

The truth is slightly more simple than that. There are two vital settings which are required for this to work on a server:

  1. Turn off “Internet Explorer Enhanced Security Configuration” (IE ESC). This is a pretty standard task for most single server demo / development boxes.
  2. When starting IE “run as administrator”

The second one is the kicker. When you run the “Central Administration” link from the Start Menu / Start Screen it automatically kicks into elevated privileges (and if you have User Access Control turned on then you get the normal prompt to “run as administrator”).

If you are running Server 2008 R2 then it is a pretty simple task to just modify the IE shortcut in your taskbar / desktop to “Run as Administrator” and you are good to go.

If you are running Server 2012 though things are not that simple! Sure, you can set the same option, but it won’t work (at least it didn’t for me).

The only way I could get it to work was to browse to the IE 10 install directory (C:\Program Files\Internet Explorer\) and create my own shortcut to the iexplore.exe application.

I then set that shortcut to “Run as Administrator”, pinned it to the taskbar and voila! success!

Just another one of those Windows Server 2012 quirks to get used to I guess…

If you have found an easier / quicker way to do this .. then please let me know in the comments!

Search Core Results Web Part with Dynamic Date and User Profile Tokens

If you just want the goodies then you can get them here:

The Big Fat Disclaimer – This has not been thoroughly tested for a production environment. I have also removed references in my snippets below to caching and error handling to try and keep it brief. The downloadable version uses both caching and error handling, but it is still really just a proof of concept and you should TEST it before you deploy it! I take no responsibility if your production servers blow up!

I must have seen this requirement dozen of times on different projects, having search results which either:

  • Filter using a User Profile Property of the current user
  • Filter using a dynamic date range (e.g. using the “TODAY” token)
  • Specify the Sort-By (which is normally restricted to either “Relevance” or “Modified Date”

The requirement for functionality of this nature come up extremely frequently on Intranet projects. For example:

“Show News Articles from the past 7 days which filter based on the user’s location”
“Show events coming up in the next 3 months”
“Show discussions / wiki entries / blog posts which include the current user’s Ask Me About values”

Example
image
FixedQuery used in the Web Part:
Author:[UPP-PreferredName] AND Write:[TODAY-180]..[TODAY] AND ContentType:Event

Well .. on my current client project these very requirements came up .. so this time I decided to knock together the basics of the web part in my spare time and then “donate” it to the project… and this post describes how I built it, what goes on under the hood, and also includes both the source code as well as a downloadable WSP package with the working Web Part in it.

Step 1 – Extending the Search Core Results Web Part
So .. to get us started, lets kick off by creating our actual Web Part. I am going to be extending the Search CoreResultsWebPart (link to MSDN).
This is easy enough to achieve by simply creating a new Web Part in Visual Studio and inheriting from the CoreResultsWebPart class. This will make sure our web part gets all of the functionality and properties that the normal Search Results web part does without any additional effort.

   1:  [ToolboxItemAttribute(false)]
   2:  public class ExtendedSearchWebPart : CoreResultsWebPart
   3:  {
   4:   
   5:  }

That is the easy bit …

Step 2 – Overriding the Query and SortOrder
Now the next bit to tackle is how to override the actual query that gets executed. Well the best place to do this is to override the ConfigureDataSourceProperties method. This method gets called before the query is actually executed against the Search engine itself.

You can then leverage the CoreResultsWebPart.DataSource property (which is of type CoreResultsDataSource). This is what allows all of the magic to happen.

   1:  protected override void ConfigureDataSourceProperties()
   2:  {
   3:      // only perform actions when we are trying to show search results
   4:      // i.e. not when you're in Design Mode
   5:      if (this.ShowSearchResults)
   6:      {
   7:          // call the base web part method
   8:          base.ConfigureDataSourceProperties();
   9:   
  10:          // get the data source object
  11:          CoreResultsDatasource dataSource = this.DataSource as CoreResultsDatasource;
  12:   
  13:          // override the query being executed
  14:          dataSource.Query = "Author:\"Martin Hatch\"";
  15:   
  16:          // remove the original sort order
  17:          dataSource.SortOrder.Clear();
  18:          dataSource.SortOrder.Add("Title", Microsoft.Office.Server.Search.Query.SortDirection.Ascending);
  19:      }
  20:  }

So lets talk through the code above.

First off we want to make sure we are only executing our custom code when we are actually trying to retrieve search results. This is a fail-safe block as some instances this will be false (such as when you are editing the web part or if you are viewing the “Design” view in SharePoint Designer). We also need to call the base method (as you typically would when overriding a method call!).

Then things get interesting. Line 11 has us create our “CoreResultsDataSource” object from the local “DataSource” property. This has two properties which we are modifying:

Query (line 14) – This allows us to change or completely override the query which is being executed. This will be the entire query including the Fixed Query, Appended Query and whatever the user typed into their search box (if you are using this on a Search Results page). In my example above I am simply overriding the result so that it just searches for items created by “Martin Hatch” (me!)

SortOrder (lines 17 and 18) – This allows us to override the sort order, allowing us to select ANY indexed Search Property you want (I expect excluding rich text fields of course!). In my example, I am sorting by Title in Ascending order.
This can then be easily extended to provide custom Web Part properties to allow the Sort functionality to be specified by the page editor.

Step 3 – Making it re-usable Part 1 – Dynamic Date ranges
 So now that we can override the query easily we can move on to adding some of the good stuff. I decided to go with a relatively simple Token Replacement function using a simple [TODAY] token to represent the current date:

  • [TODAY] (todays date)
  • [TODAY+7] (today plus 7 days)
  • [TODAY-7] (today minus 7 days)

So .. how do we code this in? Well .. I am quite lazy and don’t really get on with regular expressions (if you are reading this and you are a RegEx guru.. by all means download the source code, refactor it and send it back, cheers!).

So I started off by creating a bunch of class level constants which I would use to recognise the tokens that we are looking for above:

private const string TODAY_PLACEHOLDER = "[TODAY]";
private const string TODAY_ADD_STARTSTRING = "[TODAY+";
private const string TODAY_SUBTRACT_STARTSTRING = "[TODAY-";
private const string TOKEN_ENDSTRING = "]";

The following code can then be swapped out for our ConfigureDataSourceProperties method.

   1:  protected override void ConfigureDataSourceProperties()
   2:  {
   3:      // only perform actions when we are trying to show search results
   4:      // i.e. not when you're in Design Mode
   5:      if (this.ShowSearchResults)
   6:      {
   7:          // call the base web part method
   8:          base.ConfigureDataSourceProperties();
   9:   
  10:          // get the data source object
  11:          CoreResultsDatasource dataSource = this.DataSource as CoreResultsDatasource;
  12:   
  13:          // get the current Fixed Query value from the web part
  14:          string strQuery = this.FixedQuery;
  15:   
  16:          // swap out the exact "today" date
  17:          if (strQuery.IndexOf(TODAY_PLACEHOLDER) != -1)
  18:          {
  19:              strQuery = strQuery.Replace(TODAY_PLACEHOLDER, DateTime.UtcNow.ToShortDateString());
  20:          }
  21:   
  22:          // perform all of the "Add Days" calculations
  23:          while (strQuery.IndexOf(TODAY_ADD_STARTSTRING) != -1)
  24:          {
  25:              strQuery = CalculateQueryDates(strQuery, TODAY_ADD_STARTSTRING, true);
  26:          }
  27:   
  28:          // perform all of the "Remove Days" calculations
  29:          while (strQuery.IndexOf(TODAY_SUBTRACT_STARTSTRING) != -1)
  30:          {
  31:              strQuery = CalculateQueryDates(strQuery, TODAY_SUBTRACT_STARTSTRING, false);
  32:          }
  33:   
  34:          // swap out the Fixed Query for our Calculated Query
  35:          dataSource.Query = dataSource.Query.Replace(this.FixedQuery, strQuery);
  36:      }
  37:  }

This then calls the CalculateQueryDates support method which I put together:

   1:  private static string CalculateQueryDates(string strQuery, string startStringToLookFor, bool AddDays)
   2:  {
   3:      try
   4:      {
   5:          // get the index of the first time this string appears
   6:          int firstIndex = strQuery.IndexOf(startStringToLookFor);
   7:   
   8:          // get the text which appears BEFORE this bit
   9:          string startString = strQuery.Substring(0, firstIndex);
  10:   
  11:          // get the text which appears AFTER this bit
  12:          string trailingString = strQuery.Substring(firstIndex);
  13:          int endIndex = trailingString.IndexOf(TOKEN_ENDSTRING);
  14:          if (endIndex + 1 == trailingString.Length)
  15:          {
  16:              // there is nothing else after this
  17:              trailingString = "";
  18:          }
  19:          else
  20:          {
  21:              trailingString = trailingString.Substring(endIndex +1);
  22:          }
  23:   
  24:          // find the number of days
  25:          string strDays = strQuery.Substring(firstIndex + startStringToLookFor.Length);
  26:          strDays = strDays.Substring(0, strDays.IndexOf(TOKEN_ENDSTRING));
  27:          int days = int.Parse(strDays);
  28:   
  29:          // re-construct the query afterwards
  30:          if (AddDays)
  31:          {
  32:              strQuery = startString + DateTime.UtcNow.AddDays(days).ToShortDateString() + trailingString;
  33:          }
  34:          else
  35:          {
  36:              // subtract days
  37:              strQuery = startString + DateTime.UtcNow.AddDays(0 - days).ToShortDateString() + trailingString;
  38:          }
  39:   
  40:          return strQuery;
  41:      }
  42:      catch (FormatException ex)
  43:      {
  44:          throw new FormatException("The format of the [TODAY] string is invalid", ex);
  45:      }
  46:      catch (ArgumentNullException ex)
  47:      {
  48:          throw new FormatException("The format of the [TODAY] string is invalid. Could not convert the days value to an integer.", ex);
  49:      }
  50:  }

So you should be able to see we are using simple String.IndexOf() method calls to find out if our Tokens are present.

If they are then we simply calculate the DateTime value based on the static DateTime.UtcNow property and use String.Replace() methods to swap out these into our query text.

When we are using [TODAY+X] or [TODAY-X] we simply use DateTime.UtcNow.AddDays(X) or DateTime.UtcNow.AddDays(0-X) and use the same String.Replace() method.

The search syntax is exactly the same as it was previously, and the Keyword Syntax is very powerful.

Example: Using [TODAY] Token query syntax

Write:[TODAY] – this will return all items that were modified today
Write>[TODAY-7] – this will return all items that were modified in the past week
Write:[TODAY-14]..[TODAY-7] – this will return all items that were modified between 2 weeks ago and 1 week ago

So we already have a powerful and reusable search component .. but there is more!

Step 4 – Making it re-usable Part 2 – Dynamic User Profile Properties
The next one is to allow us to pull in User Profile Properties so that we can start doing searches based on the current user’s profile values.
For this we needed to create new replacable Tokens, for which I decided to use:

  • [UPP-{User Profile Property Internal Name}]
  • [UPP-PreferredName] (swaps out for the users name)
  • [UPP-SPS-Responsibility] (swaps out for their “Ask Me About” values)
  • etc ..

So .. we add another class level constant (same as we did for our DateTime tokens)

private const string USER_PROFILE_PROP_STARTSTRING = "[UPP-";

We can then use this in our code, in exactly the way we did before (using String.IndexOf(), String.SubString() and String.Replace() methods).

So we add the following additional code to our ConfigureDataSourceProperties method;

   1:  if (dataSource.Query.IndexOf(USER_PROFILE_PROP_STARTSTRING) != -1 &&
   2:      UserProfileManager.IsAvailable(SPServiceContext.Current))
   3:  {
   4:      string strQuery = dataSource.Query;
   5:   
   6:      while (strQuery.IndexOf(USER_PROFILE_PROP_STARTSTRING) != -1)
   7:      {
   8:          strQuery = ReplaceUserProfilePropertyTokens(strQuery);
   9:      }
  10:   
  11:      if (strQuery != dataSource.Query)
  12:      {
  13:          dataSource.Query = strQuery;
  14:      }
  15:  }

This uses the additional method call ReplaceUserProfilePropertyTokens which is shown below:

   1:  private static string ReplaceUserProfilePropertyTokens(string strQuery)
   2:  {
   3:      // retrieve the current user's Profile
   4:      UserProfileManager upm = new UserProfileManager(SPServiceContext.Current);
   5:      UserProfile profile = upm.GetUserProfile(false);
   6:   
   7:      if (profile == null)
   8:      {
   9:          throw new ApplicationException("The current user does not have a User Profile");
  10:      }
  11:   
  12:      // extract the user profile property name from the token
  13:      int startIndex = strQuery.IndexOf(USER_PROFILE_PROP_STARTSTRING);
  14:      string strPropertyName = strQuery.Substring(startIndex + USER_PROFILE_PROP_STARTSTRING.Length);
  15:      strPropertyName = strPropertyName.Substring(0, strPropertyName.IndexOf(TOKEN_ENDSTRING));
  16:   
  17:      string strToReplace = strQuery.Substring(startIndex);
  18:      strToReplace = strToReplace.Substring(0, strToReplace.IndexOf(TOKEN_ENDSTRING) + 1);
  19:   
  20:      try
  21:      {
  22:          // get the value
  23:          UserProfileValueCollection propertyValue = profile[strPropertyName];
  24:          string strValues = String.Empty;
  25:   
  26:          foreach (object propValue in propertyValue)
  27:          {
  28:              if (propValue.ToString().IndexOf(" ") == -1)
  29:              {
  30:                  strValues += propValue.ToString() + " OR ";
  31:              }
  32:              else
  33:              {
  34:                  strValues += "\"" + propValue.ToString() + "\" OR ";
  35:              }
  36:          }
  37:   
  38:          if (strValues.Length > 0)
  39:          {
  40:              strValues = strValues.Substring(0, strValues.Length - 4);
  41:          }
  42:   
  43:          // swap the value out in the query
  44:          strQuery = strQuery.Replace(strToReplace, strValues);
  45:   
  46:      }
  47:      catch (ArgumentException ex)
  48:      {
  49:          throw new FormatException("The User Profile Property specified in your UPP token does not exist", ex);
  50:      }
  51:      return strQuery;
  52:  }

So there are a few things to point out here which might trip you up:

  • We are using the UserProfileManager.IsAvailable() method to find out if we have a user profile service application provisioned and assigned to the current Web Application.
  • At the moment this code throws an error if the current user doesn’t have a User Profile. You may want to handle this differently for your environment?
  • Handling of multi-value fields. At the moment all we do is take the string values and concatenate them with “OR” in the middle. So if you had “Value1; Value2” as your property value the Token replacement would put “Value1 OR Value2” as the search query.

As long as the content editors are aware of the behaviour this allows us to create quite complex queries.

Example if we now used the Fixed Query:

([UPP-SP-Responsibility]) AND Write:[TODAY-14]..[TODAY]

Then for a user who’s “Ask Me About” properties were “SharePoint” and “IT Administration” then the resulting Search Query would be:

(SharePoint OR “IT Administration”) AND Write:13/05/2012..17/05/2012

If another user comes along whose “Ask Me About” property was just set to “Marketing” then the resulting Search Query would be:

(Marketing) AND Write:13/05/2012..17/05/2012

This is without changing any of the web part properties, and allows us to drive dynamic content from a single web part to our entire user base.

Hopefully you can see that this is incredibly powerful and flexible.

Step 5 – Making it re-usable Part 3 – Controllable Sort By
The final step is to allow our content editors to control the “Sort By” functionality. The default OOTB webpart only allows us to sort by “Relevance” or “Last Modified”, which is fine when you are looking at general search results, but when you are building custom components (such as news, links or event feeds) you typically want to control the order by date or title or something a little more usable for the specific component.

So this bolt-in allows you to control the Sort By. First off we need to add some Web Part Properties so that the user can modify their values:

   1:  [Personalizable(PersonalizationScope.Shared)]
   2:  [WebBrowsable(true)]
   3:  [WebDescription("Sort by this managed property")]
   4:  [WebDisplayName("Managed Property")]
   5:  [Category("Sort Override")]
   6:  public string OrderByProperty { get; set; }
   7:   
   8:  [Personalizable(PersonalizationScope.Shared)]
   9:  [WebBrowsable(true)]
  10:  [WebDescription("Sort direction")]
  11:  [Category("Sort Override")]
  12:  public Microsoft.Office.Server.Search.Query.SortDirection SortDirection { get; set; }

This will provide the Web Part property editing functionality:
image

Once we have done that, we can add the following code to our ConfigureDataSourceProperties method (yes .. this method really is where all of the grunt work goes on in this web part!)

   1:  // if OrderByProperty is not set, use default behavior
   2:  if (!string.IsNullOrEmpty(OrderByProperty))
   3:  {
   4:      // change the sortorder
   5:      dataSource.SortOrder.Clear();
   6:      dataSource.SortOrder.Add(OrderByProperty, SortDirection);
   7:  }

And that is all there is to it.

Step 6 – Enjoy!
So congratulations if you made it this far. I know this was a long blog post but thought it was worth walking through it properly.

If you have any questions, feedback or questions then please get in touch using the comments, and here are links to the downloads (which are also referenced at the top of this blog post)

Some notes about the “final” version:

  • The code structure is slightly different because the DateTime [TODAY] queries are cached using Web Part Properties for better performance
  • the [TODAY] token is case sensitive!
  • There is an extra “Debug Mode” checkbox in the Web Part Properties which when enabled spits out the entire query being executed at the bottom of the search results.
  • Code contains an “Editor Part” .. this just clears out the Cache value when the web part properties are modified

Usage Summary:

Tokens you can use are:

  • [TODAY]
  • [TODAY+X] (add X days)
  • [TODAY-X] (remove X days)
  • [UPP-{Internal Name of User Profile Property}]

Example Usage

Sample user has:
Name: Martin Hatch
Ask Me About: SharePoint; Solution Architecture; Code

ContentType:Event AND ([UPP-SPS-Responsibility]) AND Write:[TODAY-7]..[TODAY]
becomes
ContentType:Event AND (SharePoint OR “Solution Architecture” OR Code) AND Write:12/05/2012..17/05/2012
Returns all events which were updated within the past week, and contain the current user’s “Ask Me About” values.

Author:[UPP-PreferredName] IsDocument:1
becomes
Author:”Martin Hatch” IsDocument:1
Returns all documents written by the current user

Author:[UPP-PreferredName] Write>=[TODAY-14]
becomes
Author:”Martin Hatch” Write>=02/05/2012
Returns all content created by the current user and updated within the past 2 weeks

SharePoint Rockstar – a Nickelback Parody

This was inspired by a short twitter conversation with @cimares, @ToddKlindt and @usher about the #SharePoint #Rockstar and the potential for a rip off parody of the Nickelback song “Rockstar“..

Basically I felt like finishing the song off .. so without further ado .. to the tune of Nickelback’s “Rockstar” I give you ..

SharePoint Rockstar..

I’m through with coding in line
And unghosting everything
I’m using SharePoint Designer
And I’m never gonna win
The solution didn’t turn out
Quite the way I want it to be
(Tell me what you want)

I want a brand new blog,
with the comments all filled
And a server room I can play baseball in
And a laptop full of software that
I got for free
(So what you need?)

I’ll need a Skype account that’s got no limit
A huge laptop with an SSD in it
Gonna get my own
parking space at TVP
(Been there, done that)

I want to get an invite to a conference pass
My own seat up in Business Class,
Somewhere between Spence and
Steve Smith is fine for me
(So how you gonna do it?)

I’m gonna tweet like made, adopt SharePoint zen
I’ll use the hashtag  #SP2010

[Chorus:]
‘Cause we all just wanna be big rockstars
Fixing errors in the logs that are just bizarre
I code so much I got RSI, but my User Profile Service gonna start first time!
And we’ll hang out in the SharePint bar
In the VIP with the SharePoint stars
Every ITPro and coders
gonna wind up there
With our free vendor shirts
That we just won’t wear
Hey I wanna be a SharePoint rockstar
Hey I wanna be a SharePoint rockstar

Wanna be great like Eric Schupps but without the hat
Pass every single exam I’ve sat
Talk at the User Group
So I can get my drinks for free
(I’ll have a SharePint on the house!)

I’m gonna get the latest version
Setup on my VM
Get a free Ultimate key to MSDN
Gonna date a designer
who builds all my sites for free
(so how you gonna do it?)

I’m gonna tweet like mad, adopt SharePoint zen
I’ll use the hashtag  #SP2010

[Chorus:]
‘Cause we all just wanna be big rockstars
Fixing errors in the logs that are just bizarre
I code so much I got RSI, but my User Profile Service gonna start first time!
And we’ll hang out in the SharePint bar
In the VIP with the SharePoint stars
Every ITPro and coders
gonna wind up there
With our free vendor shirts
That we just won’t wear

And we’ll hang out in the speaker rooms
With all the MVPs and whoever is cool
I’ll build you anything with cascading styles
Everybody’s got a contractor on speed dial

Hey I wanna be a SharePoint rockstar

I’ll annoy QA by writing messy code
I’ll deploy my solutions in debug mode

I’ll get an off-shore team to write all night long
Then I’ll code it again because they’ll get it all wrong ..

[Chorus:]
‘Cause we all just wanna be big rockstars
Fixing errors in the logs that are just bizarre
I code so much I got RSI, but my User Profile Service gonna start first time!
And we’ll hang out in the SharePint bar
In the VIP with the SharePoint stars
Every ITPro and coders
gonna wind up there
With our free vendor shirts
That we just won’t wear

And we’ll hang out in the speaker rooms
With all the MVPs and whoever is cool
I’ll build you anything with cascading styles
Everybody’s got a contractor on speed dial

Hey I wanna be a SharePoint rockstar
Hey I wanna be a SharePoint rockstar

21 Things I would do if I was an evil SharePoint overlord!

  1. All site collections will be deployed with site collection quotas allowing only 1 sandbox resource point
  2. The Site collection storage limit warning will be set at 1mb for My Sites with the entire company set as the warning email address
  3. I will insist that all site collections are created with their own host name URL. This will force any BI tools to require new SPNs for Kerberos configuration
  4. Ideally, each of these sites will have their own Web Application, and their own application pool, which will force them to buy new servers so keeping within the “10 application pools per server” guidelines which I will give them
  5. Every web application will have a custom service connection proxy group, so every time a new service application is created they will have to manually add it to each web application’s custom proxy group
  6. All databases will be created through Powershell by concatenating random GUIDS (in addition to the ones SharePoint creates automatically)
  7. While developing, all of my API classes will be public with internal constructors
  8. All default site content will be deployed using HTML encoded XML, with multiple unecessary nested divs and empty spans.
  9. Feature Stapling will be banned .. as will Content Types
  10. I will configure all Diagnostics Log categories to “Verbose”, disable flood protection and only keep log files for 1 day, making it a painful and arduous task to troubleshoot issues.
  11. Each SharePoint server will install to a non-default directory. This will be different for each server to keep the admin team on their toes.
  12. I will include a script which adds expiration policies to the “Document” content type in each site collection .. this will bombard the author with emails if they don’t update their documents every 2 weeks, therefore keeping the content fresh
  13. SharePoint Designer will be unblocked, and its usage will be encouraged!
  14. The User Profile database will be configured to crawl every 2 minutes .. keeping the process continually running so no-one can modify the connections
  15. For contrast, the default Search content source will only index User Profile content every 56 hours .. so no-one can be exactly sure when it will be updated
  16. Each web application will be given different URLs for each department. IIS bindings will be put in place, but no alternate access mappings so they cannot share links or embedded urls with each other.
  17. The default zone will be set as Read Only via a policy so that items found in search results cannot be edited. there will be an alternate URL, but access mappings won’t exist so users will have to swap it out manually
  18. The reply-to email address for all notificatiosn will be set as the company switchboard.
  19. All custom web parts will, where possible, be deployed as Farm Features .. so that everyone can see them, but will only be configured to work on specific sites.
  20. We will not have specific servers .. all farm servers will run all of the services. I will convince the IT team that this makes their lives easier as they only need 1 server spec when buying new machines.
  21. I will set the qouta of the my site host to 10MB so that only the first few users will be able to upload their profile picture.

Suggestions are welcome in the comments 🙂

Code Solution – Import AD Photos into SharePoint User Profiles

[Update: Code download files updated 18/03/2011 – see below]

This is in relation to a previous post I made last week;

Active Directory Images are not imported by the SP2010 User Profile Import

So, the source code is finally ready and uploaded for your enjoyment 🙂

Now – I must first off give credit to the sources of inspiration. A lot of the code in this solution is copied / borrowed / inspired by the following posts:

All I have done is brought their code and samples together and packaged it into a WSP that runs from SharePoint Timer jobs, so you have the convenience of a SharePoint 2010 farm solution 🙂

Also it should be understood that both AD and the User Profile database are quite critical parts of anyone’s SharePoint farm, so sorry, but first I need to make a ….

… Disclaimer – All code and solutions are provided “as is” and should be used at your own risk! It is highly recommended that you test these in an isolated environment, and I confer no responsibility for any loss or damage from using the code, advice or solutions provided on this blog, or any related content.

Ok, now that is out of the way we can get on with business 🙂

[Update – 18/03/2011 – I have updated both sets of files so that it now uses the “distinguishedName” attribute to identify users in AD .. as this is a more reliable method and was in response to a reported bug]

I have uploaded the files to my Sky Drive including:

When you roll out the WSP you will find that it includes the following functionality:

Farm Scoped Feature
The WSP package includes a farm scoped feature called:

Hatch Solutions – Import Photos from AD (Timer Job)

When activated this will automatically identify the default MySite host application and create a custom Timer Job (see below) attached to that web application.

My Site Timer Job
The Timer Job (installed by the Farm Feature) is designed to run on the My Site Host web application, and is pre-configured to run once-per hour. It is called:

Hatch Solutions – Import Photos from AD

This will do the following:

  • Automatically identify all AD accounts in the current User Profile Database
  • If the AD account has a “jpegPhoto” attribute, then this is extracted
  • The photo is converted to three thumbnail images, and uploaded to the My Site Host profile photo asset library
  • The photo for that user profile is updated to point at their newly uploaded photo

Hope you enjoy, the source code is there for all to see, and good luck!

Tip – CAML Query Retrieve tasks assigned to user (including both AD and SP Groups)

This is something I have seen so many people struggle with, but it really is very easy, with the help of the “<Membership>” element.

The Membership element allows you to basically check to see if the AssignedTo field is assigned to any group which the current user is a member of.

Of course, you still need to use it in conjunction with a standard FieldRef check against the user’s ID (which you can get using the <UserID/> element.

Below is the CAML query to return All Tasks Assigned to the Current User, including specific assignments, and where the task is assigned to a group that contains the current user (both AD Groups and SharePoint Groups).

I suppose theoretically this should also work with groups in custom Membership Providers .. but haven’t tried it.

<Where>
  <Or>
    <Eq>
      <FieldRef ID=’Assigned To’ />
      <Value Type=’Integer’><UserID/></Value>
    </Eq>
    <Membership Type=’CurrentUserGroups’>
        <FieldRef Name=’AssignedTo’ />
    </Membership>”;
  </Or>
</Where>

Forays into SharePoint 2010 Performance Testing with Visual Studio 2010

Over the past six months I have increasingly become an evangelist of Performance Testing. It has always previously been an area that I was aware of but I never really got massively involved in, but recently I’ve seen it as an increasingly important part of my work, especially on the larger scale projects with load balanced web front ends (for performance, not just redundancy) and you start hitting I/O limits on SQL. I suppose this may have been triggered by the SharePoint Conference 2009, and one of my follow up blog posts “Load Testing SharePoint 2010 with Visual Studio Team Test“.

So in this post I firstly wanted to look at why you should do Performance Testing?

It sounds like a bit of a stupid question (with an obvious answer) but it really is surprising how many people don’t do it. How many of you have ever asked the following questions on a project?

“How many users can the production system support?”
“What would be the impact of doubling the number of users?”
“What impact with backups have on performance?”
“How fast will the solution perform during peak hours?”
“What is the most cost-effective way of improving performance?”

All of these are questions that you absolutely HAVE to be able to answer. The client (whether it is your organisation, or another organisation who you are running a project for) deserves to know the answers to these, and without them how can you have any idea whether your solution is going to be fit for purpose?

Sure, you can read up on Estimating Performance and Capacity Planning in SharePoint, but all that gives you is some rough guidelines.. we need to be able to apply some science to the process!

The last question is probable the most compelling. Re-configuring farms and buying new hardware is an expensive process, the consultancy alone can cost thousands of pounds, and you don’t want to have your client coming back asking why they just spent tens of thousands of pounds on a new state of the art iSCSI SAN array, to have zero impact on performance (“hey .. we thought it would help .. but we didn’t really know!”) because the bottleneck was actually the CPU on the Web Front End (WFE).

The story often gets even worse when things do start going wrong. If you have ever been in the unfortunate position where you are troubleshooting a system that is performing badly, these kinds of questions are quite common:

“What is causing the poor performance?”
“How can we fix this?”
“Why did you not notice this during development?”

Again, the last two questions is the killer.. if you don’t do any Performance Testing then you won’t know that you have a problem until it is too late. The earlier you can get some metrics on this, the faster you will be able to react to performance issues (in some cases finding them and fixing them before the client even knows about it!)

Equally, without performance testing you won’t know WHY the problems are occuring. If you don’t know why then you can’t know HOW the best way is to fix them!

So the key messages are this:

  • Early Warning .. catch problems early on and they will be easier to fix. There is no point waiting until users are hitting the system to find out the solution can’t cope with the load!
  • Knowledgewhat is causing the problems, and how do you fix them?
  • Confidence … not just that you know what you are doing, but you can prove it. This instils confidence in your sales, confidence in your delivery, and confidence from your clients too!

Performance Testing with Visual Studio 2010
I’ve been using Visual Studio 2010 Ultimate edition. It is the only “2010” product that incorporates Web Performance Tests and Load Tests, the two critical pieces that you will use to test the performance on SharePoint 2010 (or any other web based system). It also integrates tightly with Team Foundation Server and provides “Lab Management” capability, but that is out of the scope of this blog post.

In order to do comprehensive testing you really need 4 different software packages:

  1. Visual Studio 2010 Ultimate: This is where you create your tests and control the execution of them.
  2. Visual Studio 2010 Test Controller: Part of the Visual Studio Agents 2010 ISO, this allows you to co-ordinate tests executed by several “agents”, as well as collecting results and storing all of the test results (and performance counters) in a database. The license for this is included in Visual Studio 2010 Ultimate.
  3. Visual Studio 2010 Test Agent: Part of the Visual Studio Agents 2010 ISO, this can be installed on machines that will simulate load and execute tests. They are connected to a “Controller” which gives them instructions. The license for this is included in Visual Studio 2010 Ultimate.
  4. Visual Studio 2010 Virtual User Pack: This is a license that allows you to increase the number of virtual “users” you can simulate by 1,000 (for each pack that you purchase). This is a separate license that must be purchased separately (there is no trial version!)

If you need any help installing these and getting them running then there is a great MSDN article which you should read: Installing and Configuring Visual Studio Agents and Test and Build Controllers or the equally awesome article from Visual Studio Magazine: Load Testing with Visual Studio 2010.

So what about actually creating the tests?

Well, the interface is pretty simple. You can create your “Web Performance Tests” using a simple Browser Recorder (literally using a Web Browser which records all of your actions, and then click “stop” when you are finished). This works great, but there are a few caveats:

  • You might want to use the “Generate Code” option if you are adding documents or list items. This converts your recorded web test into a code file, allowing you to programmatically change document names, or field values .. useful to make sure you are not just overwriting the same document over and over again
  • Web Service tests require a bit more “knowledge” of how they work, needing the SOAP envelope (in XML) and the SOAPAction header.

It is worth noting that there is an excellent Code Plex project available: “SharePoint Performance Tests“. Although this was written for Visual Studio 2008 (you can convert it to 2010 if you want) it contains a number of configurable tests (via XML) that allow you to dynamically create tests for generic SharePoint platforms .. well worth a look!

You can then very easily create a “Load Test” which allows you to pick’n’mix tests, and a distribution of which tests you want to run.

My personal favourite is the “Tests Per User Per Hour”. For this you would sit down with your client and work out “what would a typical user do in an hour of using the system..” one such activity resulted in this kind of activity distribution:

  • Hit the site home page 50 times
  • Execute 10 searches
  • Upload 5 documents
  • Respond to 20 workflow tasks

This kind of valuable information allows you to build your tests and then distribute them using the Load Test. All you do then is plug in how many users you want to simulate and away you go!

Counting the Counters?
All of this so far is great stuff, but without the performance counters you really aren’t going to get much legs from Visual Studio. You might get the WHAT is going on (i.e. do the tests complete very quickly?) but you certainly won’t get the WHY information which is oh-so important (i.e. is it the CPU, RAM or Disk?)

For this you need to add Performance Counters… thankfully this is rediculously simple. You have something called “Counter Sets” which you can configure to collect from the computers that operate in your farm.
There are a bunch of pre-defined counter-sets you can choose from:

  • Application
  • ASP.Net (I pick this for my WFE Servers)
  • .Net Application (I pick this for my Application Servers)
  • IIS
  • SQL (I pick this for my SQL Servers)

I won’t go into any more detail than that. A step-by-step walkthrough of the options (including screenshots) can be found at the Load Testing with Visual Studio 2010 article at Visual Studio Magazine.

What about the Results?
Well, there isn’t a really simple answer to this. You really need to have a good understanding on how the different hardware components interact, and what limits you should be looking for.

The big hardware counters (CPU usage, Available Memory) are the obvious ones. Any server which exceeds 80% CPU usage for any sustained period is going to be in trouble and is close to a bottleneck. Equally any server which starts to run out of memory (or more importantly .. slowly loses memory, suggesting a memory leak!) should be identified.

But it’s the deeper, more granular analysis that proves most useful. On a recent client project I was looking at a Proof of Concept environment. We knew that we had a bottleneck in our WFE (CPU was averaging around 90%) and it was extremely workflow heavy, but the page performance was far too bad to put down to just the CPU.

On closer inspection we found a direct correlation between ther Page Response Time and the Disk Queue Length in SQL Server:

The top-left corner is the Disk Queue Length in SQL Server, and the Top Right is the Page Response Time for the Document Upload operation (bottom right is the overall Test Response time), clearly the spikes happened at the same time.

This is the true power of using Visual Studio. All of the tests and performance counters are time-stamped, allowing you to drill into any specific instance and see exactly what was happening at that moment in time!

Looking closer at the SQL Disk usage, the Write Time (%) and Read Time (%) show us even more interesting results:

The top of the graph shows the Disk Write Usage (%) and the bottom half shows the Disk Read Usage (%). Clearly, the disk is very busy writing (often being at 100%) while it does very little reading. This fits perfectly with our test results as most of the “read” operations (like viewing the home page, or executing a search result) were extremely fast … but most of the “write” operations (like uploading a document) were much slower.

So the WHAT is slow write performance (uploading of documents).
The WHY is now very simple, the disks on the SQL Server need looking at (possibly upgrading to faster disks, or some optimisation in the configuration of the databases).

Conclusion
To be honest I could talk about this subject all day, but hopefully this gives you some indication of just how crucial Performance Testing is .. and how powerful Visual Studio can be as a testing tool.

The ease of creating test scripts, the vast flexibility and power of the enormous performance counters available, and the ability to drill into a single second of activity and see (simultaneously) what was going on in all of the other servers .. its an awesome combination.

I’ll probably be posting more blog posts on this in the future, but for now good luck, and hope you get as much of a kick out of VS2010 as I have 🙂

How to: Achieve Count(*) on a large SharePoint list

This has been a mission of mine for a while now (before I went on holiday and took a 2 week hiatus from all things SharePoint :)).

One of the clients I’ve been working with has been trying to replicate a pretty simple operation (by normal development standards). They have a SharePoint list with a LOT of items in it (we are talking 200,000 list items and above) and includes some Choice fields.

They want to return a count of how often each choice value is being used. Now, if you were using SQL Server you would simply do the following pseudo-SQL:

select count(*) from myList group by myChoiceField

At first look in SharePoint this is not possible:

  • There is no “count” operation in CAML, nor any other kind of aggregation function
  • SharePoint Search “full text query” does not support the count(*) operator (or anything similar)
  • The only reference to aggregations is in the SPView.Aggregations property .. this is only used by the rendered HTML and the values are not returned in the result set.

Now .. I know that you can get count values on a list, if you create a View with a Group By then it shows you the number of items in each group, so it MUST be possible! So my mission started

List view with groups
We want to replicate this behaviour,
but programmatically!

First.. we need a test environment
The first thing I did was create a really big list. We are talking about 200,000 list items, so you can’t just pull all the items out in an SPQuery (as it would be far too slow!).

I generated a simple custom list. I add a choice field (with optional values of 1-20) and then generated 200,000 list items with a randomly assigned choice value (and a bunch of them without any choice value at all .. just for laughs).

Now I could play with my code

Attempt Number 1 – Retrieve all list and programmatically calculate the counts (fail)
I kinda knew this wouldn’t work .. but I needed a sounding board to know HOW bad it really was. There are 200,000 items after all, so this was never going to be fast.

  • Use SPQuery to retrieve 2 fields (the ID, and my “choice” field).
  • Retrieve the result set, and iterate through them, incremementing an integer value to get each “group” count value

This was a definite #fail.To retrieve all 200,000 list items in a single SPQuery took about 25 seconds to execute … FAR too slow.

Attempt Number 2 – Execute separate query for each “group” (fail)
I was a little more positive with this one … smaller queries execute much faster so this had some legs (and this is certainly a viable option if you only want the count for a SINGLE group).

  • Create an SPQuery for each of the “choice” values we want to group by (there are 20 of them!)
  • Execute each query, and use SPListItemCollection.Count to get the value

Unfortunately this was another spectacular #fail. Each query executed in around 2 seconds .. which would be fine if we didn’t have to do it 20 times! 🙁 (i.e. 40 second page load!!)

Attempt Number 3 – Use the SPView object (success!)
Ok .. so I know that the SPView can render extremely fast. With my sample list, and creating a streamlined “group by” view it was rendering in about 2 seconds (and thats on my laptop VM! I’m sure a production box would be much much quicker).

The main problem is … how do you get these values programmatically?

The SPView class contains a “RenderAsHtml” method which returns the full HTML output of the List View (including all of the group values, javascript functions, the lot). My main question was how did it actually work? (and how on earth did it get those values so quickly!)

I started off poking into the SPView object using Reflector (tsk tsk). The chain I ended up following was this:

  • SPView.RenderAsHtml() –>
    • SPList.RenderAsHtml() (obfuscated … arghhhh)

So that was a dead end .. I did some more poking around and found out that SPContext also has a view render method …

  • SPContext.RenderViewAsHtml() –>
    • SPContextInternalClass.RenderViewAsHtml() –>
      • COM object ! (arghhhh)

Now .. the fact that we just hit a COM object suggests that we are starting to wander towards the SQL queries that get executed to retrieve the view data .. I didn’t want to go anywhere NEAR that one, so I decided to leave it there and perhaps try using the output HTML instead (nasty .. but not much of a choice left!).

using (SPSite site = new SPSite(https://myspsite))
{
SPList list = site.RootWeb.Lists[“TestList”];
string strViewHtml = list.Views[“GroupedView”].RenderAsHtml();
}

Having done this we now have the HTML output of our view (and this code takes about 2-3 seconds to execute … fast enough for my laptop .. we can always cache the value if needed).
 
Looking through the DOM output in the browser, it was possible to identify the “group” element by their attributes. It is a TBody node with both an ID attribute and a “groupString” attribute (the GroupString is the important one, as it tells us the view is configured to “Group By”).
 
What I needed next was a way of getting the actual values out of the HTML. For this I used the extremely awesome “HTML Agility Pack” from Codeplex. This is a set of libraries that allow you to parse DOM elements, including both plain “poorly formed” HTML as well as XHTML, and then use XPath queries to extract any values you want (much in the same way you would normally use the XML namespace for XHTML).
 
This gave me the TBODY node, and from there I could use string manipulation on the “InnerText” to pull out the group name and the count value 🙂

// Using HTML Agility Pack – Codeplex
// load the HTML into the HtmlDocument object

HtmlDocument doc = new HtmlDocument();
doc.LoadHtml(strViewHtml);

// retrieve all TBODY elements which have both
// an ID and groupString attribute
HtmlNodeCollection nodes = doc.DocumentNode.SelectNodes(“//tbody[@id][@groupstring]”);

if (nodes != null)
{
foreach (HtmlNode node in nodes)
{
// extract the Group Name
string strGroupName = node.InnerText.Substring(node.InnerText.LastIndexOf(“&nbsp;”)+6);
strGroupName = strGroupName.Substring(0, strGroupName.IndexOf(“&#”)-1);
Console.Write (“Group: ” + strGroupName + “, “);

// extract the number of items
string strValueText = node.InnerText.Substring(node.InnerText.LastIndexOf(“(“) + 1);
Console.WriteLine(“Number of Items: ” + strValueText.Substring(0, strValueText.Length – 1));
}
}

As you can see I’m doing some rather nasty SubString statements.. there may well be a quicker and cleaner way to do this using Regex .. this was more a proof of concept than anything else 🙂

Result!

Console output, showing group names and counts.
3 seconds isn’t bad, running on a “single server” laptop VM image 🙂

The end result was 2-3 second bit of code, retreiving Group By, Count values for a list with 200,000 list items.

Not bad for an afternoons work 🙂

Attempt 4 – Do the same thing in JQuery (kudos to Jaap Vossers)
This was actually the original solution, I asked Jaap if he could look at this if he had spare time, as I knew he had a lot of JQuery experience (and he blew me away by having it all working in under 30 minutes!).

Basically it uses pretty standard JQuery to go off and retrieve the HTML content from another page, scraping the HTML and pulling back the values. Same as the C# it grabs the group TBody, then walks down the DOM to retrieve the text value that it outputs.

The speed is roughly the same as the actual view itself. I’m sure some more JQuery could be employed to pull out the specific values and do more with them, but the concept appears to be sound:

<script type=”text/javascript” src=”https://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js”></script>

<script type=”text/javascript”>
$(document).ready(function(){

// you will need to change this URL
var url = https://myspsite/Lists/MyList/GroupedView.aspx;

var groupings = [];

$.get(url, function(data) {
$(data).find(“tbody[id^=titl][groupString] > tr > td”).each(

function(index, value){
groupings.push($(this).text());
}
);

$(“#placeholder”).append(“<ul></ul>”);

$.each(groupings, function(index, value){

$(“#placeholder ul”).append(“<li>” + value + “</li>”)
});
});
});

</script>
<div id=”placeholder”></div>

tada .. (thanks Jaap!)

Result from JQuery output,
dropped into a Content Editor Web Part

Summary
Well .. I know doing HTML scraping isn’t pretty, but seeing as the code is MUCH faster than anything else I’ve seen (and is stuck in the middle of a COM object) there didn’t seem to be much choice.

By all means, feel free to let me know if you have any alternatives to this.

RCWP Part 3 – Edit Web Part using a Ribbon modal dialog

Also check out the other part of the series:

This follows on from Part 1 (where we created a “Related Content Web Part”) and Part 2 (where we added a contextual tab to the Ribbon).

This post summarised the final part of this Web Part (which we completed in the final session of the day of SPRetreat last Saturday).

We wanted to provide a pop-up window, accessed through our new Contextual Tab in the Ribbon, which allowed us to easily modify some web part properties.

The basis of this was quite straight-forward, and it certainly starts off quite easy.
We created a new Application Page (RCWP_SetFieldValue.aspx) which would contain the code to update our Web Part Properties.

In this file we added a simple ASP.Net Label, Drop Down List and button.

<asp:Content ID=”Main” ContentPlaceHolderID=”PlaceHolderMain” runat=”server”>
<p>
    This allows you to set the field value for the <strong>Related Content Web Part</strong>
</p>
<asp:Label runat=”server” ID=”lblChoice” Text=”Select Field:” AssociatedControlID=”ddlFields” /><br />
<asp:DropDownList runat=”server” ID=”ddlFields” /><br />
<asp:Button runat=”server” ID=”btnNike” Text=”Just Do It!” />
</asp:Content>

Back in Part 2 we created a JavaScript file which was use for the “Command” events for our Buttons (yes .. I told you we’d be looking at that again!).

Here we are going to modify one of the Buttons so that it throws up a SharePoint Modal Dialog with our Application Page in it.

The code below is modified from the original MS Blog Article I referenced in Part 2 (called “How to create a Web Part with a Contextual Tab”).

if (commandId === ‘CustomContextualTab.GoodbyeWorldCommand’) {
            //alert(‘Good-bye, world!’);
            var options = {
                url: ‘/_layouts/SPR3/RCWP_SetFieldValue.aspx,
                title: ‘Set Field’,
                allowMaximize: false,
                showClose: true,
                width: 800,
                height: 600
            };
            SP.UI.ModalDialog.showModalDialog(options);
        }

I have basically changed the JavaScript for the “GoodbyeWorldCommand” button so that it does something different.

I am using the new SP.UI.ModalDialog namespace in the SharePoint ECMAScript to pop up a modal dialog window.

(Note – I also changed the display text to “Set Field” .. and deleted the other button to clean up the ribbon a bit)

But don’t forget that our Application Page is running from _layouts … it’s in a completely different place to our Web Part so this really isn’t enough for our page to work. In order to do anything else our Layouts page would need the following information:

  • The Page that the web part is on (URL)
  • Which Web Part to update on that page (Web Part ID)

The URL of the current page is easy enough using JavaScript (location.href) but the Web Part ID … this represented a new challenge.

How do you get the server-side Web Part ID through JavaScript?

This problem took the entire final hour of the day (Session 5) and took quite a bit of research and web searching. Eventually (after a few suggestions) we hit upon the answer:

Back in Part 2 we created a JavaScript file which was used to register our Contextual Tab. The JavaScript file that registers the Contextual Tab contains a reference to a “PageComponentId”.

getId: function ContextualTabWebPart_CustomPageComponent$getId() {
    return this._webPartPageComponentId;
}

Thes pecific instance of our Web Part had a “PageComponentID” of “WebPartWPQ2” and after some digging we found it in the Source of the page!

<div WebPartID=”866ef42d-6626-45e0-af9c-a00467ed2666″ WebPartID2=”1ad9529a-5e86-4e7c-9d4d-022a1fa6e6c0″ HasPers=”false” id=”WebPartWPQ2″ width=”100%” class=”ms-WPBody noindex ms-wpContentDivSpace” allowRemove=”false” allowDelete=”false” style=”” >

The attribute that REALLY stands out though is the WebPartID:

WebPartID=”866ef42d-6626-45e0-af9c-a00467ed2666″

This is clearly a GUID value, referring to the server-side Web Part ID for that instance of the Web Part.
So .. how do we get this to our dialog.. well, good old trusty document.GetElementById() (we could have used JQuery, but I didn’t want to have to install the framework .. and don’t forget .. I only had 1 hour to get this working at SPRetreat!!)

Using this information, I could modify my JavaScript to retrieve these values, and pass them through to my Modal Dialog.

// get the Web Part DIV element
            var element = document.getElementById(this._webPartPageComponentId);
            // extract the Web Part ID (as a GUID object)
            var wpID = element.attributes[“WebPartId”];
            // pass through the URL and Web Part ID
            var options = {
                url: ‘/_layouts/SPR3/RCWP_SetFieldValue.aspx?wpID=’ + wpID.nodeValue + ‘&url=’ + location.href,
                title: ‘Set Field’,
                allowMaximize: false,
                showClose: true,
                width: 800,
                height: 600
            };
            SP.UI.ModalDialog.showModalDialog(options);

Note – as the Web Part ID is of type HTML Attribute, we need to use the “NodeValue” property instead of toString();

So .. first off, in our Application Page we can use the URL to retrieve the fields from the page’s Content Type and populate our Drop Down List.

protected void Page_Load(object sender, EventArgs e)
        {
            TargetUrl = Request.QueryString[“url”];
            // remove any query strings
            if (TargetUrl.IndexOf(“?”) != -1)
            {
                TargetUrl = TargetUrl.Substring(0, TargetUrl.IndexOf(“?”));
            }
            if (!Page.IsPostBack)
            {
                ddlFields.Items.Clear();
                SPFile file = this.Web.GetFile(TargetUrl);
                foreach (SPField field in file.Item.Fields)
                {
                    if (!field.Hidden)
                    {
                        ListItem item = new ListItem(field.Title, field.StaticName);
                        ddlFields.Items.Add(item);
                    }
                }
            }
            btnNike.Click += new EventHandler(btnNike_Click);
        }

I did a bit of string manipulation on the URL to make sure we trim out any URL query strings, and then use that to retrieve an SPFile object.

We then just iterate through the SPListItem.Fields collection, adding any fields that are not hidden.

Note – we are using an ASP.Net ListItem object in the Drop Down List, so that we can use the Display Name in the drop-down, but store the Static Name as the value .. it’s the Static Name we need to save to our Web Part!

The next bit is under our Click event. We can now use the URL to get the SPLimitedWebPartManager for the page, and pass through the Web Part ID property, and it would retrieve the instance of my Web Part (allowing me to set the field value).

protected void btnNike_Click(object sender, EventArgs e)
       {
           // get Web Part ID
           wpID = Request.QueryString[“wpID”];
           // retrieve the Web Part Panager for the URL
           SPFile file = this.Web.GetFile(TargetUrl);
           SPLimitedWebPartManager wpm = file.GetLimitedWebPartManager(PersonalizationScope.Shared);
           // get the safely-casted web part object
           RelatedContentWebPart.RelatedContentWebPart wp =
               wpm.WebParts[new Guid(wpID)] as RelatedContentWebPart.RelatedContentWebPart;
           if (wp != null)
           {
               // set the web part property, and save settings
               wp.FieldName = ddlFields.SelectedValue;
               wpm.SaveChanges(wp);
           }
           // close the modal dialog
           this.Context.Response.Write(“<script type=’text/javascript’>window.frameElement.commitPopup();</script>”);
           this.Context.Response.End();
       }

So .. we should be done…

Build / Deploy / Test

So .. a long journey but worth it, five different 1 hour sessions and a great day at #SPRetreat .. but definitely worthwhile, and a new “Related Content Web Part” to boot!

A massive thanks to Andrew Woodward (21Apps) and Ben Robb (CScape) for organising the event, the venue and the food! (great food!!!)

Source Code

Sorry it took so long for me to get it all online, I was very busy then went on holiday. You can find all of the source code downloadable from my SkyDrive here:

https://cid-60f12a60288e5607.office.live.com/self.aspx/SPRetreat/SPR3.zip

« Older Entries