Monthly Archives: September 2012

New-SPWebApplication : The IIS Web Site you have selected is in use by SharePoint. You must select another port or hostname.

I have to admit .. I am a bit of a PowerShell n00b. I have only really “converted” and seen the light recently but I am kinda loving the whole “scripting” thing again (I feel like I’ve gone back in time to the 1990’s and everything is driven off batch files again .. in fact it feels like SP2003 development using DDF files and MakeCab commands! :D)

Anyway .. I started building my Web Applications using PowerShell using a very handy TechNet article ( This describes “Create a Web application that uses Windows-claims authentication” which is required for SharePoint 2013 because “classic” (i.e. non-claims) web applications are deprecated and should not be used anymore.

Unfortunately this has a rather glaring bug. You see the PowerShell command that it tells you to use is:

$ap = New-SPAuthenticationProvider

$wa = New-SPWebApplication -Name <ClaimsWindowsWebApplication> -ApplicationPool <ClaimsApplicationPool> -ApplicationPoolAccount <ClaimsApplicationPoolAccount> -URL <URL> -Port <Port> -AuthenticationProvider $ap

Unfortunately there are two of the commands missing … but how did I find this out (because the first time you run it .. it works!)

I basically tried to create a SharePoint 2013 Web Application using the command above. The URL was going to be https://test and I was running it on Port 80 (standard for non SSL traffic). The Web Application created fine, but I was getting 404 errors trying to access it. I tried to create another web application (thinking something went wrong) and I got the following error message:

New-SPWebApplication : The IIS Web Site you have selected is in use by SharePoint.  You must select another port or hostname.

PowerShell Error message when creating a second Web Application

Something was clearly wrong. So I went checked out IIS. My site was there, but for some reason it didn’t have any of the host name binding information that it should have had (for https://test the “Host Name” should be “test”) :

IIS Site created without any Host Name bindings

I then also checked the “Virtual Directories” folder to make sure that the folder had been created correctly and found something a little odd. Instead of creating my site using the web application name (which it normally does) it had created a folder using the number 80 (the port number).

IIS Virtual Directory created using Port Number (80)

This seemed more than a little odd to me but after some digging it seems that the original TechNet article (mentioned at the beginning) had some information missing. Well, if you check out the New-SPWebApplication PowerShell command then there are two other commands that you need to specify:

HostHeader Specifies a valid URL assigned to the Web application that must correlate to the alternate access mapping configuration, in the form server_name. (If no value is specified, the value is left blank)

Path Specifies the physical directory for the new Web application in the virtual directories folder (If no value is specified, the value %wwwroot%\wss\VirtualDirectories\<portnumber> is applied)

So without a HostHeader value the IIS Binding information had been missed out. And without a Path specified it had used the port number (80) for the IIS folder. This technically “worked” as far as the script goes, but when I tried to create my second web application it was trying to use the same (blank) host header and the same folder name (80) which .. of course .. “is in use by SharePoint” already.

I added these two additional parameters to my script .. and voila! Everything started working.  The full command (at a minimum) should therefore be:

$ap = New-SPAuthenticationProvider

$wa = New-SPWebApplication -Name <ClaimsWindowsWebApplication> -ApplicationPool <ClaimsApplicationPool> -ApplicationPoolAccount <ClaimsApplicationPoolAccount> -URL <URL> -Port <Port> -HostHeader <HostHeader> -Path <IISFolderName> -AuthenticationProvider $ap

I deleted the old (80) web app and recreated it with a proper path and now everything appears to be back to normal.

Is it worth upgrading your laptop to USB 3.0? And if you do, which drive should you pick?

I have long been experimenting with different drives aiming for the utopia of a “high speed” external hard drive.

For many years I have been stuck on USB 2.0 (with its quite poor transfer speeds) and have been through a journey of eSATA (they all seem to require external power supplies) and finally settled on a USB 3.0 ExpressCard (that’s PCMCIA to you old skool people). I have a relatively old laptop (well .. I bought it two years ago) which was before laptops started coming with over 3 hours of battery life (I wish!) and USB 3.0 as standard so I wanted some way of getting some decent speeds with the advantages of plug-and-play capabilities and an ExpressCard seemed to be the answer.

These ExpressCards have been around for a while and you can generally pick them up for around £30 and they claim to offer “plug and play, USB 3.0 transfer speeds”. You can couple this with a whole variety of USB 3.0 external drives (or drive caddies, which you can fit your own 2.5″ hard drive or solid state drive into).

Being the curious person that I am I have decided to benchmark the different speeds I get with various drives I have:

  • Samsung 500GB USB 3.0 External Drive (which is a spindle HDD running at 5400rpm)
  • USB 3.0 2.5″ Caddy with a Seagate Momentus XT 500GB HDD (@ 7200rpm)
  • USB 3.0 2.5″ Caddy with a Crucial M4 512GB Solid State Drive

I will test all of these over both USB2.0 and USB3.0 and also compare them to my own internal Solid State Drive (also a Crucial M4 512GB SSD).

In order to run all of these tests I have been using CrystalDiskMark which runs a series of tests for both sequential and random read/write behaviour. I tested at both 1GB and 100MB sizes but frankly they were identical on every single drive so I have here published the 1GB results (as most users I know are typically playing with virtual machine images, ISO images or backups, all of which are typically quite large).

Test 1 – Internal SSD Drive (SATA II)
First off was my internal drive (Crucial M4 512GB) which is an SSD with a maximum advertised transfer speed of 550MB/sec.

Internal SSD – Crucial M4 512GB

At first glance the throughput of 200 – 250 MB/s looks a little low, but the motherboard on my laptop only supports a SATA II interface so my SSD drive is capped to a theoretical maximum of 300MB/s (assuming 2.4Gbit/s after encoding). If I had a newer SATA III motherboard then it should be almost twice as fast!

Test 2 – USB 2.0
I had 3 different tests to perform on USB 2.0. We have our 5400rpm Samsung drive, a 7200rpm Seagate drive and another Crucial M4 SSD drive. The results were not terribly surprising

USB 2.0 SSD – Crucial M4 512GB
USB 2.0 HDD – Seagate Momentus XT 500GB @ 7200rpm

USB 2.0 HDD – Samsung 500GB Drive @ 5400rpm

The results as you can see are utterly underwhelming. USB 2.0 has a theoretical maximum transfer of 420Mbit/s (52MB/s) but this is split two ways so the 20 – 28MB/s we are seeing here is pretty much flat out.

The only advantage the SSD drive has is the random read/write performance which for large 512K chunks is just as quick as sequential read/write and although the 4K chunks are a paltry 4MB/s this is still around 10x faster than the HDD can manage!

Test 3 – USB 3.0
This was an identical test to the USB 2.0 tests but this time running on USB 3.0. The spec for USB 3.0 claims a maximum throughput of 5GBit/s so it is pretty close to the SATA III maximum of 6Gbit/s (and certainly outweighs my own motherboard’s maximum throughput).

USB 3.0 SSD – Crucial M4 512GB
USB 3.0 HDD – Seagate Momentus XT 500GB @ 7200rpm

USB 3.0 HDD – Samsung 500GB Drive @ 5400rpm

This was quite surprising on two notes.

USB 3.0 – Random Read/Write Performance
The random read/write performance of the SSD over USB 3.0 is vastly quicker, with the 512K chunks showing the same performance as a sequential operation (which obliterates the HDD performance) and the 4K chunks showing a 2x – 5x speed improvement.

The HDD are showing almost the same as they were getting before, although again the 512K random read/write is a lot faster (but HDD really can’t achieve the random speeds that SSD drives can)

USB 3.0 – Sequential Read/Write Performance
This was the shock .. pretty much all of the drives get the same performance (between 80MB/s and 95MB/s) and although this is a vast improvement over USB 2.0 (every single drive shows a 400% increase in transfer speeds).

For HDD this is pretty close to their maximum speed as even internal SATA HDD rarely get above 90MB/s purely due to the limits of mechanical magnetic based drives. For me the big surprise is that the SSD drive doesn’t get anywhere near either the speed for the drive or the speed for USB 3.0.

I can only expect this is due to the throughput of the actual ExpressCard itself. The specification describes that the maximum throughput you can get between an ExpressCard and the PC is up to 1.06 Gbit/s throughout (which is 135 MB/s). Once you take account of encoding this will drop and explains why all of our connections are capping out below 100MB/s.

Well .. I can’t deny that the performance is a big advantage even being well below the USB 3.0 maximum spec!

Even using an ExpressCard / PCMCIA adapter you are still likely to get a massive performance boost. To put this in perspective if you were transferring a 40GB Virtual Machine backup to an external drive then:

  • Using a USB 2.0 port (at 21 MB/s write) it would take 32 minutes
  • Using a USB 3.0 ExpressCard (at 85MB/s write) it would take 8 minutes

If that is all you are doing then it really doesn’t matter whether you get an SSD or a HDD as your external device. The HDD are going to be FAR cheaper and you can pickup 1000GB USB 3.0 external drives these days for under £100 depending on which brand you are looking for.

However, if you are planning on going for a “native” USB 3.0 socket on your next machine then going for SSD could give you a huge advantage. You could be looking at over 400MB/s with a full speed USB 3.0 SSD drive which would reduce that 40GB transfer down to 1.5 minutes!!

The alternative is that you want to use your external drive for every-day storage, reading/writing files (perhaps running multiple virtual machines from the drive) in which case you will definitely benefit from the SSD. The random read/write speed even over USB 2.0 is blazingly fast compared to even the quickest HDD and on USB 3.0 this gets even better.

The only question I suppose is .. can you afford it?

Memory Leak in SharePoint 2013 (Preview) Search

Any of you who has setup their own SharePoint 2013 box (and cried at the Hardware requirements) will be aware of a process which is chewing up your RAM like nothing else


There will be four of these running (you can see them in Task Manager) and these basically represent the four major topology services for the FAST Search engine which now powers SharePoint 2013 search services.

The problem is the current implementation (the Preview aka “Beta” build) has a memory leak! This was confirmed by a TechNet blog post ( who also described two potential workarounds to alleviate the stress that noderunner.exe puts on your system:

Jose Vigenor from MS beta support pointed to two options to contain these processes:

  1. Use Set-SPEnterpriseSearchService -PerformanceLevel Reduced to reduce the CPU impact the search service has on your test environment.
  2. Modify the C:\Program Files\Microsoft Office Servers\15.0\Search\Runtime\1.0\noderunner.exe.config so that it can only consume X amount of RAM.
    Change the value at <nodeRunnerSettings memoryLimitMegabytes=”0″ /> to any amount of RAM you like to contain the memory leak.

Be careful when you implement this though, Paul Hunt (aka @cimares) has his own blog post (which is where I found the link above by the way!) where he encountered some “Out of Memory” exceptions when this is configured a little too tightly!

SharePoint 2013 (Preview) PowerShell bug .. sort of ..

This threw me when I got my first few SharePoint 2013 farms up and running. It looked like PowerShell had failed but actually everything was working fine.

When you start up the SharePoint 2013 Management Shell (aka PowerShell) then you get an error:

could not create a CmdletConfiguration for CmdletName Start-BulkOperation, CmldetClass, CmdletHelpFile C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\CONFIG\PowerShell\Help\Microsoft.Office.Education.Institution.dll-Help.xml. Cannot process argument because the value of argument “implementingType” is null. Change the value of argument “implementingType” to a non-null value.

This all looks kinda scary, but its actually just a bug in the SharePoint 2013 PowerShell scripts (don’t forget .. we are talking about a beta build here!)

In fact .. you should find all your PowerShell stuff works just fine

My experience of installing SharePoint 2013 Preview

I finally got some downtime and used it to get my new SharePoint play-pen up and running. I’ve been running an Office 365 Preview (aka SharePoint 2013 Online) account for a while to get my SharePoint vNext goodness .. but thought it was time to get myself up and running with a full server build (and flex those good old IT Pro skills while I was at it).

The good news .. it was damned similar to SharePoint 2010!

I was also kind of pleased to hear (from Spencer Harbar) that if you are installing the (sometimes painful) User Profile Service then there is “absolutely no difference whatsoever” between SharePoint 2013 (preview) and SharePoint 2010 (

This was slightly disappointing (because it can be a pain in the backside to get started for first timers) although good because I know most of the ins and outs of that particular FIM-based product and can typically get it running first time every time these days.

Hardware requirements ..

This has been quite a contentious topic with various furious discussion on twitter and blogs. This has mostly stemmed from official Microsoft documentation which suggests that you need a minimum of 24 GB of RAM to run a SharePoint 2013 dev box (including all of the pieces like VS and SQL).

The main reasons for this are 2 key pieces in the new infrastructure:

  1. App Fabric is now used extensively to boost performance. You will see DistributedCacheService.exe chewing a lot of RAM (typically between 500MB and 1GB on my machine) but this also means you get cross-server caching and blistering performance.
  2. FAST Search is also now baked in. The good part is you get a massive load of new search features out of the box. The bad news is that the FAST “noderunner.exe” processes (4 of them) will be running and probably chew between 1GB and 2GB of RAM on their own (without even doing very much!)

There have been people on the web (mentioning no names) who have bordered on the offensive mocking others for “ignoring Microsoft advice” and insisting that you are being foolish for running with less than “best practice” kit.

Then there are others (who to be honest I respect a lot more) who have been running on the TAP program and happily say that you can get it working on 6GB but ideally 8-12GB is needed for it to run smoothly (reports of FAST crashing if you have under 8GB have been heard!)

My personal rig is a single virtual machine which runs with 12GB RAM and 4 virtual cores. I am running full Search, User Profile Sync, most of the core services and have 3 web apps up and running. I have AD, SQL and SharePoint all on the same box with Visual Studio 2012 and so far it has been fine! I can imagine putting a lot of test data and running some hefty development scenarios might make it creak a bit at the seams, but so far no real problems.

Personally I liken this a lot to SharePoint 2010 scenarios. If you want to run EVERYTHING on a single box (Office Web Apps, FAST Search, BI services) then you are going to need a LOT of hardware. But for the average single user demo / developer rig you can get away with a lot less (I’ve run SharePoint 2010 farms on 4GB RAM before now .. and once they have “warmed up” they are quite happy for single-user demos).

The other bits ..

I also decided to refresh my environment with the other tools that go side by side with a new platform refresh (also because these have all hit final RTM release through MSDN):

  1. Windows Server 2012
  2. SQL Server 2012
  3. Visual Studio 2012

To be honest the installers for these was very straightforward, and there is really nothing special to mention which is any different to a normal dev box install.

Windows Server 2012 obviously was a bit “different” due to the new “not called Metro anymore” interface but I’ve been running Windows 8 on my laptop for about 6 months now so I was quite used to it. All of the old settings and options are still there buried behind other menus so no massive surprises for people used to using Server 2008 R2 or before.

The only real point of any note is with the service accounts when configuring SQL Server 2012. I am used to creating dedicated service accounts to run the SQL service instances but with this new version the default is baked-in service accounts which it creates for you! This is great for dev boxes as it is a few less accounts for me to setup and maintain!

SharePoint 2013 Preview  ..

This was actually surprisingly straightforward. The “splash screen” installer is identical to SharePoint 2010 (not sure if this a sign of the “preview” build status .. they might get around to updating this for RTM by in my opinion “if it’s not broken” rules and this works quite well).

The Pre-Requisites took care of everything that was needed (after 2 reboots) and then I was good to go. The actual install of the binaries went smooth as a whistle and then it was on to PSConfig.exe ..

Yep .. that’s right .. PSCONFIG .. any of you using the Configuration Wizard shame on you! (I only ever use it to get the “State Service” running once my farm is fully configured!)

PSConfig.exe allows you to setup your initial farm, create the Config Database and Central Admin database. The main part is it allows you to specify the database names so you don’t get nasty GUIDs appearing in SQL Server.

This isn’t exactly necessary for a single-server dev box, but I like to keep the old muscle memory in practice and it is good to practice a “clean” environment at all times (so you don’t get into bad habits!)

The syntax is pretty straightforward:

PSCONFIG.EXE -create configdb -server localhost -database SP2013_ConfigDb -admincontentdatabase SP2013_Admin_Content -user MyDomain\SPFarm -password [SPFarmPassword]  -passphrase [FarmPassPhrase]

Just make sure that the farm account has DBCreator and SecurityAdmin permissions in SQL Server and it should run fine!

Once that has finished, run the SharePoint Products Configuration Wizard to setup your Central Administration web app (basically just choosing the Port Number .. which I go for https://localhost:2013/ to keep it simple!).

Setting up the core services ..

Once that has been completed you should be into Central Admin and its time to setup the core services. It should be smooth sailing from here but as a general rule I configure 4 Web Applications and get going with the basic services.

Web Apps:

  1. https://sp/ (Using “Team Site” as the site collection)
  2. https://my/ (Using “My Site Host” as the site collection)
  3. https://ctype/ (default Content Type Hub .. “blank site” with all features turned on)
  4. https://apps/ (the new SharePoint 2013 App hosting site! don’t provision anything)

Once that is sorted I go for the core services:

  1. Managed Metadata Service (using https://ctype/ as the syndication hub)
  2. User Profile Service (using https://my/ as the my site URL)
  3. Search Services

The setup for these I found identical to SharePoint 2010 so there shouldn’t be any surprises here.

Obviously search was slightly different (due to the FAST pieces) but nothing to write home about for  a simple “single server” rig. There is also simple “import only” option for User Profiles which is new to 2013 and you can find out more here:

Now .. I don’t know a great deal about “apps” infrastructure yet so this is basic in the extreme! The only thing I know you need to do initially is create a blank web application, then use the “Manage App Catalog” option in Central Admin to provision the default “app catalog” site collection from which you can manage internal apps distribution and publishing.

Hopefully more to come on this (including developing some custom apps) but at the moment we’re all learning! 😉

That was pretty much it .. basic SharePoint 2013 farm up and running and ready to play!

I haven’t really been delving too much into the details yet, and plenty more still to learn and find out so watch this space, and hope you found this useful!

Windows Server 2012, Internet Explorer and missing links in Central Admin

This is one that stumped me for a short while. Anyone who has experienced “missing links in Central Admin” may be aware of this already. You install SharePoint, login to the server and try to navigate to Central Admin from a favourite in Internet Explorer.

Note – I realise you shouldn’t really be using IE locally on production servers. This is typically a “demo/dev box” problem

Everything is fine until you try to configure some services, and you find that a bunch of links have disappeared! In particular I spent a good 15 minutes working out why “Services on Server” had gone walkabout (for about 10 of those minutes I thought I’d gone mad and forgotten where it was).

You will probably have checked your permissions without any success:

  • Are you logged in as an Administrator? (yes)
  • Are you in the Farm Admin group? (yes)
  • Have you tried rebooting? (yes)

The truth is slightly more simple than that. There are two vital settings which are required for this to work on a server:

  1. Turn off “Internet Explorer Enhanced Security Configuration” (IE ESC). This is a pretty standard task for most single server demo / development boxes.
  2. When starting IE “run as administrator”

The second one is the kicker. When you run the “Central Administration” link from the Start Menu / Start Screen it automatically kicks into elevated privileges (and if you have User Access Control turned on then you get the normal prompt to “run as administrator”).

If you are running Server 2008 R2 then it is a pretty simple task to just modify the IE shortcut in your taskbar / desktop to “Run as Administrator” and you are good to go.

If you are running Server 2012 though things are not that simple! Sure, you can set the same option, but it won’t work (at least it didn’t for me).

The only way I could get it to work was to browse to the IE 10 install directory (C:\Program Files\Internet Explorer\) and create my own shortcut to the iexplore.exe application.

I then set that shortcut to “Run as Administrator”, pinned it to the taskbar and voila! success!

Just another one of those Windows Server 2012 quirks to get used to I guess…

If you have found an easier / quicker way to do this .. then please let me know in the comments!