Well first off, if you use the content organizer within a web under normal conditions, you
must not read any further.

If you want to do a lot around content organizer and manipulate data from begin to end,
please continue.

Giving a list of pitfalls would be going too far, but there are a couple of issues that I do want to mention:

1.       Timer job issues:

a.       One of the issues that we encountered was that a drop off library list event handler wouldn’t fire if the documents were routed via a timer job. One for one pushing of the documents to the sub site drop off folder wasn’t an issue. But a timer job push disabled the event receiver.

b.      Large files: when a push happens, the entire file is being copied first (if you go reflecting in the dll’s you’ll see that 2 streams are being set up). A final check happens if the entire file is copied correctly. Only problem was, to counter the first issue with the event handler we had to put the parsing logic into a workflow. Only problem , while the copying process is done, the workflow already kicked in. This in result returned a “this item is already changed” error.  So the check to see if the file is copied correctly failed only with large files. Small files never gave an issue.

As a result a doubling effect happened, the file was put in the drop off library of the sub site but also stayed in the root web drop off folder (the delete event of the source file never happened). Took us almost an hour to see where the problem was 🙂

2.       Creating content rules automatically

a.       An issue was, don’t fill the “Automatic Folder Creation” field via code, this is a read only field and will be filled in automatically (however I’m not a 100% sure this is the case, if I’m wrong, please leave me a note and I’ll change it)

b.      Also the “condition” part proved to be quite challenging

In fact the field says that it is a multiline of text, but if you take a good look, it only saves the column name, not the condition nor the value.

After doing a lot of reflecting I’ve learned a great deal of SharePoint internals and learned a bunch of objects that I’ve never used before. One dll you should really look at is the system.office.policy.dll file J .

At the moment I still don’t understand in depth what happens with the condition field (only thing I know that SP converts it into an xml structure, but where, what is being saved, still not sure).

The rest of the fields are pretty self-explaining and can easily be filled with data, make one rule via gui and see what the structure is.

3.       Creating the connection via code (instead of the central admin) -> in one of my previous blog posts I dedicate an entire blog to it

4.       An additional problem that came up last minutes.

In the drop off library I always checked with site collection admin rights. But it seems that If you don’t have these permission setting, you don’t get to see the files. This has something to do with a bug in SP. It’s setting the items to unique permissions so that a site owner can’t see all the
files in the drop off library. This ofcourse will be problematic. A solution for this one? I’m still looking into it and I’ll come back to you on that one in my later blogs.

That’s about it, these are my lessons learned (and still learning) concerning content organizer feature use in a not standard environment.

If you have a standard environment, please let me know, you would be the first one 🙂

The wonderful world of content routing is really limitless (well almost).

After turning the entire system almost inside out a new feature was added to the
scope. All the content rules are the same in the sub sites so every time a new sub
site is being created, all the content rules need to be created manually,
needless to say that this takes quite some time if you have a lot of content
types where the rules need to be set-up for.

So of course the question (well it was more of a suggestion from us) was, if we
couldn’t automate this process?

Let’s analyze it:

We have a list that cannot be found via the “view all content” link, euhm, ok but it’s
still a list. So via the spweb object it shouldn’t be a problem to find it.
Next, adding items is easy, but we still need to map the value to the fields of the listitem:


Internal Name

Display Name

Standard Value





/ No




line of text




line of text




line of text




line of text


used in Conditions


lines of text


for Automatic Folder Creation


line of text

Route To External Location

To External Location


/ No






So now you use this with a SPListItem object to add content routing rules via code. It’s
best to do it with feature activation (scope web) and an activation dependency
to Content Organizer feature.

Only problem is, the field routingconditionproperties is multiline,
so ok, no problem here.

Try setting a value to this field and after that do an edit item via the GUI. You’ll see
that your condition isn’t set. The SharePoint guys really did a disappearing
act on this one. If you set the condition and save the item, than you’ll see
only the field name of the condition, not a (is equal, contains, larger than, …) or the value that you’ve specified.

After doing a lot of reflecting (if you really want to get to know SharePoint, you must do
some reflecting on the Microsoft.SharePoint.dll J ), I only found
out that an xml structure is being made here.

It has the following structure:


                               <Condition ColumnName=”  condition=””  value=”” />


But what happens after the creation of this xml structure still eludes me .

So yesterday was final testing, everything came together with all the solutions that we made these past few weeks.
Manuel process was working fine, all the documents from the first drop off library seemed to route well to the second drop off library on the sub site. After doing the PowerShell command, all the 1000 test documents were pushed into the first drop off library on the root web. Ok this seemed to work fine, now it was time for the content processing timer job of SharePoint to do its magic. After the work of the timer job everything seemed to be in order, “seemed” being the key word here.
All the documents were indeed routed to the drop off library of the sub site but my event handler that was placed on “ItemAdded” on the dropoff library wasn’t trigged. Only logical conclusion was that the content processing timer job does a disable event triggering and an enable event triggering while it’s doing its job on routing the documents.
The solution was instead of an event handler on “ItemAdded”, creating a workflow that simple did the same thing, only execute the code. Keep in mind that you have an additional tracking system with the workflow history mechanism (thanks tom for the help :-)).

This is more of a story with some hints to look out for when using content organizer in an automatic way, if you have read the previous 3-4 blogs, you can stop reading, almost nothing new. 

After my last blog I was in the assumption that it would be smooth sailing but nothing was farther from the truth. Some of the issues that came with an automation process were unforeseen and with it a small delay of the project.

I will sketch the question that came from the customer and how it would be placed in the architecture of SharePoint:

The client asked for a migration tool from file shares to SharePoint.
But because it was a POC/Pilot the set-up had to be so generic that it would be easy to add additional file shares. So far nothing fancy, everything we make has to be generic so nothing special here.
Next some form of routing needed to be implemented; the name of the file had all the info that we needed. Ok I can hear everyone thinking that’s asking from trouble, parsing data from a filename that your system needs to route the information, but there wasn’t any other option.

So writing a string handler for retrieve info was requested, check.

So from – this was clear, a way to retrieve info to know the location of where the files must be routed too was clear as well.
So this was an easy option, SharePoint 2010 had a perfect system for this: Content processing and routing. So immediately we decided to use this feature instead of writing our own.
This was the question from the client and “solution” from us.

Ok, let’s pull apart the feature of content routing and we’ll look at it step by step.

First getting the document from file shares into SharePoint (into the drop off library), with already some knowledge of to where the files needs to go. So we had to do a little bit of parsing already. Ok Powershell without doubt. We activated the content organizer feature in the site and we had our “drop off library”, check.

A content routing rule needed to be added. Ok , first let’s create a content type based on Document and let’s add a field so that we could put the value in. After adding the rule the first step seemed easy.
Next step was to create a sub site as to where the content routing needed to send the document to. So also activating the content organizer feature in this web as well and adding the necessary rules to the final document libraries depending on the type (almost forgot the additional content types but they were created as well and used for the rules).
Almost all done, we still needed to create a connection between the root web and the sub site we’ve just created. Ok, going to the central administration site, general application settings, create connection to send to. Great, we need to select the web application, enter a name of the connection and the path of the sub site (keep in mind that the content organizer feature needs to be activated before doing this, else the url will be rejected).

So connection is created, going back to the only rule that we’ve created on the root web to route the documents now to the sub site. First we had to check if the content organizer settings allow to route
documents cross web. After selecting this feature we could edit the rule and select our new destination.

Now image all of the above in an automatic way  🙂 , yes it’s possible , lot of work but possible.

First test everything seemed fine. The automatic way of uploading the files to the first drop off library (root web) gave us no trouble. But the document weren’t routed at all.
After checking all the connections and settings we came across SharePoint versions (or the “manual” way of adding a numeric incremental value to the title of the file). Mmmm, we didn’t actually need the versioning story so deactivating it was an option. But first a little test, it appears that you need to check in the files first before the content processing and routing timer job routes the files. If they are check-out logically the files is in a “locked” state and will not be routed.

So the first solution was found but it wasn’t an option to deactivate the SharePoint versioning. Because of the adding a value to the name of the file it would appear in SharePoint as a new file.
This would give us trouble in the long run and the target libraries had version on anyway. So going to the list library settings and deactivating the version setting:” files need to be checked out before editing”. Doing the test again of uploading document the manual way, it seemed to work now.

The documents are placed correctly in the drop off library without versioning and the timer job picked them up nicely.

So they went to the second drop off library for some final routing to their document libraries accordingly to the content types.
From here normal content routing could be used. But I did mention a couple of lines back that the entire process needed to be created as generic as possible. So it wasn’t long that another sub site needed to be created using the first one as template.

Ok, creating the new sub site, checking all the settings, but wait the content organizer rules are pointing to the path of the template site. Mmm, checking again, yup, the place where the content organizer rules are being entered is just a list, with some metadata in string value. So when using a template web, the urls in the text remains unchanged. So we had to use a web provision event to edit these values (web provision event can be used because at that time we have all the necessary values). Also this event needed to create the connection between the root web and the newly created sub site. Also a list of content type rules must be created, because when the connection is created between root web and sub site, the content rules on the root web must be adjusted so that a new value in the file can be used to route the document to the new location.

When all this is completed you have an almost fully automated content routing process. And administrators will be glad not to create a new connection every time a sub site is created.
I’ve turned this feature inside out , so if you have a questions don’t hesitate to send me a mail or leave a comment.

hey everyone

Well my quest concerning the content organizer architecture has come to an end.

Got my solution to work without any problems. For those who don’t know what I’m talking about, please read my previous 2 blog posts and all will become clear.

The main issue that I had was how the content organizer structure does not automatically create a connection, so that the rootweb could send files from the drop-off library to a drop-off library on a subsite.
Instead Microsoft expects that the user must contact a farm administrator so that he can create a connection (in “General application settings” -> “Configure send to connections”).
It is a nightmare when sites can be created on the fly (even if a well defined governance document is created) and every time he/she must add the connection to the Web Application.

Now if a site template is used and the content organizer feature is activated in the template, a connection will be made automatically. I’ve put the logic in the WebEventReceiver at the webprovisioned event.

I’ll have to rewrite some code, clean up a lot of commented code lines and I’ll post the solution on codeplex, link will be provided later.

Basically it comes down to this:

SPOfficialFileHost tempFile = new SPOfficialFileHost(true); // the boolean condition gives an indication if a unique ID must be created or not, this happens on constructor level
tempFile.Explanation = “string value” ;
tempFile.OfficialFileName = Web.Title; (or string value)
tempFile.OfficialFileUrl = new Uri( url of the site + “/_vti_bin/officialfile.asmx”); // I’ve added the _vti_bin part because in central admin this is requested as well, but don’t know if it works without
tempFile.Action = SOfficialFileAction (.move, .link, .copy) //choose what you want to do when the file leaves the drop folder
tempFile.ShowOnSendToMenu = boolean value

As you can see the image below, this is familiar 🙂

To add this SPOfficialFileHost object to the web app, you’ll need to add it to (spsite)site.WebApplication.OfficialFileHosts.Add(tempFile);
and now you need to use site.WebApplication.Update() , if you don’t do this, all your connections will be gone when a application pool recycle happens.
If you use the update , all the changes are being serialized and propagating the changes throughout the farm.

Now if you go to the cental admin page, you’ll see the added connection listed in the listbox. If you didn’t use the webapplication update, the connection will not be listed.

Now there could be a catch, if you use the .update() an error must be shown (even in elevation mode) that you have been denied access to push the changes.

A small SP PowerShell command set must be done before this is solved. But it’s probably not the correct way to bypass this error. Got it from here

$contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$contentService.RemoteAdministratorAccessDenied = $false

Hope it helps someone because I didn’t found one blog about this subject.

My quest in content organizers brings me to a site template I didn’t even had the slightest notion that it existed: A Tenant Administration Site. Some description about what it does you can find here .

Now I followed the steps and didn’t do the last powershell command. The site will work, you can go to it, but clicking upon any link you’ll get the error: This page cannot render outside of a tenant administration site.

Steps to solve this one:
Do the last powershell step 🙂

Assign the Tenant Administration Site

Set-SPSiteAdministration –Identity http://// –AdministrationSiteType TenantAdministration

And all will be good in SharePoint world.

This past week I’ve been deep diving into the Content organize structure.
How does a document get routed from the drop folder to the destination Document library?
All the MSDN examples and blog posts handle the normal out of the box functionality. But I literally didn’t found one how to do all this via code. (If you have one, send me the link and I’ll add you on my personal hero list  )

Also, very few of the websites that I found handles the content organizer cross sites.
After a lot of digging (disassemble, reflector , analyzing the SharePoint dll’s (again)) I can make a well-founded statement that those sites were in their good mind not or almost not to talk about it.

I’ll sketch the problem:
Imagine you have a rootsite with content organizer rules listed based on content types.
Say you have 5 subsites, and all of these subsites use their own content type (for ease of use we say that the content types are all defined on the rootweb, so no change or additional fields on the content type in the subsite).
Now I want to do some content routing from the rootweb to the subsites. Should be fairly easy but sadly it is not.
A “connection” must first be made at central admin level in order to use the “connection” to push the content from the drop folder in the rootweb to the dropfolder in the subsite.
So for a “simple” power user tool you are fairly quickly blocked in using the content organizer if you want to do content pushing outside of the web. Because you need a farm admin to create those “connections” and this is sometimes not possible when managing a big farm.

The place where I’m working now, I have to create a sort of content routing with site creations on the fly. So I investigated this structure and wanted to list the shortcomings.
Don’t know why there isn’t a nicer way of this structure or why Microsoft splitted this into central admin, power user sections.

But the main reason of this blog is that there is absolutely none blog/ msdn article/ technet, you name it , no information is available on how to create a connection manually from inside the site collection.
I’m investigating this issue with Tom van Gaever (fellow SharePoint junior evangelist 😉 link to his blog is here ) and we’ll be trying to find a capable solution slash sort of automation.
I’ll keep you guys posted on the development on a weekly basis. Hopefully I can post a codeplex project so that you can use it also.