This is more of a story with some hints to look out for when using content organizer in an automatic way, if you have read the previous 3-4 blogs, you can stop reading, almost nothing new. 

After my last blog I was in the assumption that it would be smooth sailing but nothing was farther from the truth. Some of the issues that came with an automation process were unforeseen and with it a small delay of the project.

I will sketch the question that came from the customer and how it would be placed in the architecture of SharePoint:

The client asked for a migration tool from file shares to SharePoint.
But because it was a POC/Pilot the set-up had to be so generic that it would be easy to add additional file shares. So far nothing fancy, everything we make has to be generic so nothing special here.
Next some form of routing needed to be implemented; the name of the file had all the info that we needed. Ok I can hear everyone thinking that’s asking from trouble, parsing data from a filename that your system needs to route the information, but there wasn’t any other option.

So writing a string handler for retrieve info was requested, check.

So from – this was clear, a way to retrieve info to know the location of where the files must be routed too was clear as well.
So this was an easy option, SharePoint 2010 had a perfect system for this: Content processing and routing. So immediately we decided to use this feature instead of writing our own.
This was the question from the client and “solution” from us.

Ok, let’s pull apart the feature of content routing and we’ll look at it step by step.

First getting the document from file shares into SharePoint (into the drop off library), with already some knowledge of to where the files needs to go. So we had to do a little bit of parsing already. Ok Powershell without doubt. We activated the content organizer feature in the site and we had our “drop off library”, check.

A content routing rule needed to be added. Ok , first let’s create a content type based on Document and let’s add a field so that we could put the value in. After adding the rule the first step seemed easy.
Next step was to create a sub site as to where the content routing needed to send the document to. So also activating the content organizer feature in this web as well and adding the necessary rules to the final document libraries depending on the type (almost forgot the additional content types but they were created as well and used for the rules).
Almost all done, we still needed to create a connection between the root web and the sub site we’ve just created. Ok, going to the central administration site, general application settings, create connection to send to. Great, we need to select the web application, enter a name of the connection and the path of the sub site (keep in mind that the content organizer feature needs to be activated before doing this, else the url will be rejected).

So connection is created, going back to the only rule that we’ve created on the root web to route the documents now to the sub site. First we had to check if the content organizer settings allow to route
documents cross web. After selecting this feature we could edit the rule and select our new destination.

Now image all of the above in an automatic way  🙂 , yes it’s possible , lot of work but possible.

First test everything seemed fine. The automatic way of uploading the files to the first drop off library (root web) gave us no trouble. But the document weren’t routed at all.
After checking all the connections and settings we came across SharePoint versions (or the “manual” way of adding a numeric incremental value to the title of the file). Mmmm, we didn’t actually need the versioning story so deactivating it was an option. But first a little test, it appears that you need to check in the files first before the content processing and routing timer job routes the files. If they are check-out logically the files is in a “locked” state and will not be routed.

So the first solution was found but it wasn’t an option to deactivate the SharePoint versioning. Because of the adding a value to the name of the file it would appear in SharePoint as a new file.
This would give us trouble in the long run and the target libraries had version on anyway. So going to the list library settings and deactivating the version setting:” files need to be checked out before editing”. Doing the test again of uploading document the manual way, it seemed to work now.

The documents are placed correctly in the drop off library without versioning and the timer job picked them up nicely.

So they went to the second drop off library for some final routing to their document libraries accordingly to the content types.
From here normal content routing could be used. But I did mention a couple of lines back that the entire process needed to be created as generic as possible. So it wasn’t long that another sub site needed to be created using the first one as template.

Ok, creating the new sub site, checking all the settings, but wait the content organizer rules are pointing to the path of the template site. Mmm, checking again, yup, the place where the content organizer rules are being entered is just a list, with some metadata in string value. So when using a template web, the urls in the text remains unchanged. So we had to use a web provision event to edit these values (web provision event can be used because at that time we have all the necessary values). Also this event needed to create the connection between the root web and the newly created sub site. Also a list of content type rules must be created, because when the connection is created between root web and sub site, the content rules on the root web must be adjusted so that a new value in the file can be used to route the document to the new location.

When all this is completed you have an almost fully automated content routing process. And administrators will be glad not to create a new connection every time a sub site is created.
I’ve turned this feature inside out , so if you have a questions don’t hesitate to send me a mail or leave a comment.

hey everyone

Well my quest concerning the content organizer architecture has come to an end.

Got my solution to work without any problems. For those who don’t know what I’m talking about, please read my previous 2 blog posts and all will become clear.

The main issue that I had was how the content organizer structure does not automatically create a connection, so that the rootweb could send files from the drop-off library to a drop-off library on a subsite.
Instead Microsoft expects that the user must contact a farm administrator so that he can create a connection (in “General application settings” -> “Configure send to connections”).
It is a nightmare when sites can be created on the fly (even if a well defined governance document is created) and every time he/she must add the connection to the Web Application.

Now if a site template is used and the content organizer feature is activated in the template, a connection will be made automatically. I’ve put the logic in the WebEventReceiver at the webprovisioned event.

I’ll have to rewrite some code, clean up a lot of commented code lines and I’ll post the solution on codeplex, link will be provided later.

Basically it comes down to this:

SPOfficialFileHost tempFile = new SPOfficialFileHost(true); // the boolean condition gives an indication if a unique ID must be created or not, this happens on constructor level
tempFile.Explanation = “string value” ;
tempFile.OfficialFileName = Web.Title; (or string value)
tempFile.OfficialFileUrl = new Uri( url of the site + “/_vti_bin/officialfile.asmx”); // I’ve added the _vti_bin part because in central admin this is requested as well, but don’t know if it works without
tempFile.Action = SOfficialFileAction (.move, .link, .copy) //choose what you want to do when the file leaves the drop folder
tempFile.ShowOnSendToMenu = boolean value

As you can see the image below, this is familiar 🙂

To add this SPOfficialFileHost object to the web app, you’ll need to add it to (spsite)site.WebApplication.OfficialFileHosts.Add(tempFile);
and now you need to use site.WebApplication.Update() , if you don’t do this, all your connections will be gone when a application pool recycle happens.
If you use the update , all the changes are being serialized and propagating the changes throughout the farm.

Now if you go to the cental admin page, you’ll see the added connection listed in the listbox. If you didn’t use the webapplication update, the connection will not be listed.

Now there could be a catch, if you use the .update() an error must be shown (even in elevation mode) that you have been denied access to push the changes.

A small SP PowerShell command set must be done before this is solved. But it’s probably not the correct way to bypass this error. Got it from here

$contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$contentService.RemoteAdministratorAccessDenied = $false
$contentService.Update()

Hope it helps someone because I didn’t found one blog about this subject.

This past week I’ve been deep diving into the Content organize structure.
How does a document get routed from the drop folder to the destination Document library?
All the MSDN examples and blog posts handle the normal out of the box functionality. But I literally didn’t found one how to do all this via code. (If you have one, send me the link and I’ll add you on my personal hero list  )

Also, very few of the websites that I found handles the content organizer cross sites.
After a lot of digging (disassemble, reflector , analyzing the SharePoint dll’s (again)) I can make a well-founded statement that those sites were in their good mind not or almost not to talk about it.

I’ll sketch the problem:
Imagine you have a rootsite with content organizer rules listed based on content types.
Say you have 5 subsites, and all of these subsites use their own content type (for ease of use we say that the content types are all defined on the rootweb, so no change or additional fields on the content type in the subsite).
Now I want to do some content routing from the rootweb to the subsites. Should be fairly easy but sadly it is not.
A “connection” must first be made at central admin level in order to use the “connection” to push the content from the drop folder in the rootweb to the dropfolder in the subsite.
So for a “simple” power user tool you are fairly quickly blocked in using the content organizer if you want to do content pushing outside of the web. Because you need a farm admin to create those “connections” and this is sometimes not possible when managing a big farm.

The place where I’m working now, I have to create a sort of content routing with site creations on the fly. So I investigated this structure and wanted to list the shortcomings.
Don’t know why there isn’t a nicer way of this structure or why Microsoft splitted this into central admin, power user sections.

But the main reason of this blog is that there is absolutely none blog/ msdn article/ technet, you name it , no information is available on how to create a connection manually from inside the site collection.
I’m investigating this issue with Tom van Gaever (fellow SharePoint junior evangelist 😉 link to his blog is here ) and we’ll be trying to find a capable solution slash sort of automation.
I’ll keep you guys posted on the development on a weekly basis. Hopefully I can post a codeplex project so that you can use it also.