Wednesday, December 15, 2010

Powershell and ONET.XML gotchya

I have been suspecting this for some time now, but as of today I am sure that there is some funny caching going on in the PowerShell New-SPSite cmdlet.

I was working on a custom site definition for SharePoint 2010, specifically I was messing around in the ONET.XML with web part placement. I had changed the layout of a page and needed to move my web parts to different zones. So I changed the WebPartZoneID attribute as usual, but I could not get it to work. My web parts were being inserted into the same old zone as before. I gave up eventually and thought I'd solve it another day.

Today after a fresh boot, I redeployed my site again and all was working as expected. Huh?

The way I was working the other day was that I had a PowerShell command line session open and I was simply issuing two commands to delete and create a new site based on my definition. Sometimes I would use the quick deploy method from CKSDEV and sometimes I would do a full deployement between these commands:

Remove-SPSite http://myintranet

New-SPSite http://myintranet -OwnerAlias XCOMPLICA\Joe -Template XCOMPLICA#0 
Regardless of what I did however, the New-SPSite command never picked up any changes I made to the ONET.XML file. I have now confirmed with a few more tests that I need to start up a new Powershell console for this to happen. I even tried iisreset and resetting all the app pools of SharePoint - no difference. It's cached somewhere in the cmdlet.

I guess I am putting my commands in a script I can call from a batch file, or maybe I'll try doing something with Powershell scopes to resolve this.

In any case beware of this caching, it cost me half a day.

Tuesday, December 07, 2010

Nicer Claims Login Page SharePoint 2010

I have been working on a Forms Based Authentication (FBA) solution for an internet site and one of the requirements was the creation of a custom login page, one that will be somewhat more user friendly than the drop down page that SharePoint 2010 uses.

Most internet users will have no clue what to do with the "choose your claims provider" drop down so we wanted to show a page with a username and password box where most users will enter their credentials. These will be authenticated against the SQL database backed FBA, exactly what needs to happen for internet users. However we also want to show a small link that will automatically log in our internal users with Windows authentication.



This is not too difficult but I can see myself needing this again thus the blog post.

I started with following this blog post on Creating a Custom Login Page for SharePoint 2010 by Kirk Evans, which worked great. Following his instructions gets you to the point of having a login page for your FBA users. All you need now is that link for your internal users.

First I modified the login control to make sure it goes against the correct Membership Provider, by adding the MembershipProvider attribute like:

<asp:Login ID="signInControl" FailureText="<%$Resources:wss,login_pageFailureText%>" MembershipProvider="FBAMembershipProvider"
        runat="server" Width="100%" DisplayRememberMe="false" />

To be honest I'm not sure if that is needed, it just seemed logical to me.

Next I added a LinkButton right under the login control so I had markup like:

<asp:Login ID="signInControl" FailureText="<%$Resources:wss,login_pageFailureText%>" MembershipProvider="FBAMembershipProvider"
        runat="server" Width="100%" DisplayRememberMe="false" />
<asp:LinkButton ID="hlInternalUsers" Text="Internal Login" runat="server" />

Lastly I added some simple code to the code beside file: (explanation below)

        protected override void OnInit(EventArgs e)
        {
            base.OnInit(e);
            hlInternalUsers.Click += new EventHandler(hlInternalUsers_Click);
        }

        void hlInternalUsers_Click(object sender, EventArgs e)
        {
            if (null != SPContext.Current && null != SPContext.Current.Site)
            {
                SPIisSettings iisSettings = SPContext.Current.Site.WebApplication.IisSettings[SPUrlZone.Default];
                if (null != iisSettings && iisSettings.UseWindowsClaimsAuthenticationProvider)
                {
                    SPAuthenticationProvider provider = iisSettings.WindowsClaimsAuthenticationProvider;
                    RedirectToLoginPage(provider);
                }
            }
        }

        //borrowed from Microsoft.SharePoint.IdentityModel.LogonSelector
        private void RedirectToLoginPage(SPAuthenticationProvider provider)
        {
            string components = HttpContext.Current.Request.Url.GetComponents(UriComponents.Query, UriFormat.SafeUnescaped);
            string url = provider.AuthenticationRedirectionUrl.ToString();
            if (provider is SPWindowsAuthenticationProvider)
            {
                components = EnsureUrlSkipsFormsAuthModuleRedirection(components, true);
            }
            SPUtility.Redirect(url, SPRedirectFlags.Default, this.Context, components);
        }

        //borrowed from Microsoft.SharePoint.Utilities.SPUtility
        private string EnsureUrlSkipsFormsAuthModuleRedirection(string url, bool urlIsQueryStringOnly)
        {
            if (!url.Contains("ReturnUrl="))
            {
                if (urlIsQueryStringOnly)
                {
                    url = url + (string.IsNullOrEmpty(url) ? "" : "&");
                }
                else
                {
                    url = url + ((url.IndexOf('?') == -1) ? "?" : "&");
                }
                url = url + "ReturnUrl=";
            }
            return url;
        }


The link button has an event handler attached to its Click event. The event handler makes sure there is an SPContext to work with and then gets an SPIisSettings object for the Default Zone of the current site. You can change the zone here if needed. If the SPIisSettings object is succesfully retrieved, we can check if Windows Authenticatoin is being used by this zone. If not, our link makes no sense! Next we get the Windows Authentication provider and call a helper method that will redirect the user to the right place.

The RedirectToLoginPage method is actually borrowed from the Microsoft.SharePoint.IdentityModel.LogonSelector class that is the control responsible for the drop down on the Out-of-box Claims logon page. As far as I can tell it plays with the urls so that the user is redirected to the correct login page (depending on the provider). Notice that this method uses yet another helper method, the EnsureUrlSkipsFormsAuthModuleRedirection method. This is a method borrowed from the SPUtility class, sadly it is internal so I just copied it out of reflector. You can use reflection if you want but I didn't want the performance hit. It just helps with some url magic.

So with a few lines of our own code and a few borrowed thanks to reflector, we have a login page that is very simple for our internet FBA users but still allows our internal users to log in with Windows Auth by clicking a simple link.

Friday, December 03, 2010

Beware Parallels 6

I have been using Parallels virtualization for almost two years now, and I was completely happy with version 4 and 5.  VMs were stable and fast, no issues. Recently version 6 came out, and it has been a total piece of sh*t.  It has managed to destroy a number of VMs at this point from corrupting SharePoint dlls to making windows think it is no longer a genuine copy. I thought maybe I'd just create a clean install of W2K8 based on version 6 since sometimes VMs that are upgraded can have issues right?  Well I can't even install a fresh copy of W2K8 x64 with this version of Parallels! I get this lovely screen:


Completely useless to me now. I'm soooo glad I spent the €49.95 on it. If you have Parallels 5, DO NOT UPGRADE!

Friday, November 19, 2010

XComplica is hiring

I have had plans for a long time to grow my business, and the time to start hiring some help has come. See www.xcomplica.com/careers for more details. Spreading the word is appreciated!

DIWUG Slides etc.

Yesterday I did a talk at the Dutch Information Worker User Group (DIWUG) about Practical Powershell for the SharePoint professional. As promised I am posting my slide deck as well as the scripts used.

I've put all the material in a zip file that you can download. I've added all the files that I used for the demo yesterday plus a few scripts that I didn't get a chance to run but were mentioned in the slides. Note that these are very 'demo' scripts and have lots of stuff hardcoded, no error handling, etc.  The slide deck includes notes that should give you an idea of which commands I was running.

Hope this helps someone out!

Friday, October 29, 2010

Handy powershell script for "Data is Null" error

Earlier I blogged about a "Data is Null" error breaking the search in SharePoint 2010. http://jcapka.blogspot.com/2010/08/nasty-data-is-null-error-in-sharepoint.html

Seems that while we were on the right path, the cause of the problem was just a little off.  When creating a new site group through code, the parameter that was causing the problem is not the "default user" parameter but rather the "description" parameter. When the description is null, the search breaks.

Microsoft has a fix (http://support.microsoft.com/kb/2323206) on their website for updating the null description values and while the fix is perfectly valid, it is code and you may not be allowed to run this code in all cases. (Try getting approval to run a console app on a production machine).  So here is a small powershell script to help you fix the NULL descriptions.

Function UpdateEmptyDescriptions($siteUrl)
{
        $site =  Get-SPSite $siteUrl
        
        $needupdate = $site.RootWeb.SiteGroups  | Where-Object {$_.Description -eq $NULL} 
        
        If($needupdate)
        {
            Write-Host "Site Groups that are being updated"
            Write-Output $needupdate | Format-Table Name
            $needupdate | ForEach-Object{$_.Description = $_.Name}
            $site.RootWeb.SiteGroups | % {$_.Update()}
        }
        Else
        {
            Write-Host "No Update Needed"
        }
}

UpdateEmptyDescriptions("http://mysite.local")

A moment of clarity

Today I had a moment of clarity when typing the name of a SharePoint assembly. I was getting Powershell ready to use with SharePoint and typed:

Add-PSSnapin Microsoft.SharePoint.PowersHell

Something caused me to hit the shift key at just the right moment, and I realized how wide SharePoint adoption is.   :)

Friday, October 15, 2010

Article on LINQ to SharePoint

I recently had an article published in the DIWUG SharePoint eMagazine, introducing LINQ to SharePoint. The magazine is a free download and has some great content so get yourself a copy at http://www.diwug.nl/Pages/downloads.aspx

Wednesday, September 22, 2010

Security Trap

I was working on a staging server today and wanted to create a new User Profile Service Application when I got stumped by a silly security issue.  I didn't have permissions to create any new Service Applications. Strangely though, I am a SharePoint farm administrator as well as a local admin on the web server I was using. So why did I see this?


It took me a few minutes to realize that I had opened the browser and entered the url of the central admin by hand, instead of using the shortcut to Central Admin in the start menu. This is not a problem, except that I had not chosen to run IE as an administrator, and so I didn't have the necessary rights to create the Service Application. Once I opened up another instance of IE using the 'Run As Administrator' option, all was well again. So if you are in the habit of not using the start menu shortcut to Central Admin, remember that running IE as a regular user will not work!

Tuesday, September 21, 2010

Metadata Filtered Lookup Field

SharePoint lookup fields are not the most friendly when working with a large list of data. What I mean is, if you have a list of 1000 items that you need to choose from using a lookup, you get a not so fun experience as seen below.

Trying to find the right item here can be a real pain. This is not a new pain however, and people have worked around this. A great example can be found here: http://ilovesharepoint.codeplex.com/ These folks have created a lookup field that allows searching and filtering of the lookup data.

My needs were somewhat different, and so I set out to create a custom field. Specifically, I needed a field that would let the end user filter the lookup items based on metadata tags applied to the lookup items.

Let's use an imaginary scenario: We are a special breed of tourist, we travel the world looking for ice cream. We document our travels using SharePoint. Every trip we take, we document by adding an item to a list of trips. One of the fields in this list is a lookup for an ice cream vendor. This lookup points to a list of ice cream vendors from around the world, and that list has grown to be quite large. We do however have metadata on those ice cream vendors, and you would like to only see relevant ice cream vendors based on the country they are located in and the flavor of ice cream they sell.

Introducing the Metadata Filtered Lookup Field. This field inherits from a regular lookup field, but presents a different user interface. It lets the user enter a few tags which then filter the list of options to choose from. The field looks at the target list, figures out what metadata is being used to tag the items and presents the user with a tag select box for each term set that is in use. In our example, this means that choosing an ice cream vendor will let us enter countries and flavors as filter criteria. See below.



We now enter a term, click refresh, and we see the options that have been tagged with this term. So we see three ice cream vendors that sell banana ice cream.


We can add some more terms, click refresh and see that more options are showing up. So we see that there are six vendors selling banana or chocolate ice cream or are in Germany. This is important, the tags are ORed together, so the result set grows as more tags are added.


Next we select a few items (in a singe select lookup, only one is selectable at a time) and click save.


Looking at the list item, we can now see that we had selected "Ice Cream Heaven", "The Parlour" and "Das Cream".

Cool no?  Let's look at one more thing: What happens when we edit this item?


If we open up the same item, we will notice that a bunch of tags have been pre-populated, along with some checked and not checked items. What is up here? The design decision I made was that an editor should always be able to see the already selected items. Otherwise it would be very difficult to un-select items. So what happens here is that the all the tags of the selected items are used to pre-populate the metadata selection boxes. Since these tags can also be used with other items, the result set that we see also includes some unselected items. This may seem strange at first, but it is actually a useful feature since the unselected items are likely to be related to the selected items, and thus likely items the user is looking for!

I have packaged this field up as a wsp solution that is available at Codeplex: http://metafilteredfield.codeplex.com/ The source is also available and if someone wants to work on this I will be happy to give you access to the repository.

There is one limitation to the field that I have noticed. The out of box lookups allow you to specify 'additional fields' that are created in the list. This field also has that option, but it doesn't work at the moment.

Lastly, I have tested this to a certain degree, but make sure to do your own tests, especially performance tests before you deploy this into production.

Monday, August 23, 2010

Nasty "Data is Null" error in SharePoint 2010 Search

For the last few weeks my team has been struggling with a nasty error that showed up after we performed a search crawl on our SharePoint site. The site has a number of customizations and along the way we managed to do something that resulted in the following error showing up in the crawl log:

The SharePoint item being crawled returned an error when requesting data from the web service. ( Error from SharePoint site: Data is Null. This method or property cannot be called on Null values. )

Almost as useless as the "Unknown error" message. We got talking to Microsoft PSS, they were stumped too, and are currently poring over our log files and trace dumps. In the meantime I think I have found the code that was causing this error.

In one of our features, we use an event receiver to create a number of security groups. The following line was in there:

Web.SiteGroups.Add(groupName, owner, null, null);

Web is an SPWeb, groupName is a string and owner is an SPPrincipal. These are not relevant here. The third parameter in the Add method is the "default user". It was not very clear to me what this was, and since we just wanted empty security groups that the client can fill when we deliver the product, I thought it would be fine to leave that parameter as null. While the code compiles, runs and as far as I can tell creates the security groups just fine, it is what causes that nasty error in the crawl log. I should mention that the error actually causes the crawl to fail and thus breaks search altogether, so it's a big one.

Once I changed the third parameter to a valid SPUser, all was well again.

If you get this error message, I strongly advise you to check what SharePoint API calls you are making that are passing null as a parameter, and then try to eliminate them one by one.

UPDATE:


It seems that the "default user" parameter is not the culprit here after all, but the fourth parameter is the one that actually causes this error. This is a description parameter, and even an empty string is OK. See this Microsoft Support article http://support.microsoft.com/kb/2323206

Thursday, August 19, 2010

Managed Metadata Countries

I spent some time on this the other day and thought others might find this useful. I needed a list of all the countries in the world as SharePoint Managed Metadata. So I grabbed the list of countries from Wikipedia and cleaned it up in Excel so that it can be imported into the Managed Metadata Service Application in SharePoint 2010. You can download it here. Enjoy!

Wednesday, June 30, 2010

Strange SPANs in SharePoint 2010 Team Site

My colleague and I came across something quite strange yesterday and I think it's worth sharing. We were inspecting the HTML source of a SharePoint 2010 Out-Of-Box Team Site and found this at the end of the file:
</body>
</html>
<span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span><span></span>
For those who are not HMTL savvy, this is not valid HTML.

We are trying to look into what the cause of this could be, but so far no luck. If anyone has any idea as to what these spans are for or where they come from, I am very interested!

Update: There is now a fix for this from Wictor WilĂ©n, see his blog

Sunday, June 06, 2010

Connecting XsltListViewWebParts together in code

Last week a client ran into some trouble trying to connect two XsltListViewWebParts (XLV) together using a feature receiver. Google yielded little so I thought I'd blog about it.

Assume that we have two lists. One list is called Categories and the other list is called Items. The Items list has a lookup field to the Categories list Title field, and the name of the lookup field is Category.

First, the two web parts need to be added to a web part page, in our case inside a module.

<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
  <Module Name="WebPartModule">
    <File Path="WebPartModule\default.aspx" Url="default.aspx" >
      <View List="$Resources:core,lists_Folder;/Categories" Url="" BaseViewID="0" WebPartZoneID="_RightColumn" WebPartOrder="3" ID="wpCategories" >
        <![CDATA[
              <webParts>
                  <webPart xmlns="http://schemas.microsoft.com/WebPart/v3">
                      <metaData>
                          <type name="Microsoft.SharePoint.WebPartPages.XsltListViewWebPart,Microsoft.SharePoint,Version=14.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c" />
                          <importErrorMessage>Cannot import this Web Part.</importErrorMessage>
                      </metaData>
                      <data>
                          <properties>
                              <property name="AllowConnect" type="bool">True</property>
                              <property name="ChromeType" type="chrometype">Default</property>
                              <property name="AllowClose" type="bool">False</property>
                          </properties>
                      </data>
                  </webPart>
              </webParts>
          ]]>
      </View>
      <View List="$Resources:core,lists_Folder;/Items" Url="" BaseViewID="0" WebPartZoneID="_LeftColumn" WebPartOrder="1" ID="wpItems">
        <![CDATA[
              <webParts>
                  <webPart xmlns="http://schemas.microsoft.com/WebPart/v3">
                      <metaData>
                          <type name="Microsoft.SharePoint.WebPartPages.XsltListViewWebPart,Microsoft.SharePoint,Version=14.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c" />
                          <importErrorMessage>Cannot import this Web Part.</importErrorMessage>
                      </metaData>
                      <data>
                          <properties>
                              <property name="AllowConnect" type="bool">True</property>
                              <property name="ChromeType" type="chrometype">Default</property>
                              <property name="AllowClose" type="bool">False</property>
                          </properties>
                      </data>
                  </webPart>
              </webParts>
          ]]>
      </View>
    </File>
  </Module>
</Elements>

As configured here, there would be two XLV web parts places on the default.aspx file. What is important to notice here is that the View elements have an ID attribute which we are setting to something recognizable. This will be important later.

The next step in the puzzle is to create the connection between the web parts. I have seen samples that do this in XML, but I am not sure how that can be done for the XLV and we chose to write code in the feature receiver that accomplished this.

First we get an instance of the limited web part manager class for the page that we added these web parts to. In our case it is the default.aspx page. We then use the IDs form the XML as noted above to retrieve the web parts themselves. Without this ID, we don't have a robust way of fetching the web parts out of the collection.

// Get the web object form the feature properties
SPWeb web = (SPWeb)properties.Feature.Parent;

// Get the webpart manager
SPLimitedWebPartManager webPartManager = web.GetLimitedWebPartManager("default.aspx", PersonalizationScope.Shared);

// Get the webpart by id
System.Web.UI.WebControls.WebParts.WebPart providerWebPart = webPartManager.WebParts["wpCategories"];
System.Web.UI.WebControls.WebParts.WebPart consumerWebPart = webPartManager.WebParts["wpItems"];

Next we need to get the connection points of the web parts. This is where things start to get interesting. The XLV exposes two consumer connection points and two provider connection points. I used powershell to have a look at the connection points, and the result is the following:

Provider:

AllowsMultipleConnections : True
ControlType : Microsoft.SharePoint.WebPartPages.XsltListViewWebPart
InterfaceType : System.Web.UI.WebControls.WebParts.IWebPartTable
ID : TableProvider
DisplayName : Table

AllowsMultipleConnections : True
ControlType : Microsoft.SharePoint.WebPartPages.XsltListViewWebPart
InterfaceType : System.Web.UI.WebControls.WebParts.IWebPartRow
ID : DFWP Row Provider ID
DisplayName : Row of Data

Consumer:

AllowsMultipleConnections : True
ControlType : Microsoft.SharePoint.WebPartPages.XsltListViewWebPart
InterfaceType : System.Web.UI.WebControls.WebParts.IWebPartParameters
ID : DFWP Parameter Consumer ID
DisplayName : Parameters

AllowsMultipleConnections : True
ControlType : Microsoft.SharePoint.WebPartPages.XsltListViewWebPart
InterfaceType : System.Web.UI.WebControls.WebParts.IWebPartParameters
ID : DFWP Filter Consumer ID
DisplayName : Filter Values

I will be honest and admit that I don't know what all of them are intended to do. However when we connected the web parts by hand in the UI, I noticed that we were using the 'Filter Values' connection point and the 'Row of Data' connection point. Thus I know we need to use these in the code.

// Get the connection point for the consumer.
System.Web.UI.WebControls.WebParts.ConsumerConnectionPointCollection consumerConnections =
 webPartManager.GetConsumerConnectionPoints(consumerWebPart);
ConsumerConnectionPoint consumerConnection = consumerConnections["DFWP Filter Consumer ID"];

// Get the connection point for the provider.
System.Web.UI.WebControls.WebParts.ProviderConnectionPointCollection providerConnections =
 webPartManager.GetProviderConnectionPoints(providerWebPart);
ProviderConnectionPoint providerConnection = providerConnections["DFWP Row Provider ID"];

Now that we have the connection points, we want to use the LimitedWebPartManager to create a connection and save it to the collection of connections. This will not work without one extra step however, since the provider and consumer interfaces are not compatible. The exact error that you will see is:

The provider connection point "Row of Data" on "wpCategories" and the consumer connection point "Filter Values" on "wpItems" do not use the same connection interface.

In order to make this work, we need to create an instance of a RowToParametersTransformer, and pass it the field names that we want to map. If you remember, we had a lookup field named Category in the items list, and it pointed to the Title field in the Categories list. So we want to map these field in the connection.

// Get a new  RowToParametersTransformer object.
RowToParametersTransformer trans = new RowToParametersTransformer();
trans.ConsumerFieldNames = new string[] {"Category"};
trans.ProviderFieldNames = new string[] {"Title"};

// Get a new connection using the 2 connection points.
Microsoft.SharePoint.WebPartPages.SPWebPartConnection newConnection =
 webPartManager.SPConnectWebParts(
  providerWebPart, providerConnection,
  consumerWebPart, consumerConnection, trans);

// Add the new connection
webPartManager.SPWebPartConnections.Add(newConnection);

It took us a while to figure this one out, the RowToParametersTransformer threw us for a loop. Hope it helps someone out there.

The complete code for easier reading:

// Get the web object form the feature properties
SPWeb web = (SPWeb)properties.Feature.Parent;

// Get the webpart manager
SPLimitedWebPartManager webPartManager = web.GetLimitedWebPartManager("default.aspx", PersonalizationScope.Shared);

// Get the webpart by id
System.Web.UI.WebControls.WebParts.WebPart providerWebPart = webPartManager.WebParts["wpCategories"];
System.Web.UI.WebControls.WebParts.WebPart consumerWebPart = webPartManager.WebParts["wpItems"];

// Get the connection point for the consumer.
System.Web.UI.WebControls.WebParts.ConsumerConnectionPointCollection consumerConnections =
 webPartManager.GetConsumerConnectionPoints(consumerWebPart);
ConsumerConnectionPoint consumerConnection = consumerConnections["DFWP Filter Consumer ID"];

// Get the connection point for the provider.
System.Web.UI.WebControls.WebParts.ProviderConnectionPointCollection providerConnections =
 webPartManager.GetProviderConnectionPoints(providerWebPart);
ProviderConnectionPoint providerConnection = providerConnections["DFWP Row Provider ID"];

// Get a new  RowToParametersTransformer object.
RowToParametersTransformer trans = new RowToParametersTransformer();
trans.ConsumerFieldNames = new string[] {"Category"};
trans.ProviderFieldNames = new string[] {"Title"};

// Get a new connection using the 2 connection points.
Microsoft.SharePoint.WebPartPages.SPWebPartConnection newConnection =
 webPartManager.SPConnectWebParts(
  providerWebPart, providerConnection,
  consumerWebPart, consumerConnection, trans);

// Add the new connection
webPartManager.SPWebPartConnections.Add(newConnection);

Sunday, May 30, 2010

Linq to SharePoint for Anonymous users performance Part 2

Recently I posted about the bad performance of LINQ to SharePoint when using my anonymous users hack. Well I did some more testing and it turns out that my first test was much too premature. It seems that the statistics I was seeing were due to something entirely different, I am guessing the internal implementation of Linq to SharePoint vs CAML queries, but that is pure speculation.

First I will introduce 2 acronyms because I am a lazy typist:

L2SP - Linq to SharePoint
AL2SP - Anonymous Linq to SharePoint (Uses the workaround I suggested previously)

My recent testing was a bit more structured and thought out, and it showed that there is actually virtually no difference in using L2SP and my AL2SP work-around.

I set up two tests, one was reading all the items from a small list of 20 items 100 times consecutively, an attempt at simulating many web requests of a small list. The second test involved a large list of 5000 items, where each item was named with a random string. The query here was to pull out the top 100 items when ordered by name, thus simulating a random read of items from a large list.

In each test, the Title property of the first list item would be accessed in order to ensure that the retrieved items would be used by the test code and the query would have to run. I noticed that running a CAML query was much too fast before I added this, I suspect there is some clever optimization that skips the query if the results don't get used.

I ran these tests for a signed in user, in which case the code uses L2SP and old school CAML query. I then repeated the tests for an anonymous user which then uses AL2SP and old school CAML query.

Here are the results: The user using IE is logged in and the Chrome user is anonymous.

It is interesting to note that the CAML query does a better job in the 100 times operation, I suspect that there is some cost to setting up the Linq data context and translating the Linq query. However Linq does a much better job in the large list test.

Most important however is that the test results are pretty much the same for AL2SP and L2SP. This means that performance is NOT a problem when using AL2SP. I am curious if any other issues come up with this technique but it looks like it is back in my bag of tricks!

If you want to run the tests I created yourself, I have put the source code here. If you deploy the solution, you get the two lists created and you just need to place the test web part somewhere in the same site. Note that the code was meant just for this test, so it can easily fail if used otherwise. Also note that the feature activation takes a while since it created a list with 5000 items.


Wednesday, May 26, 2010

Linq to SharePoint for Anonymous users performance

UPDATE: I have been doing some more testing and I am coming to some results that are very different from my initial quick test here. I will have a post about it soon, but for now I'll say that the performance seems to be more than acceptable.

Not that long ago, I was happy to post that I figured out a way to run Linq to SharePoint for an anonymous user. I have some bad news for those who want to use it. I had a quick chat with Waldek Mastykarz the other day at a DIWUG event and I realized why he wasn't quite as excited about my solution as I was. There is quite a performance hit incurred when switching to a secure context, and my solution requires that this is done for all operations, including read operations. With the old CAML query approach, reads can be done without the context switch.

Today I wanted to get an idea of how much this performance hit really would be. So I put together a very quick (and probably not perfect) test. I won't post the code here just yet, but suffice to say that one section of the code ran the Linq query with my "solution" and the other just ran a CAML query the old fashioned way. Each section ran 100 times and used a stopwatch to measure execution time.

The results:

First page load - so JIT stuff has to happen here - so perhaps not fair

[4600] 0009: 2010-05-26 18:49:27.466 [CBI] 100 Linq Queries took 6000 milliseconds
[4600] 0009: 2010-05-26 18:49:27.472 [CBI] 100 CAML Queries took 2 milliseconds


Subsequent page refreshes

[4600] 0009: 2010-05-26 18:49:49.303 [CBI] 100 Linq Queries took 4271 milliseconds
[4600] 0009: 2010-05-26 18:49:49.307 [CBI] 100 CAML Queries took 3 milliseconds

[4600] 0009: 2010-05-26 18:50:03.791 [CBI] 100 Linq Queries took 4973 milliseconds
[4600] 0009: 2010-05-26 18:50:03.796 [CBI] 100 CAML Queries took 3 milliseconds

[4600] 0013: 2010-05-26 18:51:03.388 [CBI] 100 Linq Queries took 4445 milliseconds
[4600] 0013: 2010-05-26 18:51:03.392 [CBI] 100 CAML Queries took 3 milliseconds


From this I would say that there is a <cough>significant</cough> performance issue with the approach I hacked up. I would say that you forget about using it unless someone can come up with some way to cache the datacontext (I tried this and failed so far) or some other clever solution.

The reason I did not post the code here is that it's a quick hack of code that is part of a client web site that I am working on, and I don't want to breach any contract, etc. I also think that a blog post is coming soon about Linq performance in SharePoint in general, where I will do some more rigorous testing.

Sorry for the bad news.

Tuesday, May 11, 2010

Insert item into a (sub) folder using Linq to Sharepoint

Someone had a question on how to insert items into a sub folder of a list in SharePoint 2010 when using Linq. Not obvious at first, I thought I'd blog the answer:

Let's say that you Create someObject based on a class generated by SPMetal. Let's also say that this object is suitable for passing to the InsertOnSubmmit method of the data context. Then you call the following to insert the item into the root of the list:


dataContext.SomeList.InsertOnSubmit(someObject);


Now you want to insert that item into a folder in the list, called Folder1. What you need to do is use the Path property on the someObject. This property is present if you use SPMetal to generate the Linq classes.

I created a folder in my SomeList called Folder1, and then I set the Path property like so:


mynewObject.Path = "/Lists/MyList/Folder1";


Then call the InsertOnSubmit method as usual and your item will be in the right folder! I am not yet sure how to create the folder through linq, and do note that exceptions are thrown if the folder is not there. I'll try to solve that one in a later post.

Making Linq to SharePoint work for Anonymous users

Earlier today I ran into the issue of Linq to SharePoint not working for anonymous users. The issue is discussed at a number of places:


http://blog.mastykarz.nl/sharepoint-2010-linq-doesnt-support-anonymous-users/


http://social.technet.microsoft.com/Forums/en-US/sharepoint2010programming/thread/9b59abcb-6bce-42f1-9eae-ad9561753044


Seeing as 95% of the SharePoint work I do is on public facing web sites, this was a real disappoinment for me. SharePoint 2010 is supposed to be much more useable internet sites - so this was a bit of a shock. Linq to SharePoint was one of those features we all looked forward to!


Before going back to CAML queries, I thought I would have a look at what potential workarounds or even hacks I could come up with to get this working.


My first try was to just wrap my code in the usual SPSecurity.RunWithElevatedPrivileges method. That didn't work. A number of experiments later (and then finding this post), I arrived at this picture form Reflector:



The class we are looking at here is responsible for creating the data connection for the Microsoft.SharePoint.Linq.DataContext class. The important thing to notice here is the highlighted line. If the SPContext.Current is not null, the code uses the SPSite object from the current SPContext.


As documented all over the web, the RunWithElevatedPrivileges method will be of no help if you do not create new SP* objects. See http://msdn.microsoft.com/en-us/library/bb466220.aspx "...You cannot use the objects available through the Microsoft.SharePoint.SPContext.Current property. That is because those objects were created in the security context of the current user..."


So since the code shown in reflector does exactly what the above mentioned article warns against, browsing your site as an anonymous user eventually causes a login prompt on your site since the current SPContext represents that anonymous user who rightly should NOT be able to run Linq queries against your database.


Next I was inspired by this post and thought: What if I can somehow mess with the SPContext.Current object? If I could get it to be null, I could force that code in the SPServerDataConnection class to create a new SPSite object!


The SPContext.Current object is read only. However, it derives in one way or another from the HttpContext.Current, and that is writeable. So my next attempt was to check if my user was an anonymous user, and if so, set the HttpContext to null. Short story - it worked!


After some cleanup, I created the following helper method:

public static class AnonymousContextSwitch
{
public static void RunWithElevatedPrivelligesAndContextSwitch(SPSecurity.CodeToRunElevated secureCode)
{
try
{
//If there is a SPContext.Current object and there is no known user, we need to take action
bool nullUser = (SPContext.Current != null && SPContext.Current.Web.CurrentUser == null);

HttpContext backupCtx = HttpContext.Current;
if (nullUser)
HttpContext.Current = null;

SPSecurity.RunWithElevatedPrivileges(secureCode);

if (nullUser)
HttpContext.Current = backupCtx;
}
catch (Exception ex)
{
string errorMessage = "Error running code in null http context";
//Use your favourite form of logging to log the error message and exception ....
}
}
}


The logic is as follows:

  1. Check if the situation requires action: Is the user anonymous?

  2. Backup the current HttpContext

  3. Set the current HttpContext to null - thus forcing the creation of new SP* objects

  4. Use The RunWithElevatedPrivileges method to execute code specified by the caller. Note that I reuse the SPSecurity.CodeToRunElevated delegate to mimic the RunWithElevatedPrivileges method.

  5. Set the current HttpContext to the backed up object


Calling this function is identical to the way RunWithElevatedPrivileges is called:

string currentWebUrl = SPContext.Current.Web.Url;
AnonymousContextSwitch.RunWithElevatedPrivelligesAndContextSwitch(
delegate
{
MyDataContext dctx = new MyDataContext(currentWebUrl);
//... your code...
});


Just remember that the SPContext.Current object should NOT be referenced inside the delegate code. This will throw all sorts of Null Reference exceptions. If you need data such as the current web url, throw it in a string variable before you switch the context. See the example above.


I have tested this code for retrieving as well as updating data in a list and it worked great. I have also tested this for a logged in as well as an anonymous user and it seems to work for both.


Lastly, I need to say that I just figured this out today and I have NO idea what the long term impact of this will be, or if it will work as expected in all cases. Use at your own risk and all that. Happy coding!

Thursday, May 06, 2010

FieldRef element not working with custom content type

I have been working on SharePoint 2010 for the last few weeks and have found the new development tools quite useful. So far I have been able to avoid any CAML issues like typos that plagued MOSS 2007 projects. Today I ran into something interesting however.

I was creating a custom content type with some custom site columns, but after activating all the necessary features, my content type only had site columns in it that were from the parent content type. None of the site columns defined in the FieldRefs section were included in the new content type. The logs showed no errors, and I was quite stuck. I spent a few hours experimenting and finally found the solution. Comments in the XML. Yeah, don't get me started, comments should not influence functionality but in this case they do.

So basically, this content type definition doesn't include any of the fields added in the FieldRef section:


  <ContentType ID="0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3DB064584E219954237AF390058d217b15a4549e79a7dfacfb6577993"
Name="Generic Page"
Description="Generic Page"
Group="My Content Types"
Sealed="FALSE"
Inherits="TRUE"
Version="0">
<FieldRefs>
<!-- Comment -->
<FieldRef ID="{4B9D42FA-8081-49AB-9F89-72FAB3C6609C}"/>
</FieldRefs>
</ContentType>

But this one does.


  <ContentType ID="0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3DB064584E219954237AF390058d217b15a4549e79a7dfacfb6577993"
Name="Generic Page"
Description="Generic Page"
Group="My Content Types"
Sealed="FALSE"
Inherits="TRUE"
Version="0">
<FieldRefs>
<FieldRef ID="{4B9D42FA-8081-49AB-9F89-72FAB3C6609C}"/>
</FieldRefs>
</ContentType>

The only difference is the comment. Good job and gold star to the guy who wrote the XML parser for this.




Wednesday, April 28, 2010

My New SharePoint Server

I started working on a SharePoint 2010 project this week, and I needed something to run it on. My MacBook Pro would cut it if I ran it native and not via Parallels, but I would then have to deal with dual booting, etc. More importantly, my MBP is in the shop cause I fried the keyboard when I spilled beer on it. :) Back on the trusty T43p until the MBP is fixed.

So I needed some way to run SharePoint 2010, and I didn't feel like buying another laptop. My solution: Mac Mini. Purists out there may be groaning, but hey, it's a fantastic little 64 bit machine with plenty of hardware resources to run SharePoint 2010. It is totally portable, and I can either use a screen at work to plug directly into it, or RDP from my laptop. For €750 it was the best solution I could come up with. Now if I could just get a speaking gig at a MS conference so I can pull this out in front of MS Developers.....

Microsoft Exam Done

A few weeks ago I decided it would be good for my business if I had some Microsoft Exams under my belt. I wrote the 70-536 exam today and got a nice high passing score. It's not a hard exam, but you do have to study quite a bit. There are questions from all parts of the .NET framework so you can come in contact with many new things. Two more for MCPD.

Friday, April 09, 2010

Office 2007 Crashes on Server 2008 R2

A few days ago after one or another update of my Server 2008 R2 machine, all my MS Office 2007 apps would crash immediately when started, and the only thing in the event log was something like:

Faulting application EXCEL.EXE, version 12.0.6524.5003, time stamp 0x4b4fba46, faulting module unknown, version 0.0.0.0, time stamp 0x00000000, exception code 0xc0000005, fault offset 0x00000000, process id 0xa10, application start time 0x01cad7f1dc0ebffb.

It took me a while to hunt down the fix so I'll post this for googlers with similar issues: http://blogs.technet.com/office_sustained_engineering/archive/2010/03/11/issues-with-office-after-installing-kb977724.aspx

Sunday, March 14, 2010

Free ticket to DevDays 2010 - now gone

UPDATE: Ticket has been given away - Sorry :(

I happen to have a free ticket to DevDays 2010 in the Hague this year and can't go because I am out of the country. I am willing to give my ticket away to someone who can make good use of it, and preferably to someone who doesn't work for a large company that should buy their ticket for them. So if you are a a freelancer or a small company that works in Microsoft technology and want to go to DevDays 2010, let me know and I may have your free ticket. It is NOT meant for resale.

Tuesday, March 09, 2010

USA >= World

I love how some American companies feel that the world is no bigger than their country. Today I received the following email from a company called Software Wholesale Intl. (I assumed the Intl meant international):

We unfortunately cannot sell outside of the US. Sorry about that and good luck.
Jess

I think they are trying to look a bit more professional than they really are. :) Website: www.software-intl.com

Monday, March 08, 2010

Good customer service for a change

I am not one to praise a corporation lightly, but I feel I need to in this one instance.

Over the past year I have done some work for American Express and was thus exposed to the various benefits their card holders enjoy. Then a few months ago my credit card form ING was prematurely terminated due to the merger between ING and PostBank and let's just say that switching the ING Visa to an ING Mastercard was a 4 month ordeal that resulted in me saying goodbye to ING credit card services. I couldn't manage to get approved for the new card and no one at ING could tell me why since I was a loyal customer with a platinum card before... A total mess.

So I decided to give Amex a try. I figured they are accepted at 95% of the places I use a credit card (big stores in the US or online) and after all, they are a client. I have had to call them a few times durring the last few weeks with regards to setting up some things and was blown away every time at their flexibility, availability and just over all quality of customer service. I know they are a more expensive card for the retailers to accept, but wow do they take care of their clients. As someone living in Europe where customer service is generally awful, I am glad to see that some companies still value being excellent to their customers.

For any North American expats in the Netherlands, if you miss customer service form home you may want to get yourself an Amex card.

Friday, March 05, 2010

EPiServer and Image Vault

I have spent the last month working as a developer on an EPiServer project and I have mixed feelings about the product. EPiServer certainly has a lot of good stuff to offer and is fairly easy to get working. It feels a little rough around the edges though and coming form a SharePoint or ASP MVC environment it can feel a bit "open sourcey". I was not impressed by the lack of page type inheritance and by not having strongly typed properties but there is the Page Type Builder project that seems to address a lot of this nicely.

We also used an image management solution on this project that EPiServer recommends, and this was not much fun to work with. The product in question is called ImageVault and it is not even close to production quality software. We have had so many problems with this that it would have been easier to just build something like it form scratch. As an example, here is what I fixed today:

ImageVault has a UrlBuilder class that helps get the url of the image from storage. This helper allows us to specify an aspect ratio amongst other things and that is done by setting a decimal property. So I would for instance set the aspect ratio to 1.5. The url would then come back with something like : ...&aspect=1.5&.... in the query string. Now the fun comes when the site is running in a language such as Dutch where the decimal separator is a comma. In that case, the url that comes back is ...&aspect=1%2c5&... Basically the developers never tested this in such an environment and inside their UrlBuilder class they simply do a ToString without setting an invariant culture. The result is then that the URL that points to THEIR web service doesn't produce an image since the aspect ratio is incorrect.

This is not a difficult bug to fix, but a product that is supposed to be at version 3.3 should NOT have bugs like this.

I hope the ImageVault guys get their product polished up since it has potential and they have a good opportunity to be the solution of choice for image management within EPiServer. They will need to do this quickly though because they can truly lose a lot of business with the way their product works right now.

Wednesday, February 10, 2010

Dumb mistake

So I made a rookie mistale today and it took me a while to figure it out. I copied a code snippet form one ASP.NET page to another and got myself into trouble. I should know better than to copy and paste code. :)


I had the following code in my page, and I couldn't figure out why I was getting a stack overflow problem.


protected void Page_Load(object sender, EventArgs e)
{
base.OnLoad(e);
//Other code
}


for those who don't quite see the error, the call to the base class causes another call to the Page_Load method in this class and recursion that never ends is ensured.

What I meant to have is:

protected override void OnLoad(EventArgs e)
{
base.OnLoad(e);
//Other code
}

It's good to make a dumb mistake once in a while, keeps me from thinking too much of myself. :)

Friday, February 05, 2010

Autotune yourself

A site went live not that long ago that I helped create, and while it is a silly site I am quite proud of the technological achievement it is. Let's just say that I really came trough for my client on this one.

The site is to be found at www.u-tune.nl and allows people to karaoke sing. The achievement here is that the vocals are then transformed using Autotune and after being merged with the music are played back to the user. As far as I know this is the first instance of Autotune being used on the web, it is intended to be a studio tool with a human configuring it. We really took the AUTO part of it seriously. :)

The project was one of the most challenging but also one of the most fun projects I have done lately.

Friday, January 08, 2010

New Office

After few months of searching, negotiating mortgages and waiting for the bank, I finally got the keys to my new office yesterday. I will need to do some maintenance but after some new paint and some cleaning, XComplica will have a new home. This is the first step in the expansion planned this year. Registering as an Ltd is next.