Tuesday, August 18, 2009

Turn your actions into feeds on ASP.net MVC

With this implementation you can re-use parts of your content Controller (the database query for example) and render it on a RSS and/or ATOM feed.

Let's say you have an action like this (extracted from DotNetBurner’s source):

       
       [AcceptVerbs (HttpVerbs.Get | HttpVerbs.Head), CompressFilter]
       public ActionResult Upcoming (int? page, string sort)
       {
           var viewData = GetStoryListViewData<StoryListByCategoryData> (page);
           int sortDays = GetSortDays (sort);

           viewData.Stories = Repository.GetUpcomingStories (CurrentUserId, sortDays, SiteSettings.Settings.UpcomingMaxDays,
                                                             CalculateStartIndex (page), StorySettings.Settings.StoryPerPage);

           viewData.Category = Strings.Upcoming;
           viewData.CategoryDisplayName = Strings.Upcoming;
           viewData.SortDays = sortDays;

           return View ("Upcoming", viewData);
       }

This action returns the upcoming stories. To turn it on a feed, i have just implement this new action:

   [AcceptVerbs (HttpVerbs.Get | HttpVerbs.Head), CompressFilter, OutputCache (CacheProfile = "Feeds")]
       public ActionResult UpcomingFeed(string sort, string feedType, int? feedCount)
       {
           return Content(Feed("Upcoming", null, sort, feedType, feedCount), "application/xml", Encoding.UTF8);
       }


    string Feed(string feed, int? id, string sort, string feedType, int? feedCount)
       {
           int sortDays = GetSortDays(sort);

           if (!id.HasValue)
               id = 0;

           if (!feedCount.HasValue || feedCount == 0)
               feedCount = StorySettings.Settings.StoryPerFeed;

           if (feedCount > 200)
               feedCount = 200;

           IEnumerable<Story> stories = null;

           var categories = Repository.GetCategories();

           stories = Repository.GetUpcomingStories (CurrentUserId, sortDays, SiteSettings.Settings.UpcomingMaxDays, 1, (int)feedCount);

           FeedType type =  ("rss".Equals(feedType, StringComparison.InvariantCultureIgnoreCase)) ? FeedType.RSS20 : FeedType.ATOM10;

           using (var memoryStream = new MemoryStream ())
           {
               FeedHelper.GenerateFeed (stories, categories, memoryStream, type, Url);

               return Encoding.UTF8.GetString (memoryStream.ToArray ());
           }
       }

The Feed method is simplified here for the sake of the example, but it can take a “feed” parameter to return different feeds (for categories, tags, etc.). To generate the feed, i have implemented a FeedHelper class (you probably will have to tune it for your needs):

   public enum FeedType
   {
       RSS20,
       ATOM10
   }

   public static class FeedHelper
   {
       public static void GenerateFeed(IEnumerable<Story> stories,
                                       IEnumerable<Category> categories,
                                       Stream output,
                                       FeedType type,
                                       UrlHelper urlHelper)
       {

           // Create an XmlWriter to write the feed into it
           using (XmlWriter writer = XmlWriter.Create(output))
           {
               // Set the feed properties
               SyndicationFeed feed = new SyndicationFeed
                   (SiteSettings.Settings.Name,
                    SiteSettings.Settings.Title,
                    new Uri(SiteSettings.Settings.Address));

               feed.LastUpdatedTime = DateTime.UtcNow;
               feed.Language = SiteSettings.Settings.Language;

               feed.Authors.Add(new SyndicationPerson(SiteSettings.Settings.AdminMail,
                                                      SiteSettings.Settings.Name,
                                                      SiteSettings.Settings.Address));

               // Add categories
               foreach (var category in categories)
                   foreach(var subcat in category.SubCategories)
                       feed.Categories.Add(new SyndicationCategory(subcat.Name));

               // Set generator
               feed.Generator = SiteSettings.Settings.Title;

               // Set language
               feed.Language = SiteSettings.Settings.Language;
               string siteUrl = SiteSettings.Settings.Address;


               // Add post items
               List<SyndicationItem> items = new List<syndicationitem>();

               foreach (var story in stories)
               {
                   string url = String.Concat(siteUrl,
                                              urlHelper.RouteUrl("Detail",
                                                                 new RouteValueDictionary
                                                                     {
                                                                         {"id", story.StoryId},
                                                                         {"title", story.Title.ConvertToUrlPath()}
                                                                     }));

                   string voteButton = "{0}<br>".FormatWith(SwissKnife.GetVoteButtonFor(story.Url));

                   TextSyndicationContent content =
                       SyndicationContent.CreateHtmlContent(String.Concat("<div><p>", story.Description, "</p>",
                                                                          voteButton, "</div>"));

                   SyndicationItem item = new SyndicationItem(story.Title,
                                                              content,
                                                              new Uri(url),
                                                              url,
                                                              new DateTimeOffset(story.PostedOn));
                   item.PublishDate = story.PostedOn;
                   item.Categories.Add(new SyndicationCategory(story.CategoryName));

                   items.Add(item);
               }

               feed.Items = items;

               // Write the feed to output
               if (type == FeedType.RSS20)
               {
                   Rss20FeedFormatter rssFormatter = new Rss20FeedFormatter(feed);
                   rssFormatter.WriteTo(writer);
               }
               else
               {
                   Atom10FeedFormatter atomFormatter = new Atom10FeedFormatter(feed);
                   atomFormatter.WriteTo(writer);
               }

               writer.Flush();
           }
       }
   }

This uses System.ServiceModel.Syndication to create a feed from the same database query that is used to render the site content. With this, i have enabled DotNetBurner to have feeds in less than 4 hours.

Drop me a comment if you implement this on a similar time :)

Tuesday, August 11, 2009

Lazy loading a database query to improve performance of caching on ASP.net MVC

This article explains an approach to lazy load a database query on an ASP.net MVC controller to prevent the query from executing on it, and just execute on the View, if it's not cached. So, if the View is cached, the database is not queried.

With ASP.net MVC you would obviously have a Model, a View and a Controller. But the workflow of a page execution on MVC is not really M-V-C, because it doesn't happens on this order. Skipping a few steps, to keep the explanation simpler, let's start with the Controller. On the controller it's common to query a database on the Model and set the View model with the data. On the view, you usually have a Html helper that will take the view model and render it as Html.

There lies a problem. If you use caching on your HTML helper, the database is still queried, because the query would be issued on the Controller, which happens before the View (which is cached).

To understand this, I recommend you to take a look on a previous article: http://tsayao.blogspot.com/2009/07/caching-html-helpers-on-aspnet-mvc.html.

The solution is pretty simple: Lazy load the data on the Controller.

This way, the database is not queried on the Controller, it's just lazy loaded when the Enumeration is requested at the View. So, if the Html helper on the View is cached, it's never requested!

First, let's implement a simple LazyColletion class:

using System.Collections;
using System.Collections.Generic;

namespace DotNetBurner
{
  public class LazyCollection<t> : IEnumerable<t>
  {
      public delegate IEnumerable<t> LazyCallback();
      private readonly LazyCallback _callBack;
      private IEnumerable<t> _items;
      private bool _wasCalled = false;

      public LazyCollection(LazyCallback callBack)
      {
          _callBack = callBack;
      }

      IEnumerator IEnumerable.GetEnumerator()
      {
          CallCallBack();
          return _items.GetEnumerator();
      }

      public IEnumerator<t> GetEnumerator()
      {
          CallCallBack();
          return _items.GetEnumerator();
      }

      void CallCallBack()
      {
          lock (this)
          {
              if (!_wasCalled)
                  _items = _callBack();

              _wasCalled = true;
          }
      }
  }
}

The limitation of this example is that your database query method should return a collection that implements IEnumerable<T>. Something like this:

  
public virtual IEnumerable<Tag> GetTags(int top)
{
  List<Tag> tags = new List<Tag>();

  using (SqlConnection conn = new SqlConnection(_connStr))
  {
      using (SqlCommand cmd = new SqlCommand("sp_tag_get_tops", conn))
      {
          cmd.CommandType = CommandType.StoredProcedure;
          cmd.Parameters.AddWithValue("top", top);
          conn.Open();

          using (SqlDataReader reader = cmd.ExecuteReader())
          {
              while (reader.Read())
              {
                  Tag tag = new Tag
                                {
                                    StoryCount = reader.GetInt32("Count"),
                                    TagId = reader.GetInt32("TagId"),
                                    Name = reader.GetString("Name")
                                };

                  tags.Add(tag);
              }
          }
      }
  }

  return tags;
}  

On the Controller, you can do something like this:

viewData.Tags = new LazyCollection<Tag>(() => Repository.GetTags(StorySettings.Settings.TopTags));

This way, the database is just queried when you use the View Model. On this case, if you never enumerate ViewData.Model.Tags (it happens when you cache the view part that touches it), no query is issued on the database. This example uses the SqlClient directly, but i think you can extend it to use Linq to Entities.

Saturday, August 1, 2009

Using SQL Server 2005+ index included columns to improve performance of an ASP.net resource

This post shows a real scenario on using included columns on SQL Server 2005/2008 to improve the response time of an ASP.net resource.

When developing DotNetBurner, i wanted to make sure that the site performance was the best it could be. Targeting this goal, i have used indexes with included columns to improve the response time of the "burn!" count image. The burn count image shows how many "burns" (votes) a particularly link has. So, to retrieve this vote count, the database must be queried, like this:

SELECT VoteCount
FROM Story
WHERE UrlHash = @urlHash

The query result is filtered by the UrlHash column, which is a MD5 hash of the URL. Assuming that we have thousands of stories, it's recommended to create an index on the UrlHash column.

The following queries will create a Story table and populate it with some items, so we can query it and get the execution plan:

IF OBJECT_ID('Story') IS NOT NULL
 DROP TABLE Story
GO

CREATE TABLE Story
(
 StoryId INT PRIMARY KEY NOT NULL,
 Description NVARCHAR(2000),
 Url VARCHAR(255),
 UrlHash UNIQUEIDENTIFIER, /* This is not MD5, but can be used for this example */
 VoteCount SMALLINT
)
GO

DECLARE @count INT

-- Inserts the first story, so we know the UrlHash
INSERT INTO Story (StoryId, Description, Url, UrlHash, VoteCount)
VALUES(1, 'This is a story description where the quick brown fox jumped over the lazy dog which had a tatoo written lorem ipsum dolor.',
    'http://www.someurl.com', '5946D715-38FA-42C5-8056-F8F0C86B8AE8', 2)

-- Inserts more 9998 stories
SET @count = 2
WHILE @count < 10000
BEGIN

 INSERT INTO Story (StoryId, Description, Url, UrlHash, VoteCount)
 VALUES(@count, 'This is a story description where the quick brown fox jumped over the lazy dog which had a tatoo written lorem ipsum dolor.',
     'http://www.someurl.com', NEWID(), @count + 10)

 SET @count = @count + 1
END


CREATE UNIQUE INDEX IDX_Story_1 ON Story(UrlHash)

SELECT VoteCount
FROM Story
WHERE UrlHash = '5946D715-38FA-42C5-8056-F8F0C86B8AE8'

This is the resulting execution plan (with a index on the UrlHash column):

Sql server execution plan

Looking at the execution plan, you will see a key lookup. I bet with you that i can reduce it to one index seek :)

DROP INDEX IDX_Story_1 ON Story

CREATE UNIQUE INDEX IDX_Story_1 ON Story(UrlHash) INCLUDE (VoteCount)

SELECT VoteCount
FROM Story
WHERE UrlHash = '5946D715-38FA-42C5-8056-F8F0C86B8AE8'

Now we have replaced the index which one that includes the VoteCount column. Let’s look at the execution plan:

Sql server execution plan

Now this executes faster and uses less resources. This happens because when we added the included column all the values necessary for the query (UrlHash and VoteCount) can be found on the Index, so SQL Server doesn't need to look for the value of VoteCount at the clustered index.

You could, of course, just add the VoteCount directly on the index, like this:

CREATE UNIQUE INDEX IDX_Story_1 ON Story(UrlHash, VoteCount)

This would work, but would also use more disk space, since the index key would be bigger. This happens because when you use the INCLUDE option SQL Server just stores the value at the leaf node of the index, because this value is just returned, not filtered.