Showing posts with label Web API Source code. Show all posts
Showing posts with label Web API Source code. Show all posts

Sunday, 6 May 2012

Performance comparison of object instantiation methods in .NET

In the previous post, I compared performance of various methods of code invocation that are in our arsenal in .NET framework.

Last night I was reviewing ASP.NET Web API Source Code that I noticed this snippet:
private static Func<object> NewTypeInstance(Type type)
{
    return  Expression.Lambda<Func<object>>(Expression.New(type)).Compile();
}

But surely, compiling a lambda expression is really costly (as we have seen in the last post), why shouldn't we use simply do this (if we are suppose to just return a Func):
private static Func<object> NewTypeInstance(Type type)
{
    var localType = type; // create a local copy to prevent adverse effects of closure
 Func<object> func = (() => Activator.CreateInstance(localType)); // curry the localType
    return func;
}

Well, I asked this very question from Henrik F Nielsen, ASP.NET Web API team's architect. And he was very helpful getting back to me quickly that "compiling an expression is the fastest way to create an instance. Activator.CreateInstance is very slow".

OK, I did not know that, but it seems to be a good topic for a blog post! So here we are where I compare these few scenarios:

  1. Direct use of the constructor
  2. Using Activator.CreateInstance
  3. Using a previously bound reflected ConstructorInfo (see previous post for more info)
  4. Compiling a lambda expression every time and running it
  5. Caching a compiled lambda expression and running it (what ASP.NET Web API does)

Test and code

In this one, I could get a bit more imaginative with my code since in the last post I already established overhead/merits of various code invocation methods. For the object to construct, I use a simple class which has a default parameterless constructor. Results will be different using a parameterful constructor but I think parameterless constructor is a more pure case.

So I have created a few Action extension methods to perform the tedious repeated snippets in the last post. Each method runs 1,000,000 times which is not high enough but as we will see (and have seen in the last post), compiling a lambda expression every time is really slow so 10 million would be very high.
public class ConstructorComparison
{
 static void Main()
 {
  const int TotalCount = 1000 * 1000; // 1 million
  Stopwatch stopwatch = new Stopwatch();
  Type type = typeof(ConstructorComparison);
  var constructorInfo = type.GetConstructors()[0];
  var compiled = Expression.Lambda<Func<object>>(Expression.New(type)).Compile();

  Action usingConstructor = 
    () => new ConstructorComparison();
  Action usingActivator = 
    () => Activator.CreateInstance(type);
  Action usingReflection = 
    () => constructorInfo.Invoke(new object[0]);
  Action usingExpressionCompilingEverytime =
   () => Expression.Lambda<Func<object>>(Expression.New(type))
    .Compile();
  Action usingCachedCompiledExpression = 
    () => compiled();
  Action<string> performanceOutput = 
    (message) => Console.WriteLine(message);

  Thread.Sleep(1000);
  Console.WriteLine("Warming up ....");
  Thread.Sleep(1000);

  Console.WriteLine("Constructor");
  usingConstructor
   .Repeat(TotalCount)
   .OutputPerformance(stopwatch, performanceOutput)();

  Console.WriteLine("Activator");
  usingActivator
   .Repeat(TotalCount)
   .OutputPerformance(stopwatch, performanceOutput)();

  Console.WriteLine("Reflection");
  usingReflection
   .Repeat(TotalCount)
   .OutputPerformance(stopwatch, performanceOutput)();

  Console.WriteLine("Compiling expression everytime");
  usingExpressionCompilingEverytime
   .Repeat(TotalCount)
   .OutputPerformance(stopwatch, performanceOutput)();

  Console.WriteLine("Using cached compiled expression");
  usingCachedCompiledExpression
   .Repeat(TotalCount)
   .OutputPerformance(stopwatch, performanceOutput)();

  Console.Read();
 }
}

public static class ActionExtensions
{
 public static Action Wrap(this Action action, Action pre, Action post)
 {
  return () =>
       {
           pre();
           action();
           post();
       };
 }

 public static Action OutputPerformance(this Action action, Stopwatch stopwatch, Action<string> output)
 {
  return action.Wrap(
   () => stopwatch.Start(),
   () => 
    {
     stopwatch.Stop();
     output(stopwatch.Elapsed.ToString());
     stopwatch.Reset();
    }
   );
 }

 public static Action Repeat(this Action action, int times)
 {
  return () =>  Enumerable.Range(1, times).ToList()
   .ForEach(x => action());
 }
}

Results and conclusion

Here is output from the program:

Warming up ....
Constructor
00:00:00.0815479
Activator
00:00:00.1732489
Reflection
00:00:00.4263699
Compiling expression everytime
00:02:11.5762143
Using cached compiled expression
00:00:00.0855387

So as we can see:

  • Using a cached compiled expression is almost as fast as the constructor 
  • Using Activator is x2 slower
  • Using reflection is x5 slower
  • Compiling a lambda expression every time is really slow: in this case + x1000 times slower 
So indeed as Henrik said, having only the type of the class, using a cached compiled expression is the fastest method.

Sunday, 22 April 2012

ASP.NET Web API Series - Part 4: Dependency Injection

[Level T2]

In this post, I will review the dependency injection features of the ASP.NET Web API. There are a few posts out there but I felt there are areas still not covered so I am exploring the dependency injection options. I do not try explaining what Dependency Injection (DI), Inversion of Control (IoC) or Service Location are. If you are not familiar with these you perhaps first need to get a grasp by reading numerous resources out there.

Until ASP.NET MVC 3 none of the Microsoft products had any IoC interoperability. In ASP.NET MVC 2 you had to resort to replacing default controller factory with your own and server controllers by using service location. In WCF you had to provide your own factories using code or config which were not great. But ASP.NET MVC 3 for the first time relinquish its hold on object creation and acknowledged the need for DI as more teams embraced it.

ASP.NET Web API beta has a built in support for dependency injection. Just be careful that there are breaking changes in the ASP.NET Web API source code and the model has changed (it has been simplified). I will shortly explain beta release and then move on to the source code which the final release will probably look more like it.

ASP.NET Web API beta release

[Remember, this stuff has changed] DI is based on setting a dependency resolver (technically a service locator) on the HttpConfiguration object. This object is on the GlobalConfiguration object as a static property (web hosting) or in case of self hosting on HttpSelfHostConfiguration.

So in case of web hosting you would use:
GlobalConfiguration.Configuration.ServiceResolver.SetResolver(myResolver);
There are basically 3 options when it comes to registering a dependency resolver:

  1. An instance of an object implementing IDependencyResolver dependency which has two methods: GetService and GetServices
  2. An instance of an object which has these two public methods: GetInstance and GetAllInstances. No interface needed and Web API will use reflection to call methods.
  3. Passing two delegates that return an instance or instances for a particular type

ASP.NET Web API source code release

This has been simplified and unified so there is only one option: using IDependencyResolver

There is one change in IDependencyResolver though: in addition to GetServeice and GetServices (which now is in IDependencyScope interface implemented by IDependencyResolver), there is a BeginScope method. 

BeginScope according to documentation is:

Starts a resolution scope. Objects which are resolved in the given scope will belong to that scope, and when the scope is disposed, those objects are returned to the container. Implementers should return a new instance of IDependencyScope every time this method is called, unless the container does not have any concept of scope or resource release (in which case, it would be okay to return 'this', so long as the calls to are effectively NOOPs).
This is particularly important if you leave responsibility of lifetime management of your objects to your DI container.


Developing a simple Dependency Resolver for AutoFac


AutoFac is my DI framework of choice (I have used Windsor and Unity before). So here I will write a few lines of code to hook AutoFac into ASP.NET Web API.

All you need to do is to implement IDependencyResolver and then create an instance of your object and use SetResolver on the configuration (see above) instance.

Your Dependency Resolver is not supposed to throw exceptions but it can return null in case the type is not registered.

ASP.NET Web API also tries to use Dependency Resolver (and in case you have provided your own will use yours) to resolve its own dependencies. This could have placed a big burden on such a dependency resolver to have knowledge of all ASP.NET Web API dependencies but the implementation is as such that Web API will use the objects that you return for dependency resolution and if you return null, it will use the default.

So here is the code for the simple AutoFac dependency resolver:
public class AutoFacDependencyResolver : IDependencyResolver
{
 private ContainerBuilder _builder = new ContainerBuilder();
 private IContainer _container;

 public ContainerBuilder Builder
 {
  get
  {
   return _builder;
  }
 }

 public void Build()
 {
  _container = _builder.Build();
 }

 public IDependencyScope BeginScope()
 {
  return this;
 }

 public object GetService(Type serviceType)
 {
  object resolved = null;
  if (_container.TryResolve(serviceType, out resolved))
   return resolved;
  else
   return null;
 }

 public IEnumerable<object> GetServices(Type serviceType)
 {
  object resolved = false;
  Type all = typeof(IEnumerable<>).MakeGenericType(serviceType);
  if (_container.TryResolve(all, out resolved))
   return (IEnumerable<object>)resolved;
  else
   return null;

 }

 public void Dispose()
 {
  if(_container!=null)
  _container.Dispose();
 }
}
Things to note are:

  • We expose the builder so the client can register the components.
  • We expose Build() method so at the end of registration, Build() is called. Ideally we should hide the builder and expose it through AutoFactDependencyResolver methods so that build could not be directly called on the builder by the client.
  • AutoFac inherently does not have a ResolveAll method but instead you may resolve IEnumerable<T> in which case works as ResolveAll.
  • Please note that we had to use TryResolve so that no exception is thrown. Alternatively you may use a try/catch block and return null in the catch (but that could mask other exceptions)  
OK, let's imagine we have a controller which is dependent on loggers:

public interface ILogger
{
 void Log(string value);
}

public class TraceLogger : ILogger
{
 public void Log(string value)
 {
  Trace.WriteLine(value);
 }
}

public class DebugLogger : ILogger
{
 public void Log(string value)
 {
  Debug.WriteLine(value);
 }
}

public class DiTestController : ApiController
{
 private ILogger _logger;
 private List<Ilogger> _loggers;

 public DiTestController(IEnumerable<Ilogger> loggers)
 {
  _loggers = loggers.ToList();
 }

 public string Get()
 {
  _loggers.ForEach(x => x.Log("Get"));
  return "Dependency Injection";
 }

}

So the controller will use ForEach to call Log on all ILoggers passed to it. In order to register the Dependency Resolver, all we have to do is this:
AutoFacDependencyResolver resolver = new AutoFacDependencyResolver();
resolver.Builder.RegisterType<TraceLogger>().As<ILogger>();
resolver.Builder.RegisterType<DebugLogger>().As<ILogger>();
resolver.Builder.RegisterType<DiTestController>();
resolver.Build();
config.DependencyResolver = resolver;

Config object here is a reference to GlobalConfiguration.Configuration (in case of web hosting) or an instance of HttpSelfHostConfiguration (in case of self-hosting).

Conclusion

Dependency injection has gone through breaking changes from ASP.NET Web API beta to the source code release. Registering your own dependency injection provider is easy and all you need is to implement IDependencyResolver and then use SetResolver method on the HttpConfiguration (depending on your hosting scenario) to register it. ASP.NET Web API will optionally try using your resolver to resolve its own dependencies but you do not have to register the defaults. Be careful to catch any dependency resolution error.



Saturday, 21 April 2012

Web API Governance - Life in a schemaless world (Take 2)

Introduction

[Level C4] While classic web services used WSDL for their schema and description, emerging Web API mainly relies on documentation. This requires human effort and with the rising popularity of Web API it could create problems in maintenance, change management and extension of Web APIs. The solution is automation and this article reviews challenges and possible solutions.

Rise of the Web API


NOTE: This is take 2 of my post, with some updates.

I am very excited at the moment. Microsoft has just released its beta version of its best HTTP implementation to date. ASP.NET Web API has embraced HTTP exposing its full goodness. No more clutter of SOAP-based RPC services. Lean and mean.

So life cannot be any better? Hmmm.. it can! Away from all the hustle and bustle and buzz of the REST APIs (whether they are really RESTful is another discussion which to be frank I am not interested any more), it is important to be objective and almost clinical about the technology. It is important to be able to look back, and see how far we have come so far and yet learn lessons from the past and use the solutions of the previous generation if they are still relevant (in IT terms it is 5-10 years depending on the technology).

There has been a movement in the industry to question the basics. This is good, it weeds out unnecessary abstractions. We have had RDBMS for years now we want NoSQL. We have had XML/SOAP now we want JSON. We have had complex layered server side architecture now we want close to the metal, like node.js or sinatra/nancyfx. Scripting was bad, type safety was good and now scripting is good and everyone wants to escape type safety.

OK, we have had SOAP services and we have never been happy with them. They were clunky, heavy and complex. Now we got Web API which is so much better. But let's have a look at the problems that classic Web Services started tackling and in fact, did a good job in some cases.

Governance in Web Services

This is an SOA term that revolved around managing and organising web services in a consistent fashion. This  would protect the clients as well as allowing the vendors to manage the life of the services. 

Some of the aspects of Governance include:
  1. Contract and description: mainly through WSDL
  2. Registry service and discovery: mainly through UDDI and later WS-Discovery
  3. Versioning
  4. Exception handling
  5. Security and access control: mainly through Web Service Extensions and later WS-Security
To be fair, the work in the area produced some useful technologies and standards. One problem being they are mainly based on the RPC style web services as such not quite useful in the REST style Web API. The issue is problems do still exist and question is whether we do have appropriate tools to address them.

What did we lose on the way?

OK, I am trying to imagine 3 years from now - I am sorry but I see chaos. A good proportion of existing Web Services will be moved to pure HTTP Web APIs. I think it is very likely that the RPC mentality stays with the developers and we end up with RPC style Web APIs. And now what? We are even more vulnerable with no classic Governance safety net.

With classic web services (and WCF), all you needed to point to a web service was its WSDL. It would provide the contract, schemata, faults, security requirements, endpoint addresses and some description. If you would update your service, the WSDL (which it could be generated on the fly) would be updated. In fact in most cases all you needed was the service endpoint address, from where you could get all informations you needed (e.g. by just placing ?wsdl at the end of the address).

We do have some replacements here in HTTP APIs. For example, we do not need SOAP fault to be defined in the WSDL, since we have HTTP error codes. We do not necessarily need WS-Security as we have OAuth/OpenID/etc token based federated security. Discovery is not really a big issue and hypermedia can help in this regard. 

OK, the question is do we need all that stuff? REST (and HTTP as its sole implementation) is Turing-complete. An agent should not need any priori knowledge to be able to use a service. It can use OPTIONS to discover verbs, it can use content-negotiation to ask for media types it can understand and use hypermedia to navigate through the resources of a server.

Turing-completeness is great but to be honest, it still belongs to Sci-Fi novels. We are not there yet. I believe for great development experience and maintenance, we still need schema. Most web APIs are about interacting with data. Probably somewhere around 80-90% of Web APIs out there serve JSON. What does application/json content type exactly tell about the actual JSON being served? Are we currently writing agents/applications that can intelligently adapt and consume whatever JSON being served to them? 

I was listening to one of the episodes of MashThis podcast and the theme that was coming on good Web API was always good documentation. Nothing beats a good documentation but how many of us developers are good in this? Human effort to create and maintain a good documentation covering schema and description is going to be big. I think we need some automation here.


Challenges

I originally suggested using media types so that client could request not the data but the schema of the data. In this update, I am going to look at other options as well as semantic challenges for providing schema and description. In fact, I am not going to propose a solution, I will use this post to describe the problem. I hope this will start a fruitful discussion so please feel free to join in.

HTTP and schema?

Schema for an HTTP API (intentionally not using the word REST API) might initially seem a little bit odd. Actually the word does not appear at all in the RFC 2616. As far as the HTTP is concerned, server will allow the agent to interact with the resource using URI using the verb and associated headers. It is up to the agent to engage in content negotiation and interpret the result.

I personally think the problem of schema and description is inherently orthogonal to HTTP. A resource does not necessarily have to have a schema - other than the schema assumed by the content-type. For example, when a server responds to a GET request for an XML page using /foo/api/2 URL, it just serves the page which ideally is conformant to XML schema. But even well-formed XML does not explain enough about the structure of the document and its internal schema. In fact XML instance will contain its schema but server (unlike RPC style web services) might not able to return an XML schema - since /foo/api/3 might have a completely different schema.

But let's bear in mind this can simply be an option. Server might support the schema/description and return, or it can return a 404 error. In fact, schema can be service-specific or URI specific, i.e. it could refer to /foo/api/{id} or /foo/api/690. In this case, while schema is relevant for different ids (as they have different schema), service description will be the same. Hence any solution must consider this variability.

URIs and URIs routing formats

It is very common in the Web API space to provide URIs containing placeholders to pass data fragments to the API. In our example, /foo/api/690 means id=690. Alternatively, data could be sent as query string parameter such as /foo/api?id=690.

The schema or description could be different based on the parameters passed.

One issue to be considered is that it is possible agent does not know about an actual existing resource (for example does not know a valid id to be passed to return an actual resource) and all it knows is the URI format. Solution must be flexible enough to provide a schema for the API method if a generic schema exist for the URI format (i.e. all documents at /foo/api/{id} have similar schema)

Priori knowledge

One of the main focuses of HTTP spec is to remove the requirement of the priori knowledge from the agents. Agent is encouraged to explore server resources and functionality (e.g. OPTIONS) and server guides the agent using hypermedia.

With developing Web API that requires documentation, we are adding back this priori knowledge requirement. This is not good. But in the absence of a standard we will end up with nothing much better than a convention - which itself is a priori knowledge. So ideally this process must involve a standard or even a standard proposal. A lot of useful technologies out in the cloud are still in the proposal phase, namely OData, JSON Schema and HAL.

Security

A schema must ideally provide a detail on the security requirements of the service. This is especially important for authentication schemes supported by the server (e.g. basic authentication, federated identiy, etc).

Schem and description as a single resource?

Decision needs to be taken to whether serve description and schema of the service separately or provide them as a single document. The more I think about it, the more I feel it probably has to be a single document.

Possible Solutions


1- Using content type

Since JSON and XML are two most popular formats being served, for now serving schema can practically be limited to these media types.

Let's Look at this request:
GET http://localhost:50459/api/HelloWorld/1.1
User-Agent: Fiddler
Host: localhost:50459
And response is:
HTTP/1.1 200 OK
Server: ASP.NET Development Server/10.0.0.0
Date: Tue, 17 Apr 2012 21:46:38 GMT
Content-Type: application/json; charset=utf-8
Connection: Close 
"This"
So as we can see, media type returned is JSON and the value is a simple string. Now if we send a request asking for a JSON schema with accept header of  application/schema+json, we get back this:

GET http://localhost:50459/api/HelloWorld/1.1
Accept: application/schema+json
Host: localhost:50459 
And response is:
HTTP/1.1 200 OK
Server: ASP.NET Development Server/10.0.0.0
Date: Tue, 17 Apr 2012 21:46:38 GMT
Content-Type: application/schema+json; charset=utf-8
Connection: Close 
{
  "type": "string"
It is true that JSON schema is still in draft status but with this format becoming more and more popular, it is inevitable that we restore some structure in our Web API.

Since XML already has an established schema language, we can use XML Schema but one problem is that XML Schema has a content type of application/xml or text/xml. So either we nominate an alternative content type (such as application/schema+xml) or use URI to denote request for schema (ideally in the format of optional query string parameter although that will still be part of the URI).

Same applies to the description of the service. It is ideal that the documentation of the service id provided by the service itself. For description, agent can use a special content type (for example x-text/docs) or using a query string parameter.

2- Using hypermedia

Server can include separate links to schema and description while serving the actual resource. Payload can be HAL or alternative implementations of hypermedia.

A HAL payload can look something like:

{
  "_links": {
    "description": { "href": "/description" },
    "schema": { "href": "../../schema?type=MyDto" }}
  },
  "_embedded": {
    "MyDto": [
...

Server could have an implementation convention to treat description and schema URL fragments as special paths and serve appropriate resources. Please note that all the data needed to provide the schema could be located in the URL, e.g. type of the serialised payload is appended as a query string parameter so the server does not even have to know about the API method queried about.

This imposes a burden on the agent to understand such complex structures.

Also bear in mind, an ideal solution must be able to explain the output schema as well as input schema. For POST and PUT verbs, a JSON payload could be sent to the server to be stored and server must be able to provide not only the output but also input schema.

3- Using a URL convention

Basically in this solution, agent has a priori knowledge of a convention (or a standard) by which can interrogate the server and ask for schema and description of the services available on the API server. For example /foo/api/$list could provide a list of methods (I know, it sounds like RPC but I have no better word) avilable on the server. $schema can provide the schema and $description the description for the service.

Conclusion

Web API needs to look back and consider solving some of the problems solved by classic Web Services. It is very likely that with the migration of classic web services to Web API we find ourselves in a chaotic situation. This area needs to be explored and all options to be considered hence a debate on the subject is welcome. It is likely that we might need a convention/standard for describing our Web APIs.

Saturday, 7 April 2012

ASP.NET Web API beta Source code: Survival Guide

[Level T2]
NOTE on 01/06/2012: ASP.NET Web API RC was released last night (31/05/2012). This means for a while at least, you do not have to build the nightly releases. You can download it from here.

Just a quick note that as you probably know, ASP.NET Web API beta (along with ASP.NET MVC 4 beta) is out. Scott Guthrie announced the release in his blog on 27th of March and it has been released under Apache 2.0 license.

You may download the source code from here via git or web . This is a really good news for all who wanted to have a deep dive into the implementation and extensibility points of the framework - including me. The media was also surprised on the release as it shows perhaps another shift in Microsoft towards community-oriented software development - but I will leave this subject for another occasion.

It is interesting that the source code comes with full unit testing coverage, yet it does not use MS Test framework available in Visual Studio instead it uses xUnit, a framework among whose contributors is Brad Wilson. This gives us a bit more insight into the way ASP.NET team work who are arguably among the most popular teams in Microsoft.

Downloading the source is all fine. Yet I had a few teething issues with getting it working which I share so that others having similar problems could find their way out.

Building the framework source code

OK, as soon as you have downloaded and unzipped the source, just open the solution. Since this source relies heavily on NuGet packages yet it does not contain binaries, it needs to re-download the packages. There are instructions on the page but I personally could not get it working. 

So first thing to do is to right-click on the solution and click on "Enable NuGet Package Restore" as below [UPDATE: You actually can ignore this step, and do not have to set this value. I managed to build without this step]:


Now, if you attempt to build, you will get the error below:

error : NuGet package restore is not currently enabled. For more information, please see: http://aspnetwebstack.codeplex.com/wikipage?title=NuGet+Packages

Solution is to go to Tools -> Options -> Package Manager -> Package source and add the address below:

http://www.myget.org/F/f05dce941ae4485090b04586209c8b08/
Now you need to rebuild the packages as advised in the ASP.NET web stack page:

build RestorePackages

And now finally rebuild the solution and this time it will work. As the link above describes, this NuGet feed is not quite official and is used for out of band releases - which this source code certainly is.

Now you might get this error again if you try rebuilding having added the new NuGet feed. Not to worry, delete the source code, unzip again, Enable NuGet Package Restore and then build. This time should work.

Using Web API Source code

One of the best ways to understand a source code is to see it running and debugging it. And that is what I tried to do. 

I had already installed ASP.NET Web API/MVC beta from the installer. So here is what I did: created an ASP.NET Web API project (see Part 1 for more details), added the project to the Web API source code, removed references to the ASP.NET Web API/MVC beta DLLs and added project reference to those DLLs I needed from its source. Easy! Well, not so fast...

The problem is if you have installed the ASP.NET Web API, all binaries are already GACed and they take precedence over the project references. You can verify this by looking at the list of DLLs loaded in the output window while debugging. Now this by itself is not a problem, but considering the source to be newer with breaking changes, it becomes a different matteeeerrr...

Now courtesy of Christoph De Baene, this also has a solution. I was lucky to find about his post just from yesterday as he was quicker to see the problem and find a solution. He explains about using Environment variable so CLR looks the development path first:

To resolve this problem we can use DEVPATH. It’s an environment variable that you can set to a folder (typically you’re output folder). The runtime will use the DEVPATH folder for probing before looking into the GAC!

Setting DEVPATH environment variable


You also need to add the entry below to your config file (app.config or web.config depending on your project):
 

 <configuration>
   <runtime>
      <developmentMode developerInstallation="true"/>
   </runtime>
 <configuration>
 
OK, now it is time to run the code. Well, I ended up using Christoph's code but still having issues this time "Method not found":

Method not found: 'Boolean System.Net.Http.Headers.HttpHeaders.TryAddWithoutValidation(System.String, System.String)'.

Well, it turns out this time that the version of the System.Net.Http DLL has also moved on and I used the one coming with the installer. However, while Microsoft has decided not to include this assembly's source code along with the rest, source has indeed moved on and you need to use the one in the packages folder underneath the solution.

Now I did that and still not working! I was pulling my hair out and went through a few things and realised references were not set to "Copy Local". Once I set that to true, it started working.

Conclusion

The process of building and running the ASP.NET Web API (and MVC) is a bumpy ride. There are quite a few additional things you have got to do to get it working.

In brief, here are the steps you need to do to be able to debug the source code if you have installed ASP.NET Web API beta installer (see above for details):
  1. Create a DEVPATH environment variable and set to your debug output folder
  2. Add an entry to your config to flag VS to use DEVPATH
  3. Make sure all your references are "copy local" set to true
  4. Make sure you add reference System.Net.Http.dll using "Manage NuGet packages" and not from the GAC