This project is read-only.

DefaultHttpControllerSelector and HttpRequestMessage inner workings

Topics: ASP.NET Web API
Aug 19, 2012 at 8:11 PM


I tried overriding HttpControllerDescriptor SelectController(HttpRequestMessage request) in DefaultHttpControllerSelector with my own code, and noticed that the request headers passed into that function behave very weirdly in VS2012. For example, request.Headers.Accept evaluates to nothing every single time even though I've been trying out sending Accept-headers.

Is there a reason why the headers can't be read at this point, or is there perhaps a bug in something that is instantiating that class?

What I'm trying to do is read the accept headers of the incoming request to determine which controller to instantiate based on what version the client wants, and it isn't very easy because I'm not sure where I can read the accept-headers from when they're evaluating to nothing.

Best regards,

Lari Tuomisto

Aug 21, 2012 at 4:51 PM

You should be able to read the request headers in the SelectController method. Could you please confirm that you can read the headers in your action method atleast to confirm that the issue is not something else ? Meanwhile I will try to repro your scenario and see if there is any issue.




Aug 22, 2012 at 2:47 PM
Edited Aug 22, 2012 at 2:50 PM

Hello RaghuRam,

I tried it now with just the defaultHttpControllerSelector, and trying to read the request parameters inside an action method.

I took a peek inside Request.Headers with the debugger, and it is doing (imo) weird things.

Here's a picture to show you what I mean:

I also took a peek inside the base HttpHeaders property inside Request.Headers, and my Accept header was definitely in there, accessing it through Request.Headers.Accept just doesn't seem to work like I assume it would - or maybe it isn't supposed to do what I think it should?

Best regards,


P.S: I'm not sure how it is supposed to work, but maybe that collection is only supposed to show accept-headers with a certain value, which is why it might not be showing - but then I'm left wondering, how I'm supposed to access it, and why is VS showing me my custom accept header in the debugger.

Aug 22, 2012 at 4:47 PM

Hi Lari,

This is most likely happening because the value for the Accept header that you are passing in is not valid according to the HTTP specification (see The Accept header should contain a MIME media type of the form "type/subtype", e.g. "application/json".

For your purposes you could either specify the version in your own header or as an Accept header parameter, e.g. "vnd-mycompany/mytype;version=1.0"

Aug 23, 2012 at 2:54 PM
Edited Aug 23, 2012 at 2:57 PM

This was indeed the cause, thank you for helping me out!

Weird still that VS debugger would show the header in the collection if it really isn't there.

Aug 23, 2012 at 4:46 PM
Edited Aug 23, 2012 at 4:48 PM

Here is a blog post I made about this, for those who are interested.