Empty enumerables initialized to null by DefaultModelBinder

As I mentioned before the ASP.NET MVC5’s DefaultModelBinder has some quirks. The actual one I met some days ago is the following.

public class MyViewModel
{
    public IEnumerable<int> IntList { get; set; }
}

What happens, when You call Your method with this JSON request?

{ IntList:[] }

I would like to find an empty IEnumerable<int> instance in IntList, but I will found null there.
Why? Because the DefaultModelBinder initializes my empty collection to null.

What You can do to avoid this is to write a custom model binder for this:

public class EmptyEnumerableCapableDefaultModelBinder:DefaultModelBinder
{
    public override object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
    {
       object ret = base.BindModel(controllerContext, bindingContext);

       WorkaroundEmptyEnumerablesInitializedToNullByDefaultModelBinder(ret, controllerContext, bindingContext);

       return ret;
    }
}

private void WorkaroundEmptyEnumerablesInitializedToNullByDefaultModelBinder(object model, ControllerContext controllerContext, ModelBindingContext bindingContext)
{
    if (model != null)
    {
        // workaround case when there is an IEnumerable<> member and there is a "member":[] in request
        // but default binder inits member to null

        var candidateList = bindingContext.PropertyMetadata
            .Where(kvp => bindingContext.PropertyFilter(kvp.Key))
            .Where(kvp => TypeHelper.IsSubclassOf(kvp.Value.ModelType, typeof(IEnumerable<>)))
            .Where(kvp => !bindingContext.ValueProvider.ContainsPrefix(kvp.Key)).ToArray();
        if (candidateList.Any())
        {
            if (!controllerContext.HttpContext.Request.ContentType.StartsWith("application/json"))
            {
                throw new NotImplementedException(controllerContext.HttpContext.Request.ContentType);
            }

            var json = GetJSONRequestBody(controllerContext);

            foreach(var candidate in candidateList)
            {
                var emptyEnumerablePattern = String.Format("\"{0}\":[],", candidate.Key);
                if (json.Contains(emptyEnumerablePattern))
                {
                    var pd = bindingContext.ModelType.GetProperty(candidate.Key);
                    var anEmptyArray = Array.CreateInstance(pd.PropertyType.GetGenericArguments()[0], 0);
                    pd.SetValue(model, anEmptyArray);
                }
            }
        }
    }
}

private string GetJSONRequestBody(ControllerContext controllerContext)
{
    string ret = null;

    var inputStream = controllerContext.HttpContext.Request.InputStream;
    inputStream.Position = 0;

    using (var sr = new StreamReader(inputStream, controllerContext.HttpContext.Request.ContentEncoding, false, 1024, true))
    {
        ret = sr.ReadToEnd();
    }

    return ret;
}

The point is to check the inputs on DefaultModelBinder’s null result whether it missed an empty enumerable on binding.
The valueProviders available in context are useless because they simply dont contain our IntList value. Instead I check our
target viewmodel for possible candidates and check their values directly in request. If an empty value found in request I
replace the binder result with an empty array instance which fits into IEnumerable<T> place.

KeyValuePair<,> capable ASP.NET MVC5 model binder

Once upon a day I created a viewmodel class:

public class MyViewModel
{
    public List<KeyValuePair<string, int>> MyList { get; set; }
}

I wanted to use it as a parameter in my MVC action method. The wonderful model binding feature of MVC allows me to do that and it seemed to be working without error.
I got the exact number of key value pairs in my list property but the Key and Value props were always null and 0. I repeat: without any error!

After checking DefaultModelBinder’s source I realized that it will never work: KeyValuePair<,> is a struct, so assigning to variable means a copy and it’s members are readonly so can be set only during construction. The logic in DefaultModelBinder is different: it creates the model objects, handles them over via variable assignations, evaluates their member values and then assigns those values to members. There is a workaround implemented inside related to Dictionary<,>, but it’s logic isn’t reusable for my situation because the programmer didn’t intended to allow that (private methods) and the logic there is a bit smells for me.

There are solutions on the net, but those I found suffer from one common problem: they evaluate Key and Value on their onnw, which skips some goods of model binding, e.g. validation and model state propagation. Not too good.

Here comes my solution. 🙂

First I created a new default model binder which in case of KeyValuePair<,> model type calls my BindModelViaKeyValuePairSubstitute from BindModel method but leaves all other things handled by original implementation.

public class KeyValuePairCapableDefaultModelBinder:DefaultModelBinder
{
    public override object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
    {
       object ret = null;

       if (TypeHelper.IsSubclassOf(bindingContext.ModelType, typeof(KeyValuePair<,>)))
       {
           ret = BindModelViaKeyValuePairSubstitute(controllerContext, bindingContext);
       }
       else
       {
           ret = base.BindModel(controllerContext, bindingContext);
       }

       return ret;
    }
}

I created a substitute class which overcomes the limitations: not a struct and has writable members.
To make the trick transparent to model binding the substitute class must contain members of same name and type as the KeyValuePair<,> we want to handle.


        /// <summary>
        /// KeyValuePair substitute.
        /// </summary>
        private class KeyValuePairSubstitute<TKey, TValue> : KeyValuePairSubstituteBase
        {
            public TKey Key { get { return (TKey)KeyAsObject; } set { KeyAsObject = value; } }
            public TValue Value { get { return (TValue)ValueAsObject; } set { ValueAsObject = value; } }

            public KeyValuePairSubstitute()
            {
                // set values to defaults to avoid NullReferenceExceptions when trying to get
                // an uninitialized null value from a generic type which cannot stand that (e.g. int).
                this.Key = default(TKey);
                this.Value = default(TValue);
            }
        }
        
        /// <summary>
        /// Base class for KeyValuePair substitute to allow access to generic values in handy way.
        /// </summary>
        private class KeyValuePairSubstituteBase
        {
            public object KeyAsObject { get; set; }
            public object ValueAsObject { get; set; }
        }

Now my BindModelViaKeyValuePairSubstitute is trivial.
The logic here is to let DefaultModelBinder bind our substitute object instead of a KeyValuePair<,> and then instantiate a KeyValuePair<,> from that object’s content.

        /// <summary>
        /// Default BindModel call doesnt handle KeyValuePair well, because it is a struct and has readonly props.
        /// It will return an instance with default values without any error!
        /// </summary>
        private object BindModelViaKeyValuePairSubstitute(ControllerContext controllerContext, ModelBindingContext bindingContext)
        {
            object ret = null;

            var keyValuePairSubstituteGeneric = typeof(KeyValuePairSubstitute<,>).MakeGenericType(bindingContext.ModelType.GetGenericArguments());

            var kvpBindingContext = new ModelBindingContext()
            {
                ModelMetadata = ModelMetadataProviders.Current.GetMetadataForType(null, keyValuePairSubstituteGeneric),
                ModelName = bindingContext.ModelName,
                ModelState = bindingContext.ModelState,
                ValueProvider = bindingContext.ValueProvider
            };

            var keyValuePairSubstitute = (KeyValuePairSubstituteBase)base.BindModel(controllerContext, kvpBindingContext);

            ret = Activator.CreateInstance(bindingContext.ModelType, keyValuePairSubstitute.KeyAsObject, keyValuePairSubstitute.ValueAsObject);
            return ret;
        }

The last step: the new model binder should be registered in Application_Start as usual:

            ModelBinders.Binders.DefaultBinder = new CustomModelBinder();

That’s all. You have bindable and validable KeyValuePair<,>s now!

Danger of IEnumerables

IEnumerables and IEnumerable<T>s are a good thing:

  • They allow returning set of values with a minimum contract and behaviour promise. You may alter the underlying data structure later to virtually anything, because nobody was able to use Your retval in a way You didnt mention. For example if You used a List instead Your retval consumers may add to or remove items from it and become coupled to that retval type. See next too.
  • They allow returning unchangeable “lists”. Did You ever hunted a bug where Your retval instance was containing values which wasnt created by Your method?
  • They may be lazy. You shouldnt known that Your retval consumers how want to use Your data. You may have a resource eater mapping process to run on 1000s of items, but the consumer may only want the Firts() item!
  • LINQ. Just the whole thing uses IEnumerable‘s features and is returning something of this type.
  • Etc. There could be a lot of things.

So I tend to be using these as a retval in every place where the only thing I want to return multiple instances of something.

Now the 50cent question: will this test run green?

 

[TestMethod]
public void MyTestMethod()
{
    IEnumerable<MyClass> result = GetMyClassList();
    Assert.AreSame(result.First(), result.First());
}

Yes? Are You sure? Sure! The first item of an IEnumerable will always be the same!
Or not?
Lets see GetMyClassList‘s implementation:

 

public IEnumerable<MyClass> GetMyClassList()
{
    IEnumerable<MyClass> ret = new MyClass[] { new MyClass(1) };
    return ret;
}

Yes, in case of this implementation the test becomes green.
But how about this:


public IEnumerable<MyClass> GetMyClassList()
{
    IEnumerable<MyClass> ret = null;
 
    var someSourceList = new int[] { 1 };
    ret = someSourceList.Select(i => new MyClass(i));
 
    return ret;
}

Now the test became red!

Why?

Because IEnumerable promises only a sequential access to items.

To items they contain.

In the first case these are items of a fixed array.

But in the second case the items are the values returned by a LINQ projection, which contains a mapping body.

When I call First() twice the IEnumerable‘s enumerator can only be recreated or Reset() and start over the evaluation. So the new MyClass(i) will run again and again resulting in different instances and failing test. And the resource friendly lazy evaluation may become shortly really bad too…

There is nothing new in the above, but in my head the parts of the explanation didnt connect to each other before.

But wait a minute! Is this meaning that when I use an IEnumerable I should known about its creation method?!?! This would break the basics or OOP!

No, I shouldnt known anything about it, just remember: IEnumerable only promises a sequential access to items!

When I consume it in a way that dont requires Reset() or enumerator recreation I need no extra steps:


var firstItem = result.First();

But when the consumption method results in multi-enumeration I should “fix” its items via explicit enumeration, for example:


var fixedResult = result.ToArray();

That allows You to use IEnumerable in a way it was designed and saves some ugly moments of You valuable life. 🙂

Misleading message

What You would do if You get the following result after running a unit test?

Assert.AreEqual failed. Expected:<2016.04.27. 8:22:52>. Actual:<2016.04.27. 8:22:52>.

My head was full with abstractions waiting to be coded, but the above result brings me in
unexpected state. Everything suspended and my eyes were scanning the two values character by
character repeatedly to find out what is the difference? Nothing!
A quick debug revealed that the values differ in milliseconds which are not shown in the message.

But what a misleading message! Maybe the difference should be emphasized somehow!
Because I lost my concentration, my flow, etc.
It was the same when You cannot work quietly because somebody always coming to You and asks
something. Anything. Applying the 8th point of the great article I found on Joel’s blog years before
to this situation: such messages are breaking down productivity and should be avoided.

A bug hunting story

Today I found a bug. It was so interesting that I decided to write a longer post here about it.
I created a strip down solution with the only classes and methods I need to demonstrate the bug. This is the reason if the story wont seem too realistic.

A long long time ago I need a dictionary to store some integers with a key which was based on a string but has some other features (not shown here). So I created MyKey class for this:

[Serializable]
public class MyKey
{
    private string key = null;

    public MyKey(string key)
    {
        if (key == null)
        {
            throw new ArgumentNullException("key");
        }

        this.key = key;
    }

    private int? hashCode = null;
    public override int GetHashCode()
    {
        int ret = 0;

        if (hashCode == null)
        {
            hashCode = this.key.GetHashCode();
        }

        ret = hashCode.Value;

        return ret;
    }

    public override bool Equals(object obj)
    {
        bool ret = false;

        MyKey other = obj as MyKey;
        if (other != null)
        {
            ret = Equals(other);
        }

        return ret;
    }

    public bool Equals(MyKey other)
    {
        bool ret = false;

        if (other != null)
        {
            if (this.hashCode == other.hashCode)
            {
                if (this.key == other.key)
                {
                    ret = true;
                }
            }
        }

        return ret;
    }

    public override string ToString()
    {
        string ret = String.Concat("\"", key, "\"");
        return ret;
    }
}

It was used happily like this:

// create data
var data = new Dictionary&lt;MyKey, int>();
data[new MyKey("alma")] = 1;

Later I wrote some code to persist these data via serialization.
Everything was working like a charm.

// serialize and save it
var serializedData = Serializer.Serialize(data);
SaveToFile(serializedData);

...

// load and deserialize data
var serializedData = LoadFromFile();
var data = Serializer.Deserialize(serializedData);

There was a usecase when after deserialization some of the values in data must be changed:

// as in deserialized data
var specificKey = new MyKey("alma");
if (data[specificKey] == 1) // a KeyNotFoundException occures here!
{
    data[specificKey] = 2;
}

KeyNotFoundException? I was sure that there should be a value in all of data instances with the given key! Lets see in QuickView:

There is an “alma” key!
Let’s comment out the line causing the exception and check data after the expected value modification to “2”:

Much more interesting isnt it?
I quickly put all the data creation, serialization, deserialization code into one unit test to have a working chunk of code I can use for bug hunting:

[TestMethod]
public void TestMethod1()
{
    var d = new Dictionary<mykey, int="">();
    d[new MyKey("alma")] = 1;

    var serialized = Serializer.Serialize(d);

    var data = Serializer.Deserialize(serialized);

    var specificKey = new MyKey("alma");
    {
        data[specificKey] = 2;
    }
}

But in the unit test everything was working! I simply cant reproduce the bug in such a way.
But when running App1, which was creating and serializing the data and running App2 which was deserializing and modifying it the bug always presents itself.
How can be a duplicate key in a Dictionary<,>? MyKey‘s implemetation, especially the Equals() override is so trivial that it cannot allow two instances created from
same string to be not equal.

But wait a minute!

How can the hashCode’s differ?!?!?!

Yes. A quick search on the net answers everything. MSDN clearly describes in a big “Important” box:

The hash code itself is not guaranteed to be stable. Hash codes for identical strings can differ across versions of the .NET Framework and across platforms (such as 32-bit and 64-bit) for a single version of the .NET Framework. In some cases, they can even differ by application domain.

As a result, hash codes should never be used outside of the application domain in which they were created, they should never be used as key fields in a collection, and they should never be persisted.

App1 was running in x86 and App2 in x64 environment. Thats why the string hashcodes differ.

The fix is really easy. Just turn off hashCode calculation optimalization for serialization:

[Serializable]
public class MyKey
{
   ...

   [NonSerialized]
   private int? hashCode = null;
   ...
}

Now hashCode will be recalculated once in all runtime environments.

I never thought about the possibility of unstable hashcodes.
I hope I am not the only man in the world with such wasteful brain.

CLR20r3 FileLoadException and Why Keep Settings from .config?

Today one of my colleagues tried to start a newer version of one of or .Net tools on her Win7 computer which was just copied from deployment share.

The app didn’t started, only Windows Error Reporting was doing something on the systray. In the Eventlog there was an error 22 with CLR20r3 and mentioning a FileLoadException. P4’s value: PresentationFramework. Short check about installed frameworks, etc.: everything seemed to be fine. Nothing useful was found in generated WER file either.

The tool was running well on other’s machine. What happened with her’s?

In the app’s .config file there were custom app settings in the appSettings section that’s why we keeped the original file from previous installation. The problem was that in .config there was an assemblyBinding section with a bindingRedirect too:

<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <dependentAssembly>
        <assemblyIdentity name="AnAssembly" publicKeyToken="123123123123" culture="neutral" />
        <bindingRedirect oldVersion="0.0.0.0-1.0.33334.0" newVersion="1.0.33334.0" />
      </dependentAssembly>
     ...

The new code had a newer version of the AnAssembly which wasnt used because of the bindingRedirect above! The new .config had the updated version numbers, but we overwrite it with the previous version of file because we would like to keep the correct appSettings values. It is very handy to use the appSettings section in the .config but it is a bad idea because of framework configuration is keeped in the same place. Keeping it separately seems better idea.

PS.: What about PresentationFramework in the P4 value? Completely missleading info…

LINQ to Object: Skip() performance

Assume You have a List of 5000000 items. You want to do some paging so You need e.g. 100 items from offset 300000. Check these two implementations:

        public List<T> GetItems1<T>(List<T> source, int offset, int count)
        {
            List<T> ret = new List<T>();

            for (int i = offset; i < offset + count; i++)
            {
                ret.Add(source[i]);
            }

            return ret;
        }
        public List<T> GetItems2<T>(List<T> source, int offset, int count)
        {
            List<T> ret = source.Skip(offset).Take(count).ToList();

            return ret;
        }

What do You think, which performs better? You may say the second one of course. The first one indexes the source list and calls Add() method count times. The second simply enumerates once till offset then returns count items as a new List with the possibility of internal item addition optimalization. At least that were what I think.

But I was wrong!

The second implementation always slower. The magnitude depends on the offset value but it is always slower!

offset GetItems1 GetItems2
0 43 65
10000 59 729
100000 44 5162
1000000 42 52057
3000000 44 147608

The reason is inside the implemetation details of List and IEnumerable.Skip(). The first one knows where to find the nth item, the second one should enumerate to it. The conclusion as one of my colleagues pointed out: use as specialized tools as You can.

The code which I used for result above:

    [TestClass]
    public class Test
    {
        private int cycles = 100;
        private int offset = 0;
        private int count = 100;

        [TestMethod]
        public void perftest1()
        {
            var l = GetTestData();

            var sw = new Stopwatch();

            double r = 0;
            for (int i = 0; i < cycles; i++)
            {
                sw.Reset();
                sw.Start();
                var result = GetItems1(l, offset, count);
                sw.Stop();
                r += sw.ElapsedTicks;
            }

            Assert.Fail((r / cycles).ToString());
        }


        [TestMethod]
        public void perftest2()
        {
            var l = GetTestData();

            var sw = new Stopwatch();

            double r = 0;
            for (int i = 0; i < cycles; i++)
            {
                sw.Reset();
                sw.Start();
                var result = GetItems2(l, offset, count);
                sw.Stop();
                r += sw.ElapsedTicks;
            }

            Assert.Fail((r / cycles).ToString());
        }


        public List<T> GetItems1<T>(List<T> source, int offset, int count)
        {
            ...
        }


        public List<T> GetItems2<T>(List<T> source, int offset, int count)
        {
            ...
        }

        private List<int> GetTestData()
        {
            List<int> ret = new List<int>();

            for (int i = 0; i < 5000000; i++)
            {
                ret.Add(i);
            }

            return ret;
        }
    }

NuGet silently overwrites some files

On package update NuGet silently overwrites some files. Consider You have something like this in Your nuspec file:

    <file src="MyPackage\App.xaml.cs.pp" target="content\App.xaml.cs.pp" />
    <file src="MyPackage\App.xaml.pp" target="content\App.xaml.pp" />

Consider the user changed App.xaml.cs but didnt changed App.xaml. In this case when user updates Your package the App.xaml.cs file will be silently overwritten!

After a short search session on the net I found a same problem without any solutions. So here is mine install.ps1 with a workaround:

param($installPath, $toolsPath, $package, $project)
$appXamlFileName = $item.Properties.Item("FullPath").Value
Add-Content $appXamlFileName "<!-- this comment added by nuget package's install script to modify this file to workaround bug which overwrites App.xaml.cs file on next package update if this one isnt modified -->" 

On update NuGet will detect that App.xaml changed and will skip it AND App.xaml.cs file too!

Howto debug NuGet package’s install.ps1

While searching for solution around install.ps1 I quickly realised that the edit file -> create package -> publish -> update target cycle isnt very handly. There are step-by-step instructions on the net how to debug install.ps1, but when I tried to use them I run into new problems about incompatibility between my packages and nuget powershell cmdlets.

So I used a poor mans solution instead:

  1. Add some dummy content file to You package.
  2. Install Your package.
  3. Edit that dummy file (each next package install cycle will ask You about overwriting and that is what we need!)
  4. Open Package Manager Console (PMC) in VS
  5. Execute “Set-PsDebug -trace 2”. It helps You later.
  6. Open the install.ps1 in some editor (I used PowerShell ISE) from target project’s packages directory.
  7. In PMC run “update-package _yourpackagename_ -reinstall”. The installation will stop because file overwriting.
  8. In the PS editor edit and save install.ps1, but dont close it.
  9. Back to PMC and answer the overwriting question (with ‘L’ or ‘N’ to keep the modified file).
  10. See the results.
  11. Go to 7.

Not to smart, didnt uses tools and libraries but it works in acceptable cycle time.

Setting project item’s BuildAction from NuGet package

I created a package for internal use which has an App.xaml file in content. Naturally I would like to find it in target project with BuildAction set to “ApplicationDefinition”, but Visual Studio treats it as “Page”, because it is the xaml extension default.

I found a hopeful solution here promising a fast track. My first version of install.ps1 was this:

param($installPath, $toolsPath, $package, $project)

$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("BuildAction").Value = ???

The problem raised when I didnt found the “ApplicationDefinition” value in prjBuildAction enumeration on MSDN…

I found some similar examples on the net. Some of them have comment with a question: what to do if someone wants to set a value not in the enumeration? Neither of those comments has an answer which distressed me a bit.

Here I found a clue so I tried to enumerate those undefined prjBuildAction values with this code:

param($installPath, $toolsPath, $package, $project)

Add-Type -AssemblyName 'Microsoft.Build.Engine'
$msbuildproject = new-object Microsoft.Build.BuildEngine.Project
$msbuildproject.Load($project.FullName)

[System.Collections.ArrayList]$buildActions = "None", "Compile", "Content","EmbeddedResource"

$msbuildproject.ItemGroups | Where-Object { $_.Name -eq "AvailableItemName" } | Select-Object -Property "Include" | ForEach-Object {
  $act = $_
  $buildActions.Add($act)
}

$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("BuildAction").Value = [int]$buildActions.IndexOf("ApplicationDefinition")

Dont know why but the enumeration of the ItemGroups didnt worked. When I did a

Write-Host ($msbuildproject.ItemGroups | Format-List | Out-String)

it showed me a nice list of BuildItems, but when I run with a Where-Object against it I found nothing. The problem should be in PS syntax or the object instances. I dont maintain my PS knowledge which is based on my .Net and Linux scripting practice combined with looking for snippets from code examples around. I simply dont want to go more deeply into it because I feel PS is something “created” and not “born” if You understand what I mean.

I rewrite the script to get the available BuildAction values as follows:

...
$msbuildproject.ItemGroups | ForEach-Object {   
    $ig = $_
    @($ig.GetEnumerator()) | ForEach-Object { 
        $i = $_
        if ($i.Name -eq "AvailableItemName")
        {
            $buildActions.Add($i.Include);
        }
    }
...

And voilá, I got a nice list of values in $buildActions:

None
Compile
Content
EmbeddedResource
CodeAnalysisDictionary
ApplicationDefinition
Page
Resource
SplashScreen
DesignData
DesignDataWithDesignTimeCreatableTypes
EntityDeploy
XamlAppDef

My $idx becomes: 5, I checked value in Solution Explorer… and found “CodeAnalysisDictionary” there. No problem, it should be some 0/1 based indexing problem, lets set $idx+1, run, check, value okay! Lets try something other just for to be sure… again bad value! Unfortunately it seems the order of values collected with this algorithm isnt good as the link I mentioned above imply. Back to the start line.

While looking for solutions I somewhere found something else about ProjectItem’s “ItemType” property, so I tried to play with it. And suddently the Sun raised, the sky becomes blue, etc.:

param($installPath, $toolsPath, $package, $project)
$item = $project.ProjectItems.Item("App.xaml")
$item.Properties.Item("ItemType").Value = "ApplicationDefinition"

So simple and it works!