Random Ramblings

VS 2019 Project upgrade

About a year ago my team inherited a somewhat neglected solution. VS 2010 solution with VS 2008 projects, a "Dependencies" folder containing reference assemblies such as NewtonSoft.Json.dll v.6.5 - the usual. The previous owner could not let us use their build system - so naturally we had to do something about that too.

A little googling led me to CsprojToVs2017 which also let's you upgrade to VS 2019. Only about 100 projects to the solution, so how long could it take? Just over a month on a separate branch - which sucks, because every little change on "develop" needs to be merged back, and when your csproj-file is changed, you are out of luck. You have to do a diff on develop to find the actual change made and translate that change to the new 2019 format and manually merge the change. Another sucky thing is how msbuild behaves when confronted with a mixed set of csproj-versions: Phantom cyclic project dependencies is one such thing, and it doesn't go away until you convert every single project in your dependency graph.

Lesson 1: Don't do a prolonged solution conversion, assign the whole team to the conversion and finish it in one sitting. Optionally make sure to merge back to develop frequently.

Which takes us to the next hard-earned lesson: If when using VS2019 format the default nuget restore will put an "assets.project.json" file in your /obj folder. VS2008-VS2015 do something entirely different, so everytime a project was upgraded, we needed everybody to clean/restore the affected project. But the build-system (in our case Jenkins) needs to be instructed to clean (naturally) but also that the restore as part of the build target is not sufficient. I inserted a separate msbuild /t:Restore in the build pipeline.

Also, as part of the build pipeline, we're building a WebApplication (NetFX) project. These are not upgradable, so of course building this requires us to nuget restore this project on its own.

Lesson 2: Nuget behavior changes, and solution with mixed projects need a little tender love and care. "ls obj -Recurse | rm -Force -Recurse" was heavily used

Msbuild has an option to specify an alternative /obj -folder: "/p:BaseIntermediateOutputPath=c:\foo" which is useful if you're building multiple projects in your pipeline and want to make sure that dependent assets aren't reused. Unfortunately this is not playing nice with

nuget restore / dotnet restore

The reason is that packages are restored to the /obj folder and not the BaseIntermediateOutputPath/obj - which means that your compilation will be missing the dependencies. The symptom will be a missing assets.json file reported by msbuild. There'll be many hints and hacks if you google it. Ultimately I had to abandon BaseIntermediateOutputPath all together.

Lesson 3: VS 2019 csproj and dotnet restore do not play nice with Msbuild - go for simplicity in your build pipeline.

XmlSerializer namespaces

Xml is a data-language - it allows you to describe and carry data in one file, which is awesome for inter-system communication. Another nice feature is the ability to version your data. This is done using namespaces.

<myRoot xmlns="http://porse.prg/data/2019/01/11">
    <myElement />
    <yourElement />
    <herElement />

See, all the data belongs to the namespace http://porse.org/data/2019/01/11 so I know what to expect - especially if I also produce a schema definition.

So in a month's time or so, I'll probably release v2 which introduces the field <ourElement />. the other elements are unaltered, and so I could do something like the following

<myRoot xmlns="http://porse.org/data/2019/02/03"  xmlns:v1="http://porse.prg/data/2019/01/11">
    <v1:myElement />
    <v1:yourElement />
    <v1:herElement />
    <ourElement />

Notice that the "default" namespace changed to /2019/02/03, v1 was explicitly named and I reused the elements my-/your-/her from v1 in my "new" myRoot type. And here's the point of my post: It's not obvious how make System.Xml.Serialization.XmlSerializer produce the output from above.

Let's assume we write these classes and have XmlSerializer serialize it


[XmlType(Namespace = "http://porse.org/data/2019/01/11")]
public class Data { ... }

[XmlType(Namespace = "http://porse.org/data/2019/02/03")]
public class Data2 { ... }

[XmlRoot("myRoot", Namespace="http://porse.org/data/2019/02/03")]
public class myRoot 
    public Data My {get; }

    public Data Your {get;}

    public Data Her {get;}

    public Data2 Our {get;}

XmlSerializer ser = new XmlSerializer(typeof(myRoot));
ser.Serialize(outStream, instanceOfMyRoot);

The result would be something like this:

<myRoot xmlns="http://porse.org/data/2019/02/03">
    <myElement xmlns="http://porse.org/data/2019/01/11" />
    <yourElement xmlns="http://porse.org/data/2019/01/11" />
    <herElement xmlns="http://porse.org/data/2019/01/11" />
    <ourElement />

Which technically is correct, but it's hard on the eyes and the bandwidth. To make it behave you need to tell the serializer, that you have more namespaces in the mix. This is done using the AttributesOverride parameter in the XmlSerializer constructor. But this is for your ad-hoc serialization needs, if for instance you are calling web services using svcutil-generated classes, they will use the default serializer constructor without AttributeOverrides.

But you can declare the overrides in code:

[XmlRoot("myRoot", Namespace="http://porse.org/data/2019/02/03")]
public class MyRoot
    public XmlSerializerNamespaces MyCustomNamespaces;

    public MyRoot()
        MyCustomNamespaces = new XmlSerializerNamespace();
        MyCustomNamespaces.Add("v1", "http://porse.org/data/2019/01/11");
    ... etc

When the serializer happens upon a MyRoot instance, it knows to look for a field or property with the XmlNamespaceDeclarationsAttribute and insert these. The field MyCustomNamespaces will not be serialized to the output, and we'll save 114 bytes in the transmission and gain a lot of readability.





Getting to grips with RX

Reactive Extensions 

Rx for short - is a pretty neat little framework by Erik Meijer. I've sort of been avoiding it - don't really know why, but Copenhagen .Net User Group (CNUG) recently had Tamir Dresher do an introduction to the tech (and blatantly plug his upcoming book on the subject), and it was a bit of an eye opener, so I've decided to dedicate some time to get to the bottom of this - and blog about it.

The Limitations of IEnumerable

We all know IEnumerable and it's trusty sidekicks, foreach(var foo in bar) and .Select/.Where/.Any/.All/.etc. It's wonderfully convenient to have an interface, where going through a collection/set/array/list is a matter of letting the compiler do its magic.

foreach(var foo in bar)
    // do stuff with foo

But what if bar contains something exotic like a bunch of Task<T>'s that are currently crunching away on healthy portions of data. We can't really be sure when any one of them will return, but we can be 100% sure, that they won't return in the same order we enumerate them, so we will be wasting time - even if we employ Parallel.ForEach.

Ideally we would like to be able to respond to the tasks completing as events, such that when the first Task completes, we immediately respond to it. We would also like to know when the last Task completes, in order to be able to shutdown gracefully.

Another scenario: What if bar is in fact not a collection with a fixed number of elements. Think a mailbox or maybe a performance counter. New elements keep popping into it, which makes foreach-ing over the collection soft of impossible. We're forced to employ a different strategy probably involving queues and maybe OnNew-events which we need to subscribe to - and which threads are now accessing which parts of the application.

IObservable to the Rescue

Rx is an implementation of the Reactor-pattern  (hence Reactive) in Linq (which are eXtensions). IObservable is mathematically dual to IEnumerable, which could be translated into something like 'The same but seen from the other side'. In stead of pulling foos out of bar, let bar push foos out to you.

In the case of an IObservable of Task<T>, the result would be to get the Tasks in the order they complete. In the case of the mailbox or performance counter, the IObservable implementation would simply respond to additions by emitting them to anyone listening without further notice.

bar.Subscribe(bar => // doStuff with bar)

Oh, and don't get me wrong, it's not really that simple - but the gist of it is this: Rx provides a pattern for solving concurrency-issues in a short and elegant way.

I'll be examining Rx in detail in the next couple of weeks

A few links






The NUnit upgrade

Recently we decided that an upgrade was long overdue - our unittest project was using NUnit 2.6.4, and v3.4.1 had been around for at couple of weeks.

So, we did what anyone would, and let NuGet do the heavy lifting. Of course there were a few breaking changes, but nothing we couldn't handle. 30 minutes and 8 commits later, we were officially back to 2016. But TeamCity didn't agree. It seemed that after an NUnit run, the nunit-agent.exe process was never killed off, keeping pesky references to the assets in the bin-folder thus preventing the next build from succeeding.

So, a bit of digging and we found this: https://github.com/nunit/nunit-console/issues/43. TL;DR: the above happens if nunit-console output is redirected to anything but std-out, like TeamCity. Fix - probably NUnit 3.5, but they're taking it seriously

Screengrap from github/nunit/issue/43

Oh, well. Back to 2.6.4 and now I've written this, to remind me why our unittest project was left behind in 2015

Update 27.nov.2016
NUnit released v3.5 recently, and we've tried it out - still doesn't work for us. We get intermittent "AppDomainUnloadExceptions" presumably because something in our test fixtures is being kept alive for longer than the timeout. 


Copyright © 2020 - Design by Francis