Tuesday, May 27, 2014

Nerd Food: Using Mono In Anger - Part III

Nerd Food: Using Mono In Anger - Part III

In which we discuss the various libraries and tools used.

This is the third part of a series of posts on my experiences using Mono for a fairly demanding project. For more context please read part 1 and part 2.

In this instalment we shall focus more on the libraries, tools and technologies that I ended up using.


I've mentioned Castle a few times already. It appears to be the de facto IoC container for .Net, so its very important to have a good story around it. As I explained on the previous post, I added NuGet references to Castle Core and Castle Windsor and after that it was pretty much smooth sailing. I setup Windor Installers as described by Mark Seemann in his post IWindsorInstaller and that worked as described. My main program does exactly as Mark's:

var container = new WindsorContainer();
container.Install(new MyModule.WindsorInstaller(), new OtherModule.WindsorInstaller());
return container.Resolve<IEntryPoint>();

Basically, I have a number of IWindsorInstallers (e.g. MyModule.WindsorInstaller() etc.) that get installed, and then all that needs to be done is to resolve the "entry point" for the app - e.g. whatever your main workflow is.

All of this worked out of the box without any tweaking from my part.


I've used MongoDB as the store for my test tool; I'll give a bit of context before I get into the Mono aspects. Mentally, I picture MongoDB somewhere in between PostgreSQL and Coherence / MemCached. That is, it's obviously not a relational database but one of those NoSQL specials: a schemaless, persistent, document database. You can do a lot of this stuff using hstore, of course, and it now even sports something similar-but-not-quite-the-same-as BSON - JSONB, in the usual humorist Postgres way. MongoDB's setup is somewhat easier than Postgres, on both replicated and non-replicated scenarios. It also offers Javascript-based querying which, to be fair, Postgres also does. I'd say that, if you have to choose between the two, go for MongoDB if you need a quick setup (replication included), if you don't care too much about security and if you do not need any RDBMS support. Otherwise, use latest Postgres. And RTM. A Lot.

MongoDB is obviously also much easier to setup than Coherence. Of course, if you go for the trivial setup, Coherence is easy; but once you get into proper distributed setups I found it to be an absolute nightmare, requiring a lot of expertise just to understand why your data has been evicted. That's excluding the more complex scenarios such as invocation services, backing maps and so on. Sure, you can get the performance and the scalability, but you really need to know what you are doing. And let's not mention the licence costs. Basically, for the plain in-memory cache job with an easy setup, just use Memcached.

But let's progress with MongoDB. Regrettably, there are no packages in Testing for it, but the wiki has a rather straightforward set of instructions under Install MongoDB on Debian. It boils down to:

# apt-key adv --keyserver keyserver.ubuntu.com --recv 7F0CEB10
# echo 'deb http://downloads-distro.mongodb.org/repo/debian-sysvinit dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list
# apt-get update
# apt-get install mongodb-org

Since I'm using systemd I was a bit apprehensive with their control scripts. As it turns out, it worked out of the box without any problems. I did find the installation to vary depending on the machines: on some I got journaling by default, but on my really low-end NetBook it was disabled. Also, the other thing to bear in mind is that if you have a small root or var partition - e.g. the one storing /var/lib/mongodb - you may run into trouble. I ended up symlinking this directory to a drive that had more space just to avoid problems.

Once MongoDB was up and running, it was time to find a management UI. Unfortunately, MongoVue - the UI that all the Windows cool kids use - is not available on Linux. This is a bit disappointing because it seems rather full featured and well funded and - just to rub salt in the wounds - it's a .Net application. The old lack of cross-platform mentality surfaces yet again. Undeterred once more, I settled on RoboMongo instead. Not quite as matured, but seemed good enough for my needs. Simple to setup too:

$ wget -O robomongo-0.8.4-x86_64.deb http://robomongo.org/files/linux/robomongo-0.8.4-x86_64.deb
$ gdebi-gtk robomongo-0.8.4-x86_64.deb

If you don't have gdebi-gtk any other debian installer would do, including dpkg -i robomongo-0.8.4-x86_64.deb.

If you are an emacs user, be sure to install the inferior mode for Mongo. Works well on Linux but has the usual strange input-consumption problems one always gets on Windows.

Going back to Mono, all one needs to do is to use NuGet to install the CSharp Mongo Driver. Once that was done, reading, writing, updating etc all worked out of the box.


Paradoxically, where I thought I was going to have the least amount of trouble ended up being the most troublesome of all of my dependencies. Getting log4net to work was initially really easy - the usual NuGet install. But then, not happy with such easy success, I decided I needed a single log4net.config file for all my projects. This is understandable since all that was different amongst them was the log file name; it seemed a bit silly to have lots of copy and paste XML lying around. So I decided to use Dynamic Properties, as explained in this blog post: Log4net Dynamic Properties in XML Configuration. This failed miserably.

As everyone knows, log4net is a pain in the backside to debug. For the longest time I didn't have the right configuration; eventually I figured out what I was doing wrong. It turns out the magic incantation is this (I missed the type bit):

        <appender name="RollingFileAppender" type="log4net.Appender.RollingFileAppender">
            <file type="log4net.Util.PatternString" value="APrefix.%property{ApplicationId}.log" />

Just when I thought I was out of the woods, I hit a Mono limitation: CallContext.LogicalGetData is not yet implemented in Mono 3.0. It is available on later versions of Mono, but these are not yet in Debian Testing. Undeterred, I decided to try to compile Mono from scratch. It turned out to be rather straightforward:

$ git clone https://github.com/mono/mono
$ cd mono
$ ./autogen.sh --prefix=${YOUR_INSTALL_LOCATION}
$ make -j${NUMBER_OF_CORES}
$ make install

Replace (or set) ${YOUR_INSTALL_LOCATION} and ${NUMBER_OF_CORES} as required. Once you got it installed, you need to tell MonoDevelop about the new runtime. Go to Edit, Preferences then choose .Net Runtimes and click on Add. Point to the top-level directory containing your installation (e.g. ${YOUR_INSTALL_LOCATION}) and it should find the newly built Mono. I then set that as my default. Incredibly enough, from then on it all just worked.


Runtimes in MonoDevelop


As mentioned in the previous post, you should replace the NUnit references you get from MonoDevelop with NuGet ones. This is because you may be using some of the newer features of NUnit - which are not available with the version that ships with Mono. At any rate, it just gives you more confidence on the dependency rather than depending on the environment.

Another problem I found was disabling shadow copying. This does not seem to be an option in the MonoDevelop UI or the solution. It is rather annoying if you need to have some log4net config files in the test directory - as I did, due to the Dynamic Properties mentioned above.

Other than that, NUnit worked very well.

Libraries Overview

Compiling Mono from source is obviously not ideal, but perhaps the main thing to worry about is how to get latest Mono packages. As with MongoDB, it perhaps would be better to have a repository supported by the Mono community that offers more up-to-date packages, at least for the more intrepid users. Although some of these existed in the past (particularly Ubuntu PPAs) they all seem to have gone stale.

Having said that, there are still no showstoppers - the code is working on both Visual Studio 2013 and Mono.

Date: 2014-05-27 22:29:38 BST

Org version 7.8.02 with Emacs version 23

Validate XHTML 1.0


Scott Findlater said...

Food for thought, rather than having your main project having to wire up every installer amongst all projects, I have what I like to call the WindsorContainerInitialiser. This is called from the main project and loads all project related DLLs containing an IWindsorInstaller implementation.

With your log4net
Windsor has a Logging Facility, also as a nuget package "Castle Windsor logging facility", of which Log4Net is provided "Castle.Core log4net integration 3.0" ... so this achieves the same as your log4net dynamic properties but without the pain. Wire up to Windsor like so;

container.AddFacility(lf => lf.UseLog4Net());

And hey presto, any IoC managed class can have the magic ILogger which uses the single log4net config file;

private ILogger _logger = NullLogger.Instance;

public ILogger Logger
get { return _logger; }
set { _logger = value; }


Marco Craveiro said...

Cheers Scott, will update the code with this!