ASP.NET Core Gotchas – No. 3 Why don't my hyperlinks work on my Razor Pages in ASP.NET Core 2.0?

Recently I decided to take Razor Pages, a new feature in ASP.NET Core 2.0, out for a spin. I’ll be blogging about that experience and how I migrated some basic views of an existing application over to Razor Pages in a future post.

In this blog post I wanted to concentrate on a simple gotcha which left me confused for a little while. You are only likely to hit this issue if you are starting with an existing MVC based web application and migrating some or all of your Views over to Razor Pages or starting with an Empty web application template. New Razor Pages based projects using the “Web Application” template will be setup correctly by default.

The symptoms I experienced were that after moving some of my Views and making them Razor Pages, the hyperlinks were not working. This seemed rather strange. They rendered on the screen as hyperlinks (picking up the styling of other anchor tags) however, they did nothing when clicked.

My links were defined as follows in my Razor Page…

<a asp-page="/Index">Home</a>

When viewing the source in Chrome of the rendered page I saw the following…

<a asp-page="/Index">Home</a>

That’s odd! For some mystifying reason the link was not being rendered as expected. This was the first clue though, missing rendering. ASP.NET Core uses the concept of tag helpers which provide a mechanism to write Razor code that later gets rendered at runtime into the final HTML output. Tag Helpers provide a convenient syntax that keeps code in our cshtml files much cleaner. Pre ASP.NET Core we’d have used the @Html.ActionLink method to render a link onto a Razor view based on the controller name and action name for example.

With Tag Helpers we can write code which is much closer to HTML, using the traditional anchor tag <a>. Tag Helpers provide the magic which look for certain tags and then render out the necessary HTML code. In the case above, asp-page should look up the route for the designated Razor Page and render the appropriate href attribute in its place in the final HTML output.

So, it appeared that the Tag Helpers were not working on my Razor Pages. At first I couldn’t think of a reason why this would be the case until I suddenly realised, with Razor we need to make Tag Helpers available using the @addTagHelper directive. This controls in which Tag Helpers are available. In MVC Views this is generally defined in a _ViewImports.cshtml file at the root of the Views folder. This is then inherited on all Views within that folder and it’s sub directories.

I’d wrongly assumed that Pages would also pick up this existing ViewImports file from the Views folder. However, Razor Pages expect their own _ViewImports.cshtml within the Pages folder. In my case, to solve my problem, I simply copied the _ViewImports.cshtml from the Views folder into the Pages folder and tried running my application again. As expected, my anchor links were now rendering as expected. In mixed environments using traditional MVC Views as well as Razor Pages, you’ll need to include _ViewImports.cshtml in both locations.

You can also run into a related issue with missing layout and styling if you do not include a _ViewStart.cshtml file pointing to your shared layout page within your Pages directory too. Note that you can actually use Layouts from within the Views/Shared folder and you do not need to copy these into the Pages folder. Only the _ViewStart.cshtml and _ViewImports.cshtml need to be replicated.


Other posts in the ASP.NET Core Gotchas Series:

No. 1 – Why aren’t my Environment Variables loaded into ASP.NET Core 2.0 configuration on Linux?

No. 2 – Why are settings from launchSettings.json now picked up when using dotnet run?

ASP.NET Core Gotchas – No. 2 Why are settings from launchSettings.json now picked up when using dotnet run?

Following on from my first gotcha that we hit yesterday, here is another one which caught us out at the same time.

TL:DR; When using dotnet run (.NET Core 2.0), expect it to set some properties (including the port and environment) from launchSettings.json if it’s included in the Properties folder!

The issue we faced was that when running a new API using ASP.NET Core 2.0 inside a container, it was starting on the wrong port; not port 80 as we’d expected. The scenario where this can show itself is unlikely to be a common one, so hopefully only a few people will run into this exact gotcha.

In our case, prototyping the new API needed some front end involvement so we added a quick and dirty dockerfile and docker-compose.yml file to enable it to be spun up inside a container. Normally our production containers are all built on images which are based on the ASP.NET Core runtime image. We then copy in our published dlls and run the application inside the container. However, in this particular case we’d cheated a little and were using the SDK based image to allow us to build and then run the source inside a container.

In our case our dockerfile looked like this (note: do not use this in production!)

FROM microsoft/aspnetcore-build:2.0



COPY . .

RUN dotnet restore --verbosity quiet && dotnet build -c Release --verbosity quiet


WORKDIR /app/src/OurNewApi

ENTRYPOINT dotnet run

The details of what it does are not too important. If you’re interested your can read my Docker for .NET Developers series to learn more. What is important here is that we are copying the entire solution folder contents into the container and later using dotnet run to start it. Seems safe enough – right?

As stated earlier, we noticed that for some reason, instead of using the default ASP.NET Core environment variable (ASPNETCORE_URLS=”https://*:80) which defines the default port of 80 it was using a different port. We also noticed that the environment was showing as “Development” and not “Production”.

We then examined the console output and noticed the following information:

Using launch settings from C:\Projects\OurApi\OurApi\Properties\launchSettings.json…

We checked and indeed, the port being used was the one defined in launchSettings.json. This was a bit of a surprise since in the past that file has only been used by Visual Studio. We scratched our heads as we’d previously done something similar with an ASP.NET Core 1.0 project without hitting any issues. I started investigating and soon found a closed GitHub issue titled “Add support for launchSettings.json to dotnet run”. Reading through it, it seems that since 2.0, dotnet run will load up some settings from launchSettings.json if it finds one. This makes some sense from a developer tooling point of view as I guess there are cases where it could be useful. In our case, the fact we were blindly copying in our entire solution folder (including the launchSettings.json file) as well as starting our API using dotnet run, meant that we experienced this behaviour inside the container. It’s not something we normally face, but in this quick prototype, it showed itself.

The quick solution in our case was to include the Properties folder in our dockerignore file which specifies any folders/files that you do not want included in the build context. This then avoids them being copied into your image using the COPY command.

For the curious among you, this functionality is implemented inside ProjectLaunchSettingsProvider.cs in the .NET CLI repository. If the file is found then the properties from the environmentVariables section are used as well as the value of the applicationUrl property.

Long story short, be aware of the fact that since .NET 2.0 the “dotnet run” CLI command will look for and use a launchSettings.json file if it’s available. If you suspect this may be happening in your case you can check the console output to see if it has loaded from launchSettings.json.


Other posts in the ASP.NET Core Gotchas Series:

No. 1 – Why aren’t my Environment Variables loaded into ASP.NET Core 2.0 configuration on Linux?

ASP.NET Core Gotchas – No. 1 Why aren't my Environment Variables loaded into ASP.NET Core 2.0 configuration on Linux?

We’ve been using ASP.NET Core 1.0 for some time as part of a microservices architecture we’ve developed. We run the services in production as Docker containers on Amazon ECS. We recently created a new API based on ASP.NET Core 2.0 and ran into some issues with configuration.

The first of the two issues we encountered is in cases where we use environment variables in our Docker containers that we expect to override ASP.NET Core configuration values. The ASP.NET Core configuration system allows many sources for configuration values to be defined. This includes loading from json files and environment variables. When loading configuration, each of the providers is checked in turn for configuration values. Any which define the same key as a previous configuration item are overridden with the new value. This works nicely as we can define common configuration in JSON files and optionally override this in production using environment variables.

This is exactly how we run our APIs currently. In ASP.NET Core 1.0 we could pass in environment variables to containers (in our case, using docker-compose files locally and AWS ECS TaskDefinitions in production). Configuration in ASP.NET Core supports a hierarchy of settings which allows us to define “sets” of values. For example, in our case we have a top level section called DistributedCacheConfig and within that there are three settings to control various things all related to caching.

When overriding these settings using environment variables we previously used the colon separator to define the layer of the hierarchy the value targets. One such environment variable would look like this…


When read and mapped to the ASP.NET Core configuration system this would override the enabled value for the DistributedCacheConfig even if a previous JSON file had set it to false. This even worked when deployed to production where we could configure AWS to start our containers with the necessary environment variables when launching new instances.

When setting up a new API using the latest ASP.NET Core 2.0 version we noticed an issue when deploying to AWS. The settings we had defined in the TaskDefinition (which controls the environment variables containers start with) were not being applied and the settings from the JSON files were still being used. We then tested this locally by starting up a container from the ASP.NET Core 2.0 docker image and again noted that the environment variables were not overriding the JSON values as expected.

I spent some time investigating this and one item I was able to find this GitHub issue for the Configuration repository which mentions the possibility to use a double underscore(__) as the separator between the layers on Linux. I made the change to our environment definition and immediately it was working again. I’d personally never been aware of the option to use this separator.

With the problem hopefully solved I set about investigating what had changed. Certainly the colon separator was working fine on our older projects. Finally I noticed that by default the ASP.NET Core 2.0 Docker image returned when asking for the “2.0” tag is based on Debian Stretch. With ASP.NET Core 1.x it was based on the older Debian Jessie version. I started to wonder if this might explain the change in behaviour. I quickly modified the dockerfile we were using to target the “2.0-jessie” tag, changing the environment variable back to the colon separated version as well. When I ran that as a container, the value was once again set using the environment variable as expected.

My guess (although I’ve not dug any deeper) is that between the two Debian versions, something has changed in how the colon separator is handled for environment variables. To validate this assumption I modified my application to spit out the environment variables at startup.

When running on Stretch – Environment.GetEnvironmentVariables() returns the following console output:

web_1 | key = HOME - value = /root
web_1 | key = TestSetting101 - value = Something
web_1 | key = ASPNETCORE_PKG_VERSION - value = 2.0.3
web_1 | key = NODE_VERSION - value = 6.11.3
web_1 | key = DOTNET_SDK_DOWNLOAD_SHA - value = 74A0741D4261D6769F29A5F1BA3E8FF44C79F17BBFED5E240C59C0AA104F92E93F5E76B1A262BDFAB3769F3366E33EA47603D9D725617A75CAD839274EBC5F2B
web_1 | key = NUGET_XMLDOC_MODE - value = skip
web_1 | key = PWD - value = /app/TestingConfiguration
web_1 | key = DOTNET_SKIP_FIRST_TIME_EXPERIENCE - value = true
web_1 | key = ASPNETCORE_URLS - value = http://+:80
web_1 | key = HOSTNAME - value = 44d9a86fba25
web_1 | key = DOTNET_SDK_DOWNLOAD_URL - value =
web_1 | key = PATH - value = /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
web_1 | key = DOTNET_SDK_VERSION - value = 2.0.3

When running on Jessie – Environment.GetEnvironmentVariables() returns

web_1 | key = PWD - value = /app/TestingConfiguration
web_1 | key = TestSetting101 - value = Something
web_1 | key = DOTNET_SDK_VERSION - value = 2.0.3
web_1 | key = PATH - value = /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
web_1 | key = NUGET_XMLDOC_MODE - value = skip
web_1 | key = DOTNET_SDK_DOWNLOAD_SHA - value = 74A0741D4261D6769F29A5F1BA3E8FF44C79F17BBFED5E240C59C0AA104F92E93F5E76B1A262BDFAB3769F3366E33EA47603D9D725617A75CAD839274EBC5F2B
web_1 | key = MySettings:Setting1 - value = From_DockerCompose
web_1 | key = HOME - value = /root
web_1 | key = ASPNETCORE_URLS - value = http://+:80
web_1 | key = HOSTNAME - value = 0262ce3069ae
web_1 | key = ASPNETCORE_PKG_VERSION - value = 2.0.3
web_1 | key = DOTNET_SDK_DOWNLOAD_URL - value =
web_1 | key = DOTNET_SKIP_FIRST_TIME_EXPERIENCE - value = true
web_1 | key = NODE_VERSION - value = 6.11.3

You can see here that on Stretch the variable with the key MySettings:Setting1 is not even returned. So this explains why it’s not available in the configuration.

While I’d like to know what actually changed that affected the behaviour between the two OS versions, I’ll have to leave that as a mystery. However, the advice here is that if you plan to run on Linux, it’s probably safest to use the double underscore separator when defining any environment variables which works on either OS. Perhaps we were simply lucky in 1.0 that it worked and we should have been using double underscore all along.


Other posts in the ASP.NET Core Gotchas Series:

No. 2 – Why are settings from launchSettings.json now picked up when using dotnet run?

MVP Logo

My Microsoft MVP Journey Becoming a Visual Studio and Developer Technologies MVP

On November 1st I got a very nice surprise in my inbox from Microsoft. I was being awarded an MVP in the ‘Visual Studio and Development Technologies’ category. This is an achievement that I am extremely proud of and very excited about. It’s an honour and honestly quite daunting to join such an amazing group of talented community leaders and experts.

A few people have since asked me about the process of becoming and MVP so I thought it might be helpful to share my experience here.

My Background

Before diving into the specifics of the MVP process I thought it would be worth giving a little background of my career. I’ll try to keep this short by skipping to the salient points, however I think it’s helpful to share a little of my background, which has not always been in development.

I’ve worked in the IT industry since around 2000/2001 when I got my first job as a desktop technician for the NHS in my college (high school) holidays. Even before that I was “into computers”, building my own PC from components ordered online and dabbling in programming with QuickBasic and later VB6. I continued my IT career with a job as a desktop and server engineer for a manufacturing company. Whilst doing that role I suggested building an intranet for some of the internal documentation. I had some limited self-taught experience with HTML at the time so I learned what I could from books and online. Later I taught myself classic ASP in order to build a spare parts ordering system for use within the manufacturing company and its partners. Then ASP.NET was released and I started playing around and learning to work with that from books and online articles. Around the ASP.NET 2.0 timeframe my employer needed to improve an in-house Access database system for managing hydraulic test results for their products. I rebuilt that system using SQL Server with a web (WebForms) frontend. I think at this point I was still using VB.NET!

Later my role was outsourced and I was transferred to work for the outsource provider. There I continued as a desktop and server engineer but due to the stricter role boundaries I was no longer able to perform any development work. I kept my skills up with my own side projects, focusing on the current releases of ASP.NET at the time and starting to teach myself C#. I also took on a little bit of side work producing a bespoke website and CRM for a consulting firm in my own time.

I developed my career at the outsource provider over the years there, becoming a UK engineer lead and eventually moving into a role as a Service Delivery Executive, managing support in Europe and Africa for one of our large customers. This was a move away from the technical aspects and into management and customer relations. I enjoyed the challenges but as time went on the resources were stretched and the role became extremely stressful and unrewarding. I stopped enjoying what I was doing and found the 50+ hour work week was absorbing any spare time I had to relax and dabble with development.

I decided to make a change. I realised I was enjoying the experience of building sites with ASP.NET and learning as much as possible and so I applied for a few developer roles. I was very conscious that I had no idea how my self-taught developer skills would stack up in the real world. However I interviewed for Madgex (my current employer) who clearly found my answers suitable as I was offered a job there! I started at the level of “developer”, so between junior and senior. This reflected fairly my skill level at the time. I immediately loved the role and found my days were far more enjoyable. I was no longer stressed and was able to enjoy having some proper free time outside of work. It was a change in direction that I am very pleased I made.

I have now worked at Madgex for over 2 years and in that time have been constantly learning. For sure there was some elements of my developer skill set that were lacking, having been self-taught, but I quickly worked to fill those gaps in my knowledge. The team I joined was amazing and I learned so much from the experienced developers there. In my personal time I spent a lot of the free time I had learning more, watching videos and keeping up with the developments around the new ASP.NET Core framework. About a year or so ago I was lucky enough to be able to put that into practice as we started developing a new product at Madgex. I’ve since been promoted to senior developer and within the last month taken on some additional responsibilities as a developer community lead.


The journey to becoming an MVP begins with a nomination at In my case I received my first nomination in January 2017 from MVP James Chambers whom I had worked with quite closely on the Humanitarian Toolbox allReady project. James was aware of my blog and activities contributing to OSS and felt I should be put forward for the MVP.

As a nominee, the first you hear about this is via an email from Microsoft, stating that you’ve been nominated. You are given a link to complete a profile in their system.

Recent contributions

As part of the profile completion process you are asked to provide a detailed list of your community contributions. This includes things like any events you have spoken at, blogging, videos you’ve shared and OSS contributions. Retrospectively working out what I’d done and when was no easy task. Had I been expecting a nomination it would have been helpful for me to have kept a record for the last year of activities. If you’re involved in community and hope to one day be nominated I would recommend that you keep a list of key dates for your community activities.

Once you have completed your profile and contributions you submit the form and get a confirmation email that it will go into a review process at Microsoft.

The long wait

In my case, after completing my profile it was a long wait before anything happened, many months in fact. During that time I assumed that the nomination and profile must be under scrutiny within Microsoft. Every now and again I went in to update the contributions against my profile. The process is a bit of a black box and I’m not really sure if my nomination was forgotten about or that it simply takes some time to get a response.

In August I happened to be at a Microsoft community event being run by the the UK Community Program Manager / MVP Lead, Claire Smyth. Since I had the opportunity I asked about the MVP process and mentioned my nomination. We followed up by email and Claire was able to locate the nomination in the system. At this point we arranged a brief call to discuss the MVP along with support Microsoft were offering to new user groups. I happened to have just setup my new user group, .NET South East at this time so Claire had kindly offered me some support and advice.

This was a useful chat and Claire was able to explain a little more about the things that the MVP team are looking for in nominees. It’s mostly that there is a varied and comprehensive range of contributions that show a positive community benefit. I was able to discuss my community activities in a little more detail. Claire also explained that after the nomination is approved it usually also goes to the respective product teams for them to review the contributions and give their feedback. In my case I’d had a little contact with some of the ASP.NET team and I let Claire know some of the people I’d previously interacted with.

At the end of the call Claire confirmed she would look over the information and see if the nomination should go forward.

Further nominations

During the weeks after our call I was fortunate enough to receive two further nominations for the MVP award, including one from Jon Galloway at Microsoft. This last one I feel had particular weight in tipping me over to get the award. Since the review process includes some input from the product teams as to which nominees are ready to the award, having a nomination from someone on the ASP.NET team was likely very helpful! If you receive additional nominations you seem to get a whole new profile to be completed. I dropped an email to Claire who I believe was able to link the nominations together in the system. Just to be sure, I completed the profile on the most recent nomination link. By this point I’d done some more talks so I was able to add some extra contributions to my list.

After this, things went quiet again while the process continued behind the scenes. It was a busy period and I forgot mostly about the nominations for a while. Then, on the 1st of November after an evening meal with some friends I happened to glance at my phone when heading to the car. I had a few emails and I scanned over the list. One particular subject line jumped out at me – “Congratulations 2018-2019 Microsoft MVP!” My excitement mounted further as I opened the email and read the first paragraph!

MVP Award Email

I was honestly shocked and very excited to read this email. I immediately showed it to my wife who was also very excited. I’d explained the MVP and mentioned my nominations for it to her earlier in the year and she was aware of what a big thing it was to be awarded the MVP.

What do you get?

In the congratulations email you are sent details of how to start accessing some of your MVP benefits via the MVP site. There’s a lot to take in. I started by ensuring my MVP profile was correct so that it would appear on the MVP site. I then looked at things such as the MSDN Visual Studio Enterprise subscription which is a very handy subscription to get! After agreeing to a Microsoft NDA you are also able to access special mailing lists that include members of the product teams. I joined the ASP.NET mailing list and the Azure one as those are most relevant to what I do day to day. You can also sign up for a Microsoft Yammer group that gives you access to chat with other MVPs and Microsoft personnel. There’s still lots of things I need to look into as MVP have access to various licenses and products as part of their benefit package. For me though, the access to the wider product teams and fellow MVPs is one of the best benefits.

As well as the access and licenses mentioned above the other thing you can expect as a new MVP is an award pack posted to you from Microsoft. I got an email stating that mine had been posted and would be with me in about one week. I’d seen photographs of the award pack from other MVPs via Twitter so knew roughly what to expect. Even so, on the day of its arrival to my home it was exciting to unbox the contents. Inside the award pack you get a very solid physical trophy made of glass. It’s much more substantial than I’d expected and looks really great. You also get a certificate, MVP ID card, lapel pin and some MVP stickers. It’s a really nice pack and it’s nice to have something physical to represent the award.

Microsoft MVP Award Pack

MVP Summit

The other very exciting opportunity for MVPs is the chance to register to attend the MVP Global Summit which is run annually out of the Microsoft campus in Redmond, Seattle. My MVP award was perfectly timed, just before the registration for this event opened. On the day of registration I was eager to sign up to attend. It’s a hugely valuable opportunity to visit the Microsoft campus and to meet and interact with the Microsoft product teams directly. The registration process itself was a bit of a nightmare as despite jumping on the minute it opened, the system was clearly unable to handle the load and was crashing. I spent nearly two hours trying to complete the hotel registration phase. During that time a number of MVPs were experiencing problems and tweeting about the issue. One MVP provided a couple of phone numbers we could try to get support. Not surprisingly the phone system also struggled under the load but eventually I was able to get through to someone. They were very helpful and in very little time managed to get me registered for the event and the hotel of my choice. I’m looking forward to my first trip to the US and the chance to meet many of the MVPs and Microsoft staff I follow on Twitter.

Next Steps

Being awarded the MVP is a very exciting development in my career. Over the last two years I have been very focused on blogging and sharing as much as I can with the amazing developer community. I learn huge amounts from other community contributors and it’s wonderful to play my small part in that and be recognised with this prestigious award for that work. It’s a little intimidating too, to be joining such an expert group of fellow MVPs. I can’t help but feel a degree of impostor syndrome at the thought. I hope to live up to the award by continuing to contribute within the community. I want to keep up blogging, speaking, running my user group and OSS contributions as much as possible. There are a few other projects and ideas that I hope to find the time to do also that I hope will contribute further to the ASP.NET community in particular. I owe a huge thanks to the community and various specific people along my journey who have supported what I’m doing, offered advice and nominated me for this award. A big thanks to everyone who has helped me along the way!

.NET South East November 2017 Meetup With guest Michael Newton

Last night we held our November .NET South East meetup at Madgex HQ with the amazing Michael Newton speaking. Here’s a brief summary…

Intro and news

At 7pm I opened the evening with my introduction, including thanking our fantastic sponsors. I then went on to discuss some of the news items I had gathered. I was a bit short on details for the item’s I’d picked due to being out sick (cough) for the week prior to the event. I managed to put together notes on two headlines an hour or two before the event started.

Visual Studio Live Share

This first item was announced at the recent Microsoft Connect() event in New York. It’s an early announcement of a new feature being worked on for Visual Studio 2017 and VS Code.

The idea with Live Share is that teams of developers will be able to interactively collaborate on code directly from their editor/IDE. A developer can start a shared session on their machine which will generate a link that provides access to a shared, private workspace. They can send this link to their collaborators who will then be able to connect and see the code for your workspace. There is no need to explicitly clone the code or install any dependencies.

Once connected your partner will be able to view your code and even see where your cursor is. You can select code and that will reflect in your partners editor. Any changes being made will be visible to both parties.

Taking it a step further, you can even share a debug session that your partner can participate with. Your partner will be attached and will see a live stack trace, locals values and can even hover over variables to see their current value.

This looks like a really exciting way to enable remote pair programming and perhaps as a way for people to assist contributors of open source projects for example. No release date has been given but it’s certainly one to watch.

Nullable Reference Types in C# 8.0

This item came from a recent MSDN blog post stating that a trial version of the planned C# 8.0 feature to provide non-nullable reference types is now available. Microsoft want to capture feedback on the current proposed design of this language feature to help finalise on something that works well for the developer community.

The ideas being proposed are to add the concept of non-nullable reference types to the language with a view to removing bugs and making it easier to express intent in the code. Nulls should not exist except in cases where the domain design makes them reasonable.

In C# 8.0 the plan is to introduce this and as a result, all current reference types will be assumed to be non-nullable as the default. Any cases where a reference type is assigned a null will then be marked as warnings by the compiler.

To enable developers to use nullable reference types, we’ll also be able to add a question mark (?) as we can currently with value types to identify any reference types which will then be considered nullable. In those cases it will likely result in many warnings from the compiler where null checks are missing before dereferencing those objects. This is expected to help catch and resolve potential bugs that could result in NullReferenceExceptions being thrown in your code.

The compiler warnings can be disabled if you do not which to be warned of these issues and the existing IL code produced by the compiler will not change.

You can read more about this announcement here:

Michael Newton – Making Distributed Systems in .NET Easier

Michael Newton presenting at .NET South East

This month we were joined by local consultant and trainer, Michael Newton. Michael started by talking a little about good API design. He explored a few examples where the API design is critical to providing a clear way to work with a library. Just as importantly these APIs need to aim to help protect the user from making bad / incorrect decisions.

He cited examples of the NodaTime API which has a quite complex API surface but is also tackling a quite complex problem. However it does ensure that it’s hard to shoot yourself in the foot as one example showed. In that case you’d have to ignore the fact that you had made calls to 3 methods you did not understand in order to get things wrong. This API therefore forces a reasonable level of understanding of the problem before you can use it. Once you do have that understanding, it makes sense and gives fine control of the intent you have when working with the complexities of date and time across time zones.

Michael went on to show us EasyNetQ, a simple .NET library for working with RabbitMQ. This library was originally conceived by Mike Hadlow when he was working for 15Below in Brighton. The original .NET library from RabbitMq was extremely complex and required lots of knowledge before you could consume it. The EasyNetQ library sits on top of that and abstracts away the complexity with a much more simplified API. This comes at the cost of some functionality, but for the majority of general use cases provides everything people need. It favours simplicity over a complex feature set.

EasyNetQ worked well for general scenarios but as the need for more complex workflows began to appear it was clear that more was needed. This led to the development of an EasyNetQProcessManager library by Michael while working for 15Below. This came after Michael read Enterprise Integration Patterns by Gregor Hohpe and‎ Bobby Woolf. Michael set about designing a library to sit over EasyNetQ.

November audience for Michael Newton at .NET South East

It was interesting to hear these thoughts and to set the scene for next section of the talk where Michael shared his alpha code for a new version of a process manager called RouteMaster. Here Michael is working to develop a slick API that helps protect the user of the library from making mistakes, where possible making use of the .NET type system to do so. It’s very functional in nature and the main code is written in F# with a C# API having been completed in the afternoon prior to this meetup (so very, very alpha)! This library aims to make developing distributed workflows a relatively simple affair.

Along the way Michael shared some useful tools and info. For example he showed us briefly that he uses a Docker image for a product called adminer to provide a management UI over the underlying Postgres database. I’ll be checking this one out for sure. Another piece of advice was using FsCheck for property based testing and how this can be a great way to fully test your code using many randomly generated inputs.

It was a really interesting evening so a huge thanks to Michael for presenting his content to us.

Prize Draws

With the end of the evening closing in, before heading off to the pub we drew the winners of the prizes from our fantastic sponsors for the event. The prizes we had to offer were:

JetBrains – One year individual subscription to any single JetBrains Toolbox product

Manning – Free eBook – 6 month Small Business license

PostSharp – License to PostSharp

Again I used the WPF app created by Dan Clarke, who organises the .NET Oxford meetup. This time however I trialed pre-drawing names just before the event started to speed up the process of prize winner selection. The rules as with the last event were:

a) names are added from the RSVP list (as at about 1-2 hours before the event)
b) if the name drawn is not in attendance, we redraw.

Next events

We have some great speakers lined up for 2018. Our December meetup is currently under consideration. Our planned speaker has unfortunately had to withdraw so I’ll be assessing fall-back options.

For January I am working on a plan which is a long shot but will be announced if I can align a few puzzle pieces in time.

February 2018 – Ian Cooper

March 2018 – Joe Stead

April 2018 – Jon Smith