Migrating to .NET Core: the future of .NET.

More than 20 years ago, Microsoft introduced their .NET Framework. A reaction to Java and the use of virtual machines and dynamic (re)compiling of code for applications and services. Unlike Java, where the Java virtual machine was tied to a single programming language, also by the name of Java, Microsoft opened up their .NET virtual machine to a variety of languages. They also introduced a new language of their own: C#. A modern language which was similar to C++ and Java, which allowed Microsoft to introduce new features easily, because they controlled the standard.

Then, in 2016, Microsoft introduced .NET Core, a sign of things to come (and a sign of confusion, because we used to have only one .NET, and now we had to separate between ‘Framework’ and ‘Core’). Where the original .NET Framework was mainly targeted at Windows and Intel x86 or compatible machines, .NET Core was aimed at multiple platforms and architectures, as Java before it. Microsoft also moved to an open source approach.

This .NET Core was not a drop-in replacement, but a rewrite/redesign. It had some similarities to the classic .NET Framework, but was also different in various ways, and would be developed alongside the classic .NET Framework for the time being, as a more efficient, more portable reimplementation.

On April 18th 2019, Microsoft released version 4.8 of the .NET Framework which would be the last version of the Framework product line. On November 10th 2020, Microsoft announced .NET 5. This is where the Framework and Core branches would be joined. Technically .NET 5 is a Core branch, but Microsoft now considered it mature enough to replace .NET 4.8 for new applications.

As you may know from my earlier blogs, I always say you should keep an eye on new products and technologies developing, so this would be an excellent cue to start looking at .NET Core seriously. In my case I had already used an early version of .NET Core for a newly developed web-based application sometime in late 2016 to early 2017. I had also done some development for Windows Phone/UWP, which is also done with an early variation of the .NET Core environment, rather than the regular .NET Framework.

My early experiences with .NET Core-based environments were that it was .NET, but different. You could develop with the same C# language, but the environment was different. Some libraries were not available at all, and others may be similar to the ones you know from the .NET Framework, but not quite the same, so you may have to use slightly different objects, namespaces, objects, methods or parameters to achieve the same results.

However, with .NET 5, Microsoft claims that it is now ready for prime time, also on the desktop, supporting Windows Forms, WPF and whatnot, with the APIs being nearly entirely overlapping and interchangeable. Combined with that is backward compatibility with existing code, targeting older versions of the .NET Framework. So I figured I would try my hand at converting my existing code.

I was getting on reasonably well, when Microsoft launched .NET 6 in November, together with Visual Studio 2022. This basically makes .NET 5 obsolete. Support for .NET 5 will end in May 2022. .NET 6 on the other hand is an LTS (Long-Term Support) version, so it will probably be supported for at least 5 or 6 years, knowing Microsoft. So, before I could even write this blog on my experiences with .NET 5, I was overtaken by .NET 6. As it turns out, moving from .NET 5 to .NET 6 was as simple as just adjusting the target in the project settings, as .NET 6 just picks up where .NET 5 left off. And that is exactly what I did as well, so we can go straight from .NET 4.8 to .NET 6.

You will need at least Visual Studio 2019 for .NET 5 support, and at least Visual Studio 2022 for .NET 6 support. For the remainder of this blog, I will assume that you are using Visual Studio 2022.

But will it run Cry… I mean .NET?

In terms of support, there are no practical limitations. With .NET 4.7, Microsoft moved the minimum OS support to Windows 7 with SP1, and that is still the same for .NET 6. Likewise, .NET Framework supports both x86 and x64, and .NET 6 does the same. On top of that, .NET 6 offers support for ARM32 and ARM64.

Sure, technically .NET 4 also supports IA64 (although with certain limitations, such as no WPF support), whereas .NET 6 does not, but since Windows XP was the last regular desktop version to be released for Itanium, you could not run the later updates of the framework anyway. If you really wanted, you could get Windows Server 2008 R2 SP1 on your Itanium, as the latest possible OS. Technically that is the minimum for .NET 4.8, but I don’t think it is actually supported. I’ve only ever seen an x86/x64 installer for it. Would make sense, as Microsoft also dropped native support for Itanium after Visual Studio 2010.

So assuming you were targeting a reasonably modern version of Windows with .NET 4.8, either server (Server 2012 or newer) or desktop (Windows 7 SP1 or newer), and targeting either x86 or x64, then your target platforms will run .NET 6 without issue.

Hierarchy of .NET Core wrapping

Probably the first thing you will want to understand about .NET Core is how it handles its backward compatibility. It is possible to mix legacy assemblies with .NET Core assemblies. The .NET 6 environment contains wrapper functionality which can load legacy assemblies and automatically redirect their references to the legacy .NET Framework to the new .NET environment. However, there are strict limitations. There is a strict hierarchy, where .NET Core assemblies can reference legacy assemblies, but not vice versa. So the compatibility only goes one way.

As you probably know, the executable assembly (the .exe file) contains metadata which determines the .NET virtual machine that will be used to load the application. This means that a very trivial conversion to .NET 6 can be done by only converting the project of your solution that generates this executable. This will then mean the application will be run by the .NET 6 environment, and all referenced assemblies will be run via the wrapper for .NET Framework to .NET 6.

In most cases, that will work fine. There are some corner-cases however, where legacy applications may reference .NET Framework objects that do not exist in .NET 6. or use third-party libraries that are not compatible with .NET 6. In that case, you may need to look for alternative libraries. In some cases you may find that there are separate NuGet packages for classic .NET Framework and .NET Core (such as with CefSharp, which has separate CefSharp.*.NETCore packages). Sometimes there are conversions of an old library done by another publisher.

And in the rare case where you can not find a working alternative, there is a workaround, which we will get into later. But in most cases, you will be fine with the standard .NET 6 environment and NuGet packages. So let’s look at how to convert our projects. Microsoft has put up a Migration Guide that gives a high-level overview, and also provides some crude command-line tools to assist you with converting. But I prefer to dig into the actual differences of project files and things under the hood, so we have a proper understanding, and can detect and solve problems by hand.

Converting projects

The most important change is that project files now use an entirely different XML layout, known as “SDK-style projects”. Projects now use ‘sensible defaults’, and you opt-out of things, rather than opt-in. So your most basic project file can look as simple as this:

<Project Sdk="Microsoft.NET.Sdk">



So you merely need to tell Visual Studio what kind of project it is (eg “Library” or “Exe”), and which framework you want to target. This new project type can also be used for .NET 4.8 or older frameworks, so you could convert your projects to the new format first, and worry about the .NET 6-specific issues later.

What happens here is that by default, the project will include all files in the project directory, and any subdirectories, and will automatically recognize standard files such as .cs and .resx, and interpret them as the correct type. While it is possible to set the EnableDefaultItems property, and go back to the old behaviour of having explicit definitions for all files included, I would advise against it, for at least two reasons:

  • Your project files remain simple and clean when all your files are included automatically.
  • When files and folders are automatically included, it will more or less force you to keep your folders clean and not have files remaining in there, which are not relevant to the project, or should not be in the folder containing the code, but should be stored elsewhere.

So this type of project will force you to exclude files and folders that should not be used in the project, rather than including all files you need.

I would recommend just backing up your old project files, and replacing them with this new ’empty’ project file, and just load it in Visual Studio (not right away, you may want to read about some possible issues, like with NuGet packages, below). You will immediately see which files it already includes automatically. If your projects are clean enough (merely consisting of .cs and .resx files), they should be more or less correct automatically. From there on, you simply need to add the references back, to other projects, to other assemblies, and to NuGet packages. And you may need to set ‘copy to output’ settings for non-code files that should also be in the application folder.

As mentioned above, you probably want to start by just converting the project for your EXE, and get the project building and running that way, with all the other projects running via the .NET 4-to-6 compatibility wrapping layer. Then you will want to work your way back, via the references. A good help is to display the project build order, and work from the bottom to the top of the list, converting the projects one by one, and creating a working state of the application at every step. Right-click your project in the Solution Explorer, choose “Build Dependencies->Project Build Order…”:

The solution format has not been modified, so you do not need to do anything there. As long as your new projects have the same path/filename as the old ones, they will be picked up by the solution as-is.

Now to get to some of the details you may run into.

NuGet issues

NuGet packages were originally more or less separate from the project file, and stored in a separate packages.config file. The project would reference them as normal references. NuGet was a separate process that had to be run in advance, in order to import the packages into the NuGet folder, so that the references in the project would be correct.

Not anymore, NuGet packages are now referenced directly in the project, with a PackageReference tag. MSBuild can now also import the NuGet packages itself, so no separate tool is required anymore.

This functionality was also added to the old project format. So I would recommend to first convert your NuGet packages to PackageReference entries in your project, getting rid of the packages.config file.

This also implies that if you build your application not from Visual Studio itself, but via an automated build process via MSBuild, such as a build server (Jenkins, Bamboo, TeamCity or whatnot), that you may need to modify your build process. You may need to replace a NuGet-stage with an MSBuild-stage that restores the packages (running MSBuild with the -t:restore switch).

So I would recommend first converting your projects from packages.config to PackageReference, and getting your build process in order, before converting the projects to the new format. Visual Studio can help you with this. In the Solution Explorer, expand the tree view of your project, go to the References-node, right-click and choose “Migrate packages.config to PackageReference…”:

AssemblyInfo issues

Another major change in the way the new project works, is that by default, it generates the AssemblyInfo from the project, rather than from an included AssemblyInfo.cs file. This will result in compile issues when you also have an AssemblyInfo.cs-file, because a number of attributes will be defined twice.

Again, you have the choice of either deleting your AssemblyInfo.cs file (or at least removing the conflicting attributes), and moving the info into the project file, or you can change the project to restore the old behaviour.

For the latter, you can add the GenerateAssemblyInfo setting to your project, and set it to false, like this:


Limitations of .NET Core

So, .NET is now supported on other platforms than Windows, such as linux and macOS? Well yes and no. It’s not like Java, where your entire application can be written in a platform-agnostic way. No, it’s more like there is a lowest common denominator for the .NET 6 environment, which is supported everywhere. But various additional frameworks/APIs/NuGet packages will only be available on some platforms.

In the project example above, I used “net6.0” as the target framework. This is actually that ‘lowest common denominator’. There are various OS-specific targets. You will need to use those when you want to use OS-specific frameworks, such as WinForms or WPF. In that case, you need to set it to “net6.0-windows”. Note that this target framework will also affect your NuGet packages. You can only install packages that match your target.

There is also a hierarchy for target frameworks: the framework “bubbles up”. So a “net6.0” project can only import projects and NuGet packages that are also “net6.0”. As soon as there is an OS-specific component somewhere, like “net6.0-windows”, then all projects that reference it, also need to be “net6.0-windows”.

This can be made even more restrictive by also adding an OS version at the end. In “net6.0-windows”, version 7 or higher is implied, so it is actually equivalent to “net6.0-windows7.0”. You can also use “net6.0-windows10.0” for example, to target Windows 10 or higher only.

In practice this means that if you want to reuse your code across platforms, you may need to define a platform-agnostic interface-layer with “net6.0”, to abstract the platform differences away. Then you can implement different versions of these interfaces in separate projects, targeting Windows, linux and macOS.

Separate x86 and x64 runtimes

Another difference between .NET 4.8 and .NET 6 is that the runtimes are now separated into two different installers, where .NET 4.8 would just install both the x86 and x64 environment on x64 platforms.

This implies that a 64-bit machine may not have a 32-bit runtime installed, and as such can only run code that specifically targets x64 (or AnyCPU). That may not matter for you if you already had separate builds for 32-bit and 64-bit releases (or had dropped 32-bit already, and target 64-bit exclusively, as we should eventually do). But if you had a mix of 32-bit and 64-bit applications, because you assumed that 64-bit environments could run the 32-bit code anyway, then you may need to go back to the drawing board.

Of course you could just ask the user to install both runtimes, or install both automatically. But I think it’s better to try and keep it clean, and not rely on any x86 code at all for an x64 release.

Use of AnyCPU

While on the subject of CPU architectures, there is another difference with .NET 6, and that relates to the AnyCPU target. In general it still means the same as before: the code is compiled for a neutral target, and can be run on any type of CPU.

There is just one catch, and I’m not sure what the reasoning is behind it. That is, for some reason you cannot run an Exe built for AnyCPU on an x86 installation. The runtime will complain that the binary is not compatible. The same binary will run fine on an x64 installation.

I have found that a simple workaround is to build an Exe that is specifically configured to build for x86. Any assemblies that you include, can be built with AnyCPU, and will work as expected.

It is a small glitch, but easy enough to get around.

Detecting and installing .NET Core

Another problem I ran into, as .NET Core is still quite a fresh platform, is that it may not be supported by the environment that you create your installers with. In my case I had installers built with the WiX toolset. This does not come with out-of-the-box detection and installation of any .NET Core runtimes yet. What’s worse, the installer itself relies on .NET in order to run, and custom code is written against the .NET Framework 4.5.

This means that you would need to install the .NET Framework just to run your installer, while your application will need the .NET 6 installer, and the .NET Framework is not required at all, once it is installed. So that is somewhat sloppy.

Mind you, Microsoft includes a stub in every .NET Core binary that generates a popup for the user, and directs it to the download page automatically:

So, for a simple interactive desktop application, that may be good enough. However, for a clean, automated installation, you will want to take care of the installation yourself.

I have not found a lot of information on how to detect a .NET Core installation programmatically. What I have found, is that Microsoft recommends using the dotnet command-line tool, which has a –list-runtimes switch to report all rutimes installed on the system. Alternatively, they say you can scan the installation folders directly.

As you may know, the .NET Framework could be detected by looking at the relevant registry key. With .NET Core I have not found any relevant registry keys. I suppose Microsoft deliberately chose not to use the registry, in order to have a more platform-agnostic interface. The dotnet tool is available on all platforms.

Also, a quick experiment told me that apparently the dotnet tool also just scans the installation folders. If you rename the folder that lists the version, e.g. changing 6.0.1 to 6.0.2, then the tool will report that version 6.0.2 is installed.

So apparently that is the preferred way to check for an installation. I decided to write a simple routine that executed dotnet –list-runtimes and then just parsed the output into the names of the runtimes and their versions. I wrapped that up in a simple statically linked C++ program (compiled to x86), so it can be executed on a bare bones Windows installation, with no .NET on it at all, neither Framework nor Core. It will then check and install/upgrade the .NET 6 desktop runtime. I also added a simple check via GetNativeSystemInfo() to see if we are on an x86 or x64 system, so it selects the proper runtime for the target OS.

Workarounds with DllExport/DllImport

Lastly, I want to get into some more detail on interfacing with legacy .NET assemblies, which are not directly compatible with .NET 6. I ran into one such library, which I believe made use of the System.Net.Http.HttpClient class. At any rate, it was one of the rare cases where the compatibility wrapper failed, because it could not map the calls of the legacy code onto the equivalent .NET 6 code, since it is not available.

This means that this assembly could really only be run in an actual .NET Framework environment. Since the assembly was a closed-source third-party product, there was no way to modify it. But there are ways around this. What you need is some way to run the assembly inside a .NET Framework environment, and have it communicate with your .NET 6 application.

The first idea I had was to create a .NET Framework command-line tool, which I could execute with some command-line arguments, and parse back its output from stdout. It’s a rather crude interface, but it works.

Then I thought about the UnmanagedExports project by Robert Giesecke, that I had used in the past. It allows you to add [DllExport] attributes to static methods in C# to create DLL exports, basically the opposite of using [DllImport] to import methods from native code. You can use this to call C# code from applications written in non-.NET environments such as C++ or Delphi. The result is that when the assembly is loaded, the proper .NET environment is instantiated, regardless of whether the calling environment is native code or .NET code.

Mind you, that project is no longer maintaned, and there’s a similar project, known as DllExport, by Denis Kuzmin, which is up-to-date (and also supports .NET Core), so I ended up using that instead.

So I figured that if this works when you call a .NET Framework 4.8 assembly from native C++ code, it may also work if you call it from .NET 6 code. And indeed it does. It’s still a bit messy, because you still need a .NET 4.8 installation on the machine, and you will be instantiating two virtual machines, one for the Core code and one for the Framework code. But the interfacing is slightly less clumsy than with a command-line tool.

So in the .NET 4.8 code you will need to write some static functions to export the functionality you need:

class Test
    public static int TestExport(int left, int right)
        return left + right;

And in the .NET 6 code you will then import these functions, so you can call them directly:

public static extern int TestExport(int left, int right);
public static void Main()
    Console.WriteLine($"{left} + {right} = {TestExport(left, right)}")


That should be enough to get you off to a good start with .NET 6. Let me know how you get on in the comments. Please share if you find other problems when converting. Or even better, if you find solutions to problems.

This entry was posted in Software development, Software news and tagged , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

2 Responses to Migrating to .NET Core: the future of .NET.

  1. Pingback: Another adventure in downgrading | Scali's OpenBlog™

  2. Pingback: Another adventure in downgrading, part 3: XP | Scali's OpenBlog™

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s