Recap: One Year Using C# 9 with .NET Framework

TL;DR: You can use C# 9 language features with .NET Framework. Compiler features work out-of-the-box. Features that need runtime support can be enabled with tricks.

“Yeah, I knew that. What’s your conclusion?”

By the time I finally get around to publishing this, it may be old news, but in early 2021 this seemed to not have been well-known at all:

Just as you can use SDK-style projects with .NET Framework, you can use C# 8, 9 and maybe even later language versions.
This means if you are stuck with .NET Framework or netstandard2.0, you can still get a lot of the benefits of .NET 5/6/x, in both the code world and the SDK world.

The one prerequisite is that you use SDK-style projects.
The other is that you use a C#-9-capable compiler.
You probably already have the latter in your SDK installation.

As written better in other posts, by setting <LangVersion>9.0</LangVersion> in your project file, you enable C# 9 on your project, independent of its TFM (e. g. net48 or netstandard2.0).
You can now immediately use pure language features like switch expressions.
To use things like nullable reference types or records, you need to work a little harder.
There are a couple of classes that need to be present during compilation for them to work.
If the compiler does not find those classes, it will tell you explicitly and show an error.
You do not run the risk of the compilation silently failing and subsequently facing runtime errors.
If code using the features compiles, the features will work at runtime.

You could add the missing attributes yourself or just use one nuget package for getting NRTs and another one for getting records.
In my current project I had already added them myself to a “LanguageBackports” folder that was then included in all projects, before I discovered that there were ready-to-use nuget packages available.

So far, so good.

But what about the framework APIs?

The one thing you have to look out for is the state of nullability annotation of the libraries you use.
At the time of publishing this post, a lot of netstandard2.0 libraries have been annotated.
That was not the case back then and will not be the case for .NET Framework assemblies ever.

One way of having annotations would be to use an IL weaver that simulates an annotated .NET Framework, like ReferenceAssemblyAnnotator.
I have not tried that out though.

Another way would be to use multi-targeting with the one unannotated TFM you really want and one annotated TFM that is as close as possible to the “real” one.
Instead of declaring <TargetFramework>netstandard2.0</TargetFramework> you would do <TargetFrameworks>netstandard2.0;net5.0</TargetFrameworks>.
You would then view the files as net5.0 in VS to see the annotations and get the NRT warnings.
When compiling, you would still only use netstandard2.0 as the target and avoid the disadvantages of multi-targeting like increased package size.
That whole technique relies on the used API surface being the same, so you do not need preprocessor directives for a TFM you are not even really using.
It also relies on the matching APIs behaving the same with regards to nullability.
This may not always be the case, especially considering that some APIs have moved from the runtime’s core library into independently developed nuget packages.

Using framework APIs as MS made ’em

NRT flow analysis are pretty good at telling what can and cannot be null.
It will even consider implicit conversions and other less obvious code paths.
This means more discovered potential bugs.
But it can also mean you start to rely on it.

When using unannotated libraries, you cannot rely on it, but have to be extra careful.
I did not have that problem a lot, because the code calling into framework APIs already did null checks where appropriate (at least most of the time), and flow analysis takes that into account as well: if you check something for null, it will assume that it can be null, even without annotation, even when the annotation says otherwise.

In general for unannotated libraries, you do not lose anything (in C# 7.3 the compiler did not warn you about nulls either), as long as you stay alert and do not rely on the compiler to do your null checking for you.
“If it compiles it runs” is not true here, at least not when using .NET Framework 4.8 or netstandard2.0.

Having NRTs available in my own code was a booster to the expressiveness of my APIs and the simplicity of my code.
I do not want to go back.

The lack of annotations lead to some annoying false positives, though.
Fortunately the culprits have been few.

string.IsNullOrEmpty() and string.IsNullOrWhitespace were the most common ones.
An easy remedy is to create an annotated extension method and use that everywhere instead of the built-in methods:

public static bool IsNullOrEmpty([NotNullWhen(false)] this string? str) => string.IsNullOrEmpty()
public static bool IsNullOrWhitespace([NotNullWhen(false)] this string? str) => string.IsNullOrWhitespace()

You can add the cs file containing those methods to every nullability-enabled project with something like <Compile Include="..\StringExtensions.cs" />.

Another annoyance is HasValue on nullable value types.
This could be solved by an extension method as well.

Of course you need to remember to use the extension methods instead of the built-in ones.
Or write an analyzer enforcing that you do.

Conclusion

What does it feel like using C# 9 in .NET Framework and netstandard2.0 projects?
After more than a year, I can say: it feels really good.

Yes, you have to work a bit once to enable C# 9 language features for your .NET Framework and netstandard2.0 projects.
Yes, there are some annoyances and gotchas.
Yes, you should do it anyway.

Having C# 9 language features available can make your code more concise, more readable and overall more expressive.
NRTs and pattern matching make the compiler work for you more.

But maybe most importantly in the long term, it keeps you as a developer up-to-date on new language features, even if you are stuck on legacy target runtimes.