Fixing an old .NET Core native library loading issue on Alpine

by ingve- andrewlock.net

Source

SQLite is pretty much the only remaining native dependency in my C# codebases, and as much as I love the engine, I wish that could go away.

Replacing System.Data.SQLite with Microsoft.Data.Sqlite already helped with Apple ARM builds (despite all the small differences that only showed up in actual use), but pretty much the only native debugging I do these days is related to the "batteries" -- the linked article outlines the general strategy pretty well.

On the one hand, I feel bad about turning into a "pure-Java only" kind of developer (I mean, limiting yourself to H2, the horror...), but on the other hand, I'm increasingly starting to see their point. Oh, well, if AI actually works, I'm sure Microsoft.Native.Data.Sqlite is just around the corner (and, later edit, to prevent confusion: the abuse of 'Native' here is mostly making fun of Microsoft naming conventions -- they'd call it 'Interop' if it were like truly native).

I guess you would need to switch to a dotnet native database, like litedb. Even if you would use postgres, there would be native code left, decoupled from the dotnet application though.

It would be interesting though, if it's possible to run webassembly inside the CLR (dotnet runtime).

But I don't really get the issue with native code inside a dotnet application. In the end everything you do in dotnet ends up being executed as native code. Even a simple console.writeline() is implemented in native code.

I've really tried to like LiteDB (mostly because it can use an IO.Stream as the database backing store, which enables lots of fun scenarios), but even light usage mostly resulted in data corruption and inconsistent result sets, something I've literally never seen with SQLite. Plus, I think the project is pretty much dead?

And yes, of course everything ultimately runs as native code, but deployment is a major issue. As long as you only deploy IL (or, possibly at some point, WASM), you only need to worry about the relatively lightweight CLR (the dotnet executable and its direct dependencies) -- it does get a lot more complex once you go beyond that, unfortunately.

Exactly because of the benefits of this, there is the meme CGO is not Go, as any package with native dependencies kills the possibility to cross-compile with the Go toolchain.

Same applies to the pain of using native dependencies in Perl, Python, Ruby,...

Many .NET developers are only now slowly the pain of having so many dependencies to C++ DLLs, COM and C++/CLI.

One of the reasons why we still do so many .NET Framework projects at my employer agency.

> One of the reasons why we still do so many .NET Framework projects at my employer agency.

What's the issue with porting them to .net (core)? Most of that stuff is still supported, if you only have native windows DLLs you would still be constrained to windows only, but still better than staying on ancient .net framework.

If you are stuck on Windows and upgrade to Core requires more then 2 hours of dev time, it's likely not worth it. Core biggest feature is running on Linux. If you can't, who cares? Framework is not going anywhere. It will be supported till 2035 for now.

> Core biggest feature is running on Linux

There are so many features that .Net 5+ brings to the table. Even if features aren’t important the performance improvements you get with the newer versions should be enough to justify moving to it.

I agree the support side is annoying but honestly the support side is really just “security” fixes with security being a very hard thing to describe here and gives MS a lot of wiggle room to not actually support it.

No, linux support is not the only new feature. Read the change logs, it's thousands of huge improvements everywhere.

As someone who deals with this, Framework -> Core on Windows is small % performance improvement. Framework Windows -> Core on Linux is huge. Most of it coming from not Windows.

Yes, there is other nice language features but obviously 15 years of Framework code base has probably put up guard rails around those sharp edges.

My point still stands, I can't imagine most companies green lighting .Net Framework -> Core conversion if they can't switch to Linux. If you are stuck on Windows, you have probably developed all the tooling to deal with Windows so it's all sunk costs.

Linux was rarely part of the conversation when I was doing these conversions.

Getting access to things like Kestrel (breaking out of IIS jail) is way more critical. Also, self contained deployments mean you can stop shipping magical blessed machine images around. It's not even about the performance. It's about having technology that doesn't actively hate you.

We solved that headache with Windows containers, but yeah that is literally shipping a blessed machine in disguise.

Adding to this, IME, doco for .NET Framework is increasingly hard to find because the .NET Core stuff bubbles up in its place - possibly also because Google is absolute trash these days. Not a big deal if you have some hardcore .NET lifers who know Framework inside out.. but I'd say those are heavily outnumbered, and eventually going to move on to goose farming anyway.

I can tell you that in regards to that, being around .NET since day zero is really advantageous for actually being able to find what I wanted.

Also they aren't the only ones, same applies on the Apple ecosystem, only those around Mac OS and OS X early days actually manage to find sensible documentation.

Someone has to pay for the work.

Would the WebAssembly version of SQLite be OK for you?

I haven't really kept up with the state of WebAssembly in .NET (mostly because I'm entirely uninterested in Blazor), but I don't think we're at a point yet where we can simply reference a .wasm file and invoke code in it regardless of the underlying platform, right? Until that is the case: no, not really.

Nice walkthrough. You come across a lot of these old .NET based architectures in consulting. Not just older .NET Core projects but still a lot of .NET Framework too.

It’s a hard sell to a client that they should use their budget for an upgrade to the core of their software without any visible feature additions. Hence they tend to live on in this limbo state.

Beneficial for consultants however.

This is the heart of the problem with outsourcing that the powers that be rarely recognize. Yes they might pay less by offshoring the initial development costs (CAPEX), but the long terms support and maintenance costs (OPEX) is then way more costly. It also gets harder and harder to find developers that want to work in this aging tech.

I've seen the struggle with not upgrading frameworks and staying on an old version. Usually the cost/risk for upgrading is perceived much higher than it actually is. And the cost/risk of technical debt is mostly ignored. I've seen this issue so often: "we can't use this external component, because we would need to upgrade the framework. Let's just work around it, it will only take 2 days (and 50 more days to maintain it over a decade)"

I fail to understand why they feel the need to test their setup with the latest Alpine while at the same time using out of date and unsupported versions of .NET.

On the flip side, good debugging!

I really don't get why people still bother with unsupported dotnet versions. There might be a few edge cases that prevent upgrading, but in 99% an upgrade from dotnet 3.1 to dotnet 10 is completely smooth.

Running in an unsupported dotnet version also means that there won't be any security patches. Not great.

Because in many companies that isn't a 5 second job changing a csproj file.

It requires clearance from management to spend actual money, measured in the amount of hours of work of everyone involved doing this times the hourly rate, to update every single configuration file, CI/CD build scripts, do a QA round on staging environment to validate everything is working as it was already before, to finally to production delivery, and tell everyone the new version is now greenlight for development.

Naturally having a security assessment that an upgrade is required is a good way to have that budget come to fruition.

I know that. But if you never upgrade, your project drowns in technical debt until the point where it only eats up money, and no changes can be implemented anymore.

Well there is "netstandard 2.0", which lets you target both .NET Framework 4.6.1+ and Dotnet 2.0+ with the same code.

That usually involves some serious work to get done. Just upgrading to dotnet 8/9/10 is much easier.

> in 99% an upgrade from dotnet 3.1 to dotnet 10 is completely smooth.

> Running in an unsupported dotnet version is not great

Uh, dotnet 10 is currently versioned "10.0.0-preview.7". It won't be released until November 2025. It's therefor 1) Not guaranteed smooth and 2) unsupported. Source: https://dotnet.microsoft.com/en-us/download/dotnet/10.0

Perhaps you mean .NET 9.

Yes, it's a smooth update in many scenarios.

.NET 10 is the next LTS version. It’s going to be released approximately in November and already quite stable. Now is the time to upgrade to 10, do some testing and then deploy once 10 is released. 9 is a non-lts version and 8 (LTS) is already 2 years old (you’ve lost 2 years of support already).

Edit: Microsoft officially supports .net release candidates for production use, the first RC is probably going to be released within the next few weeks.

  > Running in an unsupported dotnet version is not great
They meant netcoreapp3.1 and net5.0.

Also, did not alpine work? Size difference between the two is 200MB which is probably insignificant for 99% of .Net users.

Because often, somebody wrote something a few years ago and there isn't a business case to constantly upgrade every single dependency.