I can understand why they'd complain about version mismatches when installing dependencies, since in those circumstances failing fast prevents me from running into deprecated functions down the road, as would happen with PHP. However, the fact that different versions of Ruby/Rails are available in different OS distros and such basically mandates that i use containers OR that i just change the contents of my Gemfile to reflect the version that i will be using during build, which carries the aforementioned risks.
That said, Ruby and Rails are both far more stable than the current npm or pip ecosystems, given that the development has slowed down in Rails somewhat and isn't broken every week due to some package introducing breaking changes. That's not to say that it's better in most conceivable ways (for example, in regards to scalability), but as far as batteries included solutions go, it's pretty okay.
> If you only need sqlite as a db you could easily compile a static binary that will run forever (as in foreseeable future) and serve content via unencrypted HTTP.
Actually, static binaries are perhaps one of the better ways to ship software, especially with static linking, as long as you're ready to take certain security risks in the name of long term stability, though that doesn't prevent you from building new versions in an automated fashion either, at least before something breaks down the line and hopefully your tests alert you about needing manual intervention.
It feels like Java sort of tried to be this, as did .NET, but there is too much functionality that depends on reflection and standard library classes out there, that many projects are stuck on JDK 8, and the whole .NET/Mono --> .NET Core --> .NET cycle is as promising as it is also problematic to deal with. As for actually workable options nowadays, i'm not too sure - most ways to encapsulate scripts in static binaries fail miserably (containers allowing to mitigate this, but don't address the root issue) and otherwise there aren't too many technologies that are good for this out there.
If i wanted to go down that route, i'd probably go with Go, since it doesn't have the problem of needing JDK (and GraalVM is still brittle, for example with Spring Boot, an issue that Go doesn't have). Any other options that you can think of? I really like the idea behind Lazarus/FreePascal, though their web server offerings are really lacking, which is sad.
As for HTTP, i largely agree - read only sites don't necessarily have to be encrypted, even if that can hurt SEO.
> If you then use a reverse proxy that can be automated with certbot you will have a system where all the maintenance work is done by the EFF, the reverse proxy developers and your distro's packaging team.
I am already doing this, but as the Caddy v1 --> v2 migration showed, even web servers and their integrations are subject to churn and change. I'd say that it's only a question of time until Apache/Nginx/Traefik + Certbot run into similar issues, either with new methods for getting certificates being needed, or something changing elsewhere in the supply chain. And even then, your OSes root CA might need to change, which may or may not cause problems. Old Android phones have essentially been cut off from internet for this very reason - the fact that i can't (easily) install Linux on those devices and use them as small monitoring nodes for my homelab disappoints me greatly, especially since custom ROMs brick hardware devices due to lacking driver support.
So sadly if i get hit by a bus tomorrow, it's only a matter of time until my homepage stops functioning and the memory of me disappears forever. Of course, that's just a silly thought experiment, since i recall another article on Hacker News which pondered how someone could keep code running for centuries. It didn't look too doable.
> It feels like Java sort of tried to be this, as did .NET
I meant compile your whole logic and libraries into one (big) binary so you won't depend on any runtime that might ever change except for your operating systems syscalls.
> most ways to encapsulate scripts in static binaries fail miserably [...] and otherwise there aren't too many technologies that are good for this out there.
LUA is about as stable as it gets, (minimal) WASM runtimes will also probably live forever. (Or are, most likely, interchangeable if not) For both of them you'll need to build the interface yourself, so any breaking change will at least be your own fault.
> Any other options that you can think of?
Rust, or if you are a bit masochistic C or even C++. Rocket 0.5 (Rust) looks really nice (as in ergonomic) as a webserver and iirc you can statically link with musl instead of glibc.
> I'd say that it's only a question of time until Apache/Nginx/Traefik + Certbot run into similar issues
But then youll at least have http as a fallback
> So sadly if i get hit by a bus tomorrow, it's only a matter of time until my homepage stops functioning and the memory of me disappears forever.
I can understand why they'd complain about version mismatches when installing dependencies, since in those circumstances failing fast prevents me from running into deprecated functions down the road, as would happen with PHP. However, the fact that different versions of Ruby/Rails are available in different OS distros and such basically mandates that i use containers OR that i just change the contents of my Gemfile to reflect the version that i will be using during build, which carries the aforementioned risks.
That said, Ruby and Rails are both far more stable than the current npm or pip ecosystems, given that the development has slowed down in Rails somewhat and isn't broken every week due to some package introducing breaking changes. That's not to say that it's better in most conceivable ways (for example, in regards to scalability), but as far as batteries included solutions go, it's pretty okay.
> If you only need sqlite as a db you could easily compile a static binary that will run forever (as in foreseeable future) and serve content via unencrypted HTTP.
Actually, static binaries are perhaps one of the better ways to ship software, especially with static linking, as long as you're ready to take certain security risks in the name of long term stability, though that doesn't prevent you from building new versions in an automated fashion either, at least before something breaks down the line and hopefully your tests alert you about needing manual intervention.
It feels like Java sort of tried to be this, as did .NET, but there is too much functionality that depends on reflection and standard library classes out there, that many projects are stuck on JDK 8, and the whole .NET/Mono --> .NET Core --> .NET cycle is as promising as it is also problematic to deal with. As for actually workable options nowadays, i'm not too sure - most ways to encapsulate scripts in static binaries fail miserably (containers allowing to mitigate this, but don't address the root issue) and otherwise there aren't too many technologies that are good for this out there.
If i wanted to go down that route, i'd probably go with Go, since it doesn't have the problem of needing JDK (and GraalVM is still brittle, for example with Spring Boot, an issue that Go doesn't have). Any other options that you can think of? I really like the idea behind Lazarus/FreePascal, though their web server offerings are really lacking, which is sad.
As for HTTP, i largely agree - read only sites don't necessarily have to be encrypted, even if that can hurt SEO.
> If you then use a reverse proxy that can be automated with certbot you will have a system where all the maintenance work is done by the EFF, the reverse proxy developers and your distro's packaging team.
I am already doing this, but as the Caddy v1 --> v2 migration showed, even web servers and their integrations are subject to churn and change. I'd say that it's only a question of time until Apache/Nginx/Traefik + Certbot run into similar issues, either with new methods for getting certificates being needed, or something changing elsewhere in the supply chain. And even then, your OSes root CA might need to change, which may or may not cause problems. Old Android phones have essentially been cut off from internet for this very reason - the fact that i can't (easily) install Linux on those devices and use them as small monitoring nodes for my homelab disappoints me greatly, especially since custom ROMs brick hardware devices due to lacking driver support.
So sadly if i get hit by a bus tomorrow, it's only a matter of time until my homepage stops functioning and the memory of me disappears forever. Of course, that's just a silly thought experiment, since i recall another article on Hacker News which pondered how someone could keep code running for centuries. It didn't look too doable.