Most of the time, self-hosting sites focus on installing the tools, but most of an admin's time is not the initial setup, it's the long-term maintenance, the updates, the fixes when something doesn't work as expected, the disk full, the disk failure, ... and most of the time I find that those nice websites don't help enough. How will you deal with your self-hosted service when your friends or family rely on it and it is down for an obscure reason on an obscure component you never heard of?
I would very much discourage people reading "how to self-host" articles on hosting for anyone but themselves. As soon as you involve other people, your stress will grow immense, there are all kinds of expectations and possible disappointments. It's one thing to share a few photo folders, but to provide online services is a dev job, and you don't want this relation with them.
Note that I specifically mean the situation where you share your own self-hosted setup with some close people so they can also benefit. If you want to self-host to specifically serve a community, go for it, you're awesome.
FWIW, I'd go a bit stronger and say that, if you are running a service for other people, you aren't "self-hosting" anymore, you're just regular-old "hosting" at that point ;P.
It makes much more sense to form a little co-op, pay some pros to look after it (managed hosting) and contribute to the core, imo. Extract an SLA from the managed service that will cover restoring service. It doesn't feel as cool, but it is much less stressful as well as fairer to the nontechnical folk.
If it’s production grade even for you, the most boring and reliable way for systems to update themselves is mandatory. Luckily there’s way more out there than the past.
I grew up hosting PHPBB, game servers, ratio'd FTPs among many other things in the 90s. Dozens of friends used my "services". It didn't contribute to anxiety. It taught me a lot of skills because I wanted the things we used collectively to be available. Were they always? Nope. That was the contract.
Fast forward to today and it's no different. The complexity has increased, yes. But in many ways some of those complexities make things easier. The hosted solutions I provide friends and family with are rarely offline. Do I hit 5 nines annually? The real question is - does anyone care? No. But if I were to guess my downtime is cumulatively less than a few hours per year.
If you're drawn to it don't be distracted by an opinion of others who project these sorts of narratives. For some it may create anxiety - so don't do it, but for others they may find a path in their career through learning via the self-host path.
And if people you're providing ancillary services for are jerks about a bit of downtime here and there, and those folks are not contributing to your endeavors then tell them to go fly a kite. Because nobody I've shared self-hosting with has ever truly complained. I've had years of fun banter, especially in the earlier days, around uptime. But those people weren't trying to cause discourse. Those people are still friends.
I enjoy following the self-host community, contributing in minimal ways, and self-hosting useful things for friends and family. It doesn't keep me up at night. I'm constantly improving because of it.
Depending on what you are hosting, I recommend hosting just for yourself for perhaps a year. After you have the redundancies, backups, processes and other knowledge, you can share it with your community. I would never expect my family and friends to have the skill or desire to self-host, but I do not like to see them abused by platforms. It is a labor of love for me.
I host a bunch of stuff for friends completely free with the expectation that if there is an outage I'll do my best, on my own schedule to fix it. If I plan on sunsetting something I'll let them know a bit of time in advance, and that they're responsible for backing up the data they can't afford to lose.
So far everyone is happy with that arrangement, and I don't mind sharing the extra resources I have with people I care for.
Also if you do host for other people they need to know that shit might go wrong and they should have their own backups (which is a good thing to teach them with any service).
I use cloudron for this. While I agree that there is _some work_, but technology and services has improved to the extend that a reasonably good developer can host apps for themselves, their family and others.
Yunohost is not a site, it's a Linux distribution focused on making popular self-hosting apps more accessible. It tries to solve exactly what you're talking about - long-term maintenance and excessive complexity of small scale/personal self-hosting.
All of what you mention is something you consider during install. Which arguably makes the install take most of its time.
Same should be true for the cloud. Because you surely are not just jumping into something without analyzing what you are locking yourself into etc.?
Yes, it sucks if your PSU goes into smoke at an inopportune time. So you don't put critical services on it. Just as you don't on the cloud either. AWS has a much worse uptime than my homeserver. Github, slack, netflix, etc too.
Also, you get to do maintenance when you know noone will be bothered by it.
I absolutely do not. I see AWS outages and issues with provisioning all the time at work. Last time my home server went down was because the power went out and we don't have a generator.
i think what gets lost here is how we’re defining uptime. if you’re regularly experiencing ec2 outages that result in your apps and services being unavailable i find it very hard to believe you’re using ec2 correctly.
running a bare one-off ec2 instance on aws is a strange choice if you care about uptime.
the statement doesn’t even make sense. “aws” isn’t a single service and measuring uptime across every aws service is nonsensical. obviously they mean ec2, but even then lumping all regions and azs together is weird.
I would interested in hearing examples of software that is a breeze to self host and terrible to self host. And most importantly, why? What can software devs do to make something a breeze? And what should they avoid? Thanks.
Yup, add a bit of automation to the mix, and you will also not need to repeat much in rare case where you need to reinstall something from scratch or add new machine.
> There is terrible software to self-host (both as a beginner and as an expert)
This is the first time I've heard of Yunohost, but their documentation instills a lot of confidence in me.
I've been thinking of setting up NextCloud or similar for starting self-hosting of our HOA's email server, and file host, but it felt like a huge lift for someone who has never done it before.
Seeing that this is a full-on OS distro (Debian based), has been used and improved for over a decade, has docs that address backups, restores, and various attack vectors, has all app users integrated via LDAP, I feel comfortable at least giving it a try.
IP reputation matters. If you self host, for the love of god don't use DO/AWS/Linode/etc. Small, established hosts are best, in my experience - shoutout to Mythic Beasts and (in times gone by) Bytemark for that.
Updates can be handled by stuff like unattended-upgrade package in debian (and derivatives. Then you're left with once every 2 years distro version upgrade, which if you're using Debian it's just few commands and 5-15 minutes of update + occasional config fix if some app changed enough that some config options you used are deprecated. Adding non-distro repositories can make it a bit iffy tho, as some packages are just made badly and have problems upgrading but it will generally not break your system
If you're using Ubuntu or ubuntu derivative it's same except its russian roulette whether upgrade will actually work. In my experience just fucking don't or prepare for full reinstall every few years.
Then there is also slew of stuff that are "appliance" type of OS, like FreeNAS, that also allow you to run containers. That's probably best choice if you don't want to play linux admin once every month or two.
There is definite lack of "all in one panel" for someone that wants to have "their own linux server" but doesn't want to set up monitoring and a bunch of little stuff around it. When you want some alert generation, some metric monitoring + few automated stuff like "resize this partition when it fills up, up to a limit", it's usually a bunch of scripts + disparate apps.
Then again it's "different 20%" problem; 20% of features of those commonly used stacks(grafana/elk/icinga/etc.) is enough to 80% of the users but the 20% is different from person to person.
For example, some might care to only have some rough stats to draw a line and see "okay, my hard drive will be full in 2 months, gotta do something about it. But someone else will want same metric solution to also ingest a bunch of stuff from their IoT gadgets.
> How will you deal with your self-hosted service when your friends or family rely on it and it is down for an obscure reason on an obscure component you never heard of?
... the same thing happens with "all in one" stuff that automates everything you'd do manually on a plain linux box. Maybe rarer but also harder to find a solution for. In the end, if you want it to be not a problem you need to pay to make it someone's else problem (managed/hosted solution)
> Updates can be handled by stuff like unattended-upgrade package in debian
That won't keep up with the breaking changes this type of software usually has.
> If you're using Ubuntu or ubuntu derivative it's same except its russian roulette whether upgrade will actually work. In my experience just fucking don't or prepare for full reinstall every few years.
When was the last time you did that? That hasn't been true for literally years.
>> Updates can be handled by stuff like unattended-upgrade package in debian
>That won't keep up with the breaking changes this type of software usually has.
this only updates to current stable, so there will be no breaking changes as those are essentially security updates.
The "breaking changes" part is once every 2 year distro upgrade but it's generally very little, although that heavily depends on software you use of course. That from experience of few hundred machines at work and half a dozen private ones.
But if you wrote custom config you will have to change it if the format changed, you won't get away from that no matter how much automation you throw at the thing.
>> If you're using Ubuntu or ubuntu derivative it's same except its russian roulette whether upgrade will actually work. In my experience just fucking don't or prepare for full reinstall every few years.
> When was the last time you did that? That hasn't been true for literally years.
Just recently we fixed a machine of user where they tried to upgrade and some random shit broke. It also filled /boot with a bunch of kernels (it did not remove old ones) that made the machine unbootable coz it ran out of space on kernel update. It still could be booted on old one but of course user didn't knew that. We've actually migrated few people over the years to Debian precisely because they just broke their ubuntu during update somehow.
I imagine it is much less with actual competent users and server, not desktop.
>> If you're using Ubuntu or ubuntu derivative it's same except its russian roulette whether upgrade will actually work. In my experience just fucking don't or prepare for full reinstall every few years.
> When was the last time you did that? That hasn't been true for literally years.
20.04 to 22.04 dist-upgrade... kept getting weird errors from apt post-upgrade, it was just easier to blow the vm away and start fresh.
Not GP but 20.04 to 22.04 fucked something up that ended up (I can't remember what off hand) being quicker and easier to just format and reinstall Ubuntu.
Admittedly I did it right after 22.04 dropped rather than waiting for 22.04.1 or even a week would have been fine so that's a little on me.
Self-hosting really should be understood as "hosting for myself" rather than "doing the hosting myself". Hence I'd never host for anyone else but me. The stress of having to make services reliable, only if for my wife and kids, is a no by my book.
This is probably perspective. I wouldn’t recommend it to everybody.
I don’t find hosting services for a couple of friends and my family stressful.
I’m also a full time system admin so it is really a more fun, more freedom, less pressure, less bullshit version of the work I’d love to be doing instead of whatever management and the team came up with.
Another problem with Yunohost at least is that is has no focus at all on security. There are other, like Sandstorm, that at least try to have some basic security.
Can you expand on that a bit? I don't think its true.
Yunohost has a firewall, fail2ban, user management, access management for its installed apps and documentation on the topic: https://yunohost.org/en/security
I am not the poster, but the comparison with sandstorm leads me to believe that they meant sercurity as in service isolation, e.g. running potentially unstrusted services in one system.
IIUC yunohost correctly they help to deploy and manage services in a more traditional way and assume that each service is trusted and they aren't that rigerously isolated from each other.
Yes, that's what I was referring to. With Yunohost if a single service is compromised, then they're all compromised. That makes for a large attack surface that grows with each app you install.
I used to use this, but I've grown disillusioned with these sorts of self-hosting admin tools that try to be user friendly.
It's great when everything works, but under the hood Yunohost is running a bunch of scripts to e.g. manage distro packages. When that fails, you often can't recover through the UI in my experience and you then have to work out what the fancy tools were trying to do under the hood so you can log in and fix it yourself. After a few iterations of that I decided it wasn't worth the hassle.
The only one I still use is Sandstorm, because it's more of a container-based system and doesn't rely on scripts that can fail and leave the system in a half-finished state. Its userbase is unfortunately much smaller though, so the package selection isn't nearly as good. For the packages that are there though it's been pretty rock solid.
It can be, but one of the biggest draws to things like Sandstorm and Yunohost is that they handle auth for you. I run a bit of Dockerized infra too but managing users for a bunch of services separately is kind of a pain. You can set up central auth yourself but as a lone sysadmin that's even more of a pain :)
Fair. Could an auth solution for proxmox not exist? Or Sandstorm/Yunohost run within proxmox?
I remember seeing a Github repo where someone had baked all the proxmox install and maintenance scripts using rockylinux including auth. Much cleaner than I would organically end up at without reinstalling :)
Mostly TTRSS with a bit of DokuWiki, Etherpad, and Davros. I was using Radicale but wanted more email/calendar integration, and I was using the WordPress app for my local hackerspace but its static site generation took too long for our very large website.
I've been looking at the WordPress package lately and I definitely think there is room for improvement. For one, publishing seems to create a complete duplicate copy of all of the media, which is super painful for large sites.
Yeah - that and the lack of feedback when generating the static files. It's fine when it's a small site and generation finishes ~immediately, but when it's going to take a while it'd be helpful to have some indication that things are happening and when they finish.
> YunoHost is an operating system aiming for the simplest administration of a server, and therefore democratize self-hosting, while making sure it stays reliable, secure, ethical and lightweight. It is a copylefted libre software project maintained exclusively by volunteers. Technically, it can be seen as a distribution based on Debian GNU/Linux and can be installed on many kinds of hardware.
Features
Based on Debian;
Administer your server through a friendly web interface ;
Deploy apps in just a few clicks;
Manage users (based on LDAP);
Manage domain names;
Create and restore backups;
Connect to all apps simultaneously through the user portal (NGINX, SSOwat);
Includes a full e-mail stack (Postfix, Dovecot, Rspamd, DKIM);
... as well as an instant messaging server (XMPP);
Manages SSL certificates (based on Let's Encrypt) ;
... and security systems (Fail2ban, yunohost-firewall);
While I get much the same with Debian/Ubuntu and Docker containers for most of the software (and maybe Portainer for graphical administration), I appreciate the effort! While you can't get rid of the need to do some configuration, updates and other maintenance, I still think it's reasonable to work on making the process more streamlined and easier. Even though I use Ansible for some things, graphical interfaces are also usually a bit more relaxing to browse through (though in my experience CLI is still needed for non-trivial actions anyways).
Wow, I recognize the name and must have seen it countless of times - only now did I realize it's a piece of software and not another PaaS provider... And suddenly the edginess of the name becomes apparent.
I switched over to YNH from a self-rolled stack a few months ago. I've sent the odd patch (dovecot configs, a couple of apps) which tend to be accepted. Anyone who is bored of the rugged individualism of self hosting might find some welcome respite in the YNH community. By focusing on stuff other than "configure the email server" people can do more valuable stuff.
In my case, that means working on a signup form and invite-a-friend functionality.
Yunohost is a gem. I can't imagine not having it anymore. The built in SSO to stand up any kind of site in seconds with built in user managemnet in particular is tremendous.
Regular updates, security taken seriously, extremely helpful community. All around fantastic.
I think I already get a lot of the value from dokku. Dokku has an incredible and incredibly simple backup and recovery plan. Plugins for everything. And given a choice between a GUI and command line, I'll always choose git push. For all those reasons, at a quick glance, I would stick with dokku.
But the user management stuff is interesting. Does this feature move across apps? With dokku you generally have users isolated into each app and you would have to integrate them yourselves across apps if you need that.
Am I wrong about my assumptions here? Anyone using dokku with a good user management system? Or, is yunohost more interesting that what I am asserting?
Dokku Pro[1] has team-based management (and user-specific password auth is coming soon). This might fit your needs around user/team management.
Having never used yunohost, from the docs it seems more aimed towards folks that want some sort of homelab or self-host things, and not necessarily app developers who want a workflow around code releases. My initial thought is that the userbases are different, and thats okay! Use what works for your use case.
I haven't heard of Dokku until now but yunohost does have SSO. However, it's up to the packagers to actually configure the SSO to be used in the applications by default, and some applications are just inherently not built with SSO support to begin with, so it's hit or miss on a per-app basis. It is nice for the apps it works with though.
(not to bash yunohost since I haven't personally used their service)
At my workplace, we use cloudron.io, as their app store feels well maintained and up-to-date. I've never encountered any issues with app updates or instability, so they must be doing something right when it comes to quality control, which is one of the hard part of self-hosting.
yunohost is not a service it a just a project (more like a distro ) that makes installing apps easier. There is no paid version or other support options.
I tried this out and it clearly has had a lot of effort put in and was pretty smooth, but I had concerns about security. Last I saw, nothing is containerised or sandboxed. I’d be concerned about hosting a load of loosely maintained services along with my important data like Nextcloud.
I've been using this for a few years, mainly for Nextcloud, Matrix and a few other services. It's been a great experience, I can't say I've ran into any major issues so far. Updates can be slow sometimes but nothing to be really concerned about.
I’ve used it for some time, it’s overall good, but it has some issues:
- Users privilege can be tricky sometimes, in some cases I wanted to give admin access to some users to some specific apps installed, it didn’t work.
- Customization wasn’t that straightforward, and even if you go extra mile and change things manually, next update you will have the default ugly logos and favicon.
- Didn’t try it long enough to check the security, but the admin user was that time still uses a password instead of a key pair like my AWS, and didn’t see any 2FA for the admin panel, unlike others like webmin.
So it’s overall great for home use but wouldn’t recommend for any enterprise use.
I think it is worth revisiting as it has greatly improved since maybe you last used it. Within the admin panel I switched to use key pair and changed my SSH port. There is still no MFA on main admin panel, but many of the apps such as Nextcloud and Mastodon have it. Fail2ban is included and mitigates brute force attempts.
Yunohost is awesome and I’m using it since Debian 9 and I moved the configuration from DigitalOcean to Hetzner to OCI with even an architecture change (amd64 to arm64). It never let me down, crashed or was left in an unmanageable state. It’s „production ready“ for private use, non-profits or very small businesses.
For a bit of context, for people to compare with eg sandstorm or cloudron: this is a community project, mostly from the french local/community ISP scene. See https://internetcu.be/ or https://www.chatons.org/en
Yunohost is broader, with more selection of packages, and no cost. Cloudron is probably slightly more polished. I've used both and was much happier sticking with yunohost.