The homelab and selfhosted movements are very strong in the world of the internet. People have various reasons for doing it from self education all the way through to keeping out of the hands of Google, Amazon or any other big player.
My personal reasons for running a home server environment are more from a learning perspective with the minor like of running my own systems.
Over the years my home setup has taken many forms and many guises, however with each itteration I get a longer list of software and services I use each time.
The purpose of the list is to share that as some of this might be useful to others. Chances are some of this you'll stanchly disagree, thats fine as its a matter of choice. I'm open to other suggestions however a good reason why should be put forward.
I'm also not adverse to using cloud services or paying for them if it makes my workflow earlier, so this isn't a list of only free software.
Virtualisation, there are a few decent choices out there depending on your needs. Microsoft do the baremetal Hyper-V for free and VMware do ESXi for freeboth of which scale up to enterprise well from a learning perspective. There's also Openstack which is well documented for single or multi host installs and If you were so inclined you could do a baremetal Virtualbox maintained by scripts and the PHP front end.
I've personally gone with Proxmox.
The original reason for this was because back when ESXi didn't offer a web dront end and had a Windows client this was a pain for ESXi. However as time has moved forward the baremetal installer based on a Debian/Ubuntu hybrid back end has offered features such as clustering, LXC support, centralised storage and the like all within the free tier.
Proxmox is a baremetal install, in so much as it installs as the OS not on top of it. It does this installed as a Debian spin with a custom Ubuntu Kernel. This base install has a tiny ram and cpu draw in comparison to ESXi and HyperV so this is a solution suited for lower powered systems as well.
The hardware platform support is also better than ESXi and HyperV (and yes, I understand enterprise supportability) because Proxmox is based on Debian it will install on almost everything. I currently run a 4 node Proxmox cluster on an HP Proliant Gen8, Gen10 a Thinkpad 240 and a Macbook Pro. each with 16 GB ram and each running about a TB of Disk.
Note: My next upgrade to this is centalised storage via a NAS to provide Failover in the cluster and to build a backup partition to backup VM's to, a feature built into Proxmox.
One issue if you're migrating from VMWare ESXi is the format, because Proxmox is using KVM under the hood for virtualisation converting vdmk or ova files to qcow2 can prove to be a chore.
Some other useful features which make this software easy to use right out of the box are things like Console Access, it's built right into the Web interface using the Web based NoVNC client which means anything with a browser including chromeOS will provide management access and console access to your VM environment.
There is a (paid) app called Aprox
Which you can run which not only allows you to start/stop and view whats happening with your Proxmox Servers or Cluster, it also provides console access to the servers as well which can be handy when working remotely over a VPN, although possibly more useful on a bigger screen tablet.
I've only ever used Proxmox on my own home systems so can't comment in a business solution however if you do run VM's at home or are looking to and are looking for a solution which runs on just about any hardware you throw it at, offers enterpise level features and you're willing to do a bit of reading. I'd fully recommend Proxmox
Papertrail - https://www.papertrail.com/
Log Management is important, aside from monitoring it will be the thing which provides you with the most information about your systems.
There are plenty of options for this from Greylog to ELK and again what you choose is really up to the level you want to learn and maintain your systems.
I chose to use Papertrail a Solarwinds product because it ticks a few boxes for me the main one being I pay for 2Gb of logs each month and I'm happy having this not stored on my servers.
Setup is pretty easy there is a copy and paste line of code for Windows, Linux, BSD, OSX etc which will push the syslog files into the papertrail servers and if its application logs you're after it utilises NXLog to do this.
Once setup you are able to group the servers into however you want in the dashboard.
The true power of Papertrail for me however was being able to seach my logs in realtime via the web browser
Much like ELK and similar software you may host yourself, you can also create Alerts on specific log events which can be pushed by email, slack or other channels should those events occur or when aggregated.
The Search bar is also pretty quick when searching abck in the logs.
There are as I said many self hosted options to do this, the cost of weighing up
Netdata is server monitoring on steroids.. Starting off its a local install with a one liner found on the netdata site.
Once installed, *nix based only, you're directed to a url on port 19999 which opens up this near nirvana in monitoring.
The never ending (seemingly) downward scroll presents graphical realtime breakdowns into systems, applications and everything running on the system.
Each release provides a greater set of plugins and delving into more far reaching information on applications.
You can run Netdata locally, and should you so wish there are plenty of online instructions explaining how to use InfluxDB and Prometheus to pull data from Netdata and display the output in Grafana.
If this sounds like hard work, recently netdata.cloud was launched and you can "claim" your netdata installs and pull the data into a centralised Netdata Cloud instance
It's early days for the netdata.cloud interface however its slowly improving and is a useful interface for data aggragation.
While many items on this list have "opensourced" alternatives Netdata stands out as a really useful service for both realtime system monitoring and if using grafana or the cloud service a time based snapshot too. the fact it looks nice and the Netdata community is highly active (especially on twitter) helps as well.
While I've already included Papertrail above and covered that ELK can become unwieldy on a home lab, Loki is the opposite. Using data collection with Promtail pushed into Loki that data can be presented into Grafana.
Treating log data this way and providing what the loki project describes as possibly petabyte level of horizontal scaling, Lokie fits in well if your home server solution is more based around devops tools like Prometheus and containers as the tool set is (or should be) pretty much in place to extract the raw logs and then present the log file data in tools you'd be used to.
It is still early doors for the project, however it works well and is low maintainance thus easy to run on a home server setup.
Moving away from services and onto tooling to Termius. Termius is an SSH client (and MOSH) which runs on Windows, Linux, Mac, IOS and Android.
Termius has both a free and a paid tier, and its for me best feature is. Once you create a termius login, and add hosts to connect to using keys or password based SSH. Open the Termius client on any other platform you are logged into with and the same hosts are automagically setup with the creds and accounts ready to use.
I love this feature as i'm always adding and removing ssh endpoints and I work across Linux, Android, ChromeOS and Mac so this makes one part of my connectivity workflow so so much easier.
Termius offers proper end to end encryption and a couple of really neat features like the in app Keyboard having all the keys you would need on an ssh session, useful when keyboards on mobile devices by default might not have them. the keyboard also has a nifty auto complete feature
This fits in well with Snippits which are small command lines which you may use often which you can create as snippet code to run them as macros within a session.
There is also an SFTP client built into Termius and while its functional its not the prettiest or must intuative solution at present.
The free version is very basic and offers ad free ssh and mosh connectivity. The premium version is worth it if you make heavy use of SSH over multiple platforms as the sync feature comes in handy.
There is also a Teams version, nothng to do with Microsofts chat tool, this provides the ability to manage accounts across teams with RBAC control using Termius.
For a very long time I was looking for a replacement for TeamViewer, and for a very long time that replacement was RealVNC. Then there was a deal by the team who brought the world iDrive and with 90% off and proof i was leaving another system. I ended up with a very cheap remotepc.com setup.
Anyone who's used TeamViewer will understand the basic setup here. Everything is controlled by a central dashboard
Each client you want to attach is done so by a local Windows, Mac or Linux (RPM/DEB) agent. The local install then provides the user with an interface to enable remote connection using a password for access.
I've been using this for about 6 months now, and while i've found the RPM on OpenSUSE to be a bit flakey a recent 1.1.1 updated Linux client hopefully will solve this issue. Ubuntu, Windows and Mac however seem pretty solid.
The Remote access is controlled via the internet, and is accesible using Windows, Mac, Linux, IOS or Android clients or a web interface. The latter being really useful on the Chromebook.
The remote desktop speed even on 4G seems really solid and even from my Mobile connecting while on a train over 4G i was able to work on a Windows server remotely with no issues.
The session as you'd expect is fully TLS1.2/AES256 Encrypted and within that bubble file transfer is also an option to and from the remote device.
If only running this on family PC's to save the heartache of trying to fix issues makes the cost of this worth it.
Where to start with Ansible. There's a learning curve, theres plenty of examples, once you get your head around it, you'll wonder why you didn't build your servers and desktops using it before.
I've run the gambit of Chef, Puppet and now Ansible over the last 5 years and even as the hardened bash script can do this person I am. Ansible is the right direction if you're building Linux Systems.
However its important as with all the above tools and many other to make sure you use Ansible for that which it is designed to do. Its amazing at deploying applications, running updates, setting up services etc. And while its 100% possible to use it to ensure that any ansible driven configuration is kept the same over a period of time, there are better tools as keeping immutable config files honest, and for that personally I feel Puppet is a better tool. Which is good because puppet sucks at installing applications.
What makes Ansible such a good option for configuration of linux systems is the lack of an agent, communication is (in my setup) controlled from a small ssh host, which can reach out using passwordless ssh (via keys) to the external hosts. I'm then able to update either the Ansible hosts or inventory to reach out to the boxes as needed in either an all hosts or based in groups (I've grouped mine by OS)
I wrote an introduction article on getting Ansible setup which may be of help and if the command line isn't your thing Ansible AWX is a web interface for Ansible, its the feeder project for Ansible Tower, which may help.
If you run multiple services and servers and are looking to get a consistant build and setup then Ansible is the tool.
IF you run any external facing services on your home setup then knowing if those have gone down, and what the long term connectivity to those looks like is something covered by tools like Pingdom and 1000eyes. The tool i prefer however is Uptime Robot. Mainly because its simple to setup and get working, and for what I want/need its free
Providing an interface for either basic ping checks to https and login type checks and report that by email or via your favorite chat/collaboration app this is a simple tool that does what it says on the tin.
The free tier at time of writing provides 50 checks every 5 minutes from 200+ locations around the world.
Although I update my LetsEncrypt SSL certs automaticatially its also nice to have the feature to check SSL cert validity and provide notification 30 days early using Uptime Robot,
Earlier I covered Ansible which is a fantastic tool for pushing consistant code driven builds out to servers and linux nodes. Rundeck is the tool to do the pushing.
Running scheduled tasks has for so long been the remit of decentralised CRON Jobs on linux servers. As your home server environment grows, managing these jobs becomes more and more of a pain. I went down the route of Git managed Cron jobs which worked for a while. I even had Ansible deploying these timed jobs from git onto servers.
What I was really after was a centralised method of doing this, and for a while that was Jenkins. Running commands (rather than groovy scripts) on remote servers which all had the Jenkins agent installed.
Jenkins is a cool bit of software, however I class it as Software written by developers, for developers, run by developers. If you really want to get the best out of it you need to get into the code and even with the pretty new interface on it, its a difficult beast to manage if you start wanting to make use of complex pipelines to do things on your systems.
Rundeck is a more polished lower footprint version of this idea. No remote Rundeck agents are needed, just like Ansible you can run this over passwordless SSH.
The installation is not the easiest as the documentation I found a little presumptive in what you did and didn't know about setting up Rundeck. As such I wrote a guide to get the service installed and setup with some basic nodes and get some basic jobs setup.
The simplest method of running Rundeck is to point it at your Ansible jumpbox and have it execute jobs from there.
Out of the box there is support for running basic bash commands, ansible scripts, SCM with Git and pretty much all you'd want to get remote jobs run on schedules with notifications and audit.
And all this is for free.
Portainer - https://www.portainer.io/
Docker is a great tool and there is no argument that its power comes from the command line. Learning it however takes a little time and while you're doing that, there is Portainer.
A web based interface for managing your Docker environment, Docker Swarms, local or remote. Portainer is a great tool for providing a WebGUI to your Docker environment and quickly seeing what is happeing in logs or jumping into containers.
Its a well thought out interface, well supported with updates and very low maintainance.
These are my tools, my tool chain, some you might agree with, some not, thats fine. Maybe there is something new there? Maybe not.. These are all good tools which help running a home system.