Self-hosting | #1 | Docker & Docker Compose

In this mini-series, you will get from installing Docker to deploying and managing your containers with an easy to use UI, and providing your services through a reverse proxy, with authentication and automatic TLS certificates

Self-hosting | #1 | Docker & Docker Compose
📄
I am not sponsored by any of the brands, software, or manufacturers pictured or mentioned in my posts. In these posts, I am simply sharing my private home-lab journey, my opinions, and experiences about running a home-lab environment.

Introduction

My hometech journey has evolved to a point where most of the things I do aim to somehow improve my smart home platform capabilities, security, stability, making it work more and more independently.

I've been coding, building computers/servers, smart-home, home-automation stuff for a long time. My first contact with computers was in the 1980s when I got a Commodore 64, then, later on, I got Amiga 500; although these were both mainly used for gaming, I did get my first experience in coding and doing some graphics. I built my first PC in 1990 and my first water-cooled, dual-processor in early 2000. Yes, full nerd here. :)

Somehow this ...

... has led into this.

Filling in my 2nd liquid-cooled PC somewhere around 2008?
The bliss of the flashing lights

I know my setup is still really a small scale, as I have seen what some of you guys are running. Sure I have are extensions and replacements happening and planned, but I try to keep things within the rack cabinet I now have, as the rest of my family is not that enthusiastic about all the new stuff I drag into the house :-)

As a kid, I used to dream about the type of gadgets, games, virtual reality, automation stuff that we nowadays have. Now I actually get to live in that dream .. perhaps you can understand why I am so enthusiastic about all of this.

You can get started small, basically, anything goes in which you can install a Linux distribution in it, be it an old computer, Raspberry Pi, or something similar. I am mainly covering the setups here from a standalone environment point of view, so it should be relatively easy to get started. Later on, it should be easy to scale to swarm or Kubernetes setups if you need.

Getting started

Keep documentation of what you do. You will thank yourself later.

What you will need is a system running some distribution of Linux. I won't go into how to install your flavor of Linux distribution, as there are plenty of guides on the internet on how to do that.

I run one system with Ubuntu Server 20.04.x LTS, and a couple of Pis with Ubuntu 21.10. If you are running Pi and planning to use Ubuntu, then the 21.10 version is most likely what you want to run, as that officially supports the Pi and you can easily make it boot from SSD instead of SD card.

Installing Docker

Install Docker Engine on Ubuntu
Instructions for installing Docker Engine on Ubuntu
The official guide

This will be reasonably short and easy to install and will be the first step towards self-hosting in this miniseries.

Install using the repository. Setup repository and install and update from the repository.

# update the apt package index and install packages to allow apt to use a repository over HTTPS:

 sudo apt-get update
 
 sudo apt-get install \
    ca-certificates \
    curl \
    gnupg \
    lsb-release

# add Docker’s official GPG key:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg

# setup the stable repository
echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \
  $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

# refresh Ubuntu packages list:
sudo apt-get update

# install the latest version of Docker Engine and containerd
sudo apt-get install docker-ce docker-ce-cli containerd.io

# run hello-world to verify that Docker Engine was installed correctly
sudo docker run hello-world

Install Docker Compose

Install Docker Compose
How to install Docker Compose
The official guide is below in the below link. If you go with that, be sure to select the correct OS tab to match your system (Linux).

Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application services. Then, with a single command, you create and start all the services from your configuration.

# Download the current stable release of Docker Compose | replace the version number to download a specific release
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose

# apply executable permissions to the binary
sudo chmod +x /usr/local/bin/docker-compose

# test the installation
docker-compose --version

You should now be ready to deploy containers using Docker run or Docker Compose.

Check out part 2 below, it will ease up deployment and management of your containers considerably. Installing Portainer Server and Portainer Agents, Portainer provides an easy-to-use UI to manage your containers. It makes deploying apps and troubleshooting problems almost so easy that my mother's 107-year-old aunt could do it .. yes she is a pretty cool person
Self-hosting | Part 2 | Portainer Server & Portainer Agent
This self-hosting mini series part 2 shows how to install Portainer Server and optionally Portainer Agents to quickly and easily deploy and manage your Docker containers.

Thoughts on self-hosting

My guiding principles

With my home-lab stuff, I have a few principles that I try to follow as much as I can, yours may be different. For me, these have molded over time and will be molded, as adaptability is something also that I value. These I have come to follow from my previous experiences and have some reasoning behind them.

1️⃣
Learn, apply, and be creative.

When I do something, I try to really understand, why something works as it works, breaking it apart piece by piece. If you understand it, then you get to push it to its limits and maybe learn some more. It's a great way to keep up with the technology and trends.

2️⃣
Use open-source (whenever possible)

Open-source systems can give better possibilities to push your system further and integrate them better into your environment. This depends much on what you are trying to achieve, sometimes however there just might not be any other viable option than to use a closed system.

3️⃣
Cloud independent (where possible)

I have been a happy cloud user, but I've also learned that there are limits, risks, privacy, and continuity considerations. In some cases, it might be simple as latency. If your home automation lag for even a few seconds or worse yet have random lag, things can get annoying really fast. Sometimes it's a continuity issue the vendor decides to dump the full ecosystem or a dispute between vendors and you cannot make the systems talk to each other no matter what.

When it comes to my home-lab, data, and home automation, I like to keep it local, and as much as possible, in my control.

To consider

Before moving forward, here are a couple of topics that you might want to be aware of. These may not be relevant right from the start, but they can come very relevant really quickly, thus the reason for bringing these up (I wish I had considered these.)
📦
Sources: When deploying containers, be mindful of the guides and sources you use. You wouldn't want to end up installing or hosting something malicious.

Don't assume anything. Verify the sources you use! Few things are as easy as providing an installation guide with a repository. Let people blindly read the directions, pull the images into their servers and internal networks, and then run the code, so verify!

💾
Backups: If and when you end up running anything important or meaningful to you, make sure you have backups, and make it part of your deployment plans (database backups, file backups, configuration backups, VM backups, full system backups).

Have a separate, secure place to store your backups. Remember: if you haven't done a recovery test, you cannot assume a working backup! So test the backups that you take regularly.

🚀
Redundancy/Recoverability:  You might want to keep using some of the software that you have tested. When you keep running something like a production system (e.g., relying on a Home Automation platform and integrations), you should also prepare for hardware/software failures and plan for redundancy.
Documentation should be a major part of your recovery plan and have a backup of you documentation.

While recoverability might not be a problem when you start, you should be aware that becoming dependent on something in a home-lab type of environment can happen sneakily over time. That one tiny software that you were "just testing" a year ago, and just kept using, has become a central piece in a chain of automation. If you keep testing on the same system where you run your production type of applications, chances are you will mess something up at some point. Also, software and hardware can, and eventually will fail. If all your documentation were on that failed storage device, it would be significantly more painful to make a recovery, even a partial one. Think how long you can live without different parts of your systems if they fail, and plan for redundancy accordingly when and where needed.

While redundancy can provide continuity when systems fail, it is still not the same as backup. Redundancy is more a real-time fail-safe, and while backups don't do real-time protection, depending on how it's done, backups can protect from a broader set of possible problems, may it be simply accidental deletion, for example.
4️⃣
Security: when self-hosting, you may end up wanting to use some of the systems you have set up outside your own network.

First of all, you should never blindly just open ports through your firewall, each opened port is a security hole in your system, that someone WILL take advantage of, it's just a matter of time. If possible I'd rather not open any ports, depending on your needs a secure VPN connection (although not bulletproof either) to your network could be the solution.

Suppose you do decide to share something more widely with friends or publicly, like a blog or a photo album, you should find out how to make it as secure as possible and then take all the possible precautions to secure your systems.

If you do open something up, have monitoring in place, and set up notifications, that way you can at least try to catch it if something malicious is about to happen.