Learning Linux and virtualization while setting up a homelab was the perfect excuse to learn how to create a static website for my blog and portfolio. I’ve been wanting to move my photography website away from squarespace for awhile as I felt it was too expensive for my simple portfolio needs and minimal web traffic. (I have about 16-20 visitors a month) As I learned more and more about Linux and servers I knew it wouldn’t be as difficult for me as it seemed previously.
HTML & CSS
I was worried that knowing very little HTML & CSS beyond any basic “Hello World” type stuff would hold me back, but once I started researching I quickly learned most modern websites aren’t coded from scratch. They use existing frameworks. For a simple static website like the one I needed you have options including but not limited to; Ghost and Hugo, which admittedly are the only two I researched. I chose Hugo because it’s open source, was free, and had a couple of themes I liked that were really easy to setup. While I’m still not super skilled with HTML and CSS I was able to tweak some settings to fit my needs, getting acquainted with the language a bit more.
git & GitHub
I still had knowledge gaps at this point. So I’m sure I made some mistakes I still have yet to rectify. At the time I knew very little about git and how it works with GitHub besides cloning and didn’t understand adding as a submodule at the time but I made changes to the theme from a standard clone when it would have been beneficial to create a git repository from the start and add this theme as a submodule, now I don’t think I can pull the changes from upstream without breaking my site. I’m sure I can figure something out, but truthfully I’m just amazed at how much more there is to learn still, and also how much I’ve learned so far.
Arch Linux Server
Something else I came across was not being able to install the latest version of Hugo from the Ubuntu or Debian repos. I didn’t think to check snap or flatpak at the time, so I just decided to just try putting it on an Arch Linux container since I’m so used to pacman and I know it’s bleeding edge and it worked. I haven’t had any issues with it on a virtualized Arch Linux server instance at all so far. So I kept it that way when I moved it to the cloud, since they offer an Arch server. I have snapshots so if it breaks I can always restore. No issues with it in a cloud instance so far either.
Cloud hosting
I had the web server hosted locally but anxiety over security and open ports got the best of me and now I host it in the cloud. It was really easy to migrate: Spin up cloud server. Ensured the same version of Hugo and go was installed on the cloud server. Copied website folder but excluded public folder to the same path. Copied nginx config from /etc/ Changed A records for my DNS to the new IP and that’s it!
I make changes locally and push to the cloud with rsync:
rsync -avP --exclude=public * phil@0.0.0.0:/var/www/philcifone.com/
I’m using the -a for archive (preserves file attributes), -v for verbose, and -P for progress.
The –exclude=public excludes copying any potential public directory to the cloud server. I also do a –dry-run first to make sure only the files I want are being sent.
And then test in the Hugo server:
hugo server --baseURL 0.0.0.0 --bind 0.0.0.0
Note: you can add -D flag to include draft posts
Finally, I push it live with:
sudo hugo
As always; be sure to have a very strong password and disable root login. I personally disable password authentication period, and simply use my SSH keys to login as it’s more secure.
Install crowdsec and a bouncer so you can still keep an eye on people trying to break into the system.
That’s how I self host my own site!