— beauty in simplicity

Blog stack

My fist ever blog posts were published on medium. It was fine in the beginning, but soon I started to get annoyed by certain things there. Like there is no way to include a code block on a page. You have to create a bunch of gists and link them. You can’t backup or export your stories, your replies are also stories, you can’t personalize your posts much. And paywall they have there is stupid.

So I’ve switched to wordpress hosted in aws. Which was also fine for a while. But eventually it felt like it has lots of features I don’t need (like user registration and comments, which are mostly used by spammers) and lacks features I need (simple backups and tracking changes in a version controll’y style).

That’s when I decided to run a blog in asciidoc. That was a perfect fit for me! I’ve seen it a couple of times across the internet but have never considered to do a setup by myself until now.

In fact, you are reading this post in my new asciiblog!

What is asciidoc

AsciiDoc is a text document format for writing notes, documentation, articles and so on. AsciiDoc files can be translated to many formats including HTML, PDF and EPUB.

Or to put it another way - asciidoc is a markup language that could be used to generate all sorts of static pages.

For example you just open up a plain text file with .adoc extension and write the following:

[NOTE]
====
Here is why I like asciidoc:

* text based;
* generates lightweight HTML5 pages;
* supports styling and customization.
====

Put it through asciidoc processor and receive the following output:

Here is why I like asciidoc:

  • text based;

  • generates lightweight HTML5 pages;

  • supports styling and customization.

Pretty cool! It has a reach set of capabilities, so if you are interested already, check out the markup documentation.

Setup

Problems to solve

So I’ve picked a markup language. But I have some other problems to solve before I can actually post my markup files.

First, they have to be stored somewhere. Ideally - not locally. Because in that case loosing your local copy would mean loosing all your posts.

I could store it somewhere in a cloud, but since my posts are now just plain text files, I can use a VCS like github. It will host my files, make it accessible from anywhere and provide version control for them, making it easy to track changes. I also can use branches to start new blog posts and keep my published version of the blog in the master branch.

First one down.

Second, I need to render html pages from .adoc files. That is an easy one. Asciidoctor does just that and could be installed on any operating system.

Another down.

Last but not least, I need to serve my pages. Any web server could handle such an easy task. I picked nginx because I had some experience with it already and it’s super easy to setup.

Last one down!

Tools and technologies used

My current setup is the following.

I have Ubuntu 20.04 running on aws ec2 instance. It has nginx installed with certbot used to acquire a Let’s encrypt certificate.

My sources for the blog are located in github public repository.

And finally I have aciidoctor installed on my local machine to render these sources into some neat html pages.

Posting to the blog

When I have an idea for a new blog post - I start writing it in plain text in one of new .adoc files in the posts folder. I check in my changes to track progress and render the page locally from time to time to see what I am getting so far.

As soon as I feel that it’s ready to be posted - I render the page locally and run rsync command in order to synchronize my local render with the one I have in aws.

In order to automate this manual work I’ve created two scripts. Fist one just build a local copy of the blog:

#!/bin/bash

asciidoctor -R src -D target '**/*.adoc' && \
cp -r src/images target/images

One thing to note here is that I need to take care of static content and move copy it along with generated pages with cp -r src/images target/images.Second one builds and deploys my blog in aws:

#!/bin/bash

asciidoctor -R src -D target '**/*.adoc' && \
rsync -av -e "ssh -l blog" src/images blog:/var/www/html && \
rsync -av -e "ssh -l blog" target/ blog:/var/www/html

Note the use of rsync here. I am running it over ssh. For that I’ve aded the following to my .ssg/config file:

Host blog
  Hostname <my-aws-hostname>.compute.amazonaws.com
  IdentityFile <path to my pem file>

This way I can push my deploy script to the github without any worries of exposing any info about my aws instance.

If you are not working on Linux/Unix - the whole process could be done on your VPS. Just clone the repo, render posts with asciidoctor and copy rendered pages to /var/www/html for nginx to serve them.

Conclusion

Overall I am very happy with the setup! It combines the ease of use and deploy with ease of hosting it. I am currently running it on t3.nano instance and still feel like it’s to much for it (I am thinking about getting static IP from my ISP, installing Ubuntu on my old laptop and running blog on it). You should be able easily run this setup in a free tier instance for a year.

On top of that, there is one thing that I haven’t mentioned in the post - customization. You can either play around with stylesheets yourself, or use one of the community themes available. I am currently running the one that mimics github style.