testerson

Capping off the year with some more personal work, I decided to rebuild my website. A couple years back I put the last iteration together, which included some fancy three.js models from both Unstoppable and Interloper. The idea was cool but I was never quite happy with the execution, I might revisit it in the future but for now I'm taking it in a more minimalist, and easy to maintain direction.

If you're viewing this directly in the browser, then, welcome to my corner of the internet.

Design:

I wanted to keep this all super lowfi, very clean and minimal, so I opted for using the default browser font. It's easy to read, looks fine, and generally just does an ok job at being legible. I chose a charcoal and orange theme, only applying the key colour to major headings. Images are centred and some extra styling was added for code blocks. Nothing fancy, but it should be easy to read.

Code

I'd like to preface all of this with a couple points. First, I've got limited experience building web content, I've made a couple angular web sites, and some of those have been turned into apps, and then I've made my own personal content. Secondly I don't know a damn thing about shell scripts, this is the first time I've dove deep into them. I'm likely making a ton of inefficient, or even wrong choices here, and that's a-ok!

This is the part I really wanted to be minimalistic. Very limited use of JS, no site generators running in the background, nothing but static html hosted directly on a shared hosting web server. Basically, if this thing can render on a win95 machine I'll be happy. Ideally it'd load super fast too, but I'm still working out some kinks with gif loading. The real kicker is that the pipeline is built around Bear, my main note taking and writing tool, export html from bear, run a bash script, and boom, blog updated.

Content in bear is written in Markdown, but I've found its text pack export has issues with providing the relative link paths to the files for images, so instead I opted to use the html export instead. This also resolves the need for a markdown parser as well… everything has already been parsed. The downside of this approach is that html exported from bear comes pre-styled and can't easily be extracted to a single page feed (more on that later) without fudging up the relatively minimal css I'm using, so part one of the bash script is to cut out the first couple hundred lines of inline css that comes with the exported html file.

The second part is cleaning up some other bullshit from the bear HTML exporter, removing all the <br> tags, fixing the instances of apostrophes using the incorrect Unicode character, making sure the relative links point absolutely (again, more on this later) and rebuilding the h1. This last part happens because Bear uses the first h1 as both a title, and a file name proxy. I could always change the file name at export, but it's an extra step I'd prefer to avoid. All of this is achieved using a ton of sed, iterating through the file and changing instances where required.

Once the exported html is cleaned up, I move onto compression images using sip and its aggro jpeg compression to bring images into line. This is the first space where I've got work to do, as I'm not covering the rather large gifs that I generate to show off gameplay. I'm not sure these can be compressed any further than they already are. (I use macOS's native video recording and the desktop gif conversion app Gifski)

After the images are cleaned up, I then start propagating the new file amongst the website. First stop is the homepage, where I've got some incredibly basic jquery to load the five most recently uploaded posts into the main index.html. This acts as a kind of feed, and limits the effect of images on load time. In the future I might improve this to auto load more posts as the user scrolls, but for now this works fine. This code leads to some limitations however, primarily that relative load paths for images get messed up, thus the need for absolute pathing. The main code function here is simply changing the html to load the most recent pages, again more sed html manipulation.

Secondly I'll add the pages link to the post index, easy enough here, more sed.

Third, I create a new entry in the sites rss.xml. The implementation here is as basic as it comes, and sometime in the future I'd like to change include the full article text.

For safety reasons I then commit and push my changes to a repo. One of the benefits of already being in a terminal is that can be done pretty seamlessly.

Then finally I ssh into the website and upload the new and updated content.

Conclusions

For now this all works pretty nicely, and I'm generally happy with the outcome, though I feel like it could be made faster somehow, maybe by moving from bash over to using Shortcuts? While I hate shortcuts interface and clunkiness, it would open up the opportunity for a very seamless export and upload from any of my devices.

A huge stretch goal would be to host the website myself, have a machine running in my own home, and do something a little bit like Low Tech Magazine's solar website

If anything it's nice to own the process of writing content for my own website end to end. As the years go on I trust companies or third parties less and less with the content that I create, and being able to have some slice of the internet completely written by me is reassuring (if not exactly the most sensible approach).