CJ Eller

Classical Guitar by Training, Cloud Engineer by Accident

Being out of your comfort zone can make you try new things. Since I'm away from my laptop running errands, I have to type out this post on my phone. For one it feels vastly different than typing at an actual keyboard. Not just physically but mentally as well. I wonder if synapses fire off differently when you're using only your thumbs instead, if you have different trails of thought because of that.

It costs one cup of coffee a month for me to publish to Write.as every weekday until the end of time. That translates to “until the end of AWS,” because I use a Python function running in Lambda that is triggered on a regular schedule by EventBridge. A simple serverless architecture that helps me worry about servers “less” and more about what I want the code to do — hold me accountable to publishing something on my blog every week day.

A cup of coffee a month isn't much (depending on where you go), but I wonder why I have to pay for a cup of coffee at all. Why can't I have this event-based, serverless function run every week day for free somewhere else?

This is where I came across Github Actions. I've written about it before where I used it to create blog posts when a Github repo issue is created. Recently I discovered that you can run a workflow as a scheduled event. That means that you can have a workflow that triggers a Python script to run in your repo every week day. Like a Python script that publishes to your blog?

What excites me about this is that the Python code can be simplified even further. No Lambda function handler nonsense, just Python scripting. Even if I wanted to write functions, I could have Github Actions invoke the function like I would if I were testing it locally. Thanks containers!

Hoping to get something together to share later. it might be a kludge, something Github Actions isn't supposed to be used for, but then again that's what I enjoy most about technology — finding ways to put it to use that are on the fringes of personal preference.

Been participating on a forum lately. There's something about writing there that invites a different kind of energy than writing on a blog — a sort of enthusiastic generosity mixed with informal concision that I cherish so much. Why is that?

Maybe it has to do with one's identity on a forum — starting topics and responding to them, being in a chorus of many voices. What you write there is like a series of musical notes that feed off of what was previously said and that others complement in kind. A blog post can feel like it has to be a contained musical composition.

Of course it doesn't have to be, and that's where the great blogging happens. Reminds me of why I love playing classical guitar — to make a single instrument sound like an orchestra.

There was a context switch for me when I went from predominantly playing an electric guitar to a classical nylon string guitar. A sense of portability — no amp to lug around, no cables to untangle. Of course it wouldn't be able to sonically project like an electric guitar, but that wasn't the point. I could play guitar wherever I wanted to. Power outlets have no hold on my musical expression.

I'm starting to learn more about containers thanks to Bret Fisher's Docker course and it's surfacing the thoughts of portability that I had about the classical guitar. One exercise had me test for the version of curl on a CentOS container. A single line Docker command allows me to accomplish something would've created a VM for back when I had no clue about containers. Even using something more straightforward like Vagrant would've taken more time & resources than that Docker command. I don't have to be tied down by VM's like I was to power outlets with the electric guitar.

But all of this to say that the classical guitar doesn't take away from my enjoyment of playing electric, rather it serves as a complementary experience. Containers might be the same way for me. Learning about containers doesn't diminish the joy of using VM's. Hell, I still use them on a daily basis for work.

I interpret it more as adding something new to my toolkit of technological expression. That I want to be ever expanding anyhow.

I got into a car accident today. Nobody got injured thankfully. How thin the line between cars, between potential collisions.

Handling the insurance claim for the accident? All virtual. Even the phone number led me to an automatic responder which led me to a hyperlink. Another thin line of infrastructure that allows me to swiftly carry out life altering activities while on the thin line of Ethernet that is hosted in data centers, all with thin lines jutting out between the servers housed therein.

The night before, I read a startling fact about our stomachs from Bill Bryson's The Body:

A natural question is, Why don't all our ferocious digestive juices eaat through our own gut lining? The answer is that the alimentary canal is lined with a single layer of protective cells called the epithelium. These vigilant cells, and the gooey mucus they produce, are all that stand between you and digesting your own flesh. If that tissue is breached and the gut contents get into another part of the body, death without immediate medical treatment inevitably follows. Yet this only rarely happens.

A thin line preventing us from eating ourselves. I'm starting to realize how that can be said for many things.

The fascinating (and frustrating) thing I'm finding about working with Infrastructure as Code (IaC) is that it can sometimes create different sources of truth. In my case I notice two. There's the code running live in your cloud provider and then there's the code up to date on your main branch.

In an ideal world, these two would be in sync all of the time, but life gets in the way. Drift ensues and you're scrambling to get both your repo code and the live code in sync. This has happened to me before and lately it happened again. I beat myself up about making the same mistake twice. I wonder if it's a right of passage to fumble with version control in some form or another.

In a way, writing about it here is my attempt depersonalize and de-stigmatize the experience. This happens all the time. I talk to others about this and they laugh about their git disasters. It's just happening to me for the first time or two. The novelty makes it sting that much more. It feels like a personal foible. Realizing that this experience is far from unique takes away from that hurt. You live and you learn.

So perhaps it's a right of passage to fumble with version control in some form or another.

Reading Bill Bryson's The Body: A Guide for Occupants is like microdosing on partial wonder. There are countless passages where I sit in awe at the complexity of our own flesh & blood. Here's a particular passage that struck me today:

In breathing, as in everything in life, the numbers are staggering — indeed fantastical. Every time you breathe, you exhale some 25 sextillion molecules of oxygen — so many that with a day's breathing you will in all likelihood inhale at least one molecule from the breaths of every person who has ever lived. And every person who lives from now until the sun burns out will from time to time breathe in a bit of you. At the atomic level, we are in a sense eternal.

I might have heard such a fact before, but the conclusion that Bryson draws — that we are almost eternal in an atomic level — is staggering.

Perhaps most of this wonder comes from an ignorance of the human anatomy prior to opening this book. Maybe that's where wonder lives — in between ignorance and complete knowledge. Who knows...

Sometimes a video game will show you this part of a level where you need to get to. Unfortunately it's out of reach. How do you get there? Well you go on a detour and receive an item. This item gives you the ability to get to the place where you need to go.

About a year ago I ran into this issue with a Python project that attempted to import QuoteBacks quotations into Arena. Most of the quotations are formatted in as nested dictionaries, and iterating through nested dictionaries wasn't in my toolset at the time — I simply got stuck and skipped. Only after encountering nested dictionaries recently in a work scenario did I learn how to deal with them. It's like gaining that item in the video game, remembering where that item would help get me to where I needed to go — with the QuoteBacks/Arena project. We'll see what happens.

Makes you wonder about progression as being able to go to “places” that you couldn't before. Guillermo Rauch has this great post on “Code Golf” and other interactive coding challenges that took learning programming as this means of progression. Rauch explains,

The idea was simple: you start with a simple hacking challenge at Level 1, and your goal is to hack the made-up system you are given to progress onto the subsequent levels. There was a forum associated with it, where you could meet others and get hints. The level you were in was your flair, and there was a public leaderboard of everyone who was playing.


This experience was unlike any code tutorial or book I was consuming at the time. It was thrilling, competitive, frustrating, exhilarating. It went from easy (viewing the HTML source) to harder (disassembling binaries and writing key generators).

It taught me skills that are essential to problem solving, rather than coding itself, like reverse engineering a system and unusually effective debugging with very little information. And it did so very quickly.

By now, these tournaments have long been wiped from the internet, and if you google you'll find some that have carried on the torch. I would recommend beginners to play with them, and content creators to experiment with this model for new educational materials.

These kind of games were where I cut my teeth on learning basic shell skills like maneuvering through directories and using grep. The “wargames” on OverTheWire were especially transformative. There's something to the progression of those games that made it feel like you were genuinely learning about the shell in the way that Rauch talks about with the games he played — “thrilling, competitive, frustrating, exhilarating.” I even remember my first exposure to the AWS CLI being through Scott Piper's,flAWS, a game that taught you common security vulnerabilities in AWS through progressing levels.

Perhaps there's something to this kind of format that could be more widely adopted. Massively Multiplayer Online Programming Games?

Don't get me wrong, copy & paste is useful. The question lies in how much of it needs to be done at any given time. After dozens of copy & paste's for an Infrastructure as Code project, my focus drained with each key stroke. Copy & paste turned into copy pasta real quick — a genuine mess.

This raises a benefit of automation that alluded me until this very moment — a script automating copy & paste processes is supposed to prevent human error. Such is the case with Ctrl + V & Ctrl + C. Copy & paste is convenient until it isn't. When it teeters towards disaster, then it's time to take the time to script the process out.

Writing out that automation can bring its own headaches, but I'm finding that I prefer the headache of getting this script right than the headache of causing a kerfuffle in production.

The cloud is just someone else's computer. Part of that adage I find fascinating is understanding how that “computer” is set up. Sometimes it's not configured how you expect it to be.

I've been recently looking at AWS Systems Manager Run Command. It's supposed to work exactly as it sounds — letting you run commands on your managed instances. Well, sort of. I wanted to use Run Command to run a Python program. What appeared to be simple in execution turned into complications that turned into hours of troubleshooting.

The snag? AWS couldn't pick up on a configuration file on the instance that I needed for the Python program to work with Run Command. Turns out that the root user's Python configuration is different than the user I set up the program on. And what user does Run Command execute the script with? Root.

What to do? Thankfully M. Gleria on Stack Overflow led me to the wonderful command runuser utility which, well, let's you run a command as a particular user. This allowed me to get around how root configured Python since I was running the program as the user whose Python setup I wanted to use. That did the trick.

Such a workaround that I didn't expect to use in the first place. It's that part of the cloud being someone else's computer that I take as more of a puzzle than an inconvenience. Of course that puzzle aspect of using cloud providers can easily sway to inconvenience, but it's a balancing act I find intriguing.