CJ Eller

Classical Guitar by Training, Cloud Engineer by Accident

We complain about information overload, and yet we also get an almost eschatological thrill from the glittering glut, as if the acceleration of communication and the bandwidth bursting density of the datastream can somehow amplify the self and its capacities.

There's something about this passage from Erik Davis' Techgnosis: Myth, Magic, & Mysticism in the Age of Information — how engaging in information overload is compared to a thrill. An acknowledgement of the riches we get from the web and, at the same time, an acknowledgement of its gravity upon our psyche. Both delight and caution in the cup that runneth over. This feeling is out there. I found this post from Alejandro called “The Weight of the Clipboard” (source) that articulates it so well:

It feels more like an invisible weight that can be felt through every idea and keystroke. Through every executed action, like something you can lose, something you need, something missing from the stream of data you’re writing.

Tom Critchlow also gets at the feeling in this post:

I spend all day long slinging URLs around. Mostly, when I copy and paste a URL it’s treated as a string of characters. But you and I know that a URL is heavy. A URL is a representation of a blog post, or a product I want to buy, or a hike I want to go on, or an Airbnb I’m going to book.

The invisible weight that Alejandro and Tom describe reminds me of the bandwidth bursting density of the datastream Davis mentioned earlier. How do we manage the deluge of digital detritus in a way that both respects its weight and acknowledges this innate desire for information foraging. To quote Davis again,

Information gathering defines civilization as much as food gathering defines the nomadic cultures that preceded the rise of urban communities, agricultural surplus, and stratified social hierarchies. From the moment the first scribe took up a reed and scratched a database into the cool clay of Sumer, information flow has been an instrument of human power and control [...]

From cool clay to svelte silicon, its information gathering all the way down. “Opening tabs and browsing the web is essential to task completion.” Tom mentions in his above post. “Tab sprawl is a symptom of a basic task: web foraging.”

Speaking of tables, I've been looking into Tom Critchlow's Electric Tables lately as a way to deal everything mentioned above. It's interesting how Tom uses similar language to describe this project — Electric Tables as a way to respect the gravity of a URL while allowing for more nimble foraging of them. Such a description parallels my experience so far.

The cup still runneth over, but at least the runover is being caught somewhere this time.

There's a striking passage from a collection of nonfiction called Between Eternities by Javier Marías'. It's from a piece called “Air-ships,” which explores the act of anthropomorphizing objects:

We live in an age that tends to depersonalize even people, and which is, in principle, averse to anthropomorphism. Indeed, such a tendency is often criticized, erroneously and foolishly in my view, since that “rapprochement” between the human and the nonhuman is quite natural and spontaneous, and far from being an attempt to deprive animals, plants and objects of their respective selves, it places them in the category of the “humanizable,” which is, for us, the highest and most respectable of categories. I know people who talk to, question, spoil, threaten or even quarrel with their computers, saying things like: “Right, now, behave yourself” or thanking them for their help. There's nothing wrong in that, it's perfectly understandable. In fact, given how often we travel in planes, the odd thing about our relationship with them — those complex machines endowed with movement, to which we surrender ourselves, and that transport us through the air — is that it isn't more “personal' or more “animal” or more “sailor-like,” if you prefer [...] That's what I would like to see, less cool efficiency and more affection [.]

I find it curious that Marías mentions computers as an example of anthropomorphism, because there's another type of computer that defies such characterization — the cloud, or, as Robin Sloan calls it (and what I prefer), the slab.

The slab makes saying things like “Right, now, behave yourself” feel strange. We're not talking about a laptop at your desk. It's a data center in a discrete location you access from a laptop. Who knows what part of the data center you access. The slab is an amorphous thing.

Could the slab be anthropomorphized?

The apps that are lovingly crafted with Glitch are powered by the slab. The hand crafted blogging software I use is powered by the slab. Many things on the web I cherish are powered by the slab. Does that make them depersonalized all of a sudden? No. The slab's cool efficiency is imbued with human affection.

It makes you wonder though. Earlier I used “the web” and the slab in the same sentence. Is the web our way of anthropomorphizing the Internet, which is technically a broad network of computers?

The great thing about external brainstorming is that in addition to capturing your original ideas, it can help generate many new ones that might not have occurred to you if you didn't have a mechanism to hold your thoughts and continually reflect them back to you. It's as if your mind were to say, “Look, I'm only going to give you as many ideas as you feel you can effectively use. If you're not collecting them in some trusted way, I won't give you that many. But if you're actually doing something with the ideas — even if it's just recording them for later evaluation — then here, have a bunch! And, oh wow! That reminds me of another one, and another,” etc.

This passage from David Allen's Getting Things Done makes me think of blogging as such a mechanism for holding your thoughts and continually reflecting them back to you.

When I took a break from posting here, I understated blogging's function as an external brainstorming system. Only after a week & a half of blogging again, my mind races with ideas that wouldn't have occurred to me if I didn't have a place to hold my thoughts.

I think about Tom Critchlow's question in his Januray 2022 – Map of Inquiry — “How do we get more people blogging?”

Networked writing relies on… the network! I have a variety of friends and contacts that I wish blogged more. How to encourage / support and nurture more people writing online.

Could blogging as an external brainstorming system be one part of the puzzle for encouraging people to blog? Makes me wonder about David Allen's above characterization of the mind but geared towards blogging:

“Look, I'm only going to give you as many ideas as you feel you can effectively use. If you're not collecting them in some trusted way like on a blog, I won't give you that many. But if you're actually doing something with the ideas — like posting them on a blog — then here, have a bunch!”

Omar Rizwan's TabFS is always humming in my mind — browser tabs as a file system, the state of a tab always reflected in the files inside a folder on your laptop. What does that give you? Omar answers this:

[N]ow you can apply all the existing tools on your computer that already know how to deal with files — terminal commands, scripting languages, point-and-click explorers, etc — and use them to control and communicate with your browser.

Now you don't need to code up a browser extension from scratch every time you want to do anything. You can write a script that talks to your browser in, like, a melange of Python and bash, and you can save it as a single ordinary file that you can run whenever, and it's no different from scripting any other part of your computer.

A script that talks to your browser in a melange. All I can think of is digital bricolage when I read this, and honestly that's what scripting with TabFS is like.

I can relay an example to you from the other day. For a while I wanted to have a website blocker of sorts in order to focus on writing. But here's the thing — I either need to download a browser extension for a SaaS solution like Freedom or stand up a web proxy like Squid on my laptop or a Raspberry Pi. Neither option gels with me. Where's the middle ground? Kludging together some Bash together that sorta does the same thing1 That's at least where I'd have the most fun.

It's fascinating how the TabFS paradigm of files & folders changes how you approach the problem of a website blocker. Instead of focusing on a URL to block a site, you key in on the title of files. Instead of blocking a website, you need to remove the file of that tab. Instead of having a browser extension or server running, you have a script in the background going on an infinite “while” loop.

Below is a first pass on this website “blocker” idea:


# Website "blocker" using TabFS

# TabFS folder — using the by-title subfolder

# List of websites to block
block_list="Twitter YouTube"

while true
  # Grab the tabs
  tabs=$(ls $dir)
  for tab in $tabs 
    # Go through the sites in the block list
    for site in $block_list
        # If there's a match with the site and the tab, close the tab
        if [[ "$tab" == *"$site"* ]]; then
          rm "$dir/$tab"
  # Wait 10 seconds before going through the process again & again & again...
  sleep 10

It could definitely be refined, but the script works just fine for me in its current state. Hell, I'm using it right now so that I don't get sidetracked while writing this. But that's the beauty of something like TabFS, where embracing the melange on your computer can go a long way.

Recently I had a draft of further thoughts on digital bricolage accidentally cross-post to Twitter. In any other circumstance I'd delete the tweet, continue to refine the draft, publish it to my blog, and then cross-post it. Not this time. There's some digital bricolage here.

Going through iterations of drafts until a “final draft” is what we learn in school. That line of thinking extends to how I write on the web. Only when a piece is in a “final draft” state do I share broadly (ie: published as a post to my blog, that post shared to Twitter).

In this case, however, the mistake of sharing a draft broadly creates a moment for rethinking things, for a different approach to emerge. Maybe loose thoughts could be published as these one-off anonymous posts and shared via social media? Who knows, but it gets me thinking.

Seymour Papert & Sherry Turkle allude to how bricolage scrambles the natural order of epistemology:

The bricoleur scientist does not move abstractly and hierarchically from axiom to theorem to corollary. Bricoleurs construct theories by arranging and rearranging, by negotiating and renegotiating with a set of well-known materials.

Digital bricolage is the swift rearranging and renegotiating of materials and practices, the speed of which can come before any clear thought of what it is that you're doing emerges. That happened to me with sharing my draft. Before I knew what I was doing, the post got shared. A different perspective on sharing my writing on the web began to emerge.

This is what Papert & Turkle are referring to — digital bricoleurs construct theory through play, not the other way around.

Tugging on the thread of digital bricolage brought me to a wonderful paper by Seymour Papert & Sherry Turkle called “Epistemological Pluralism.” (source)

Papert & Turkle tug on the anthropological origins of bricolage, extending it to the digital.

[Anthropologist Claude] Levi-Strauss used the idea of bricolage to contrast the analytic methodology of Western science with what he called a “science of the concrete” in primitive societies. The bricoleur scientist does not move abstractly and hierarchically from axiom to theorem to corollary. Bricoleurs construct theories by arranging and rearranging, by negotiating and renegotiating with a set of well-known materials.

If we take Levi-Strauss's description of the two scientific approaches as ideal types and divest them of his efforts to localize them culturally, we can see both in how people program computers. For some people, what is exciting about computers is working within a rule-driven system that can be mastered in a top-down, divide-and-conquer way. Their structured “planner's” approach, the approach being taught in the Harvard programming course, is validated by industry and the academy. It decrees that the “right way” to solve a programming problem is to dissect it into separate parts and design a set of modular solutions that will fit the parts into an intended whole. Some programmers work this way because their teachers or employers insist that they do. But for others, it is a preferred approach; to them, it seems natural to make a plan, divide the task, use modules and subprocedures.

On the other end? The digital bricoleur:

The bricoleur resembles the painter who stands back between brushstrokes, looks at the canvas, and only after this contemplation, decides what to do next. Bricoleurs use a mastery of associations and interactions. For planners, mistakes are missteps; bricoleurs use a navigation of midcourse corrections. For planners, a program is an instrument for premeditated control; bricoleurs have goals but set out to realize them in the spirit of a collaborative venture with the machine. For planners, getting a program to work is like “saying one’s piece”; for bricoleurs, it is more like a conversation than a monologue.

Programming as a conversation full of midcourse corrections, associations, and interactions. This strikes such a chord with me. I wonder if it has something to do with a musical background that inclines one towards the bricoleur approach. Papert & Turkle got me thinking as much when their paper delves into a student named Robin and her own approach to computers.

A classmate, Robin, is a pianist. Robin explains that she masters her music by perfecting the smallest “little bits of pieces” and then building up. She cannot progress until she understands the details of each small part. Robin is happiest when she uses this tried and true method with the computer, playing with small computational elements as though they were notes or musical phrases. [...] [S]he is frustrated with black-boxing or using prepackaged programs. She too was told her way was wrong: “I told my teaching fellow I wanted to take it all apart, and he laughed at me. He said it was a waste of time, that you should just black box, that you shouldn't confuse yourself with what was going on at that low level. “

Robin and I are one and the same in this regard. My experience has too dealt with taking music apart into smaller phrases and then stitching them together into a piece I perform. This way of thinking for music bleeds into coding — breaking a script into even smaller executable phrases and then stitching them together into something I use on my laptop.

While I didn't have people pestering me about my approach like Robin, I did that plenty myself. As someone who only started working with technology in their late 20's with 0 experience in programming and only a degree in classical guitar to their name, I worried that my background would be antithetical to learning about how to program and manipulate technology. It sure as hell wasn't a mathematical background or an analytical approach that a computer science degree would cultivate. How would I fare? How would I survive?

And you know what? It turns out that I could do just fine. Not in spite of my background but because of it. Papert & Turkle make this crucial point:

The computer can be a partner in a great diversity of relationships. The computer is an expressive medium that different people can make their own in their own way.

This bears repeating — the computer can be a partner in a great diversity of relationships, whether you come at the computer from a mathematical perspective or musical one, whether you approach programming as a digital bricoleur or a meticulous planner.

That can be the freeing flexibility of the computer.

Sometimes a phrase someone uses can strike at the core of your own identity. It touches on a piece of yourself that you couldn't articulate before. Nothing could articulate it. Then lo and behold — the phrase appears. Your identity suddenly opens up before you, things start to make sense.

Tom Critchlow uses such a phrase in this post to describe one of his “open avenues for inquiry.”

Digital Bricolage:

I’ve been obsessed for a long time with the web as a texture – malleable programming, mini hacks and flexible personal scripting. From using importxml in Google Docs to writing little apps on Glitch/Replit.

Why does this resonate with me?

Maybe it's because I came to technology from a music background. I think more in musical terms when I work with computers — timbres, improv, orchestration, arrangements, sampling, remixes, jamming, melodies — than with anything computer science related.

“What happens if these three notes sound together? And what about these two?” translates to something like, “What if my blog could connect to Are.na? What would that look like?”

The musical question doesn't beget a fleshed out musical composition. I'm usually just plucking the strings of my guitar. In a similar way, the computer question doesn't beget a fleshed out web application. What comes out of it? The computer equivalent of plucking strings — mini hacks and flexible personal scripting that Tom writes about. This is why platforms like Glitch have served me well throughout my learnings — they're perfect for this musical form of programming, this digital bricolage.

Working with computers and the web this way have captivated my attention and continue to. I just couldn't put my finger on it before. Having “digital bricolage” in my pocket scratches a 3+ years itch.

Thanks Tom.

Reading through Sir Gawain and the Green Knight, you encounter these passages where the poet lingers on details. Perhaps it feels like lingering to me. But then sometimes the poet admits as much. Take for example when Sir Gawain puts on his armor. We get a beautiful description, finally falling upon the pentangle star painted on his shield.

A pentangle star, painted pure gold, Shone at its center. He swings it by the belt, Then tosses it across his neck. And the sign Of that star, its perfect points, fitted That prince, and I'll tell you how, though it hold up This tale.

The poet proceeds to explain the significance of the pentangle for 40 lines — how it is a “symbol of truth”, how “each of its angles enfolds the other,” how it is called “the infinite knot,” how it relates to the five senses, Christ's “five wounds on the cross.” These details wash over you. The tangent becomes the meat of the poem. But then the poet throws us back into the story as Sir Gawain gets on his horse.

I don't know how to describe such moments. They at once baffle and delight me, not feeling like tangential speed bumps but like a beautiful view you want to linger on. Reading such passages slowly allows me to take more in. Such lingering creates a richer reading experience. The speed bumps of a work become what makes the work special.

Umberto Eco brings this up in Chronicles of a Liquid Society in an essay titled “The pleasure of lingering”:

Lingering was something a certain Monsieur Humblot didn't approve of when he rejected Proust's [work] for the publish Ollendorff: “I may be slow on the uptake,” he wrote, “but I just can't believe that someone can take thirty pages to describe how you toss and turn in bed before falling asleep.” A denial of the pleasures of lingering would thus prevent us from reading Proust.

And maybe Sir Gaiwan too.

Git is something I haven't worked with much in a work context. That lack of knowledge has led to frustration lately and I'm hoping to change that.

I discovered this great post/talk from Rake Routes called Deliberate Git. Here's a great excerpt about the power of using Git responsibly:

Many teams see Git as a source of frustration. A painful reminder that the rubber needs to meet the road in order to make ends meet and keep our customers happy. They see Git as just a mechanism to transport the code from development machines to production servers and keep everyone in sync.

But I want Git to do more. Being distributed means Git gives us the opportunity to do something really amazing. It allows us to make quick commits locally without breaking flow and then allows us to rewrite those commits into a cohesive story that we share with our team.

If you focus on putting more information into your repo now, you can see amazing returns when you have questions later.

There's a great story of how Tu Youyou came to the work that led to her 2015 Nobel Prize in Medicine. The roots of her research have alchemical origins. David Epstein gives an account in Range:

Tu is known as the “professor of the three no's”: no membership in the Chinese Academy of Sciences, no research experience outside of China, and no postgraduate degree. Before Tu, other scientists had reportedly tested 240,000 compounds searching for a malaria cure. Tu was interested in both modern medicine and history, and was inspired by a clue in a recipe for medication made from sweet wormwood, written by a fourth-century Chinese alchemist. Technology doesn't get much more withered than that. It led her to experiment (at first on herself) with a sweet wormwood extract known as artemisinin. Artemisinin is now regarded as one of the most profound drug discoveries in medicine. A study on the decline of malaria in Africa attributed 146 million averted cases to artemisinin-based therapies between 2000 and 2015. Tu had a lot of disadvantages, but she had an outsider advantage as well that made it easier for her to look in places others would not dare.

The fourth-century alchemist Epstein glosses over is Ge Hong, who has other discoveries worthy of note:

In his most famous book, “Manual of Clinical Practice and Emergency Remedies,” he recorded a strange epidemic disease, which made patients suffer a serious fever while experiencing white pustules on their skin. The disease was later discovered to be smallpox. Ge's record was 500 years earlier than the Arabic physician Muhammad ibn ZakariyāRāzī's.

Ge also mentioned scrub typhus in his text, finding that the disease at that time was prevalent in China's Fujian and Guangdong provinces, and was caused by an intracellular parasite orientia tsutsugamushi. His record was roughly 1,500 years earlier than the first English report made by Dr. Theobald Palm in 1878.

In hindsight, Ge is a noteworthy figure ahead of his time. I wonder how much of that is hiding behind alchemical subtexts that many wouldn't bother with. Perhaps they think it backward? No matter. Tu bothered, and we have her to thank for her breakthrough in finding a cure for malaria.

There's a lot of history that we throw aside; history that is supposedly antithetical to what we're working on. I wonder what else could be found if we fight against the tendency to label such history as anachronistic to current & future ventures both large & small?

I am again reminded of a passage from the second volume of Lewis Mumford's The Myth of the Machine, imagining of a society that fully integrated history into its practices:

Had Leonardo [DaVinci]'s example in fact been followed, naturalization, mechanization, organization, and humanization might have proceeded together. Thus one method could have influenced and sustained the other, maintaining continuity with the past, yet alertly absorbing useful or significant novelty, constantly reviewing and correcting past errors, and seeking a wider selection of possibilities; introducing new values, not to destroy but to enrich and fortify those already achieved by other ages and other cultures. Such a practical syncretism of technologies and ideologies would have been an open one, open indeed at both sides, to past and future — constantly absorbing and refining more of the past while projecting and remodeling in a richer design ever larger tracts of the future.