A hand-colored print of a gentleman in a top hat and tailcoat riding an early two-wheeled hobbyhorse, with another rider visible in the distance. “Johnson, the First Rider on the Pedestrian Hobbyhorse” — published 1819 by R. Ackermann, London. Public domain.

Last summer I spent about a week or two (or three? Or four? I can’t remember, but the git history knows, though, and turns out it was a few months that I iterated on it, funny how the brain does that) creating new dev machine setup scripts / approaches based on a practice I have done for years in my career, and which I wanted to modernize with AI assistance and put out there for others to use (and selfishly, for myself to use as well).

Quick story time: at my old company, in research every summer there would be like 15-20 interns that would show up (maybe I’m exaggerating, it’s hard to recall, again brains are weird!), and what I noticed the first summer is that they would take a couple of days to set up their freshly re-imaged Alienware dev box.

Now, when you’re in an internship, every day is precious, and interns spending two days on this mundane task always felt a bit silly to me.

PowerShell to the rescue.

At that time I coded something by hand (raw PowerShell, no frills) that would automate the setup of their machines, and as a practice when they joined I’d walk around with a thumb drive (yes, you heard that right) and plug it into the intern’s machine, dump the script off, and run it while we got to have coffee and got to know the newest group of folks coming in that summer.

We got a lot of leverage out of this little script, and it was a practice that I took with me in my career when I wrote a new (and enhanced) version at my new company (again, by hand!) and this practice has endured to this day.

Now, I wanted to have a version of this that I could create as open source for not just me to use but others out there. There were also things that could be improved upon like a YAML config file to better parameterize the setup. But the inertia usually always held me back from doing that.

However once AI hit and I got access to GitHub Copilot, it was time to try my hand at this.

Now keep in mind, last summer, 2025, even with GitHub Copilot in hand, I still had to iterate on this for a while. In hindsight I should have set up CI from the get-go, but in the initial pass I really needed to be “close to the metal” to run things in virtual machines to work out repeatable setups for macOS, Windows, and Ubuntu and get a feel for what the developer experience is really like running this these dev machine setups from scratch multiple times. (Pro tip for folks in enterprise who design developer experiences: make sure you dog food these processes yourself to build true empathy.)

Contrast that with now: With Claude Code I was able to wrap the whole thing in CI with GitHub Actions in an hour and a half, and further add Fedora support in an hour.

This last week, I added Debian support in an hour (or less—who’s really counting at this point).

Now, one might say:

“Well Ryan, you’ve been compounding your engineering practices over time.”

And yes, yes I have.

I think this is one of those things that makes developer productivity gains so hard to measure: depending on where you come in and start measuring, you may not have a great reference point of having done something in the past, but further as you go along you build compounding wins beyond just using AI for coding, by using certain techniques and practices like rigorous testing and linting that allow you to move so much faster.

It turns out the people advocating for more linting, testing, TDD and more over the years were right—the difference now is it’s easier than ever to get these practices and workflows going.

In fact compound engineering practices are so powerful that they have been codified into a Claude Code plugin that I encourage people to check out. This is certainly not the only way to experience these benefits—there are thousands of ways to to approach this from, and they very well will vary by type of workload you’re building—but the important thing to do is start.

The tools and the models have gotten better, but it’s hard to account for practices that an AI tool helps you put in place to go even faster.

It’s not all just about code throughput—more often than not, it’s about the practices you set up around the code.

I will add this as well: You really need a platform like GitHub where CI is right there and is easy to get going for fast iteration and the pit of success is very easy to fall into. The virtuous cycle of feedback with these tools being able to read out results from builds in real time is real, and if your CI is clunky / disjointed / cumbersome / slow / unreliable / inaccessible from being read by local tools and doesn’t allow for this virtuous cycle, it’s simple to understand why it’s less ideal: you’re stretching out the developer loop into adjacent tools that don’t provide as seamless of an experience. This is not to say that you can’t accomplish great DX with adjacent tools—you just have to work harder to provide the means for developers to consume them in fast, iterative loops.


It truly feels like the future—like when Doc Brown said, “I’m sure in 1985 plutonium is available at every corner drug store.”

I feel like I have plutonium and I’m blowing up all of my backlog. I’m going to have no retirement projects when I’m old now. I might have to pick up fishing.