Jeremy Keith

Jeremy Keith

Making websites. Writing books. Hosting a podcast. Speaking at events. Living in Brighton. Working at Clearleft. Playing music. Taking photos. Answering email.

Journal 3219 sparkline Links 10779 sparkline Articles 87 sparkline Notes 8041 sparkline

Thursday, February 19th, 2026

A considered approach to generative AI in front-end… | Clearleft

A thoughtful approach from Sam:

  1. Use AI only for tasks you already know how to do, on occasions when the time that would be spent completing the task can be better spent on other problems.
  2. When using AI, provide the chosen tool with something you’ve made as an input along with a specific prompt.
  3. Always comprehensively review the output from an AI tool for quality.

A programmer’s loss of identity - ratfactor

We value learning. We value the merits of language design, type systems, software maintenance, levels of abstraction, and yeah, if I’m honest, minute syntactical differences, the color of the bike shed, and the best way to get that perfectly smooth shave on a yak. I’m not sure what we’re called now, “heirloom programmers”?

Do I sound like a machine code programmer in the 1950s refusing to learn structured programming and compiled languages? I reject that comparison. I love a beautiful abstraction just as much as I love a good low-level trick.

If the problem is that we’ve painted our development environments into a corner that requires tons of boilerplate, then that is the problem. We should have been chopping the cruft away and replacing it with deterministic abstractions like we’ve always done. That’s what that Larry Wall quote about good programmers being lazy was about. It did not mean that we would be okay with pulling a damn slot machine lever a couple times to generate the boilerplate.

Wednesday, February 18th, 2026

10 Thoughts On “AI,” February 2026 Edition | Whatever

  1. I don’t and won’t use “AI” in the text of any of my published work.
  2. I’m not worried about “AI” replacing me as a novelist.
  3. People in general are burning out on “AI.”
  4. I’m supporting human artists, including as they relate to my own work.
  5. “AI” is Probably Sticking Around In Some Form.
  6. “AI” is a marketing term, not a technical one, and encompasses different technologies.
  7. There were and are ethical ways to have trained generative “AI” but because they weren’t done, the entire field is suspect.
  8. The various processes lumped into “AI” are likely to be integrated into programs and applications that are in business and creative workflows.
  9. It’s all right to be informed about the state of the art when it comes to “AI.”
  10. Some people are being made to use “AI” as a condition of their jobs. Maybe don’t give them too much shit for it.

JS-heavy approaches are not compatible with long-term performance goals

Frameworks like React are often perceived as accelerators, or even as the only sensible way to do web development. There’s this notion that a more “modern” stack (read: JS-heavy, where the JS ends up running on the user’s browser) allows you to be more agile, release more often with fewer bugs, make code more maintainable, and ultimately launch better sites. In short, the claim is that this approach will offer huge improvements to developer experience, and that these DevEx benefits will trickle down to the user.

But over the years, this narrative has proven to be unrealistic, at best. In reality, for any decently sized JS-heavy project, you should expect that what you build will be slower than advertised, it will keep getting slower over time while it sees ongoing work, and it will take more effort to develop and especially to maintain than what you were led to believe, with as many bugs as any other approach.

Where it comes to performance, the important thing to note is that a JS-heavy approach (and particularly one based on React & friends) will most likely not be a good starting point; in fact, it will probably prove to be a performance minefield that you will need to keep revisiting, risking a detonation with every new commit.

Counting down to Web Day Out

Not long now ’till Web Day Out — just three weeks!

It’s also not that long until the start of a new financial year so if you’ve got training budget that needs to be used this year, send your team to Web Day Out. Not only is it excellent value for money, it’s also going to have an incredibly high density of knowledge bombs per talk.

CSS! Progressive web apps! Web typography! Browser support! And much more.

If you like the sound of Web Day Out, you’ll also like State Of The Browser, which is just ten days away. In-person tickets for that event are now sold out, but online streaming tickets are still available.

Better yet, if you buy a ticket to Web Day Out, you automatically get a free online streaming ticket for State Of The Browser!

So get your ticket in the next ten days, enjoy State Of The Browser from the comfort of your own home, and then enjoy a trip to Brighton for Web Day Out on Thursday, 12 March. See you there!

Tuesday, February 17th, 2026

Magic

I don’t like magic.

I’m not talking about acts of prestidigitation and illusion. I mean the kind of magic that’s used to market technologies. It’s magic. It just works. Don’t think about it.

I’ve written about seamless and seamful design before. Seamlessness is often touted as the ultimate goal of UX—“don’t make me think!”—but it comes with a price. That price is the reduction of agency.

When it comes to front-end development, my distrust of magic tips over into being a complete control freak.

I don’t like using code that I haven’t written and understood myself. Sometimes its unavoidable. I use two JavaScript libraries on The Session. One for displaying interactive maps and another for generating sheet music. As dependencies go, they’re very good but I still don’t like the feeling of being dependant on anything I don’t fully understand.

I can’t stomach the idea of using npm to install client-side JavaScript (which then installs more JavaScript, which in turn is dependant on even more JavaScript). It gives me the heebie-jeebies. I’m kind of astonished that most front-end developers have normalised doing daily trust falls with their codebases.

While I’m mistrustful of libraries, I’m completely allergic to frameworks.

Often I don’t distinguish between libraries and frameworks but the distinction matters here. Libraries are bits of other people’s code that I call from my code. Frameworks are other people’s code that call bits of my code.

Think of React. In order to use it, you basically have to adopt its idioms, its approach, its syntax. It’s a deeper level of dependency than just dropping in a regular piece of JavaScript.

I’ve always avoided client-side React because of its direct harm to end users (over-engineered bloated sites that take way longer to load than they need to). But the truth is that I also really dislike the extra layer of abstraction it puts between me and the browser.

Now, whenever there’s any talk about abstractions someone inevitably points out that, when it comes to computers, there’s always some layer of abstraction. If you’re not writing in binary, you don’t get to complain about an extra layer of abstraction making you uncomfortable.

I get that. But I still draw a line. When it comes to front-end development, that line is for me to stay as close as I can to raw HTML, CSS, and JavaScript. After all, that’s what users are going to get in their browsers.

My control freakery is not typical. It’s also not a very commercial or pragmatic attitude.

Over the years, I’ve stopped doing front-end development for client projects at work. Partly that’s because I’m pretty slow; it makes more sense to give the work to a better, faster developer. But it’s also because of my aversion to React. Projects came in where usage of React was a foregone conclusion. I wouldn’t work on those projects.

I mention this to point out that you probably shouldn’t adopt my inflexible mistrustful attitude if you want a career in front-end development.

Fortunately for me, front-end development still exists outside of client work. I get to have fun with my own website and with The Session. Heck, they even let me build the occasional hand-crafted website for a Clearleft event. I get to do all that the long, hard stupid way.

Meanwhile in the real world, the abstractions are piling up. Developers can now use large language models to generate code. Sometimes the code is good. Sometimes its not. You should probably check it before using it. But some developers just YOLO it straight to production.

That gives me the heebie-jeebies, but then again, so did npm. Is it really all that different? With npm you dialled up other people’s code directly. With large language models, they first slurp up everyone’s code (like, the whole World Wide Web), run a computationally expensive process of tokenisation, and then give you the bit you need when you need it. In a way, large language model coding tools are like a turbo-charged npm with even more layers of abstraction.

It’s not for me but I absolutely understand why it can work in a pragmatic commercial environment. Like Alice said:

Knitting is the future of coding. Nobody knits because they want a quick or cheap jumper, they knit because they love the craft. This is the future of writing code by hand. You will do it because you find it satisfying but it will be neither the cheapest or quickest way to write software.

But as Dave points out:

And so now we have these “magic words” in our codebases. Spells, essentially. Spells that work sometimes. Spells that we cast with no practical way to measure their effectiveness. They are prayers as much as they are instructions.

I shudder!

But again, this too is nothing new. We’ve all seen those codebases that contain mysterious arcane parts that nobody dares touch. coughWebpackcough. The issue isn’t with the code itself, but with the understanding of the code. If the understanding of the code was in one developer’s head, and that person has since left, the code is dangerous and best left untouched.

This, as you can imagine, is a maintenance nightmare. That’s where I’ve seen the real cost of abstractions. Abstractions often really do speed up production, but you pay the price in maintenance later on. If you want to understand the codebase, you must first understand the abstractions used in the codebase. That’s a lot to document, and let’s face it, documentation is the first casuality of almost every project.

So perhaps my aversion to abstraction in general—and large language models in particular—is because I tend to work on long-term projects. This website and The Session have lifespans measured in decades. For these kinds of projects, maintenance is a top priority.

Large language model coding tools truly are magic.

I don’t like magic.

Sunday, February 15th, 2026

Knitting is the future of coding. Nobody knits because they want a quick or cheap jumper, they knit because they love the craft. This is the future of writing code by hand.

Alice Bartlett

Progress Without Disruption - Christopher Butler

We’ve been taught that technological change must be chaotic, uncontrolled, and socially destructive — that anything less isn’t real innovation.

The conflation of progress with disruption serves specific interests. It benefits those who profit from rapid, uncontrolled deployment. “You can’t stop progress” is a very convenient argument when you’re the one profiting from the chaos, when your business model depends on moving fast and breaking things before anyone can evaluate whether those things should be broken.

We’ve internalized technological determinism so completely that choosing not to adopt something — or choosing to adopt it slowly, carefully, with conditions — feels like naive resistance to inevitable progress. But “inevitable” is doing a lot of work in that sentence. Inevitable for whom? Inevitable according to whom?

Thursday, February 12th, 2026

The Morrigan by Kim Curran

Every culture has its myths and legends. Greece has its gods and warriors. England has its stories of Arthur. Ireland has the Tuatha Dé Danann, The Ulster Cycle, and more.

But while the Arthurian legends and the Greek myths have been retold many times, the stories of ancient Ireland have remained largely untouched.

Kim Curran’s book The Morrigan takes on this challenge.

The blurb for the book compares it Madeline Miller’s Circe, which is a bold comparison. The writing in The Morrigan isn’t in the same league as Circe, but then again, very little is.

Structurally, the comparison makes complete sense.

Circe starts with the titular nymph in the world of the gods of Olympus before moving on to more mortal affairs, coming to a head with the events of The Odyssey, when Odysseus’s story dominates.

The Morrigan starts with the titular goddess in the world of the gods of the Túatha Dé before moving on to more mortal affairs, coming to a head with the events of The Táin, when Cú Chulainn’s story dominates.

I took me a little while to adjust to the tone, but once I did, I thoroughly enjoyed this retelling. It manages to simultaneously capture the bloody, over-the-top feeling of The Táin while also having a distinctly modern twist. By the last third, I was completely engrossed.

After finishing Circe I went on a spree of reading many, many modern retellings of Greek myths. Now that I’ve finished The Morrigan I want to do the same for the Irish legends.

But I can’t. Apart from re-reading a translation of The Táin, there’s not much else out there for me.

Kim Curran does have another book that’s just been released; Brigid (the goddess? the saint? both?). If it’s anything like The Morrigan, it’s going to be a must-read.

I hope these books are the first of many.

Buy this book

Older »