I’ve been working on upgrades to the NUKEMAP for several years now, but I actually would like to get them implemented this year. I figured that one way to both make this “invisible” work visible to the rest of the world would be to write about it on here, and that this would also, perhaps, encourage me to Get It Done.

NUKEMAP as it looks in Chrome on a Mac in February 2026. Sorry, New York. Note the use of Protomaps and not Mapbox or Google Maps, both of which became prohibitively expensive.
Over the past few years, I’ve done some major “behind the scenes” upgrades to NUKEMAP that are basically invisible unless you are paying very close attention. Some of these are basic bug fixes and quality-of-life improvements, but the more substantial relate to migrating a lot of the functionality of the site away from Amazon AWS — which is relatively expensive — and to Cloudflare. The site still does require AWS for its PHP and RDS (database) functions, as migrating those to Cloudflare’s Worker and database systems would both be a lot of work and not necessarily be cost-effective (Cloudflare and AWS RDS have very different pricing models, and for some things Cloudflare is better, and for others it is much worse), but I’ve managed to cut the monthly costs down considerably.
Most importantly, the map UI is now run off of a self-hosted (Cloudflare hosted) Protomaps instance, which is much, much, much cheaper than Google Maps, Mapbox, whatever. If you are a developer or organization that uses maps and do not require them to be at-the-moment constantly up-to-date and are looking to shave a lot of cost off of your bottom line, check out Protomaps. It is a pain to set up the full “stack” of things you need for it to work (I will write up how I did it at some point, but 80% of the difficulty is just getting the style sheets migrated correctly to Protomaps’ format) but once you do it costs almost an insignificant amount compared to those other systems even with very high map usage. It’s not perfect, in part because it is reliant on OpenStreetMaps data for tagging (and sometimes someone will make a mistake that like, eliminates Pennsylvania, and that ends up in the “build”) and because I’ve made some tweaks so that it will render even on older (non-WebGL supporting) browsers that disallow me from using the latest versions of Protomaps data builds, but the cost savings are so significant that it makes up for it.1
So what lies ahead? Here are the major things for NUKEMAP I would like to implement in 2026:
- Enhanced casualty calculations and updated population data. The NUKEMAP’s casualty calculations work on LandScan population data, and that underlying dataset is by now somewhat out of date. I have migrated the new data into a database, but am experimenting with alternative modes of casualty calculation as well before I roll it out. These include looking at alternative calculation models, e.g., not just pegging them to blast as a proxy for mortality, but looking at thermal and conflagration models, and allowing the user to change which models they use. I also have integrated an urban land-use dataset into my casualty dataset so that there are possibilities of using building data for these calculations (e.g., being able to take into account whether a given grid square is “high-density urban,” “medium-density urban,” etc. or not, which can affect various estimations). I would say this is 90% done, and just waiting on some tweaks to the underlying effects model (covered below) that I am implementing.
- Humanitarian impacts. A very long time ago, when Google Maps was actually cheap/free/affordable, I used to have a “Humanitarian impact” button that would look up information about schools, churches, and hospitals that were within the area of heavy blast damage. I have been meaning to return a similar functionality, and have put together a database OpenStreetMaps’ tagged metadata (basically an extracted database of these kinds of institutions, along with a “cultural” category that includes libraries, theaters, and museums). I also have added, as part of my enhanced casualty calculations, estimates as to what component of your casualties are specifically children, which I think is an effective way to make these numbers a bit more impactful. This is basically ready to go and just waiting on my updates to the casualty calculations above, and integrating the output into the user interface.
- Upgrading the effects models. I have for some time now been creating a new Javascript framework that I call the Atomic Weapons Effects Library (AWEL.js, you see what I did there?) which is a “universal” nuclear effects library that allows me to integrate different effects models and data into one place and use them in a way that is much easier than the current NUKEMAP effects library. While doing this I have been going over every aspect of the modeling and improving it where I can, either by drawing on additional or better sources when they are available, or adding new functionality when I find it. So, for example, I am not only using Glasstone and Dolan’s The Effects of Nuclear Weapons (1977), but also bringing in data from the previous (and formerly classified) Capabilities of Nuclear Weapons (1960 and 1972 editions) where it is available. So this means that the new effects library not only will be able to do a number of things the current one cannot do (like go into the “very low” overpressure region, down to 0.25 psi), but also will be open-source and on Github for people to use and critique and add to as they see fit. It is also a very useful modular framework that is unit agnostic and will make the life of anyone who does these kinds of calculations much easier, I think (it has made my life easier, anyway). Anyway, while this is undoubtedly a Quixotian, never-ending project (since it is meant to be able to accommodate as many effects models as one wants to include into it), I will have a “basic” version done and implemented in the new NUKEMAP fairly soon. I would say this is 80% done, with just a few things needed to finesse and fix before it is “production ready.”
- Adding more fallout capabilities. Aside from wanting to allow the user to more easily understand the impacts of fallout (like being able to instantly generate “exposure over time” graphs for places downwind of a detonation), I have also been working on new implementations of WSEG-10 and, possibly, DELFIC, that can run in the browser, including as WebGL shaders, which are shockingly fast. So I am planning to allow the user to choose between fallout models. An advantage of WSEG-10, for example, is that you can easily do things with it that are not so possible with the existing Miller model, like showing the changing dose-rate over time after the detonation. This is maybe 70% done for WSEG-10 and maybe 20% done for DELFIC. (I have not gotten far enough with DELFIC to confirm that it can render in a reasonable amount of time, even as a web shader, so it may be impossible for this implementation. With WSEG-10, the web shader version renders maybe 100X faster than a normal in-browser version, and feels very plausible.)
- A total UI overhaul and recode. This is a big job, but the NUKEMAP interface and underlying code was originally made almost a decade and a half ago and has been added to incrementally since then. My own Javascript programming experience has expanded a lot since then, and what a web browser is capable of has changed a lot since then. Moreover, the modes in which people use websites have changed since then — mobile users represent a significant component of NUKEMAP users — and NUKEMAP is not really built around non-desktop modes (I made a half-assed mobile stylesheet port a few years ago, just so it isn’t impossible to use on a mobile phone, but it is very half-assed). I have been experimenting with this for several years now, unwilling to commit to one framework or another (they all have their pros and many cons, and a major “con” for all of them seems to be that once you commit to one, you’re committed to it), but I’m in a place now where I’m pretty ready to do this. This will involve rewriting basically all of the site in a much smarter way than it is currently written. Lest you worry, the essential aesthetics are not going to change that much — I am, if anything, stuck in my ways, so it is not going to turn into something “trendy” — but I think there are ways to streamline it without removing any existing “advanced” functionality. This will also involve making it more accessible for those with visual impairments, and building in a native framework for translating its components into other languages, something I have wanted to do for a long time now.
OK, so that’s a lot for 2026, but most of those tasks, as you can see, are already pretty advanced, as I have been working on them for some time now. The UI recode is the one that is the most nebulous in terms of “how much has been done, how much is there to do,” because I’ve started and re-started this many times over the years, and had students do mockups and proofs-of-concept, and so on, but I probably am going to just restart it again from scratch, because it’s that kind of project. But I’ve been thinking about it for so long that my mind has definitely formed some very firm opinions about how to go about it.
So that’s the game plan! I’ll be posting updates on here as bits of it get completed. At the moment I am thinking that the order of completion will be: effects model, casualties, humanitarian impact, fallout, UI overhaul. But we’ll see how that shakes out in practice.2
I would be remiss not to acknowledge that this upgrading work, and NUKEMAP’s general costs, have been for the past few years supported by a grant from Ploughshares, and some aspects of this work have been supported by a grant from the Future of Life Institute.
- It is a general theme for me that NUKEMAP has had to change many times because of the ways in which map API companies “squeeze” their users over time with very high costs for a website as popular as NUKEMAP. For a while Mapbox gave me a steep discount because of the not-for-profit/educational nature of NUKEMAP, but a) in periods of very high use (like the Russian invasion of Ukraine) that still was very expensive, and b) they started requiring me to essentially re-apply for that discount on a regular basis and started wanting me to “do things” for them, like give them positive reviews on certain websites, and I just really don’t appreciate that kind of thing. Right now I am of course at the whim of the AWS and Cloudflare pricing schemes, which undoubtedly will change over the next decade or so, but at least my core “stack” is one that is inherently more portable because it is self-hosted. Protomaps for the win. [↩]
- Note: before someone chimes in (and someone always does these days) and suggests that I let AI do all of this work for me: a) how dare you, have some pride for your craft and labor; b) while I am generally pretty skeptical of AI making life easier for anyone other than the oligarchs who are trying to push it onto us, I am actually not totally opposed to using it in the same way I would Stack Overflow, which is to say, to produce well-defined functions that have long been “solved” (and are largely uninteresting to me) but are not at my fingertips (“write me a function in Javascript that inputs an array of bounding boxes and outputs a common bounding box for them all in Web Mercator”), but even there, my experience is that you have to be very careful because the first thing they put out often has bugs or incorrect assumptions (if you don’t specify “Web Mercator” it can give you a function that gives results that are quite incorrect when used in a web mapping API, for example), and I certainly am not interested in “vibe coding” an entire web framework (I have seen some other NUKEMAP-like knockoffs that were clearly “vibe coded” and they sometimes are very strange in their implementations). So, yeah, this isn’t what I am interested in doing, except sparingly. “Convert this function from Python to Javascript” is something I am happy to let a bot do for me, although even then I feel ethically uneasy about using these products, and one still needs to bulletproof-check the output. [↩]





























