People are valuable because they exist

[P]eople are valuable because they exist, not because they are productive. You are valuable because you carry the collective hopes of humanity, because you are your ancestors’ wildest dreams come true, because you breathe and eat and take up space.

You are valuable because you are, not because you do.

Hugh Hollowell – Life is so Beautiful, April 6th 2023

MySQL’s OLD_PASSWORD() uses bytes, not characters

This is all ancient history, but sometimes you have to deal with ancient systems.

Way back in MySQL 4.x, MySQL had a PASSWORD() function that was used to set MySQL-managed user credentials. You gave it a string and it returned a hex string. It was never intended to be used by clients to hash passwords for their own use (indeed the 5.7 docs tell you not to) but nothing prevented it.

Later, MySQL changed the hashing algorithm and had a way to toggle if the PASSWORD() function used the old way or the new way via old_passwords. Somewhere along the way they also made an OLD_PASSWORD() function that only used the old algorithm.

Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 6
Server version: 5.5.62-0ubuntu0.14.04.1 (Ubuntu)

Copyright (c) 2000, 2018, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> select PASSWORD('123456');
+-------------------------------------------+
| PASSWORD('123456')                        |
+-------------------------------------------+
| *6BB4837EB74329105EE4568DDA7DC67ED2CA2AD9 |
+-------------------------------------------+
1 row in set (0.00 sec)

mysql> select OLD_PASSWORD('123456');
+------------------------+
| OLD_PASSWORD('123456') |
+------------------------+
| 565491d704013245       |
+------------------------+
1 row in set (0.00 sec)

Then MySQL 8.0 came along and all of those functions were removed.

But users had done Bad Things and used those functions to generate hashes and stored those in their own databases and needed ways to replicate the function. And the internet provided solutions in various languages, including python, PHP, and a replacement SQL function.

The devil is in the details though, because based on the language, it matters if the password is 7-bit ASCII or Unicode.

The PASSWORD() and OLD_PASSWORD() functions both treat their input as a string of bytes, not a string of Unicode characters. If the input is 7-bit ASCII those are the same and it doesn’t matter. If it’s Unicode, however, multibyte characters are hashed as individual bytes rather than as characters.

With a little python3 code and a handy mysql-5.7 Dockerfile we can demonstrate this:

# old_password.py
import sys


def mysql_old_password_chars(password):
    """Treat the password as a string of Unicode characters -- WRONG"""
    password = password.replace(" ", "").replace("\t", "")

    # build the old password in nr and nr2
    nr = 1345345333
    add = 7
    nr2 = 0x12345671

    for c in (ord(x) for x in password):
        nr ^= (((nr & 63)+add)*c) + (nr << 8) & 0xFFFFFFFF
        nr2 = (nr2 + ((nr2 << 8) ^ nr)) & 0xFFFFFFFF
        add = (add + c) & 0xFFFFFFFF

    return "%08x%08x" % (nr & 0x7FFFFFFF, nr2 & 0x7FFFFFFF)


def mysql_old_password_bytes(password):
    """Treat the password as a string of bytes -- CORRECT"""
    password = password.replace(" ", "").replace("\t", "")

    # build the old password in nr and nr2
    nr = 1345345333
    add = 7
    nr2 = 0x12345671

    for c in password.encode('utf8'):
        nr ^= (((nr & 63)+add)*c) + (nr << 8) & 0xFFFFFFFF
        nr2 = (nr2 + ((nr2 << 8) ^ nr)) & 0xFFFFFFFF
        add = (add + c) & 0xFFFFFFFF

    return "%08x%08x" % (nr & 0x7FFFFFFF, nr2 & 0x7FFFFFFF)


if __name__ == '__main__':
    password = sys.argv[1]
    print("chars: " + mysql_old_password_chars(password))
    print("bytes: " + mysql_old_password_bytes(password))
# Dockerfile
FROM ubuntu:14.04

RUN apt-get update && apt-get install -y mysql-server

# start mysqld in the foreground
CMD mysqld

Let’s get our MySQL 5.7 server going:

docker build --tag mysql-5.7 .
docker run --rm -d --name=mysql57 mysql-5.7

Now run some tests. Start with some simple 7-bit ASCII strings:

$ docker exec mysql57 mysql --default-character-set=utf8 --skip-column-names -e 'select old_password("123456");'
565491d704013245
$ python3 old_password.py '123456'
chars: 565491d704013245
bytes: 565491d704013245

$ docker exec mysql57 mysql --default-character-set=utf8 --skip-column-names -e 'select old_password("Pa$$ W0rD");'
69d9eae853c7ddf5
$ python3 old_password.py 'Pa$$ W0rD'
chars: 69d9eae853c7ddf5
bytes: 69d9eae853c7ddf5

So far so good. Now throw in something above 7-bit ASCII, like a simple ‘ô’

$ docker exec mysql57 mysql --default-character-set=utf8 --skip-column-names -e 'select old_password("Allô");'
4ae9b3f6595c3f70
$ python3 old_password.py 'Allô'
chars: 3ff55a3d63bf3485
bytes: 4ae9b3f6595c3f70

The well-known python solution that uses characters fails here. To be fair, it probably worked in python2 which used bytestrings not Unicode strings.

The “nice” thing about PHP here though is that because it doesn’t understand Unicode natively at all, it just works with a simple port of the python version.

<?php

function mysql_old_password($password)
{
    # build the old password in nr and nr2
    $nr = 1345345333;
    $add = 7;
    $nr2 = 0x12345671;

    $password = str_replace([' ', '\n'], '', $password);

    for ($index = 0; $index < strlen($password); $index++) {
        $c = ord($password[$index]);

        $nr ^= ((($nr & 63) + $add) * $c) + ($nr << 8) & 0xFFFFFFFF;
        $nr2 = ($nr2 + (($nr2 << 8) ^ $nr)) & 0xFFFFFFFF;
        $add = ($add + $c) & 0xFFFFFFFF;
    }
    return sprintf("%08x%08x", $nr & 0x7FFFFFFF, $nr2 & 0x7FFFFFFF);
}

echo mysql_old_password($argv[1]) . "\n";
$ php --version
PHP 8.1.32 (cli) (built: May 21 2025 23:22:09) (NTS)
Copyright (c) The PHP Group
Zend Engine v4.1.32, Copyright (c) Zend Technologies
$ php old_password.php '123456'
565491d704013245
$ php old_password.php 'Pa$$ W0rD'
69d9eae853c7ddf5
$ php old_password.php 'Allô'
4ae9b3f6595c3f70

The well-known SQL replacement for a user-defined OLD_PASSWORD() function also breaks with non-7bit-ASCII because it uses LENGTH() to calculate the string length with MID() to get the character which is encoding aware. Any attempt to use the function with a multibyte string will fail.

All of this is dealing with a decade-old technology and hopefully you never have to encounter it. But if you come across this blog post you are probably working with an ancient system and I hope this helps you.

Steps for successfully migrating email for a domain

I recently moved 8 mailboxes across 4 domains from Rackspace to Runbox after the former jacked up their mailbox prices by 3x. Here are the steps I used with great success in hopes that it helps others.

This post assumes that you are moving providers for emails that use a custom domain. This won’t work if you’re coming from gmail.com, etc.

The whole process is much simpler if users don’t have access to their old or new email until the entire migration is complete. In today’s stupid “2FA by email” world that can be challenging, but if your users can handle an overnight downtime it’s the way to go.

  1. Coordinate a time with your users on when to do the move. Email hosting is moved a domain at a time so you need to get all users with addresses on the domain on-board with a downtime window.
  2. In your new mail provider, create new accounts (mailbox) for receiving email. For each mailbox:
    • Use the same username as their old provider if possible.
    • Save a temporary copy of their new password (you did create secure passwords, right?) in a secure location like a password manager note.
    • Add any aliases used at the old provider.
  3. Send an email to account holders with their new server information and tell them to save the email to their computer for later. Include things like:
    • A reminder of the agreed-upon migration time and that they won’t have access to their old email during the downtime.
    • Ask them to delete any spam in their spam folder, delete any mail in their Deleted Messages folder, and to empty the trash. This will save time later in the sync.
    • The webmail address for their new provider.
    • The username of their new mailbox.
    • How you intend to get them their new password (a text to a privnote, it will be something they already know, etc).
  4. One day before the agreed-upon time, you need to update DNS for the domain in preparation.
    • Reduce the MX TTL on the domain to 30m or 1h, as low as you can get it.
    • Add DKIM records from the new hosting provider. You should keep the records from your existing provider for now.
    • Update the SPF record to include your new provider. You should keep the include for your existing provider for now.
  5. At the agreed-upon time, start the migration.
    1. Reset passwords on the old provider to prevent users from altering their email. Save this password in a secure location.
    2. Update the domain DNS and change the MX records to new provider, keep the short TTL for now. Be sure and remove any MX records for the old provider.
    3. Wait the full TTL time on the MX record before proceeding. This ensures (or attempts to, it is DNS) that all new mail will come into the new provider before continuing.
    4. Confirm mail is delivered at new accounts by sending an email to one of the mailboxes and confirming it is delivered. Care must be taken where you send this email from. If you send it from a mailbox at the same provider as the new mailboxes, it may shortcut the MX record. Best to send the confirmation email from one of the big providers, like Gmail.
    5. Sync the email from the old provider. Most email providers provide a way to do this, such as Retrieve on Runbox or Easy Switch from Proton.
      • All of these will be doing some method of IMAP sync under the covers, and for this they need the password from the old provider which is why you saved it.
      • Some providers require this be done from within the individual mailbox rather than at the administrative account. So you may need to log into the user’s new mailbox to initiate the sync.
      • This can take hours, possibly more than a day, depending on the number of emails, the size of the mailbox, if the old provider throttles the connection from the new provider, and more. This is the most nerve wracking part — the waiting.
    6. Confirm the sync completed successfully. Hopefully the provider provides a status update (Runbox does). If there were errors, just run it again. All decent IMAP sync tools are idempotent.
  6. Update DNS one more time.
    • Remove old DKIM records from the old provider.
    • Update the SPF record to remove the include from the old provider.
    • Increase the MX TTL to something longer, like a day.
  7. Notify users that their mailboxes are ready by the agreed-upon mechanism. Consider:
    • Suggesting that they change their passwords (presumably via the web interface) before adding the email to their desktop and mobile clients.
    • Telling them on their desktop and mobile clients to delete the old account and add the new one. This is a more robust solution than hoping the client would figure things out if they just updated the connection details.
    • Remind them to use IMAP, not POP.
  8. If you want to go above and beyond for your users, send them an email with some details about their new provider. You can include information like how to report spam, add addresses to the allow or block lists, and add server-side filtering rules.
  9. You can now delete the old mailboxes. I waited a week after the migration and checked in with my users before doing so.

That’s the happy path, let’s talk about the less-than-happy path where users can’t be without new email during the downtime window. In this scenario you give users access to their new mailboxes immediately after changing passwords at their old one.

The problem comes in during the sync as emails are brought over into their new mailbox. If the user deletes folders that are being synced it can stop the sync with an error and you’ll have to restart it. If they delete emails that were already synced and you have to start the sync again those will get synced a second time and take more time for it to complete. Then after all of that you have to look at the sync report and try and understand if everything came over successfully. It’s certainly doable but can be much more difficult.

Moving email providers can be daunting but it’s eminently doable with some care and a little patience from your users.

Plex and Garmin watches: SubMusic for Plex

The music support available on Garmin watches is horribly lacking. Spotify, Deezer, and Amazon Music — no Tidal or Quobuz. But if you have your own music library in Plex, there’s SubMusic for Plex which makes syncing music to your watch (mostly) a breeze.

It’s a paid app and I know a lot of people don’t love that, but I believe people should get paid for their work and I’d rather be giving money to an independent developer than Spotify or Amazon.

Tips

To connect SubMusic for Plex to your Plex system, the watch uses the Garmin Connect app on your phone. To do this the Connect app needs to have notifications enabled for the initial SubMusic for Plex pairing. This will let the watch trigger a notification that will bring up a web browser to link the app with Plex. You can disable notifications again after if you want.

By default, the forward and next buttons on your headphone are probably going to cause the music to jump forward or backwards by 30 seconds instead of to the next or previous song. You can disable this by going into Garmin ConnectIQ app on your phone -> Device -> My Music -> SubMusic for Plex -> Settings and making @SubMusic.Strings.skip_30s disabled. IMHO, this is a terrible default for a music app but at least you can turn it off.

When SubMusic for Plex downloads music to your phone it relies on Plex to transcode the music to a format the watch will support. You can see the transcoding, playlist fetches, and more during the sync by going to Plex -> Settings -> Console. This can be useful if you are trying to troubleshoot problems.

And finally, the developer is really responsive if you are having issues. They worked with me to fix a bug in the app where the watch wasn’t correctly accepting successfully-transcoded files from Plex.

Casey’s 2025 Playlist

Another year, another playlist — 14 years and still going!

I collect songs throughout the year that speak to me into a playlist then massage them into an order that feels good. I think of it as an audio journal for the year. These started as sharing songs with friends at an annual holiday party in early December, so the playlist’s “music year” runs from December through November. Take that, water year!

Prior lists get a lot of playtime and remind me of good memories, trips, and friends.

  1. Ozdust Duet – The Wicked Orchestra
  2. Stutter – Marianas Trench
  3. Suddenly I See – KT Tunstall
  4. Music for a Sushi Restaurant – Harry Styles
  5. About Damn Time – Lizzo
  6. Booty – Meghan Trainor
  7. Special – Lizzo
  8. Head & Heart – Joel Corry, MNEK
  9. Tahitian Skies – Caro Emerald
  10. Who You Are – Jessie J
  11. All I Know So Far – Pink
  12. I’m Not That Girl – Cynthia Erivo
  13. Love, Maybe – MeloMance
  14. Unhoused Residents – Carlos Rafael Rivera
  15. Moonlit Dreams Reprise – Marc Enfroy

You can listen to it on Qobuz, Spotify, Tidal, Apple Music (thanks to Casey C for making this one!) or presumably recreate it in another streaming service of your choice. I recommend listening to it in order for the full experience.

You can find prior playlists under the archaic mix cd tag. Enjoy!

Queer Books for Pride – 2025 Edition

What better way to celebrate Pride for readers than sharing a book a day that features queers?

This year’s #queerBooksForPride has 30 books, mostly but not exclusively scifi/ fantasy because that’s my jam, that either include a queer protagonist, strong queer secondary character, or where queers are just a natural, accepted part of the world — because that’s the world that I want to live in.

  1. The Watchmaker of Filigree Street – Natasha Pulley
  2. Gideon the Ninth – Tamsyn Muir
  3. Space Opera – Catherynne M. Valente
  4. Welcome to Forever – Nathan Tavares
  5. Silver in the Wood – Emily Tesh
  6. A Sorceress Comes to Call – T. Kingfisher
  7. Village Fool – Nathan Burgoine
  8. Teller of Small Fortunes – Julie Leong
  9. Foxes in Love – Toivo Kaartinen
  10. Godkiller – Hannah Kaner
  11. Darkness Outside Us – Eliot Schrefer
  12. Floating Hotel – Grace Curtis
  13. Witness for the Dead – Katherine Addison
  14. The Jasmine Throne – Tasha Suri
  15. The Emperor and the Endless Palace – Justinian Huang
  16. MicroSFF – O. Westin
  17. Somewhere Beyond the Sea – T.J. Klune
  18. Machine – Elizabeth Bear
  19. The Empress of Salt and Fortune – Nghi Vo
  20. Some Desperate Glory – Emily Tesh
  21. The Black Coast – Mike Brooks
  22. Ink Blood Sister Scribe – Emma Törzs
  23. Last Night in Nuuk – Niviaq Korneliussen
  24. The Summer Prince – Alaya Dawn Johnson
  25. The Space Between Worlds – Micaiah Johnson
  26. Lord Mouse – Mason Thomas
  27. The Water That Falls on You from Nowhere – John Chu
  28. City of Stairs – Robert Jackson Bennett
  29. The Enchantment Emporium – Tanya Huff
  30. A Memory Called Empire – Arkady Martine

This is the third year I’ve done this. And because each year includes new books there are more available at:

I hope you find something that makes you laugh, maybe something that makes you cry, but most importantly something that makes you feel loved, included, and seen.

Happy Pride! 🏳️‍🌈

Faster clamav scans for archives

clamav got much slower in 0.105 which we discovered at DProofreaders after upgrading from Ubuntu 20.04 to 24.04. It became so slow that the scans for our new content uploads and post-processing artifacts — all zip files — timed out resulting in failed uploads as the AV check is a gate. No amount of futzing with configuration options, RAM disks, --fdpass, and other things would get this faster.

clamav has a “multiscan” mode that will make the clamd service scan multiple files concurrently, which is great for modern systems with multiple processors. Except that mode does not work with files inside archives.

We solved this for our needs by creating a wrapper script that detects if the scanned file is a zip file and if so, extracts a copy and scans it with --multiscan instead. We went one step further and if the zip file contains other top-level zip files, or epubs which are effectively zip files, we extract those as well. With this we’re able to successfully scan large archives within our system timeout.

However, if an upload does still time out, it’s likely to succeed without a timeout the second time the user attempts to upload the file. This is because clamav caches the results of the last 65536 files (see CacheSize) based on the file’s hash and if it has passed before, clamav doesn’t need to scan it again. In this way content scans of extracted archives can make incremental progress on retries.

The following is an example for how this might be done:

#!/usr/bin/bash
# To use multiple threads to scan archives, we need to extract them
# first. We also extract included epubs to get them parallelized too.

if [ "$1" = "--" ]; then
    shift
fi

if [ $# -ne 1 ]; then
    echo "Script only takes a single argument: filename";
    exit 255
fi

FILENAME=$1

if [ ! -f "$FILENAME" ]; then
    echo "Filename '$FILENAME' is not a valid filename";
    exit 255
fi

TEMPDIR=$(mktemp --dry-run)
if ! unzip -q -d "$TEMPDIR" "$FILENAME" >/dev/null 2>&1; then
    # if it didn't extract, just try to scan it
    rm -rf "$TEMPDIR"
    echo "Error extracting file '$FILENAME', will try to just scan it"
    clamdscan --fdpass "$FILENAME"
    exit $?
fi

# now try to extract any top-level zip-compressed files into subfolders
# if extraction of one of them fails, keep the original for scanning
for ext in epub zip; do
    NUM_COMP=$(ls "$TEMPDIR" | grep -c -e "$ext\$")
    if [ "$NUM_COMP" -gt 0 ]; then
        for compfile in "$TEMPDIR"/*"$ext"; do
            COMPDIR="$TEMPDIR/$(basename "$compfile")-extract"
            mkdir "$COMPDIR"
            if ! unzip -q -d "$COMPDIR" "$compfile"; then
                rm -rf "$COMPDIR"
            else
                rm "$compfile"
            fi
        done
    fi
done

clamdscan --fdpass --multiscan "$TEMPDIR"
RESULT=$?

rm -rf "$TEMPDIR"

exit $RESULT

Note that decompressing archives can be fraught with problems like zip bombs so you must ensure that you account for these like extracting to a temporary partition with a fixed size, limiting the extraction runtime, etc.

Cooling the house with outside, filtered air

My husband Daniel and I live in the US Pacific Northwest which in the summer has warm days but cool nights. Indeed many places do not have AC. But I have terrible allergies so just opening up the windows at night to cool the house off makes me miserable.

So last year my brilliant husband created this filtered air box from a Clean Air Kit and rigged it into the living room window so we can bring filtered air in when it’s cooler outside. I’m told there are two fancy HVAC term for this: “economizer” and “energy recovery ventilator” (ERV).

But this year he got cleverer.

After buying some 3Reality plugs and a temperature sensor he now has it wired up to our Home Assistant. The temperature sensor is outside above the fan. We have another one inside the living room (part of our Ecobee). When the temperature inside is above a set temp, and the temp outside is 2 degrees cooler than inside for more than 5 minutes, the fan turns on. When the temp inside is cool enough or outside warms up, it turns off.

But the past few years we have to worry about wildfire smoke here in the PNW. So in addition to the temp, he’s incorporated the air quality index from the PurpleAir sensor in the neighborhood. If the air quality is poor the fan won’t come on. If the fan is on and air quality gets too bad, the fan turns off.

And this is all working brilliantly. The house has big West facing windows, but the porch is on the East. Around 5p the air on the porch is cooler than the house so the fan turns on to start cooling the house down. The fan will help cool the house off at night without letting it get too cold. All without turning on the AC or pulling warm air or wildfire smoke inside.

This is the kind of home automation that feels like you’re living in the future, honestly.

This was originally posted to Mastodon on 2025-05-29.

PHP tunings in the age of AI scrapers

Distributed Proofreaders (pgdp.net) is a PHP-based site that has been around since 2000. And while we’ve had our fair share of high traffic days — including a Slashdotting and a huge influx during the start of the pandemic — the increased traffic from scrapers over the past year has been unbelievable. These are some things we’ve done that have drastically improved our ability to serve traffic without changing our serving hardware.

These are listed in the most to least impactful.

Decrease your surface area

The most important thing you can do is to decrease the surface area for crawlers to see your content. Whenever possible, put things behind a login. The vast, vast majority of crawlers are just blindly following links and not targeted attacks that will bother creating an account — this is an easy first step.

If you have public forums in phpBB, for instance, lock those down to logged-in users. This won’t prevent crawlers from requesting the pages but they’ll get a less useful response with more links to crawl.

Apache: php-fpm and the event MPM

The second most important thing you can do is to move to php-fpm and the event MPM if you are using Apache. Using the php-fpm and event MPM module has allowed us to serve 100x the amount of concurrent traffic with half the memory. This is the easiest way to do more with less.

The default way PHP is wired into Apache is with mod_php which requires the prefork MPM because PHP is not thread safe. This is a very resource intensive, particularly memory heavy, way to serve PHP content because each worker allocates up to the maximum amount of memory for PHP. And every Apache worker is doing double duty serving both dynamic PHP content as well as static content.

The event MPM allows Apache to serve static content directly and efficiently and only PHP pages are routed to php-fpm.

PHP OpCache

The PHP OpCache is not new, being built into PHP since 5.5, and caches PHP bytecode in memory. This saves CPU as well as disk IO. You want to allocate enough RAM to the OpCache such that you get a very high hit ratio. It probably requires less RAM than you think to have a big impact. pgdp.net uses just 128MB and has a 100% hit ratio. The opcache-gui tool is a really great way to see the status of your opcache.

You might be tempted to futz with the JIT settings introduced in PHP 8.0. While we’ve enabled JIT we haven’t seen a huge performance benefit to this for our use-cases so don’t waste too much time on this at the beginning.

Use static 403 and 404 pages

Crawlers are stupid and will happily load URLs they don’t have access to and will get a 4xx error. And if you’ve created fancy 403 and 404 pages that use dynamic content those could bring your site to its knees. Make sure your 403 and 404 pages are as low-resource as possible, ideally completely static pages.

Tell crawlers to fuck off

Today’s crawlers do not respect robots.txt. Many of them don’t even present a proper User Agent. But for those that do present a User Agent you can tell them to directly fuck off by an Apache rule like the following:

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^facebookexternalhit.*
RewriteRule . - [L,F]

You can chain multiple of them together for different crawlers like so:

RewriteCond %{HTTP_USER_AGENT} ^facebookexternalhit.* [OR]
RewriteCond %{HTTP_USER_AGENT} ^meta-externalagent.*
RewriteRule . - [L,F]

This will return a 403 forbidden HTTP response which will stop the request at your web server so it doesn’t even get to PHP. Note that a crawler getting a 403 won’t generally stop it from trying again and again for another page.

Malicious crawlers may present changing user agents and you might have to get creative with RewriteConds against HTTP_USER_AGENT, HTTP_REFERER, and others. This whackamole arms race is exhausting. Welcome to the internet in 2025.

MediaWiki caching

If you are running a MediaWiki instance, you want to ensure $wgMainCacheType = CACHE_ACCEL; to use the PHP opcache (or CACHE_MEMCACHED to use memcached) so it’s not hitting either your filesystem or the database for oft-used resources. You might want to set $wgSessionCacheType = CACHE_DB; as well to ensure sessions are persisted. This should be the calculated value when you change $wgMainCacheType but I find it better to be explicit.

You also want to use $wgUseFileCache = true; and set $wgFileCacheDirectory to something. This causes MediaWiki to cache page content to disk for non-authenticated users — which is all of the crawlers — which can decrease load on your database and PHP. This shouldn’t require a lot of disk space and can have a big impact.

Reducing MediaWiki dynamic content

The MediaWiki file cache is great for serving page contents, which are largely static. Generating dynamic content — like edit history, linked pages, etc — can be very resource intensive though and crawlers are often blindly following links that do just that. The MediaWiki entry Handling web crawlers has some great suggestions for mitigating that.

We installed the Lockdown extension and configured it thusly to prevent unauthenticated users from accessing these resource-intensive pages:

wfLoadExtension( 'Lockdown' );
$wgSpecialPageLockdown['Specialpages'] = [ 'user' ];
$wgSpecialPageLockdown['Activeusers'] = [ 'user' ];
$wgSpecialPageLockdown['Listusers'] = [ 'user' ];
$wgSpecialPageLockdown['Recentchangeslinked'] = [ 'user' ];
$wgSpecialPageLockdown['Recentchanges'] = [ 'user' ];
$wgActionLockdown['history'] = [ 'user' ];
$wgActionLockdown['edit'] = [ 'user' ];

Application and database optimizations

You might be tempted to focus on application and database optimizations. Those are always good things to improve performance for your users but focus first on the pages that are accessible to unauthenticated users because that’s probably the biggest percentage of your traffic in today’s internet.

Unfortunately if you’re down at this level it’s probably going to be a lot of work for a little return for your real users, good luck.

Fitness Tribal Knowledge

In December I started posted some bits of “tribal fitness knowledge” to my account on Mastodon — things that I’ve learned over the past 24.5 years of working out. This blog post is all of them collected together.

My hope is that maybe someone will learn something new that helps them along their fitness journey or makes some of it less scary. To be very clear I am not a certified personal trainer, dietician, or other health professional which is why I’m not going to be talking much about any of those things.

My creds, such as they are, are just from working out very consistently since the summer of 2000. You learn a lot of things doing something for a couple of decades. You can also read more about my fitness journey that I wrote in 2018.

Working out: getting started

  • The most important thing you can do in your workout is to actually do it. Getting started, and being consistent at it — regardless of what it is — is the hardest and yet the most important part. It’s ok to start small. If all you have the energy and spoons for on a given day is to go for a walk around the block, that’s still something. This gets easier (yay forming habits) but is never easy.
  • The best time to start working out is now. It’s tempting to wait until some other milestone — after the holidays, the start of the year, after that vacation, etc — but there will always be something. It doesn’t have to be getting a gym membership and hiring a personal trainer (although it can). Get out for a walk. Put on some music and wiggle your butt in the privacy of your own home. Starting is hard, why not start now?
  • Figure out what your goals are and write them down. Maybe you want to be big and beefy. Maybe you simply want to move more. Maybe you want to lose a little weight. There are no wrong answers but your goals should help guide your direction. If you want to focus on moving more, maybe going for walks is simpler than getting a gym membership. It’s certainly a cheaper and easier way to start.
  • Sometimes workout goals conflict and you can’t achieve both at the same time. You can’t really bulk to gain weight while also working to get definition (you have to do those in series — bulk then cut — not in parallel). Most of us can’t regularly run long distances, for example train for and run marathons, while also adding lots of muscle mass. When thinking about your goals, make sure they align.
  • Figure out if you’re a class person or not. For some people having a scheduled, group class can be very motivating and they can provide some accountability and structure. For others (like me) its intimidating and can be uncomfortable (also: eww, people!). If you don’t know which you are, try out a class! They’re often a good way to get started and can be great for many people.
  • You could be in the most consistent workout habit possible — at some point you’ll fall out of it. Maybe it’s an injury. Maybe it’s a major life event (new kid, death in the family, global pandemic). If you’re like me you’ll invariably beat yourself up over it: you were doing so well! But that’s life, it happens. Give yourself time and grace to get through it. Rest. Then challenge yourself to get back into your old routine, or into a new one.

Workout progress

  • You’re only competing with yourself: cheer everyone on. One of the things I love most about working out is that it’s an individual sport — it’s you vs you. And the important corollary is that we get to be positive and encouraging to everyone on their fitness journey. Just getting started? Awesome, you can do it! Been lifting a while and hitting some personal records (PRs)? Woohoo, way to go! This time of the year there are a lot of new people in the gyms; be kind to them.
  • All bodies are built differently. It’s cliche but true. The musculature illustration you see in textbooks is so generic it’s laughable. Beyond the physical structure, how your muscles are shaped, grow, and look under fat, skin, etc varies person to person. Have goals, have something to aspire to, but don’t try to fit into a mold that won’t work with your body. I’ll never be a 250lb muscle daddy with square shoulders — that’s just not my lot in life.
  • The limiting factor for your workouts will shift over time. I find mine moving between cardio, energy, and the actual muscles. When you hit your “wall” take a moment to figure out which one you hit. If it’s energy maybe you need to eat more before your workouts. If it’s cardio maybe you need to add some focused cardio days into the mix. We often just assume it’s the muscles we’re working out and miss signals for the others.
  • Don’t be surprised if you’re incredibly sore not the day of your workout, or even the next day, but two days later. I don’t know why but that’s been pretty consistent for me and I’ve heard it from others.
  • Take progress pics. You don’t ever have to share them with anyone, but having them can be helpful to feel like you’re getting somewhere. It’s like watching a small plant (or human) grow: its hard to see the changes day-to-day but it’s more obvious in pictures week to week or month to month. You’ll probably hate them. Taking good workout pics is a skill — and a stupid one — and loving your body is hard. Take them anyway; future you will appreciate it.

    I would be remiss in any post where I suggest taking selfies if I didn’t clearly state that I struggle with body image issues. All of us do. Especially gay men. Hate your body? I understand that feeling — you are not alone. It wasn’t until 2016 after working out for 16 years that I finally started to not hate every picture of me. And it wasn’t because of the workouts.

Gyms

  • Stepping into a gym the first few times can be scary as fuck. I’ve been going to gyms for over 20 years and I still get a little anxious when I start going to a new one until I settle into it. Figuring out where the equipment is, how busy it is at different times of the day, when to avoid the lunkheads, all of that just takes time and repetition. Cut yourself some slack while you figure it out. Consider going with a friend if that makes it less intimidating!
  • Wear whatever you feel comfortable in, although you may find specific clothes to be better for some activities (running with unconstrained bouncy bits can be painful). T-shirt and sweats? Short-shorts and muscle Ts? Pajama bottoms!? I’ve seen it all and it all works. Some humans dress up to the 9s before coming to the gym (looking at you fashion gays) but you don’t have to do your hair before you go. But for the love of pete please don’t bathe in cologne or perfume.
  • If you bring stuff into the gym (coat, etc) consider using a locker rather than schlepping it around with you. And if you use the lockers at the gym, put a lock on your locker. I can’t tell you the number of times I’ve known gym goers who have had stuff stolen out of their lockers. Wallets. Phones. Watches. Jackets. Sadly, just because someone needs a membership to get in the door doesn’t mean they aren’t going to steal your stuff.
  • Unless you’re really diligent about not being on your phone, leave it in a locked locker or car. It’s too easy to get stuck on your phone between exercises which wastes your time for your workout and for the person who wants to use the equipment next. This is annoying AF for others. If you really want your phone for music or to track workouts, use a timer for your rest breaks. As retro as it sounds, consider a pad and pen if you want to track your activity.
  • Here are some tips to avoid violating the implied social contract at the gym:
    • If it’s busy, don’t be an ass and hog two (or more!) different machines or benches. If you’re supersetting, try to find ones that only use one piece of equipment or be willing to let others work in with you.
    • Don’t hog 6 different pairs of dumbbells. Take a pair, maybe even 3 pairs, if you’re supersetting and working out different stuff, but more is just being rude.
    • Dropping weights should be the exception in order to prevent hurting yourself, not the rule. In general if you can’t safely manage them they’re probably too heavy for you.
    • Don’t leave your weights around. Put them back where they belong (not necessarily where you found them) when you’re done.
    • Don’t have a phone call at the gym outside of the lobby. Much less sitting on a piece of equipment.
    • Never take pictures of someone without their explicit permission.

Exercises

  • If you’re new to working out / lifting, I generally recommend weight machines as a good starting point. They are usually isolation exercises and have pictures to guide you on the movements. And while it’s certainly possible to injure yourself on them, I feel it’s less likely than using dumbbells/barbells if you’re not sure of how to use them safely.
  • Have some idea of what you want to do at the gym before you get there. Walking in the door and then figuring it out can lead to paralysis. At the beginning I recommend a well-defined set of things. Later that can morph to “arms and back” or “leg and core” or some other generalized set that makes sense for you. Being flexible on the order of things though can help if the gym is busy and the piece of equipment you want is being used.
  • Warm up before you lift! Not doing this probably contributed to me rupturing my distal bicep tendon in early 2019 — would not recommend. Before arm/chest day I do some arm raises, opens, rotations, shoulder shrugs, etc. For leg day I do some high-knees, ass kicks, jumping jacks, and squats. It doesn’t have to be a lot, but get those muscles warm and joints moving before you put them under load.
  • Stretch after your workout (not before). You want to stretch after your muscles are warmed up. Stretching is good for increasing (and maintaining) range of motion and overall joint happiness. My shoulder workouts have been better (read: shoulder has been less bitchy) after incorporating some stretching into my upper-body days — but in general I’m not good at remembering to do it.
  • You might be surprised at how stability exercises can help your lifting, both in safety (supporting related joints) and in power. My leg game really leveled up when I started adding regular stability exercises to my leg days. Squats on the bosu ball, single-leg balances, etc. And my shoulder workouts have been better (read: less bitchy shoulders) after incorporating high plank circles, plank taps, and similar.
  • Give yourself rest days. Most of our bodies weren’t meant to go-go-go nonstop. Taking a day off from exercising, particularly lifting, can help you recover for the next time. This can oddly get hard after you’ve developed a solid habit, but resting is part and parcel to gains. Going non-stop can also lead to injuries which will just set you back further. If a day off sounds like too much, go for a walk. Do some easy stretching. Just give your body a break.
  • Seeing what exercises others are doing can be a great way to find new ones to freshen up your routine! But don’t assume it’s necessarily the right/safe way to do it. I’ve seen people using the cable or fly machine and hyperextending their shoulders (ouch). Most gyms have personal trainers who are often happy to answer basic questions if you ask. As are many gymgoers that look like they know what they’re doing (they might not though, it’s a tossup).

Casey’s 2024 Playlist

Back in 2011 I started creating a yearly playlist of songs that spoke to me (back then we called them “mix cds”) and sharing them with friends. Here I am 13 years later and still at it.

I collect songs throughout the year, adding some, removing some, and then I put them in an order that speaks to me. I think of it as an audio journal for the year. I often go back and listen to them and am reminded of the year I met my husband, trips with my BFF, and more

I was as surprised as anyone when three(!) of the songs were from a single artist — Maren Morris — but I’m not sad about it.

  1. Letter from Yokosuka – Nujabes
  2. Fly Me to the Moon – Nichelle Nichols
  3. A Thousand Miles – David Archuleta
  4. Turbulence – Pink
  5. Detour – Maren Morris
  6. Good Friends – Maren Morris
  7. A Song for Everything – Maren Morris
  8. All American Queen – Ben Platt
  9. Mama Wanna Mambo – Meghan Trainor feat. Natti Natasha & Arturo Sandoval
  10. All I Know So Far – Pink
  11. cut – Tori Kelly
  12. Truth or Dare – Marianas Trench
  13. For Her – The Chicks
  14. #88 – Lo-Fang
  15. Ice Cream – Sarah McLauchlan
  16. Lost Woods – Patti Rudisill

You can listen to it on Spotify or presumably recreate it in another streaming service of your choice. The list has been curated to be played in order but you may need a paid account for some services.

You can find prior playlists under the archaic mix cd tag. Enjoy!