My computer hoard, or why I can't throw away stuff

󰃭 2025-02-03 (Updated: 2025-12-03 )  | 🗏 2614 words

'I'm a dragon so I must hoard' meets 'I'm neurodivergent so I have weird special interests'

Disclaimer

Okay, this blogpost will be mostly personal rambling, half serious stuff. Again, if you’re here for strictly-technical blogposts, you’ll have to skip this one too…

Expect: rants about some specific hardware manufacturers, personal viewpoints, hot takes…

IMPORTANT NOTE: This blogpost exposes my own viewpoint about all this, and isn’t related or representative of my employer(s), past, present or future.

Oh, and happy new year!

A bit of history

Born in the previous century, raised among electronics

So, I’m not going to give too many details, but since I came to this world, I swam among computers and various electronic devices. As a child, I got PC parts as toys. I had my first personal computer at 13 y.o., and my first real computer around 16 y.o. In the meantime, I was helping some people I know to repair old computers, and build new ones from the carcass of other ones.

Later in my computer science studies, I realized that MANY people working in Computer Sciences have no idea how a computer works on the inside, nor have ever witnessed the insides of a desktop or a laptop computer.

It’s pretty scary if you want my opinion.

If you give me a PC, given that its processor is at least more recent than an i386, I should be able to use it, disassemble and reassemble the computer, and install an OS.

That is not a lot. Like, seriously, there’s nothing magic about being able to plug an IDE cable, or to put a floppy disk inside the drive. There’s nothing magic about burning a Debian ISO on a CD-ROM and install it in “text” mode (read: ncurses interface). It’s not that hard to know the difference between different architectures and sockets, what’s a northbridge/southbridge, or why I don’t understand when people get triggered when I talk about AGP (Accelerated Graphics Port).

The latter is the ancestor of PCIe, specialized for Graphic cards

To me, what I can do is nothing compared to people who handle truly old hardware (mainframes, old Apple stuff, microcomputers, C64 and alike…) or more exotic architectures.

However, the fact that I still respect old hardware seems to be especially rare among my peers. I won’t count the number of people I know who would throw their whole rig to the trash can if they had the chance to trade it for a newer one for free.

I mean, I’m attached to my computers. I feel bad when they are broken. I feel bad when I see someone physically abuse a laptop or a computer.

I still remember the name of all computers I handled, and have a list of all my personal computers, along with their hardware configuration and their quirks.

Maybe I’m neurodivergent?

Until I got to university, I’ve always had a computer that was obsolete compared to its time. I did all my school curriculum before university on either a Pentium III (1999-2003) or a Pentium 4 (2000-2008).

Oh, and no smartphone! Only a brick-style Nokia 3310 like phone. Indestructible. There even was a clone of Worms, with 2x2 characters and two types of weapon!

Imagine that while all the people of my age were playing Dofus or watching cringe Youtubers, I was stuck on a Linux rig that couldn’t even run Flash! (No SSE3 instruction set)

===

Anyway, all this to say that I got used to running older hardware since I’ve been a child…

… then I discovered the joy of having a more recent rig, a somewhat acceptable personal computer.

Prisoner of its own frame: Topaze

So, let’s fast-forward to that infamous laptop, that I kept until last year. I’d been using it for 8 years (2016-2024). This was the first real time that I had access to a modern computer…

… I hate it as much as I appreciate it.

It is an HP laptop, featuring an Intel 4th gen i7 processor, with 4 SODIMM-DDR3 slots and a soldered Nvidia Quadro K1100M.

This is a laptop, and by definition, it was a devil in a box: I thought that opening it would make it impossible to close, given how tight everything would be packed to fit as many things as inside a traditional desktop.

I opened it for the first time in 2023. Out of desperation. I wanted to try and upgrade it, but then I realized a few things:

  • The Nvidia card being soldered, it was impossible to remove it, or to upgrade it
  • The screen output, as well as the DisplayPort/Thunderbold ports, were hardwired to go through this card, meaning that I couldn’t disable it
  • The CPU was under so much stuff that I was sure removing it would make it impossible to put it back.
  • 2 of the memory slots were under the keyboard, requiring to disassemble nearly everything to access to them.

All in all, I hated the concept of this laptop, stuck in a single configuration, impossible to fully upgrade.

Yes, I did upgrade the Wi-Fi card, added an SSD, and added 3 sticks of RAM, but still.

As a wise man once said: “Fuck you Nvidia”

That Nvidia card is the main reason why I wanted to make this blogpost: planned obsolescence.

This Quadro K1100M is last supported by proprietary drivers 418.XX, which in reality means that the last true version supported is 390.XX.

For some reason, drivers 470.XX were still working, but Debian only packaged the “Tesla” version of the drivers. Anyway, this series of drivers is End-of-Life since end of 2024, so it means that this graphic card doesn’t have any updates for its drivers

Have you ever heard about Nouveau? 🙂 And what about the Open-source Nvidia drivers?

Well, let’s talk quickly about Nouveau.

In itself, I have nothing against them, on the contrary, I would love them to have more volunteers, and more people helping them to make everything work. Sadly, they don’t have enough people.

One of the main issue I have with this laptop is that I can’t disable the graphics card, so I wanted something to put it in the lowest possible energy state. Sadly Nouveau doesn’t support this card properly, yet. Furthermore, they need you to extract a proprietary firmware blob from the drivers 325.15 which is… suboptimal, but at least it’s something.

Again, kudos for the project, but they are understaffed. Having to recreate the code for these firmwares, from scratch, when the original company could open source their GPU’s firmware that are 10+ years… Yeah, that sucks

===

The Open-source drivers from Nvidia only start from drivers 515.XX, so it’s basically a “fuck you” to older cards, as always.

Why am I especially angry at Nvidia, there are other GPU card manufacturers

Yup, there are.

At least, AMD/ATI GPU work Out-Of-The-Box even on older hardware.

But I can’t switch that card out of my laptop, so I’m stuck with it. Even if I don’t want to use it, I must use it.

Then just throw away the laptop?

That, my friends, is the exact point of that talk

Why should I throw away my computers if they work?

All is in the title…

I have a few perfectly working computers. They are, most of the time, perfectly fine for what I’m usually doing:

  • Indie games, usually small ones
  • Media playing, whether video or musics
  • TTY stuff
  • Programming stuff

For example, that laptop I mentioned earlier still works perfectly fine for all of these tasks, which is even more frustrating considering that I can’t have a “standard” setup anymore.

Corporate greed, planned obsolescence.

The longer it exists, the harder it is to make it run.

Nvidia is a perfect example of this, as you won’t get updates for the drivers for more than, like, 5 years.

I’m not talking about FPGAs (Especially Xilinx, thank you for the 2 fancy paper presses) or mobile phones, as both of these are even worse, but… yeah.

Why would you buy some new stuff if the one you have works perfectly fine? So, if the hardware doesn’t fail before, let’s make the software impossible to run, right?

Special section for HP, fuck you too

So, I just want to dedicate this paragraph to insult HP for one of the worst things they can possibly do: Deleting content about their previous models.

If you try to look online for official documentation about my laptop, you won’t find it. As soon as they decide to drop support for a given model, they ERASE everything about it, including (but not limited to) manuals, FAQs, software…

So, when you need information about a particular thing from your specific model of computer, either it is archived somewhere else (Thank you, Internet Archive), or you’re fucked!

So, from the bottom of my heart, fuck you.

Intel Wi-Fi cards, and proprietary blobs

Graphics cards are not the only case of “blobs required to run”. In fact, Wi-Fi cards may be considered even worse.

To properly use Intel Wi-Fi cards, you need a proprietary blob (usually provided under the name of iwlwifi on Linux). Without this, sometimes it will work in degraded mode, sometimes it won’t even work citation needed.

Intel support their cards for a given amount of time. For example, my old 7260 was supported until end of 2022. They even removed the matching drivers from their website.

Again, what would it cost them to open source their firmware once they stop supporting it?

When hype is enough: “AI (derogatory)”

First, a quick disclaimer. This section talks about LLM/Generative AI, not about the whole research field of Artificial Intelligence.

I’m talking about all this hype about generating bad-but-looking-great content using your computer…

So, nowadays, nearly all PC part manufacturers are advertising AI-generation aids, software… everywhere.

Simple keyboards or mouses shouldn’t need to embed dedicated “AI-accelerators”… what the hell?

Anyway, this is a pretext to encourage everyone to throw away their previous rig for something suited for AI…

And even if you don’t want it, you’ll have it!

(I won’t throw bricks at Microsoft, they would actually use it to build a thermal power plant to feed their datacenter).

Legitimate concerns

There are, of course, legitimate reasons why you would want newer hardware. Don’t hesitate to send me your comments if you see some more!

Security

It would be pretty hard for manufacturers (or volunteers) to maintain all the hardware that ever existed, outside all capitalist concerns.

When we discover security flaws in old hardware, it’s often “buy a more recent one to have an efficient patch”. That’s logical, once your bike’s inner tube is only made of patches, you may want to change it.

When I say “efficient”, I mean that you often have security flaws which can be “mitigated” using microcode updates, but which will heavily impact performances.

If you disable said mitigations (like what people recommended doing for Spectre/Meltdown), you may not feel the need for newer hardware as soon as the people who will see heavy performances drop

Energy Efficiency

Here, I have two nice examples.

First, I will say that I have a Raspberry Pi 1B+ and a 4B, and I would say that even if they use nearly the same amount of energy, I can’t do the same thing with both of them.

Second, I have retrieved a desktop “gaming” computer from 2015, and I have my desktop from 2022. The former consumes more power with 4 graphical terminals running yes (180W) than my desktop running Elite: Dangerous in High settings, at 1920x1080@75Hz (160W).

Hardware tends to become more energy efficient as the time passes citation needed. For your energy bill the environment, it is interesting to replace old inefficient computers with newer models. You may have to weigh the pros and cons of actually buying new items, given their construction’s environmental costs.

Performance / New capabilities

You wouldn’t run Doom Eternal on a Potato PC, right?

RIGHT?

Okay, sometimes an upgrade can be necessary. For example, sometimes you need newer features (better DirectX/OpenGL/Vulkan versions, hardware decoding, instructions sets…). Sometimes, you just need more power (it can be nice to finally run your favourite game at more than 15FPS, or to compile your new Rust project in less than 15 minutes).

A fancy paper press

My main problem isn’t with the fact that sometimes, you have some hardware that dies. It isn’t even that sometimes, hardware becomes simply obsolete for regular everyday use.

My main problem is that even legitimately good hardware becomes unusable just because some people decided to make it impossible to use to force you to buy new expensive gear.

I’m still a bit angry that my old laptop or my old Raspberry Pi, while still working fine, are doomed to become paper presses, the former due to unusable obsolete graphic card, the latter due to all modern distributions dropping support for such an old architecture.

All that rambling for nothing?

I guess this blogpost didn’t make you learn anything? Like, everything I explained is obvious stuff that we’ve known of for years now.

Should we drop security to allow people to keep their old rigs? Absolutely not!

Should we stop making more efficient components? Of course not?

Should we forbid people to buy newer setups while their old ones still work? Not at all!

BUT, in an ideal world, we should be able to use our hardware until it dies or until we decide to stop using it, without getting limited by software.

Just stop using it for anything that requires internet communication?

That’s what some people recommended me to do with Topaze, making it basically useless given what I work on.

For ancient hardware that are good for nothing modern, I understand that statement. I wouldn’t use my Raspberry Pi 1B rev1 to run VR or compile heavy Rust projects.

Obsolete architectures, missing instructions set

In some not-so-far future, i386 architecture will get dropped. All computers which aren’t 64-bits won’t get any stuff running on them.

That is understandable, we had the same transition from 16 bits to 32 bits, and it has been ongoing for years now.

It also means that software will also move-on, and old computers will remain on old software.

Again, that is understandable.

Yet…

What isn’t is the fact that companies would prefer disappearing rather than releasing old proprietary code pieces that would allow hobbyists and fans to maintain these old rigs. What isn’t is the fact that people use this argument of “you must move with your time” to make perfectly fine setups unusable after a few years, for no other reason than pecuniary gain.

Quick word about the gaming world - Nintendo’s case

As a closure word, I would like to talk a bit about Nintendo.

Among its many faults, they never release anything about their network services.

There are numerous fans who work on alternative implementations, especially network services (to name a few: wiimmfi, Pretendo, NetPass). They have to rely on reverse-engineering, and have only a limited amount of time for this.

If you try to reverse the services too much while they are still active, you risk bans, and bricking your expensive console. If you work on it too late, services aren’t available anymore, and you’re stuck with attempting to understand the server with only a client.

What would it cost for companies like Nintendo to release the sources for their several-decades old game servers? Does it really outweigh the benefits of popularity they would get?

I’m talking here about Nintendo, but LOTS of game developers, studios, console manufacturers… do this too.

All that knowledge lost due to corporate greed

El problema es el capitalismo.

Thanks

Thank you to proofreaders for their pertinent corrections:



Written by . Permalink

Enter your instance's address