While moving from one nest to another (we're lemmings here; RP it a bit) I realized I still have all computers I ever bought or assembled, except for those that literally broke beyond any hope of repair.
Some are no longer used daily but all work and being on a point in life where everything and anything in the nest needs to have a purpose or a function, led me think what actually renders a computer useless or truly obsolete.
I was made even more aware of this, as I'm in the market to assemble a new machine and I'm seeing used ones - 3 or 4 years old - being sold at what can be considered store price, with specs capable of running newly released games.
Meanwhile, I'm looking at two LGA 775 motherboards I have and considering how hard can I push it before it spontaneously combusts to make any use of it, even if only a type writer.
So, per the title, what makes a computer obsolete or simply unusable to you?
Addition
So I felt necessary to update the post and list the main reasons surfacing for rendering a machine obsolete/unusable
I have a 12yo MacBook pro at home with some Linux installed that runs perfectly.
Still I have absolutely no use for it. There's nothing much else it can be used for than browsing the web. And for that I have lighter devices with a much better screen, so I prefer those anytime.
I have one of these a d wapped the old HDD for an SSD and it's like a brand new machine. It's still stuck on 10.13 but as a netbook and N64 emulator it's great.
When space, time, or power it requires is no longer a good trade in exchange for the task it completes.
I live in Asia, so the space something physically takes up is often the biggest cost. The footprint of my house is like 25 square meters, so if I want to keep a bunch of older computers around, I'm going to need to rent a bigger house.
My time has also grown more expensive over the years.
Your argument is correct on its own, but seriously misses the point of all possible variations. Using locally generated renewables mostly defeats it; not discarding the machine means less ewaste. If you're trying to be environmentally friendly - remember:
Reduce
Reuse
Recycle
In that order. Since I cannot reduce the amount of computer I have already obtained, the next best thing is to reuse it. When that is no longer sensible - recycling is the third best thing.
If the cost of running a 2500k with a 790 eats up the cost of switching to a cheap newer one (it isn't necessarily a new one) over some two three years, then that's already a sign the old PC is dead.
How long are you running such a machine realistically, though?
I'm in Germany, so very expensive power, and my single person household costs me about 600€/year in power for everything. And I'm working from home, so about 100W baseload for 8-10h a day.
Unless a machine is running really long and doing something significantly more than idling, power usage is almost irrelevant.
I have this "rule" which might be a bit old, that 1 watt a year costs roughly 1€ (it's just getting worse).
So over say 5 years (a somewhat reasonable time today I think), your 180watt PC used 8h/d would cost 300€ in power usage.
An older PC with a power hungry GPU could use 400 watts => 666€.
A ThinkPad (ok, it has not a gaming GPU) would be like 50€ and a good used one can be had for 200-300€.
You can also get a 4 to 8 gen Dell tower for 40-140€, add a cheap GPU and you'll have a Roblox / even Minecraft PC.
If you buy a brand new PC yes that won't (most probably) be an economical investion concerning power use. But old PCs suck(draw) power and one day it's probably economically viable to change it for a mor recent one.
Power usage is a massive one for me. I go by £1/W/Year for consumption of always-on devices. (I think it's more like £3/W/Year now!)
If the 20w new server can do the same work as the 100w server, and will cost me less over 2 years including the purchase price, then the old server is obsolete.
IMO a computer is obsolete when it can no longer run any desired programs. My laptop for example has outlived my much beefier desktop since the laptop is basically just used for web stuff while my desktop is used for gaming, development, and the like. Especially gaming has had a significant increase over the years and a gaming PC might be rendered obsolete much faster than something used for the web. My old gaming PC that was rendered obsolete I repurposed to be a server and it works well for its new purpose and will probably live for a couple of years still.
So there isn't any concrete limit on which you can say a computer has become obsolete. It is more of a subjective assessment of whether the computer can fulfill its tasks to a satisfactory degree.
You should only get rid of computers when your home, your parents' home, and your parents' garage have all run out of space. My parents' garage used to be an industrial building and is about as big as the house, so can fit many ancient computers.
oh boy... i ask myself this a lot. i frequent the puppylinux community and dudes are out there slanging 32bit computers with sub 1gb ram all the time... much like others have echo'd, the answer seems to be when the computer dies.
It's all very arbitrary and depends on the definition of computer for the individual.
Ultimately it does, I think, come down to practicality. Can I still use this thing to get what I need to do done, and can I still do it securely?
The security part can be more or less important depending on computer, as well. If you're a Mac person, your machine may be obsolete as soon as Apple decides to stop giving you security updates. If you're a Linux person, you can probably maintain a secure system easily on 10-15 year old hardware.
Pretty much the software you run on it and the support behind it. And for now, energy consumption, but I can imagine 100 years now that won't be a factor anymore.
But that's probably falls under "no practical use"
I mean, with the proper software, you still can automate your house with a Commodore 64, or browse the web with an Amiga
I moved to a laptop for my main system for portability, and I'm really enjoying the reduction in my power bill from my previous threadripper 1950x build.
For me, it's the hardware failure. If it's damaged enough to be uncomfortable to use, it's done. Similarly, if it can't run a modern browser decently.
I just ditched a >10yo laptop that I used as a server. The display was off most of the time, and the battery offered some energy backup. Its last months I couldn't even use the power button, had to take the mobo battery out and connect it without the battery in order to turn it on. Touchpad wasn't working either. OS hard drive was failing but that was replaced. I'm sure the thing works fine but I can't find the right flex cables to connect the power button and touchpad to the mobo. Guess it's going to trash soon.
I would say when it becomes too slow for even basic tasks like browsing the web, or running an up-to-date operating system.
Today, I would say the bar is around 3000-4000 points on cpubenchmark for the cpu, 8gb of ram and an SSD.
You could definitely get a usable computer that has less. I have a Pentium II PC that works great, and can even connect to the Internet. But software today is far more bloated and inefficient than it used to be, such an old machine would be useful only if you don't do anything computationally intensive, and don't need to run any modern software.
But something I forgot to mention about old hardware is that it allows you to run old software, old games... and there's also the nostalgia of Windows XP, or Windows 98, the early web. They remind me of a simpler time...
But seriously, I question the "practical use" bit, not because it's wrong but because it's so completely situational. If you want it for a business you want to beat AWS prices probably, but if you are just goofing off a replica of the Zeus Z-1 is actually a substantial upgrade from an old XP desktop, just because of the huge cool factor. If you have some sort of basic but fairly practical personal need, the cutoff for usable will be in between.
In your situation, I'd figure out how many you want, and then keep the n best ones by your reckoning.
The weird thing is, that we're currently at a point, where even very old machines are perfectly usable, if you're not playing modern games.
My main computer is an i5 4670 (or something like that), it's almost 10 years old, but for my Firefox/vs code/docker workload, it's pretty much about as good as my M1 MacBook. Sure, some tasks take a second longer, but not annoyingly long.
When it's slow and when I open up recent programs it locks and shuts. I've changed two computers in 25 years for that reason. I think the first one was for GTA III. Fortunately, there are no more games that are worth changing a PC.
I would say that it depends on a lot of different factors.
A computer can be obsolete because you require available spare parts to quickly repair it when something breaks and those are no longer available.
A computer can be obsolete because it is physically much larger, much heavier, much louder or less power-efficient than a more modern computer performing the same task for you.
A computer can be obsolete because of some external change, e.g. when Apple moved from x86_64 to ARM or when some new encryption algorithm or codec is not supported in hardware on that system and the software implementations are lacking in performance or power efficiency.
A computer can be obsolete when its hardware is no longer supported by drivers in modern operating systems with security updates.
When it no longer reliably functions - Older hardware still has a lot of uses, just dump lubuntu on it and you have a functional desktop that you can play older games on, and use open source productivity suites with. However, once parts start to fail that you can no longer replace (those old laptop HDDs for example), it becomes obsolete to you.
As someone with a dual Opteron 6386SE sitting in a closet somewhere with 512GB of RAM... It was fun for a few weeks until I saw my energy bill. Was great in winter though as I didn't need a heater on.... Ever.
The pros of working in a tech company where they decommission shit and just ask who wants it
You could think of anything that’s not the newest as obsolete, or anything that’s no longer supported, but in either case, it doesn’t mean the machine can’t still be used for either it’s original intent or something else (Plex server or something).
There are a lot of good suggestions in the replies here, aren't there?
I was going to say that I've been doing a lot of self-hosting and home automation recently, and it's had me doing things like spending a lot of time finding out if I can run Linux on an old Apple TV, to make it yet another home server running containers. I went through a phase where I was considering disassembling old laptops to re-use their LCD panels as mounted control access points around the house.
However, the LCD thing never went anywhere, because I'm not handy with a soldering iron, but also because I've found that those laptops are usually newer than the ones people in my family tend to have (me being in software and having cycled through laptops frequently), and I've been re-installing friendlier Linuxes on them and giving them away to friends and family.
I wonder about the other devices, though. Many are certainly not low-power-use, and what's the impact of me continuing to use them? Headless, most are certainly capable of running at least one containerized service, but a newer ARM or RISCV board will almost certainly sip less. What's the environmental trade-off?
I have, though, only one tower. I built it in 1993, and have simply upgraded it with new MBs and components over time. It's main feature turned out to be it's usefulness as a RAID5 container, again upgraded with increasingly larger HD over the decades, until the point where I stated prodominantly using docked laptops. One move, I simply never set it up again. That one is a power-hungry monster, and I feel bad about having it powered on 24/7. But I still keep it because, sentiment.
Annecdotes aside, my answer to your question is: most computers can run Linux, and therefore, most computers could find a use in self-hosting. For me it's become more a question if whether I have, or can find, a use for it. Often, a conversation with family results in finding a use; setting up a self-hosted media-server for mom, maybe. If not, it becomes e-waste, and I feel bad for a bit. But my devices have tended to be small form-factor, like Vera or AppleTV; it sounds like yours are larger, and maybe the form factor makes them less desirable to reuse.
If when you start up a brand new game you just got on Steam, the game on minimum settings grinds down to 10 FPS in the first 5 minutes or refuses to even start? That's what you know it's time to put the old girl to rest.
I'd say that, for me, a computer gets moved down the chain. From a daily driver, to something I use more sporadic, and then on to become a server of some kind hosting light weight stuff on my LAN. And then eventually it becomes a question of if it's worth the electricity bill having such inefficient old hardware running 24/7.
At the physical level: capacitors age and blow up, batteries stop charging.
At the efficiency level: when the work you want to do uses more energy on an older platform than on a newer platform.
At the convenience level: when the newer device is so convenient you never use the old device, telephone versus desktop as an example for most people.
After reliability level: if you're constantly replacing things on a unit, where it becomes your part-time job.
The longest used devices tend to be embedded industrial devices. They have a job they keep doing that job and until they break they're going to do that job forever. And that's application specific computing.
Most home users are general computer users, so they have a mix of different requirements, support and use cases. I for one still use a 10-year-old laptop. And it's totally fine.