Introduction
At the start of the pandemic I started to see a lot of videos on the topic of buying older hardware, and how old hardware is sometimes more performant for the money spent, than new hardware. In 2024 this is less true, with processors like the Ryzen 5 5500, which isn’t a great processor compared to a lot of the processors currently on the market, but blows away even my 10 core 20 thread XEON E5-2690V2 when it comes to tasks like encoding video. At $110CDN when on sale, it’s an excellent deal. But what if your budget for the entire system is $100? I thought I’d look at a system from Computer Recycling in that range, see what upgrades we could make, and look at performance again.
The base system
The computer I started the build with is an Acer Aspire M5811 desktop. When Computer Recycling got the computer it had it’s original Intel Core i5 650 CPU running at 3.2GHz, 4GB of DDR3 RAM, no hard drive, and an NVidia GeForce GT320 video card.
Initial Upgrades & installation
Computer Recycling tries not to build any computers with less than 8GB of RAM. Even though Xubuntu is lightweight enough to run on 4GB without issues, 8GB gives a little overhead for opening Firefox or Chromium browsers with multiple tabs. I also added a 500GB Seagate ST500DM002 hard drive. SSDs are preferable, but our project has very few, so we tend to save them for the best systems we build. Because we have so few SSDs, we actually value a 120GB SSD more than a 500GB hard drive. Adding an SSD would bump what we charge for the system. And while the SSD is probably the best upgrade the system could get (even over doubling the RAM), there just are not enough at the project for us to be putting them in systems that are close to the minimum specification we would build…
But we can still make this better.
Initial Testing
One of the things I’ve heard repeatedly in videos on Youtube is that cards like the NVidia GT310 are complete garbage, and that even onboard video is better. So I thought I’d put that to the test with this system.
Using phoronix-test-suite I benchmarked the system with, and without the NVidia GT310. The test I used was Xonotic at 1024×768 with “high” details on. Why only 1024×768, why not 1080p (1920×1080)? Because the project also builds laptops that cannot necessarily do 1920×1080 on their built-in screens we standardized on a “minimum” resolution. We decided to up the anti on the test by selecting “High” details to make up for the lower resolution. If the game wasn’t at max resolution, at least we could make it look a little better. Here are the results:
The NVidia GT310 is significantly better than the graphics integrated into this first generation “Clarkdale” processor. It makes sense to keep the card in.
I also tested the onboard graphics for a 4th generation Haswell processor, in a system with an SSD and 16GB of RAM and Xonotic blew both of these numbers away with 118.44 FPS. But it left me wondering if we could find a better graphics card, that would still work with this system. As spec’d the 4th generation with the SSD would have been about $80CDN over the $100 budget. We might be able to get better FPS for less… I should also mention that the NVidia GeForce GT310 is using the Nouveau open source driver. I checked for a proprietary driver, but none seemed to be available for this particular card (at least without some tweaking).
Graphics Card Upgrade
About a year ago I dropped off my old Zotac NVidia GeForce GTX650Ti Boost card. It was a great card for the games I normally played up until a year ago. I wondered if this card could do better than the 4th generation’s onboard video. Unlike the GT310, the GTX650Ti Boost does have proprietary drivers, but I figured I’d test both the open source and proprietary drivers to get an idea of the status of each.
I ran into a stumbling block near the end of the test, power management and screen saver kicked in near the end of the test and appeared to have crashed the test. I decided to install the proprietary driver, run the test, remove the driver, and rerun the test with the Nouveau driver after.
As expected, the proprietary driver looked like it was running significantly faster. With the nvidia-driver-470 installed, Xonotic 1024×768 with high details on scored 181.57 FPS with the maximum FPS reached at 290 FPS, and the minimum of 112 FPS.
With the proprietary driver on, the GTX650Ti ends up being a worthwhile boost to the system. Now to see if we can squeeze a bit more performance out of the system.
Special Note [Re: Zotac NVidia GeForce GTX650 Ti Boost]: while going through the process of testing the performance of the GTX650 Ti Boost the system I was testing on froze several times. This only happened using the Nouveau open source driver. I looked into the problem and saw several complaints from people about this card, many seemed to suggest issues with power management, but here’s my $0.02 cents observations:
- The Nouveau driver seemed to be okay for a bit, until I decided to run the Xonotic benchmark. Then it froze. I thought the issue might be overheating, but the card was clean and once I had the proprietary driver installed I ran the nvidia-smi tool and saw the card was sitting about 30 degrees celsius (86F).
- When I was installing Xubuntu 22.04 from our PXE network environment the GTX650Ti boost would let me get to different stages of the install, but then glitch out. Our PXE Xubuntu image is based on 22.04.2, which I mention since 22.04.3 is out.
- When I ran updates on the Xubuntu 22.04.2 the Nouveau driver seemed to crash less… but, the proprietary driver was better performant, with fewer (none) crashing issues.
- Before installing the card, I’d previously had an NVidia GT310 using Nouveau, and I was able to remove that card, test the onboard video, and put the card back in. When removing the GTX650Ti Boost the system wouldn’t even POST (power on self test) until I added a card back into the slot. No BIOS changes were made, so the card seemed to have caused the BIOS to switch to a setting forcing the PCI slot to look for a card. Putting the card back in, or another card (which is what I did for the initial installation to SSD – more on the SSD later) fixed the POST issue.
To be clear, I don’t blame open source developers at all for this issue. The issue seems to exist on Windows as well with this particular card locking systems up. Users seemed to report issues resolved themselves with a newer driver. AFAIK NVidia still doesn’t help the open source community with drivers, which is why I tend to prefer AMD cards for my own desktop at home.
Processor Upgrade
The Intel Core i5 650 is a first generation “core i-Series” processor. Early generation Core processors can be confusing. The Core i5 650 CPU is a 2 core 4 thread CPU, while the same generation Core i5 750 CPU is a 4 core 4 thread CPU. I remember when our project first started getting Core i-Series desktops. When they started to come in, we were blessed enough to get almost all 4 core, Core i5 750-based systems. At the time I was under this mistaken impression that all i5’s were 4 core processors. That quickly changed thanks to tech friends pointing out their i5 processors were only 2 core (like the i5 650).
Whether Xonotic will make use of the extra cores of the i5 750 I had no idea. But we also benchmark all systems with GeekBench 6. GeekBench 6 gives both a single and multi-core result. The Core i5 650 in this particular machine had a single core score of 481 and a multi-core score of 1019. To give a sense of how bad this is, an i5-4590 (4th generation 4 core) desktop system we tested had a single core score of 1187 and a multi-core score of 3546. But it’s actually not all that bad if you’re comparing similar generation processors like the AMD Athlon II X4 645 processor. That system had a single core score of 358 while the multi-core score was similar to the i5 650 at 1012 (at bit less, but the X4 is a 4 core system). Accordingly, the X4 based machine is actually significantly less at CR (but less RAM with 4GB), at only $50, so perhaps something to build on if you want to run an older system. I put this here just for a sense of comparison.
I’m happy to report that adding the i5 750 did make a difference to Xonotic, by almost 19 FPS. The average Xonotic 1024×768 with High details on was 200.40 FPS with a low of 120 FPS and a high of 334 FPS. Adding the extra cores definitely helped.
On to the Geekbench (CPU) score comparison. The single core performance of the i5 750 (422) was actually quite a bit worse than the i5 650 (481). This is probably explained by the i5 750’s lower base clock speed of 2.66GHz and max turbo speed of 3.2GHz. The i5 650 has a base clock speed of 3.2GHz (the max speed of the i5 750), and a maximum boost clock of 3.46GHz. The multi-core performance is really where the i5 750 shines, at least compared to the i5 650, with a score of 1358, compared to the 650’s score of 1019. This seems to align with the almost 19 FPS boost Xonotic got when the i5 750 was inserted into the system, and it suggests Xonotic is multi-core aware.
Does adding an SSD boost Xonotic (Graphic) performance? What about (CPU) Geekbench?
I mentioned earlier that the Computer Recycling Project wouldn’t normally sell a 1st gen i-Series computer with an SSD (in fact we’re mostly trying to build a minimum of 2nd or 3rd gen systems), but I pretended that this was a system someone bought, installed their own SSD they had at home, to see what difference the solid state drive would make.
The first and most obvious difference was the boot time. Xubuntu boots noticably quicker with an SSD in it. No numbers here because the BIOS was actually set to RAID during the installation when it should have been AHCI, my bad for not noticing this before the install, but to keep things more apples to apples I left it for the remaining tests.
Geekbench runs first during our install process, and the results were… almost exactly the same, 422 for the i5 750 single core, and 1359 for the multi-core. The SSD seemed to make no difference to the Geekbench CPU benchmark.
I both love and hate the Xonotic (graphics/game) benchmark. It’s a nice little benchmark, but seems to take forever to download (despite it’s small size). Once it finally installed I ran the benchmark, and the results were (lower) almost exactly the same with an average of 199.23FPS (vs 200 FPS before) and a low of 118 FPS, high of 332 FPS, suggesting adding the SSD doesn’t do anything for this particular game’s performance. But I suspect games that are disk-heavy, loading video scenes from the drive, would see a boost in performance, it just didn’t happen for this title because it likely runs all in RAM.
So what happens if we upped the RAM to 16GB?
To save a lot of time, Xonotic showed virtually no difference, with almost identical lows, highs and average performance (198 FPS). It was lower again, despite adding faster and more RAM. I chalk this up to averages, and if I ran the test a bunch of times in each scenario I’m betting we’d see a similar result still. The biggest boost for Xonotic was the video card upgrade, but the CPU upgrade also squeezed out a noticeable FPS difference, just not nearly as noticeable as the graphics card upgrade.
I ran Geekbench next to see if the RAM affected the CPU’s performance in this test. Again the answer was no, with the single core score still 422 and the multi-core score dropping several points to 1347 (from 1359 with 8GB and the SSD).
When Windows is better than Linux
It’s rare that you’ll find me stating that Windows is better than Linux, but there’s a use case to be made here. The issue is the “new/old” graphics card I added to the system, my old Zotac NVidia GTX650Ti Boost. This is an old video card, but it was once a moderately capable card. I spent about a year using this card to play a number of games under Windows 10. According to TechPowerUp this particular card is capable of DirectX 12 on Windows. See: https://www.techpowerup.com/gpu-specs/zotac-gtx-650-ti-boost-oc.b1655. When I tested this card under Xubuntu in this system I found it produced a number of errors when trying to load games through Steam/Proton that needed DirectX 11. Worth noting that I had the latest Nvidia driver that was available for the card installed (nvidia-470).
The 470 driver apparently “used to” be supported by older versions of DXVK, but are no longer supported… wait, I thought Linux was better on old hardware than Windows? Yes, in general, this is true, but not always. And this is one of the use cases where Windows might be a better option.
It’s not that the NVidia driver doesn’t help, it does improve things for Linux-native games like SuperTuxKart and Xonotic, but if you’re wanting to play some of those Windows games through Steam that need Proton, you’re out of luck for now, better to get a better card. And this is sad. Linux should be better. A Mesa developer on Reddit noted:
Mesa driver developer here.
Basically what happened is NVidia just did the absolute bare minimum to allow the community to develop an open source driver, but they themselves are not really participating in any of that.
Fortunately we have very talented people such as Faith (formerly Jason) working on it, so I’m pretty sure there will be a working driver eventually. However, this will take time. How much time, is hard to guess. I would assume that it will be similar to how long it took for RADV to become a competent driver.
That being said, it’s hard to predict whether NVidia will keep helping the community the same way AMD is helping us. I recommend cautious optimism that the new driver stack may be ready in a few years but I wouldn’t recommend a hardware purchase based on this yet.
TimurHu – Mesa driver developer on Reddit: https://www.reddit.com/r/linux_gaming/comments/119v169/with_the_nvidia_drivers_now_being_open_source_are/
So the story for NVidia cards is still mixed. As a refurbisher I hate to see cards like this, which could be useful for low-end gaming, not get used to their maximum potential. It may not be Linux developers’ fault, but it’s certainly frustrating for users and refurbishers like us. Our project rarely sees systems with decent graphics cards, a GTX 1650 is a once a year kind of find for our project. That’s slowly starting to change, but in the meantime, it would be nice to use these older cards.