Linux Mint 22.1 XFCE shows AppImages with embedded icons.

What's the deal with AppImage icons on Linux?

Embedded AppImage icons show on Linux Mint 22.1 XFCE When Yoyo Games/Opera released a beta version of their GameMaker Studio 2 game engine for Linux, I thought it would be a good opportunity for me to promote Linux in an unusual way, by making simple games entirely under Linux. I use Xubuntu at home (and still do). I love Xubuntu, and have become accustomed to some of the unique key bindings. Handbrake, a program I use heavily, also has some quirks on Linux Mint, that don’t exist on Xubuntu, so Xubuntu continues to be the Linux distribution I prefer to use… ...

May 22, 2025 · 2 min
Running Ollama Artificial Intelligence on a Lenovo ThinkPad T430s under Xubuntu Linux

Running Ollama artificial intelligence on a Lenovo Thinkpad T430s

I tried AI on my Lenovo ThinkPad T430s under Xubuntu Linux and it did something shocking! Yes, the heading above makes this seem like zoo of bad artificial intelligence videos on Youtube, but I was actually shocked at what happened when I tried to install Ollama on my ThinkPad T430s running Xubuntu Linux 24.04.2. Entertain me for a second and let me briefly explain why I was surprised: I’ve installed Ollama a bunch of times on hardware several years newer than my laptop. Ollama will run if you don’t have a modern graphics card (which sounds a bit strange since it’s mostly text based, but having a decent, modern Nvidia graphics card can really boost Ollama’s performance). ...

April 16, 2025 · 3 min
Llama 3.2 running in ollama

Simple Artificial Intelligence on Linux

Initial look at artificial intelligence We might be a bit late to the game, but I’ve started looking at what we need at the Computer Recycling Project to run artificial intelligence models locally (on the machine, versus on the server) under Linux. The Ollama project makes it very easy to install a number of artificial intelligence models under Linux. Once ollama is installed you can download a model by typing: ollama pull <model name> So if you wanted to pull Llama 3.2 you would type: ...

November 19, 2024 · 2 min