I spent an evening writing a cloudformation template for Counter Strike Global Offensive linux server. No, I don’t have a life. Yes, you will thank me next time you play with your friends and the laptop cannot handle more than 5 players. (AWS t2.micro handles 6 players easily, and you can always throw a c4.large at the problem which is still about $0.13/hr and handles, well, just about anything).

The template sets up a single EC2 instance of type t2.micro by default, uses the default VPC, and runs the server with “Arms Race” game in a free-for-all mode. Consult Valve’s documentataion page if you want to run other games or reconfigure the server in any way. The template also sets up a CNAME record pointing to the instance’s public DNS name, so comment the last section out if you don’t have a public hosted zone in your Route53.

Happy shooting!

Dell announces another in its series of “developer laptops” with Ubuntu pre-installed. This time it’s the überpowerful M3800 mobile workstation, available with everything from an i7 CPU, through a Quadro K1100M graphics board to a 3840×2160 display. I remember ArsTechnica’s review of the XPS 13 developer edition, in which they basically said the best thing about the laptop was that it was “unremarkable”, which by today’s standards is the best compliment. Dell managed to deliver a premium quality linux laptop that just worked, Cupertino style. If they manage to do the same with the powerful 15” mobile workstation and, as they announce in the blogpost linked above, with the upcoming XPS 13”, we’ll have Linux-powered alternatives to both the Retina Macbook Pro and the Macbook Air. Which would be brilliant.

You seem to be doing a great job, Dell.

update 01.01.2015: Andy Turfer has a review of this laptop on his blog.

Every couple years I get the urge to peek out of my Apple-furnished hole and survey the landscape of alternative devices and operating systems. I call this urge switching season […] I figure that the least I can do when the urge to switch strikes me is to share what I’ve learned in the hopes that it saves other people some time.

via Alex Payne — Switching Season Report, 2013 Edition.

I have it exactly like Alex Payne – I’ve been living in the Apple-ecosystem for the last 3 years, and I am sorry to admit that the 2010 13” Macbook Pro is hands-down the best computer I have ever used. It’s fast (especially after having an SSD upgrade last summer), silent, portable, has great keyboard, and its software is boring like hell – it doesn’t crash and I don’t have to tinker with it to make wifi work after resuming from suspend. Despite all that, whenever I see a nice Thinkpad,1 I’m immediately browsing the best second-hand computer store in the world searching for a used X201 or X220 in good condition. It’s partly nostalgia, partly the love of XMonad, and to a small extent dislike of Apple.2 And every time it happens, I’m performing an analysis similar to the one Alex Payne did, arriving at mostly the same conclusions: Android sucks, Linux on the desktop mostly sucks, Windows is not considered due to it not being unix-based, and Apple sucks least on all fronts.

It’s all rather sad.


  1. A friend just recently bought a used X201 Tablet. He’s running Ubuntu on it and says everything’s fine and dandy. I love the way this machine looks.
  2. However, I am not pro-Dell, pro-HP or pro-Google either. All corporations are evil, it’s their duty.

How Would You Fix the Linux Desktop?

aussersterne:

The culture of Linux remains the culture of 1993 mid-range computing—but we no longer live in a world in which CS students can’t afford the hardware/software they use at school and mainstream OSes can’t do the fun stuff. Quite the opposite. It’s funny to think back at how thrilled I was to have X11 on the desktop (compared to Windows 3.1) versus how I feel now, twenty years on, comparing KDE or GNOME on Fedora or Ubuntu to OS X 10.8. The tables have been exactly turned. Linux is still essentially the same in architecture and philosophy, while the rest of the world has moved to a completely different paradigm in which computing is essentially appliance-driven. In 1993 Linux was ahead of its time. In 2013 Linux is a decade behind.

These days, I want an complete, polished, turnkey appliance at low cost and with no labor time investment, not a set of building block. Today’s appliances are fast, intuitive, stable, durable, powerful, and integrated like the iPad (which I do, yes, use for serious work about 5-6 hours a day). For most users (which is where I have always ultimately fallen), Linux is solution in search of a problem that no longer exists.

Ask Slashdot has an interesting discussion about current state of the linux desktop, which has become (again) a heated debate after Miguel de Icaza’s blogpost and Linus Torvalds’ reply. There are some very insightful comments, like the one by aussersterne above, but more importantly the discussion gives a good picture of the linux/FLOSS community, with different views on what linux desktop is or should be, different backgrounds, ideas and problems. The first comment sums up the problem, or meta-problem to me:

Hatta:

I’ve been using Linux on my desktop for 13 years now. It works just fine for me.

Right.

A eulogy for Maemo/MeeGo

A long, long time ago, when I was still very enthusiastic about desktop linux and free software in general, an idea of a linux-based cellphone or a ‘palmtop’, as they were called back in the day, was something the FLOSS community dreamed of. There were numerous software and hardware projects (does anyone still remember OpenMoko?), and one of them, Android, was acquired by Google in 2005, and later became one of the most popular operating systems for mobile devices in the world.

I’ve already mentioned my linux-related laptop problems some time ago. Some things changed since then. Canonical released a new version of Ubuntu, 10.04, which in my opinion is a huge improvement over 9.10, and I got a new laptop (kindly provided by HiB), an HP EliteBook 6930p. Old problems are gone, but new have arisen.

At first it all seemed ok. I’ve installed Ubuntu 10.04 and didn’t have to tweak anything. Wireless, bluetooth, suspend/resume – all worked automagically. Except graphics.

My laptop has a built-in Intel 4500 graphics board. Most of the time it works fine, hardware acceleration and dual-display mode included. It’s also quite fast for my needs – flash videos work with no frame-dropping, HD films as well, and Quakelive works smoothly (the last one especially relevant to my research). But from time to time, completely randomly, weird annoying things happen.

First weird annoying thing is the Non Existing Display problem (which we shall henceforth refer to as NED). It goes like this: I power on the machine, the kernel boots, plymouth loads, and the GDM screen… well, it also loads, because I can hear the bongos, but it’t not visible. The thing is, GDM login window is being displayed on the NED. If I simply press enter and put my password, it will log me in, and in most cases I will see my desktop. If not, I can use Gnome-Do to evoke the display configuration panel. Once it opens, the screen usually flickers and realizes that there’s only one display connected, or, ontologically speaking, that the NED indeed does not exist (as the name clearly suggests). If it can’t realize the obvious truth immediately, it will do it after a couple of clicks in the display configuration panel. If not after a couple, then after a couple more, but anyway after some time the display works correctly. That doesn’t mean Ubuntu won’t have any doubts as to the NED’s ontological status anymore, and that’s the most annoying part.

The second weird annoying thing is the Proper Resolution Holding problem (which we shall henceforth refer to as PRH). Imagine a situation like this: you have your Emacs open and you’re working on some non-trivial piece of code. As any programmer will tell you, this requires utmost concentration. I often have situations like this, perhaps even more often than other people, because most programming problems are non-trivial for me, since I’m a bad programmer. Anyway, I have my Emacs open, and I’m thinking deeply about some problem. If I keep thinking for more than 9:59 without touching the mouse/keyboard, e.g. reading the algorithm description from a printed article, the screensaver will go on. And then boom!, the displays go crazy. Oh right, did I mention I’m using dual display configuration? I guess I didn’t, but I don’t think having an external display connected to your laptop is something fancy in 2010. Anyway, when the screensaver wants to switch on, X.Org turns the mirroring mode for my displays, and sets 1024x768 resolution on both screens. Unfortunately, the only way to solve this situation is to save any work I have, switch to a virtual console 1 with Alt+F1, execute sudo service gdm restart and hope for the best.

Display problems occur randomly while trying to change laptop screen’s brightness too. And of course after resuming from suspend, but thankfully this happens really seldom, like one in fifty suspends. When it does, however, the only way to bring your display back to work is to forcibly reboot the laptop.

There are more problems, like the silent microphone (no matter what I do it’s just too quiet for most people to hear me via voip), short battery life (i.e. much shorter than on windows), and terribly bad trackpoint/touchpad support (i.e. much worse than on windows), but these problems I could live with. The graphics related annoyances are just too much. And the worst thing is that I no longer know which graphics board should I recommend to people who want the best linux experience. Nvidia? Yes, but only with closed-source drivers, although some people claim that nouveau work well. Ati? Well, I remember both mine and Karolina’s problems with our Radeons, so that’s a no. Intel? As far as I recall everyone always told me that Intel chipset based products (graphics, wifi, ethernet) are always best for linux, but ever since the introduction of KMS-enabled drivers this is apparently no longer true. By the way, my case is nothing compared to a case of a new PhD student in our department. He’s new, so he’s got a newer laptop. Good for him? Not quite. Not a single linux distro supports his brand new Intel HD graphics.

Now the obvious question is: did I try to fix my problems? Yes and no. Yes, I’ve searched the forums, and yes, I’ve tried some solutions. None of them worked. There probably are some new tips, new kernel releases I could compile, new patches I could apply, but no – thanks, I don’t want to. I’m too old. With all my previous computers it was tuning and tweaking all the time. Thinkpad T40 needed a custom TuxOnIce-patched kernel for suspend to disk to work (suspend to ram made no sense, the battery was too old and I was loosing too much power even during suspend). Dell Latitude D430 had huge problems with newer Intel graphics drivers. There was always something I needed to tune. It’s like this joke about Lancia owners – they like tinkering with their cars in a garage, which is only a nice way of saying that their cars won’t work unless tinkered with.

I’ve been a linux user since about 1999, and during that time I’ve been using linux exclusively on all my computers. First it was SuSE 6.0 and RedHat Manhattan (was it 5.1?). Then different RedHat versions for a short while, then Slackware for a long time, then Debian, Gentoo, Arch, and finally Ubuntu, since somewhere around 2005. In fall 2010 I’ll stop using a linux-based operating system on my home computer, and I don’t mean switching to some BSD (huh, been there!). I need a second computer, so I won’t have to carry my laptop in a rucksack all the time (I moved to Fyllingsdalen and I work at HiB – everyone who knows Bergen sees the reason). It will be some Windows 7 based computer, or a product of one Cuppertino-based company.

Either way, I feel sad.

How I stopped being a desktop linux enthusiast

It’s actually about “how I’m stopping to be a desktop linux enthusiast”, because I’m still using linux on my desktop/laptop, and I still think it’s a much better solution than any Windows OS. It’s just that I’ve been using various linux distributions for many many years (since 1998 I guess) on every computer I’ve owned and thought it is a nearly flawless system. It’s not, and in fact it’s getting on my nerves.

I’m paranoid about backups and I have good reasons for that. I’ve tested many open source tools for automatic backup available for linux, but none of them fulfilled all my requirements.

I liked Déjà-Dup a lot, but it wasn’t able to abort a backup once the destination directory (portable hdd) wasn’t present (or rather: it did abort, but tried to prepare a backup anyway, consuming some cpu on the way). Second thing about Déjà-Dup I didn’t like is that it divides backup files into 5 megabyte archives — opening a directory containing 20 gigabytes of such archives takes a while (I understand the reason for such small volumes is handling Amazon S3, but for local backups it makes no sense), and finally, Déjà-Dup can’t make automatic backups more often than once a day (did I mention I’m a bit paranoid?). However, Déjà-Dup integrates with Gnome very nicely, and since it uses duplicity as a backend, I was able to come up with a simple script fixing all the problems in a couple of minutes.

#!/bin/bash

BACKUP_SCRIPT=${0##*/}
DUPLICITY="/usr/bin/duplicity"
DATE="/bin/date +%R-%d-%m-%y"

BACKUP_COMMAND="$DUPLICITY --exclude=/media/backups --exclude=/home/piotr/.cache --include=/home/piotr --exclude=** --no-encryption / file:///media/backups --volsize=250 --archive-dir=/home/piotr/.cache/deja-dup"

# Sanity checks
if test -z "$BASH" ; then
   printf "$DATE \n$BACKUP_SCRIPT:$LINENO: please run this script with the BASH shell\n" >&2
   exit 192
fi
if test ! -x "$DUPLICITY" ; then
   printf "$DATE \n$BACKUP_SCRIPT:$LINENO: the command $DUPLICITY is not available - aborting\n" >&2
   exit 192
fi

# Create an incremental backup if the portable drive is connected
if test -d /media/backups ; then
    printf "\n\n\n\n\n`$DATE` \nBacking up!\n-------------------------------------------------\n"
    $BACKUP_COMMAND
else
    printf "`$DATE` \n$BACKUP_SCRIPT:$LINENO: portable drive not connected - aborting\n\n" >&2
    exit 192
fi

# Cleanup
printf "`$DATE` Files backed up successfully\n-------------------------------------------------\n"
exit 0   # all is well

The best way is to first configure Déjà-Dup according to your needs, then copy the duplicity command it uses while backing up (it’s visible in STDOUT once you set an environment variable DEJA_DUP_DEBUG=1), paste it into the script, tune it (I’ve changed the volume size), and simply put it to crontab — this way it’s easy to control how often your backups are done.

Feel free to use the script if you need it, and if you’re better in unix scripting than I am (and I believe you are), send me any improvements and/or comments.