Expected Number Of Coin Tosses Until You Get Tail

Because of my interest in BitCoin I was confronted with the question of the expected number of attempts to achieve a specific result if the probability of achieving the result per attempt is fixed. In BitCoin mining, the miners test nonces for hashing a certain value, trying to find a nonce so that the sha256 hash of the nonce combined with the value is a number below a certain threshold. Every test of a nonce has a fixed probability of succeeding, so it is exactly this classic problem of the expected number of attempts.

A simpler variant of the problem is: what is the expected number of coin tosses until you receive head (or tail). Or how many dice rolls on average until you roll a six.

Anyway, my maths being a little rusty I did not know immediately how to calculate it. I remember how to calculate the expected “value” of some event: it is the sum of the values of the possible outcomes multiplied with the respective probabilities for the outcomes. But for the expected number of attempts until something happens, this yields an infinite sum: p*1+(1-p)*p*2+(1-p)^2*p*3+…+(1-p)^(n-1)*p*n+… if the probability for the desired result is p. That is because the probability to get the result after n attempts for the first time is the probability to not get it in the first n-1 attempts (1-p)^(n-1) multiplied by p. And we want to sum the probability for it taking one attempt+the prob. for it taking two attempts+… up to infinity.

On the internet I found some claims that the expected number of attempts is simply 1/p, but I could not find an explanation for it. Now in the meantime I found a really elegant and much more versatile solution to the problem, but I had already decided that I needed to tackle that “nasty” infinite sum. So while the solution mentioned above obliterates the need, I still want to show how to compute that infinite sum, as I could find no other explanation on the internet.

To compute the infinite sum, we’ll look at the partial sums p*1+(1-p)*p*2+(1-p)^2*p*3+…+(1-p)^(n-1)*p*n. Thinking about progressions, one famous progression immediately comes to mind: the Geometric Progression p+p^2+p^3+…+p^n. It is memorable for the neat trick for computing it’s value: multiply the whole thing by (1-p)/(1-p), then most terms in the numerator cancel each other out and we are left with (1-p^(n+1))/(1-p).

Our sum almost looks like the geometric progression, if only we could use the same trick to compute it. Luckily we can. First, let’s say q = (1-p) and observe that we can factor out the p from the sum. So we are left with the problem to compute p(1+2q+3q^2+…+nq^(n-1)). If we multiply this with (1-q)/(1-q) we get

p(1+(2q-q)+(3q^2-2q^2)+…+(nq^(n-1)-(n-1)q(n-1)-nq^n)/(1-q) = (since p = 1-q)

= 1+q+q^2+…+q^(n-1)-nq^n

The first part of that is a geometric progression, so we know it is equal to

(1-q^n)/(1-q) – nq^n

Still not that pretty, but we are making progress. Rather than trying to make that partial sum more pretty, let’s see what happens for n-> infinity: since q < 1 we have q^n -> 0 and I claim that nq^n -> 0 too (will show this later). Therefore lim_n_to_infinity((1-q^n)/(1-q) – nq^n) = 1/(1-q), or 1/p. So there we have it, the expected number of attempts until a result of probability p happens is 1/p.

For toin cosses p = 0.5, so the number of attempts is 2. For rolling a six with a die, it is 6.

As for the remaining step, nq^n -> 0 for q < 1, I admit I also had to Google for a hint. The (or one trick) is to write q as 1/(1+a) for some a > 0. Then the equation becomes n/(1+a)^n. Then we look at the Binomial Sum for (1+a)^n, which is

1+(1 out of n)a+(2 out of n)a^2+…+a^n <= n(n-1)*a^2/2 (since 2 out of n = n(n-1)/2).

Therefore n/(1+a)^n <= n/(n(n-1)*a^2/2) = 2/((n-1)a^2) which obviously goes to 0 for n -> infinity.

I am not sure who the audience for this blog post could be, but anyway, I am glad I found a small puzzle and managed to learn some simple new tricks while trying to solve it. I am excited about the solution I linked to above, which sets up equations for the expected value E = 1*p+(1-p)*(E+1). It is much simpler, and also more versatile, as you can use the same approach to answer questions like “how many coin tosses until you get three times head in a row”. Somehow I have never encountered that approach before, as far as I remember.

Also I must admit I was a bit disappointed by the internet, namely the apparent willingness to accept the formula E(number of attempts) = 1/p without questioning how to derive it. This has to be explained somewhere else already, but if not, maybe I did my small part to plug an information hole in the internet. :-)

Posted in Uncategorized | 10 Comments

Accepting BitCoin Payments For Programming Work

I am fascinated by the new electronic currency BitCoin, so I have decided to accept BitCoin in exchange for freelance coding work.

Obviously I can not bet my whole livelihood on it, so for the time being I would be interested in doing short time tasks or projects in exchange for BitCoin. For example I recently did some web site scraping jobs which took between a couple of hours and a day to put together. Something like that might work well.

In general I specialize in Ruby on Rails, JavaScript (including Node.js) and Android (Java) programming. Obviously some more exotic stuff would be OK, too (examples: Clojure, Scheme, Erlang, Linden Scripting Language). It is always great to get paid to experiment…

Also, I am asking for market rates (their equivalent in BitCoin, that is).

Incidentally, if you just want to donate some BitCoins to this blog out of the goodness of your heart, you could send them to my BitCoin address 1HtiNZTMuCy47Hwj9d2Kvd5BYSRPxqMZuF

Posted in Uncategorized | 17 Comments

Installing Ubuntu 10.10 On An Acer Aspire 4820T TimelineX

UPDATE: Not sure if it worked in the beginning and got messed up somehow by other people fiddling with the computer, or if it never worked. Anyway, I was unable to get Skype to work on the Notebook under Ubuntu (no Microphone input). Also tried Google Talk, which seemed a little bit better, but suffered from whining feedback loops. Not sure if the problem was on my side or the callee’s side. Anyway, bottomline: unable to get Video Chat working under Ubuntu. Microphone works in Sound Recorder, so it doesn’t seem to be a problem of the Mic fundamentally not working.

My mum’s old notebook backup computer at my mum’s place is failing, so I was looking for a new computer capable of running Ubuntu Linux. Looking through the selection at a local renowned computer store (Cyberport), all notebooks seemed really ugly. The only one I liked was the Acer Aspire 4820T TimelineX. I was skeptic about Acer before, but shop employees told me that it is OK.

I actually ordered a Dell Inspiron 15R before I bought the Acer, as it is the only notebook left in Dell’s Ubuntu initiative, but I also found it far too ugly and I therefore decided to return it. Sorry, Dell, but it is hard to tell from the pictures on your web site what to expect. Especially if instead of actual photographs you use 3d rendered images.

I am really pleased with the Acer, though. It comes in different variants and I ordered one without an extra graphics card, as that apparently can lead to problems under Linux. It also has a Core i5 CPU and 4GB of RAM. My only worry at the moment is that the glossy display might be annoying, but while doing all the installation stuff, the display never got in the way. So I hope it is OK. Another concern might be the quality of the webcam, I haven’t really looked into it, though. My MacBook’s webcam does not seem significantly better, so maybe it is just the current state of integrated webcams. Wish I could have found a notebook with a 2MP cam instead of the 1,3MP of the TimelineX.

From the internet I had learned that I should update the BIOS before installing Linux. So I booted into the preinstalled Windows 7 just to execute Acer’s flash installer, bringing the BIOS to version 1.22. I suppose should I ever have to do another BIOS update, I could try the DOS updater that Acer also provides, combined with a boot CD made from one of the free DOS clones (FreeDOS or DRDOS?). Let’s hope it won’t come to that, though.

Next I used the standard installation CD of the 64 bit desktop version of Ubuntu 10.10 to install Ubuntu on the notebook. I am pleased to say that I did not experience any issues at all, as far as I recall. Everything works out of the box. I had a network cable attached during installation, so that the installer can get the latest updates. Also, I enabled the proprietary drivers that are available for the notebook (only for the broadcom WLAN chip, apparently).

Things that work out of the box: WLAN, Webcam, volume and brightness keys, sound (Skype), hibernate and suspend to RAM – did I forget to check anything? I haven’t checked Bluetooth, mainly because I couldn’t think of a use case. I also didn’t try to set up the touchpad for multitouch, because it has a dedicated stripe on the side for scrolling, and it will be used with a mouse most of the time anyway. There are instructions on how to enable it, though.

Still, there are a few things to be done on a clean Ubuntu install before an unsuspecting user can use it: DVD and MP3 support, Thunderbird, Google Earth, Flash, remote access.

To enable MP3, I installed the package ubuntu-restricted-extras via Synaptic Paket Manager. This also installs MSTCoreFonts (I guess Microsoft fonts like Arial, so that web sites look the same as on other computers) and presumably a couple of other things. Then in the RhythmBox settings I changed the default encoding for CDs from ogg to MP3. Now that I checked it, I see there is also an option for AAC – maybe it would have been a better choice? Also in the settings of the MP3 format, I changed the quality value from 6 to 2. I have no idea what I am doing there, though. It seems since I last encoded MP3s, it has become customary to use variable bitrates. So when I tried it first I was surprised by the seemingly low bitrates of the resulting files. Some googling lead me to assume that the variable encoding has a quality setting between 0 and 9, and I saw 2 to be recommend. But as I said, no idea – probably I should have left the defaults alone…

For DVD support I followed the instructions at Ubuntu Help on Restricted Formats, that is, installed the package libdvdread4 and executed sudo /usr/share/doc/libdvdread4/install-css.sh in the command line. Unfortunately the command line thing is necessary, but I guess we are lucky to get DVD support for free at all. I also installed the VLC player for good measure. Sometimes the default movie player of Ubuntu fails, and VLC is often a good replacement.

Flashplayer: my memory is bit fuzzy, maybe it was already installed or the Ubuntu installer asked me during setup if it should install stuff like that. If not, it can be installed via Synaptic or the Ubuntu Software Center. Again, it seems to work flawlessly out of the box on Firefox. After some thought I decided to also install Flashblock and Adblock Plus, also via the Software Center. Flashblock is a bit inconvenient, but being spied upon by the evil Flash cookies, and being spammed with Flash ads is even more inconvenient. Another alternative might have been the route suggested by Daring Fireball to remove Flash from Firefox and use Google’s Chrome with inbuilt Flash support as a fallback.

I installed Google Earth via the make-google-earth package process, as detailed in this German Ubuntu Wiki. Other than this and also the official Ubuntu wiki state, installing the ia32-libs package was apparently not necessary – in fact, I could not find that package in the standard repositories anyway. I missed that Google now offers debian packages for Google Earth that I could have tried.

I haven’t used Evolution in years, and last I tried it it wasn’t really good, so I always immediately install Thunderbird as the mail program.

Another thing I always do on a fresh Ubuntu install is remove one of the panels. So I have to add the stuff from the bottom panel to the top panel – “show desktop” icon and the open windows selector. For the first time I didn’t switch off the Gnome effects (semi-transparent Windows, closing/opening animations and stuff like that). Somehow on the fast notebook and the glossy screen, I actually liked them.

Lastly I installed the openssh-server, so that I can remote ssh into the notebook sometimes. Of course the DSL router needs to be configured to forward some port to the ssh port on the notebook.

I have to admit I haven’t really looked into remote administration of the notebook. I figure ssh access should be enough to also get some graphical interface to run (like remote desktop or VNC), but I don’t know how yet. Might be another post some day.

Did I miss anything? Overall, as I said, I am extremely pleased with the notebook. A minor thing: the fan seems to be spinning most of the time, but it’s volume is very low. You can’t have everything, I suppose.

Posted in Uncategorized | 2 Comments

Switching from MacPorts to Homebrew

Since my last entry was about MacPorts, I just wanted to note that the last time I had to reinstall OS X (because of replacing the hard drive with a SSD), I decided to switch to Homebrew instead. It seems to be what all the cool kids are doing. So far it has worked well, but I haven’t installed that many packages yet. It seems a lot faster than MacPorts, too.

Also, I decided to use RVM to manage my Ruby installation, which has also worked out extremely well so far. RVM allows you to switch between different versions of Ruby on the fly. The only thing I don’t like is that it seems to have added a variable whose value is an ultralong script to my envrionment variables. So if I want to look at my environment variables, I have to scroll past this script.

Posted in Uncategorized | Leave a comment

MacPorts upgrade

Just a short snippet, because I keep forgetting this: to upgrade installed ports in MacPorts, use “sudo port -u upgrade outdated”. The -u flag tells ports to uninstall outdated versions of ports (the ones that have newer versions that are being installed through the upgrade).

Before using that flag, I kept running into issues with obsolete port versions all the time. It boggles my mind that the -u behaviour isn’t the default.

Posted in Uncategorized | Leave a comment

Fixing WLAN Connectivity Issues On My MacBook

It is almost too simple to blog about, but on the other hand, it caused me some grief.

Short summary: fixing the WLAN channel fixed the WLAN performance. It had never occured to me to try switching the channel because signal strength had always be shown as excellent. However, apparently the external monitor somehow disturbed the connection on the default channel.

I had been experiencing really bad network problems on my home network for a while. It had gotten so bad that I even used Tethering on my Android phone as a fallback at times, and I did not even dare to play my beloved Carcassonne games on Brettspielwelt anymore, because they would be so painfully slow.

Disconnecting my MacBook from the external monitor and going next to the WLAN router with it seemed to yield big improvements. So it looked as if the signal strength simply was not good enough. I already thought about buying another router or even plugs for networking via the power grid. However, the signal strength indicator for the WLAN was always at 100%, which seemed strange.

As a last test, I remembered about the command line tool ping. pinging my router, ping told me I had 20 to 40% packet loss. Really bad. But then I disconnected the monitor and tried ping again – suddenly the packet loss was 0%. After some more trials, like connecting the monitor to another computer, it became clear that it is really when the external monitor is connected to my MacBook that the WLAN connection goes bad. Apart from ping, using some of the speed detection web sites also was informative. Download speeds quadrupled without the external monitor being plugged in. By googling I found that others experience this, too.

At first I was frustrated, expecting costly repairs or hardware replacements, but then I read somewhere that changing the WLAN channel might help. So I switched my router from automatic channel selection to channel 5 (before it was using 11), and since then, the WLAN connection seems to be fine. (Knock on Wood – it’s only been 30 minutes since I changed it).

So that’s it – switching the channel should have been the first thing to try when I experienced connectivity problems. However, it didn’t occur to me because the network strength indicator always showed a strong signal. Only with ping was I able to see the packet loss despite of the good signal.

It’s such a relief to have a snappy network connection again. Especially as I didn’t have high hopes for finding a solution by myself, as I know next to nothing about optimizing WLAN networks.

Posted in Uncategorized | Leave a comment

Mac Mini G4 Homeserver With Ubuntu Linux 10.04, WPA2

I finally got WPA2 to work on my old Mac Mini G4, which is running Ubuntu Linux 10.04 server edition for PowerPC (Update: WLAN worked for a while, but it seems to be very unstable. Could be the location where it sits, or the software – Update2: running it at another location, it seems to work fine and stable).

I wanted to use the old Mini as a homeserver for a long time, but my girl-friend had complained about the (faint) noise it makes. Without a wireless connection, I had to place it next to the router, which in turn is placed next to her room.

Now with wireless I put it on the fridge in the kitchen, which is already quite noisy. Unfortunately, my girl-friend still complains. But I hope she’ll either get used to it, or I can still find a better place. With WLAN, there are more options.

Since I have installed Ubuntu Server and Samba for serving Windows shares months ago, I have forgotten the steps and can not talk about them now. I remember that the Ubuntu installation was really simple. Also I had apparently already configured the driver for the Mini’s wireless card, so I am not sure how I got that working. For a long time, I was unable to get wireless to actually work, especially not with WPA_SUPPLICANT providing access to my WPA2 encrypted network.

Hopefully information for installing the correct drivers for the Mac Mini wireless card can be found reasonably easy, as I can not retrace the steps anymore. My Mini required the b43 drivers, which requires download of the firmware by installing the b43-fwcutter package (sudo aptitude install b43-fwcutter).

So assuming your driver is working, I eventually found WPAHowTo for an old version of Ubuntu that describes most of the steps for configuring WPA2 (I only read the WPA_SUPPLICANT parts of that HowTo). All the newer how-tos seem to assume a graphical user interface and only describe how to use network-manager.

All instructions say to shut down eth0 before trying to start wlan0 (that’s how they are called on my system). So I grudgingly connected the Mini to a monitor and a keyboard again to complete the configuration. I also tested wlan without encryption, which worked.

Next install wpa_supplicant if not already done (sudo aptitude install wpasupplicant).

My wlan network uses a preshared key, so I used wpa_passphrase to generate the basis for a config file:

wpa_passphrase NetworkEssid passphrase

(replace NetworkEssid and passphrase according to your network’s setttings).

which resulted in output like

network={
ssid="NetworkEssid"
#psk="TextPassphrase"
psk=somerandomnumbersandletters
}

Then create or edit /etc/wpa_supplicant.conf (on my system the file did not exist yet). Since it needs the output of wpa_passphrase, I actually piped the output of wpa_passphrase into a file and copied it to /etc/wpa_supplicant.conf (somehow piping there directly didn’t work). (all operations in /etc require root privileges, so sudo accordingly). Also change owner and group of the conf file back to root in case by copying it or creating it it became owned by your “normal” user (chgrp root thefile and chown root thefile).

After some searching around, I found an example wpa_supplicant.conf for a WPA2 WLAN network using a preshared key and CCMP/AES encryption here They say they need a weird “double configuration” for it to work, but actually it also worked for me when I removed the TKIP stuff. So my final wpa_supplicant.conf file looks like this:

network={
ssid="dummy"
proto=WPA2
key_mgmt=WPA-PSK
pairwise=CCMP
group=CCMP
#psk="dummydummy"
psk=somerandomnumbersandletters
}

(Except of course other values for dummy and psk). It is probably save to delete the line with the clear text password, too.

Now the instructions from the WPAHowTo said to test wpa_supplicant like this (already with my parameters, not the ones from the HowTo):

sudo wpa_supplicant -iwlan0 -c/etc/wpa_supplicant.conf -Dwext

(Omitting the -w parameter from the HowTo, doesn’t seem to exist anymore)

The problem here was the -D parameter, as here you are supposed to state the correct driver (also, apparently, instead of wlan0 your wlan might have a different device name). At the Homepage of the linux driver for the b43 I found the information that “wext” should be used for wpa_supplicant.

So, again following the old HowTo, I put the following into my /etc/networks/interfaces:

auto lo
iface lo inet loopback
address 127.0.0.1
netmask 255.0.0.0

#auto eth0
iface eth0 inet dhcp

auto wlan0
iface wlan0 inet dhcp
wpa-driver wext
wpa-conf /etc/wpa_supplicant.conf

Now if I power up the server, it automatically connects to my WPA2-encrypted WLAN. I was very happy about this and put the server on the fridge. However, while I started writing up this summary, I experienced several connection losses. Now I feel like giving up on connecting the server via WLAN and try one of those Powerline networking things instead, which enable networking through the electric power lines (like this one).. On the other hand, I just moved the Mac Mini server into my room to connect it to the monitor again, and here WLAN seems to work fine. So maybe it was just the location on top of the fridge that doesn’t work – it is closer to the router, but at another angle and with different walls in between. I might try some other places for the server yet.

One candidate might be the bathroom, but I am bit worried about the occasional high humidity.

There are yet more issues to be solved before my home server is ready. Backups, what kind of file systems to share, iTunes server (?) or something else?

One small thing I also haven’t yet found a solution for yet: it would be nice if the Mini would shutdown if I press the power button. At the moment the power button simply seems to be ignored by the Ubuntu Server installtion. I could not yet find a solution for this – in the net there are some instructions for making the power button initiate the shutdown sequence, but they are all for normal PCs, not for PowerPC Minis. If anybody knows of a solution, please let me know!

That way, my girl-friend could also power the server on and off without having to learn about ssh and linux shells, and it wouldn’t have to run all the time.

Posted in Uncategorized | 5 Comments

Why sending e-cards is rather rude

Valentine’s day is upon us, and like many other special days throughout the year a pesky aspect of the internet rears it’s head again: e-cards.

For one thing, spammers jump on the opportunity and send around intriguing e-card announcements of the form “somebody sent you an e-card” or “n people want to meet you on network x”. It’s a good idea to never click on those links in emails, as most likely they’ll just link to a scam.

So that is one reason not to send e-cards: it endangers the recipient, because he is getting used to clicking on e-card links, which might be hazardous. The better e-card sites are capable of including the senders name in the email (“xyz sent you an e-card”), but even that is no guarantee that the e-card is legitimate. It is easy to harvest the names of a person’s friends from various social networks (Facebook and others) and simply pretend to send e-cards in their name.

One way to at least send a reasonably safe e-card would be to use a very well known and trustworthy company, like Yahoo, Google or Amazon (and that list is probably the complete list, I can’t think of any other well known sites). But even then it is still rude: by sending somebody an e-card, you are giving away that person’s private information, namely their email-address. Chances are high that in addition to your lovingly selected e-card, that person will also receive a never ending amount of spam mails for the rest of their lives.

If you absolutely want to send a greeting card by mail, just attach a regular JPEG image to an email you sent from your normal email program. Otherwise, why not invest in a real postcard? I know there are a lot of funny flash movies available for special occasions, but really JPEG is preferred. The chances for security holes in JPEG viewers are very small (although not impossible), and the same can not be said for most other formats like flash or Power Point.

That said, I admit that the occasional e-card from a friend has given me a warm fuzzy feeling. I don’t really expect most people to understand the issues with e-cards, or with giving my email address to a 3rd party. But I’d prefer it if they did, hence this article.

Posted in Uncategorized | Leave a comment

Exchanging a hard drive with Ubuntu LiveCD and gparted, disk image created with dd

The S.M.A.R.T. monitor has been warning about imminent failure of the hard drive in my mother’s laptop, so it was time to exchange it. Since it is running Ubuntu Linux (version 9.10, Karmic Koala), I was looking for ways to create an image of the old hard disk and transfer it to the new hard disk.

My initial googling didn’t immediately yield definite results, even though I found some comments mentioning “dd”. Therefore I wanted to quickly summarize the steps I have taken in case anybody else looks for something similar.

While I found forum threads recommending a variety of tools, they were usually several years old. Therefore I wasn’t sure if the recommended tools are still state of the art. Also I preferred a disk image over using the recovery mechanisms of the backup software (sbackup or rsync), as I wasn’t 100% sure if permissions and everything would work out OK on a fresh install of Ubuntu (probably, but a disk image just seemed cleaner).

Then I found this blog article about copying a disk with dd and decided to stick with it. Other than in that article, since I didn’t have a way to connect the new hd without installing it in the notebook, I first copied the image to another external hd. Then I exchanged the internal disk and copied the image back onto the new internal disk.

To do the copying, first boot the notebook from the Ubuntu CD (“Ubuntu LiveCD”), to run Ubuntu from the CD and not from the internal hd. That way, the conents of the hd don’t change during the copy process. Booted into Ubuntu LiveCD, I quickly changed the keyboard layout in the settings -> keyboard menu (it defaults to US layout, but I have German). Then I mounted the external USB disk by selecting it in “Places” (or clicking on it in Nautilus, the Ubuntu file explorer).

Then open a shell, and create an image of the internal hd by executing

sudo dd if=/dev/sda of=/media/name_of_external_disk/image_name

The sudo might be optional, in my case I needed it because the external hd was only writable for root. If the external hd doesn’t have a name yet, you can assign one with GParted or Disk Utility (I forgot which).

This might take a while, depending on the size of the internal hd. The resulting image will be as big as the capacity of the internal hd. dd will copy the whole hd, no matter how much of it is used or not. Also dd does not give any progress reports, so just be patient.

As the article I linked to mentioned, it might be a good idea to check with
sudo fdisk -l /dev/sda that /dev/sda is the right hd (I recognized it because of the size).

Now, power down and exchange the internal hd, then boot up with the Ubuntu LiveCD again. Again, change the keyboard layout (if necessary) and mount the external hd. (I actually rebooted once because at the first time there was a hickup mounting the external hd. After the reboot it worked).

Then write the disk image back using

sudo dd if=/media/name_of_external_disk/image_name of=/dev/sda

(again, checking that sda is the right target with fdisk might be good).

If the new disk is the same size as the old one, that’s it. Otherwise, the partitions on the hd can be resized with GParted to make them use the whole disk. GParted can be started from the Ubuntu Administration Menu.

I had only one problem: the hd had a “normal” partition containting the main file system, followed by an extended partition that contained the swap partition. Somehow I couldn’t move the extended partition or the swap partition, and therefore I could not resize the main partition either. Eventually I figured that I should first resize the extended partition to fill all the remaining space. Then I could move the swap partition (which is inside the extended partition) to the end of the available space. That done I resized the external partition again to only be as big as the swap partition. After that I could finally resize the main partition to use all the remaining space (OK, except for 8MB that were left over because of alignment with the hd’s “Cylinders”, not sure if that was necessary or not). Before resizing/moving the swap partition it might be necessary to select “swapoff” for that partition on gparted, if the Ubuntu Live system has decided to use that swap space.

That’s it – again a scarily long text to describe a simple procedure.

A downside might be that it copies the whole disk, not just the used parts. Also there has to be enough space left on the external disk. Not sure if copying less could be achieved with some dd magic. I am pretty sure one could just copy individual partitions with dd, but not sure how to copy the disks partition table, master boot record and what not then.

Posted in Uncategorized | Leave a comment

APT, the app store for geeks

Whether they loved it or hated it, all reviewers of the iPad agreed that usability of “normal” PCs for average users is atrocious. And I have to agree: whenever I take a look at the PC of a friend or relative who is not a “computer freak”, they are always riddled with spyware and malware or at least ladden with useless software that draws away time and energy of the user (examples are “Toolbars” like the Google Toolbar or the Yahoo Toolbar). This is not only because they might have downloaded or installed bad software from questionable sources, but because even vendors or seemingly trustworthy businesses have no qualms to sell their customers. Usually a new PC is already messed up by the software the vendor has preinstalled. If not that, then the new gadget (camera, navigation system, whatever) might come with crappy software.

But I don’t want to rant about the various ways today’s PC software and hardware vendors mess up the PC experience. The point is, by many reviewers the iPad has been hailed as the savior from this hell of malware and overly complicated software. What I want to mention is that the “geeks” (computer savvy people) have actually been aware of this problem for a long time, and they have invented a solution long before Apple’s App Store. It is called APT.

APT is a front end to the package managers of some Linux distributions, most notably Debian and it’s derivative Ubuntu. By using it you can install software from a trusted repository of open source applications (trusted because it is open to peer review). It is not the only way to install software on these Linux systems, but usually if you opt to install software from another source, you end up feeling slightly icky and dirty, as you should.

To avoid icky spyware, malware and so on, just stick to the official repositories of your Linux distribution. It is as simple as that – no debilitating iPad required.

Now I have to go ahead and admit that I am not even that well versed with Linux and apt. I know how to find, install and remove programs, and some other internals that are not really important. But isn’t that kind of the point: you can use Linux and apt even if you are NOT a “computer freak”. There are simple front ends that enable you to use it without using the command line. The main difference to Apple’s App Store is that it is still open – using apt is entirely optional, but recommended.

Of course, things on Linux don’t always run as smoothly as with a Mac (although I have whole lot of things to complain about with Macs, too). Not all the software in the repositories is very polished or even bug free. But neither is software in the app store.

As for stability, it helps to look at the hardware Apple has on offer: presumably they only actually sell three or four different kinds of computers (a laptop, which includes the iMac and the Mac Mini which are also based on laptop internals, the Mac Pro, and the iPhone/iPad). Most Linux distributions try to support a far wider range of hardware and therefore are less optimized for any specific piece of hardware. But it would be possible even today to launch computers with a Linux distribution optimized just for these computers. They should have no troubles achieving adequate stability.

Anyway, maybe you get the idea, maybe you don’t, all I want to say is this: the App store model is NOT our only salvation.

Posted in Uncategorized | Leave a comment