« previous: Spam | next: A New Song »
State of Linux Address
Five years ago, the Linux operating system was really a pain to use. It looked and felt "ugly" and inconsistent, but more importantly, it was often very difficult and time-consuming to install new software. While you might be willing to put up with an ugly interface, you’re probably less/not willing to use a system that requires you to spend an entire weekend whenever you want to install anything.
The heart of the problem is dependencies. Almost every program you install will depend on other programs/libraries being installed first, because your new program needs to use those libraries. So you try to install your new program, and the first thing it does is check whether your system has the dependencies it needs. If not, it says "couldn’t find xyz" and the installation stops; you then need to find and download and install xyz. Any given program might have 3 or 4 or 10 or 20 dependencies, and each of those has its own dependencies which your system may or may not already have.
To make matters worse, a dependency is usually for a certain version of a program. You might have version 1.2 of package foobar installed, but your new package requires foobar version 1.4. So you install the new version of foobar. But then it turns out that some other program you’ve got, which also depends on foobar, doesn’t work properly with version 1.4. Now you need to upgrade that other package to (hopefully) make it work again, and possibly upgrade some of its dependencies too.
This turns into a frustrating nightmare very quickly, and it was not the exception, it was the rule. And the chances of hitting some error while installing your program are much greater when you’re installing 5 or 10 different packages. Plus, installing any given package requires at least 5 steps (download, uncompress/unzip, then running the commands configure, make, and make install), so installing more than a couple packages is a pain even if you don’t run into any errors.
RedHat Linux attempted to make things easier with their RPM package management system. This allowed you to use a single command (rpm packagename.rpm) to replace the last 4 steps of an installation, once you’d downloaded the RPM file. But there aren’t always RPMs available for the package you want to install, and in any case it doesn’t address the issue of dependencies at all.
Debian Linux attempted to address all 5 steps with their APT package management system. With a single command (something like apt-get install packagename) the system would locate, download, unpack, compile, and install the program. Even more importantly, it would also determine which dependencies your program has, and it would download and install them first. If you’ve spent any amount of time administering a Linux system, you understand that this idea is nothing short of miraculous.
I spent a month last year using Debian, but found the APT system to be less than ideal. (In fairness, I should say that given more time I might have been able to better understand and properly use the system, possibly eliminating my reservations about it.) It’s not exactly as simple as apt-get install whatever. The system actually has 3 different versions at any given time: stable, testing, and unstable. The packages in "stable" are outdated, sometimes by a year. Testing packages are newer, but usually at least a month or two old. Unstable packages can be up-to-date but also very likely have bugs that prevent them from installing or running.
All that is confusing and annoying, but if you could mix packages from the different versions, it’d be ok. But you can’t do that cleanly, i.e. if you try you’re likely to break your whole system. If you want to upgrade from stable to testing or unstable, you have to do it all at once, which could take hours or days. And eventually unstable becomes testing, and testing becomes stable, so you will have to go through this on a regular (albeit infrequent) basis. Basically, a Debian system is either stable, testing, or unstable, and it can only use packages if they’re available within the proper version. And as I said, stable is usually very outdated.
Aside from those conceptual problems, I frequently ran into practical problems just trying to use the APT system at all. In the process of installing packages, there would be compilation errors, or configuration problems, or other problems typical of the manual configure-make-makeinstall process. The concept is great, but if it doesn’t work, it’s no better than the manual method.
Enter Gentoo. Gentoo Linux is based on the same idea as Debian -- a package management system that resolves dependencies intelligently. In Gentoo, this system is called Portage, and you use the emerge command to access it. With the command emerge packagename the system locates, downloads, unpacks, compiles, and installs your package, and it also installs any required dependencies. The difference (in my experience) is that Gentoo does this a lot better than Debian does. In Gentoo, there aren’t separate package trees for "stable" or "unstable" (etc) versions; there’s just one package repository. When new packages (or new versions of packages) are introduced, they are "masked" meaning they’re undergoing testing. It’s a similar idea to Debian, but it’s implemented much more simply. To install a masked package, you don’t have to convert your whole system to a "testing" distribution; instead, you just set a special variable when running the emerge command: ACCEPT_KEYWORDS="~x86" emerge packagename will install your masked package.
I said that I often ran into errors using Debian’s APT system. In contrast, Portage has "just worked" for me consistently. I’ve probably installed 200 packages including dependencies, and have only had problems with 4 of them. And they weren’t difficult, frustrating problems that took forever to solve -- manually installing the packages solved the problems, so that probably means whoever created the Portage version of the package just messed something up.
With the advent of such intelligent package management systems as Gentoo’s, I think the last significant problem with Linux on the desktop has been solved. The two other main problems were that the graphical interfaces available for Linux were ugly and hard to use, and that there weren’t as many/as good programs available for Linux compared to Windows.
The GUI problem has been largly solved in the past year or two; a modern Gnome or KDE Linux desktop is just as nice-looking and easy to use as a Windows desktop (and can be configured to look exactly like Windows, if you’re really addicted). The lack-of-programs problem is solved depending on what you need, with a few exceptions. For web browsing, email, instant messaging, office apps (word processing, spreadsheets, etc), watching movies, listening to music, image editing, and every other mainstream usage, there are (usually multiple) Linux programs that are as good as or better than their Windows counterparts. I do all that on my system, plus watch tv with my tv-tuner card, and record music with a free multitrack harddisk recorder program (ardour). There’s literally nothing I want to do that I can’t do in Linux, and that’s saying something since I’m a geek and use my computer for everything. The few apps that I hear people really complain about being missing are the Intuit apps (Quicken, TurboTax), and those you can run in Linux with Crossover Office (which I think costs $50).
Back to the topic at hand, systems like Portage solve the final important problem with Linux -- the fact that installing programs used to be a nightmare. It’s even arguable that this is now easier on Linux than on Windows, since with a single command, your programs are downloaded and installed for you. The initial installation of the operating system is still somewhat difficult, but the average user never does an OS install anyway; the OS comes pre-installed, or they have a friend/shop do it, because the average user would have trouble even with a Windows OS install.
Switching from defense to offense, what are the strengths of Linux?
♠ price. The OS and all applications are free. (Well, there are programs that you can pay for, but I’ve never needed a program that wasn’t free.)
♠ security. Since programs need special priveledges to modify/delete system files, viruses and social engineering rarely cause damage on Linux systems. Since most programs are open-source and are actively developed by multiple people, security bugs are fixed very quickly.
♠ community. Again, most programs are open-source and actively developed, and have mailing lists that you can read and ask questions on. If you need help or want to request a feature, mailing lists are your best friend.
♠ remote access. With simple secure tools (ssh and scp) that are standard on virtually every Linux distribution, you can log in to your system from anywhere, run commands, transfer files, do anything you’d normally do if you were sitting in front of it. With vnc and x11vnc, you can even use your GUI (mouse, etc) remotely, similar to the Remote Desktop feature in Windows XP (except that it’s been around for about 10 years on Linux).
♠ backup and system transfer/copy. With a single simple command (rsync -a --delete / /mnt/backup) you can make an exact copy of your system to another disk (or partition or directory). Do it regularly and keep it as a backup, or do it to move your system to a bigger disk, or just to clone a system.
There are lots of others, but those are some of the standouts.
In conclusion... I don’t actually have a conclusion, but this post got to be really big, so I feel like a conclusion is in order. I conclude that if you know a geek who can help you get past the initial installation, you should run Linux instead of Windows. The end.
Comments:
Reply to this message here:
Home – Create Post – Archives – Login – CMS by Encodable