Wednesday 30 August 2006

Transcript of talk with Richard .M. Stallman on the sidelines of 4th International GPLv3 Conference held at Bangalore

Various governments in the developing world are slowly but surely waking up to the advantages that free software can provide both in terms of monetary savings as well as otherwise. One example of this new way of thinking is the Kerala state government's decision to use Linux in all the government run schools in the state. In fact this trend is so prominent that even national dailies have started dedicating valuable column space for bringing the latest going ons in the free software community.

One example of such a leading Indian English language newspaper is "The Hindu" . This Indian national newspaper boasts of a readership of over 4.5 million and has cultivated an image of providing balanced and unbiased news. Recently they concluded a series of surveys about the social aspects and the aspirations of the Indian populace in association with CNN-IBN television channel (the Indian subsidiary of CNN). I am an avid reader of this leading Indian English daily.
Today when I came across the interview of Richard.M.Stallman on The Hindu, it left me with a warm and fuzzy feeling. So without further ado and completely pleading ignorance about any copyright issues, here is the transcript of the interview of Richard.M.Stallman as published in The Hindu newspaper dated August 31st 2006.

Transcript of the interview with RMS
Richard Stallman is the famous for his brushes with authority. In the 1970's at the Massachusetts Institute of Technology Artificial Intelligence Laboratory, he reset the system passwords to null strings because he didn't like restricted computer access. He went on to found the free software movement and the GNU Project, which saw him author the General Public Licence (GPL) that defined the four basic rights of computer users. A million programmers now reputedly contribute to free software. And now, after 15 years, GPL is ready for its next version, GPLv3. Stallman has drafted it with legal counsel from Eben Moglen, professor of law and history of law at Columbia University. Excerpts from an interview with Anand Sankar on the sidelines of the 4th International GPLv3 Conference held at Bangalore.

Q. What are the main changes in GPLv3 and why are they necessary ?

RMS: Various changes are proposed for various reasons, so there is no general reason. There are different kinds of reasons. First of all, some changes have to do with fighting against software patents. GPLv3 has an explicit patents licence and it has limited kind of patent retaliation. Consider if Company A is running a version of a GPL-covered program on which they have made improvements and they get a software patent for the technique that it will feature, and then someone else, B, makes similar improvements on that program, then A can't sue B. If they do, then they lose the right to continue maintaining that program. So it is a way in which we can prevent treachery to the community.

We have made the results of GPL as uniform as possible, independent of national copyright law variations. We have defined two new terms, propagate and convey, instead of copy and distribute.

Q. The new version takes DRM (Digital Rights Management) head on. India might also go ahead with an amendment to the Copyright Act of 1957. What consequences will it have ?

RMS: The media companies are trying to take total power over the public. They want to publish books, movies and music in formats that are encrypted and that are designed for the sole purpose of controlling the public. So their idea is that nobody should be allowed to or able to make a player, except with their approval.

The companies want to modify GPL-covered free software to restrict the user. The next thing they want to do is ensure we users cannot change the program again.

And this brings them in direct conflict with GNU GPL, which says you are free to change the program and redistribute it and the next person too has freedom to make the changes he wants to make.

Q. Do you feel that the Kerala (an Indian state) Government's decision to start using free software in schools is something that the rest of the country will follow ?

RMS: If you teach students to use proprietary software you are teaching them to be helplessly dependent on a particular company. And that is not good for society as a whole. So, the schools should not do it. What the Kerala government is doing is the right thing and all other states in India should be doing this.

Q. There are a lot of misconceptions about free software. What kind of an economic model does an entrepreneur look at when he starts out with free software ?

RMS: I want to ask you why that question is worth asking. First of all there are many people who don't have to make money. Importantly even if a person has to make a living, he doesn't have to make a living from everything he does.

Lots of people develop free software in their free time and there are people who have to make a living and they do make a living.

To jump from, this person is not rich and therefore has to work, to this person can't write free software because he is not paid to write it, is an error.

There are over a million contributors to free software, a substantial fraction is getting paid and a majority are volunteers.

I suspect the reason people bring up this question of economics as a secondary detail is because they are labouring under the misconception that the free software community is impossible, unless the developers are getting paid.

Q. Catering to local needs is a stated goal of free software but GPL itself has not been officially translated into local languages. Your comments.

RMS: We are trying to write the text in such a way that its results are as uniform as possible in all countries. And for the same reason, uniformity of results, we are not translating it. Every translation would be an opportunity to make a mistake. And any mistake could be a disaster. Free software must be written in English and the reason is it is the language understood by programmers around the world. Obviously to have that in other languages is a good thing. So, we have encouraged others to publish translations that are clearly marked unofficial and we link to them from our site. You can use it as a guide if you don't read English.

For more information on the GNU project visit gnu.org.

Update (2nd September 2006): A reader of this blog has kindly pointed out the online version of the interview (which I couldn't find earlier and hence the reason for this transcript) as well as another interview with RMS carried on the Financial Express (an Indian newspaper).
Ciaran O'Riordan points out to the transcript of the entire presentation (not the interview) given by Richard.M.Stallman at the 4th international GPLv3 conference held on 23rd August at Bangalore.

Sunday 27 August 2006

A Tryst with Debian Etch Beta 3

When a Linux enthusiast hears the name Debian, it never fails to instill in him some awe and respect. After all, this is the one and only not-for-profit Linux distribution which has singularly built up a name synonymous with security, stability and freeness.

I recently downloaded the latest offering of Debian which goes in the garb of Debian Etch Beta 3. Ironically, I came across the correct link through a comment inserted in a prominent news site (I will come back to it later). There are 19 ISO images to download if you want all the packages which runs to more than 17000 and their sources. But Debian also provides a net install CD image roughly around 140 MB size and is the preferred option for people who have a very fast Internet connection. And it is possible to install the distribution entirely via the net.

Since I was severely constrained in the Internet speed department, I decided to download the first CD image of size 643 MB praying that it would be enough to install a full fledged desktop environment. After a couple of hours, I had the ISO image successfully downloaded and burned on to the CD and I was ready to install Debian on my machine.

The main reason for trying out this version of Debian was to check out the new GUI installer which is considered to be a huge improvement over the previous versions. I booted my PC using the CD and I was shown a boot prompt. Here I had the choice of either installing using the text based installer or using the new graphical installer. Also it is possible to install the distribution using the expert mode or the normal mode. If you opt for the normal mode, the installer takes most of the decisions at the cost of your relinquishing fine grained control. Since I like having more control of the installation process, I opted for the expert GUI mode and diligently typed the command :
boot: expertgui
In fact, the installer provides a plethora of choices which one can learn about by navigating using the function keys F1 to F8 prior to entering the command.

Fig: New GUI installer of Debian

In a short time, I was face to face with the new GUI installer which had a clean spartan look to it. The installer in itself has only three buttons. Them being "Continue", "Back" and "Screenshot". The "Screenshot" button will grab the current screen and save it as a PNG image in the /var/log/installer/ location which can be later accessed after the installation is complete. This imparts a nice touch to the whole process as it will, at a later time, aid in providing a visual walk-through of the installation steps to a first time user of Linux.

Debian Installation Walkthrough
The installation steps I had to go through are as follows :
  1. Choose language
    • Choose a country or region
    • Choose locale
  2. Select a keyboard layout
    • Type of keyboard
    • Keymap to use
  3. Detect & mount the CD-ROM
    • Modules to load - like usb-storage, floppy and so on.
    • Start PCMCIA ? Yes/No
  4. Load installer components from CD
  5. Detect network hardware - automatic detection
  6. Configure the network
    • DHCP or static IP
    • Set hostname
    • Set domain name
  7. Choose a mirror of the Debian archive
    • Choose the country
    • Choose ftp location
    • Enter any HTTP proxy information
  8. Detect disks - Automatic detection
  9. Partition Disks - Would be nice to have a help button here for first time users. For others the whole process is intuitive.
  10. Configure time zone
  11. Configure clock
  12. Setup users and passwords
    • Enable shadow passwords? Yes/No
    • Allow login as root ? Yes/No
    • Set root password
    • Create normal user account
  13. Install the base system
    • Select the kernel image - There are 20 images including those with SMP support for 2.4 and 2.6 kernels.
    • Select tools used to generate boot initrd
  14. Configure the package manager
    • Should the network mirror be chosen ? Yes/No
    • Use non-free software ? Yes/No - Really interesting!
  15. Select and install the software
    • Participate in Debian popularity contest ? Yes/No - Cool!
    • Select your choice of packages
      • Desktop environment
      • Web Server
      • Print Server
      • DNS Server
      • File Server
      • Mail Server
      • SQL Server
      • Laptop
      • Standard System
  16. Configure Exim v4 - Mail Transport Agent
  17. Choose dictionaries to use (US, GB...)
  18. Install Grub/Lilo Boot loader or continue without boot loader
  19. Finish the installation
Even though there are a lot of steps involved in the Expert mode of installation (the number of steps are a lot less if you choose the normal mode), each of them are easy to understand. I found the installer to provide fine grained control over which modules to enable.

Usually most Linux distributions enable all the modules available even though many of them are not needed for the particular machine hardware. But Debian allows one to enable just those modules that are most suited for the machine. For example, my machine does not have infrared or wireless supported hardware so I can choose not to load the kernel modules related to these.

Another aspect which endeared me to the Debian way of installing Linux was the sheer number of kernel images available to choose from. The installer provided a choice of no less than 20 kernel images from the 2.4 kernel fit for installing on a 486 machine to the 2.6 kernel with SMP support for Pentium class of x86 machines.

Fig: Partitioning the hard disk

The Grub boot loader also correctly detected the WindowsXP and Ubuntu 6.06 OSes residing on the other partitions on my machine though it failed to detect the FreeBSD OS. But then none of the other Linux distributions I have installed till now have correctly detected the FreeBSD OS while installing the boot loader.

If one has a net connection, I recommend choosing the standard system install (while installing the software) which will install a base system sans X server. And then it is only a matter of installing just those software which are needed using apt-get. I installed the standard system and then with an additional 150 MB download, I was able to setup a desktop system with an editor (GVim 7.0), a web browser (Firefox 1.5.0), a window manager Xfce 4.0 (a very good alternative to the more common heavy weights Gnome and KDE) and two graphics software Gimp 2.2 and Inkscape 0.44. It is clear that the Debian team have upgraded the Etch repositories to mirror the most recent versions of the software. For example, the Inkscape build was compiled as recently as July 2006.

And after all this, my Debian partition utilized only 1.1 GB space which includes the space used by apt-get for storing the packages downloaded for installing. I have started liking the Debian way of installing only what is necessary which does away with a lot of bloat. And I am sure this will be the preferred way of installing Linux in the future when majority of the people have access to high speed internet.

I found the new graphical installer similar to the text installer of FreeBSD in that it is possible to jump around back and forth in the different categories of the main menu. And once a particular task is finished, you are placed back in the main menu.

Bootup times of Debian
The boot-up times are significantly faster than any of the other Linux distributions I have used barring say Vector Linux or Gentoo which are also equally quick to boot up. I dare say I found Debian Etch to boot up quicker than Ubuntu :).

The Pros of Debian Etch beta 3
  • Comes with an easy to navigate graphical installer
  • Has a choice of 20 Linux kernels suitable for a wide variety of x86 machines.
  • Gives fine grained control over the installation with options to enable/disable specific modules.
  • Installs the latest versions of software. A real surprise!!
  • Quick boot up times.
  • Stable and secure - you get the same unadulterated operating system which powers many of the high traffic servers around the world.
  • Support for setting up encrypted partitions during installation .
The Cons of Debian Etch beta 3
  • Internet access is mandatory to install Debian unless you are willing to shell out money to buy a set of CDs.
  • No out of the box support for Flash, Sun's Java, proprietary audio formats, closed source graphics drivers and so on. And these have to be included by the users themselves.
Now for some rants ... ;-)
I had earlier noted that I got hold of the link to Debian Etch beta 3 from a prominent news site where a reader had graciously included it in his comment. Would I have succeeded in downloading the correct ISO (in my case the most recent one) if I had tried looking on the Debian website ? In all probability I would have ended up downloading the wrong version of Debian.

Agreed, Debian supports lots of architecture other than x86 and all these need to be given equal representation. But how much effort will it take to provide a download link to the latest version of Debian simultaneously recommending a specific version for desktop users (even if it is in beta stage) on the main page of debian.org site ? I would guess not much. The download link provided at present takes the visitor to Debian Sarge which is too outdated for use as a Desktop.

On this note, and with due respects, I feel Debian team seriously need to acquire some lessons in the ABC's of marketing. Just because it is a not-for-profit organisation doesn't mean that it has to refrain from marketing itself and rely exclusively on well wishers and enthusiasts to spread the word. It is high time that the Debian site is overhauled and made more user friendly (Read less cluttered) . For one, a new Linux user will not be able to make head or tail about Sarge, Etch or Sid. But if he is told which ISO image is most suited for a specific purpose, that will go a long way in making the end-user experience towards Debian much better.

I would love to see a forum hosted on the official site where users can post queries and help each other on Debian specific problems. If Debian has to regain the lost ground on the popularity front, then it has to take the desktop users (newbies ??) more seriously. While a system administrator is capable of taking care of problems he face with Debian himself, an ordinary user will need some directions. And Debian's capability in handling this section of users will decide how popular Debian gets in the coming future.

Having said all this, I eagerly await the final release of Debian Etch.

Friday 25 August 2006

One Laptop Per Child gets a new name and a slightly higher price tag

From the time of its inception, the project of providing one Laptop per child has managed to catch my imagination. It is my belief that a computer will bring the best from within a child and will give growth to its imagination. The only hurdle of providing a computer to every child is the obvious cost factor. And this exciting project spearheaded by MIT could be the corner stone for change in the way children see learning.

Ars Technica writes that MIT one laptop per child project has finally got a name and will be known as Childrens Machine 1 (CM1). But the name is not the only aspect that has changed. The laptop will be scaled up to provide a 400Mhz AMD Geode processor, 128 MB of DRAM, built-in wireless support and 512 MB flash memory storage. The down side is that it is no longer the $100 laptop project as the price has climbed slightly to $140.

Those who keep track of this illustrious project will recall that this laptop is available for sale only to governments and only at a minimum order of 1 million pieces or more. OLPC team runs a wiki giving all the latest news about this project and the progress it is making in achieving its goal.

Wednesday 23 August 2006

How to setup a home web server - the Red Hat way

In previous posts I have explained how to set up apache server to serve webpages on ones personal machine. But then I explained how to do it the Debian way. Obviously there are other ways of doing the same thing. For instance, if you put the same query (of setting up apache webserver) to a person running RedHat, the steps he will list out will be some what different from what I had covered. By now you will realise what I am getting to. There is obviously a Red Hat way of doing things and a Debian way of doing things.

In Debian based distributions, the apache webserver uses a modular structure of storing configuration information using different files for each site which makes it less cluttered. Where as in Red Hat, you have a single flat file called httpd.conf in which you add all the configuration details of each site.

Jeff Goldin has written an excellent article which explains the Red Hat way of setting up a home web server which makes a very interesting read. But apart from the minor differences, the article contains lots of information like the type of hardware to select, the partition scheme to use, increasing the level of security of the webserver and so on which I believe are important concepts which could be applied while hosting a web server in other Linux distributions too.

Monday 21 August 2006

Shouldn't this be reason enough to opt for an Open Source development model?

In a traditional software development model, the client asks for one thing and is delivered something else not to say anything about the galloping costs involved as is rightly illustrated by the cartoon below.

Fig : Traditional software development model (original link)

Shouldn't it be reason enough to take a long hard look at the development model of open source software instead of sticking to the traditional model of software development with its all too evident fallacies?

Disk Encryption Tools for Linux and benchmark result of a couple of them

Consider this scenario... Your computer running Linux somehow ran into a hardware glitch and had to be hauled to the neighborhood Computer service center. And you are asked to leave the machine at the service center and come back after a couple of days so that the technician can have a good look at it. But you are a bit worried because the harddisk contains the blue prints of the most secret project you are currently working on ;-) .

Ever been faced with such a situation where you have to contend with wringing your fingers in despair ? This is where the use of an encrypted file system gains prominence. If you had created an encrypted volume on your hard disk and were in the habit of saving all your sensitive data on to the encrypted volume, you could have had a sound sleep while your computer is being repaired at the service center.

In Linux there are a number of solutions to create encrypted volumes and encrypting and decrypting data on the fly. Some of them are as follows:

Qryptix - Qryptix consists of a PAM object and utilities for session- and key-management for encrypted home directories using the International Kernel (CryptoAPI) patches for Linux. It simplifies login/logout, mounting/unmounting, and key generation and changing. Unfortunately, it needs selinux to work properly. One OS which has selinux installed is Red Hat/Fedora.

eCryptfs - An enterprise-class cryptographic filesystem for Linux.The kernel module component of eCryptfs is upstream in the -mm tree of the Linux kernel.

Truecrypt - One of the best and easily available disk encryption solution for both Windows and Linux platform.

Encfs - EncFS provides an encrypted filesystem in user-space. It runs without any special permissions and uses the FUSE library and Linux kernel module to provide the filesystem interface.

LUKS - LUKS is the upcoming standard for Linux hard disk encryption. By providing a standard on-disk-format, it does not only facilitate compatibility among distributions, but also provide secure management of multiple user passwords. In contrast to existing solution, LUKS stores all setup necessary setup information in the partition header, enabling the user to transport or migrate his data seamlessly.

dm-crypt - Dm-crypt is a device mapper target which provides transparent encryption of block devices using the new Linux 2.6 cryptoapi.The user can basically specify one of the symmetric ciphers, a key (of any allowed size), an iv generation mode and then he can create a new block device in /dev. Writes to this device will be encrypted and reads decrypted. You can mount your filesystem on it as usual. But without the key you can't access your data.

CryptoFS - CryptoFS is a encrypted filesystem for Filesystem in Userspace (FUSE) and the Linux Userland FileSystem (LUFS).

Justin Korelc and Ed Tittel have done an interesting benchmark of three of the above encryption tools namely LUKS, EncFS and CryptoFS, and have posted their findings online. And their verdict is that LUKS shines over the other two in the ease of use department because of better integration with the Gnome desktop and PGP keyring management facilities.

Thursday 17 August 2006

Interview with Fedora Project Leader Mark Spevack

Fedora project leader Mark Spevack gave an interview to slashdot where he answers questions about Fedora and what significant part it will play in the future of Linux. He is asked about the worst aspect of Fedora, enabling NTFS support in future versions of Fedora, comparison of Fedora with Ubuntu and many other interesting questions which surprisingly, he tackles and answers in a very honest and clear manner. And when confronted with the problem faced by Linux on the closed source (propritery) drivers, his solution is to vote with your wallet. Buy only those hardware for which you have open source drivers readily available.

One grouse I have about Fedora is its crappy way of updating/installing software. In the past, when ever I tried installing software in Fedora using Yum, I have found it to take trible the time it takes an apt-get to install the same software in a Debian based distro. Of course it could have been a case of using the repository in the wrong geographic location. But still Yum cannot be compared with apt-get in efficient management of packages on one's system. Fortunately, there is a version of apt-get available for Fedora too.

Monday 14 August 2006

Is the word Ubuntu in "Ubuntu Linux" over-hyped ? Decide for yourselves.

Recently I read an article lambasting or rather dissecting the role that Ubuntu has played in the Linux community and according to the author of the article titled - Ubuntu vs. Debian: What Canonical Doesn't Want You To Know , it boils down to clever marketing on the part of Ubuntu folks or in other words marketing gimmicks. In fact I see this growing dissent towards this distribution in different quarters - mind you, not from the actual users of Ubuntu who love this distribution but from people who are affiliated with or use other Linux distributions.

So as a long time Ubuntu user, I asked myself this very important question. Is the Ubuntu word over-hyped ? Is Ubuntu Linux riding on the wave of good will, on top of what this word actually mean ? And finally what makes Ubuntu Linux stand apart from the rest of the Linux distributions ?

Ubuntu is a word in a South African language (Bantu) which has lots of meanings associated with it. And this word originated long before the birth of Linux. When the South African space tourist Mark Shuttleworth adopted this word for his Linux distribution, the word came to be synonymous with the Linux distribution and it gained world wide popularity.

I believe the word Ubuntu has gained as much popularity from being associated with this Linux distribution as the Linux distribution has gained from the word. In other words it is a symbiotic relationship as far as the Ubuntu word and Ubuntu Linux are concerned.

Even though I have been using Ubuntu as my main Linux distribution, I have used and still use other Linux distributions too. If you look around this site, you will find reviews of different Linux distributions as varied as PCLinuxOS, DSL, Vector Linux, Gentoo .... why just Linux, I have also reviewed a couple of non-Linux OSes such as FreeBSD and PCBSD. So my intention in raising this sensitive topic is not as much as to throw dirt at other Linux distributions rather to see why Ubuntu Linux has gained so much popularity and if it is worthy of it.

The number one incentive for me to try out Ubuntu in the first place (earlier I used to be a Fedora user) was that they were shipping (still do) free CDs of Ubuntu Linux to anyone around the world. This at a time when I had to do entirely with dial-up connection to log on to the net. Even now, in most third world countries, the word broadband Internet is synonymous with just 256 Kpbs speed and many people do not even have this luxury. So the vision of Mark Shuttleworth to hand out free CDs of Ubuntu to anybody who asked for it became a major hit world wide. In fact by doing so he made Linux free not only as in freedom but free as in beer too.

But handing out free Cds alone will not bring as much popularity as it now enjoys. After all if that were the case, all software (proprietary or otherwise) which we receive free of cost would have been as popular. So there are more reasons...

Any long time Linux user will agree with me that Debian holds a special place in their hearts. It has the largest number of packages in its repository than any Linux distribution can boast off. It is stable and can be compared at par with other Unices in its uptime. But one area where Debian developers lagged behind was in creating a separate Linux distribution or should I say a separate entity which the ordinary user could easily install and use on his desktop. Debian was and is more server oriented. I had the fortune of installing Debian on my machine a couple of months back. Even though I was able to successfully install, use and enjoy it, I had to struggle a bit in figuring out how to install it. The installation is far from newbie friendly. I would go so far as to say I found installing FreeBSD to be much more simpler than installing Debian. This is one area where the Debian developers have to direct their efforts to. And I believe something is being done to address this shortcoming. But that is not all, since it is a Linux distribution oriented to be run first and foremost on the servers, it comes with a lot of additional packages which a normal user will not have a clue as whether to install or not. So some one who needed a newbie friendly Linux distribution based on Debian had to put up with trying out Knoppix Live CD.

Ubuntu Linux, I believe addressed this need. They took all the good things that Debian has to offer - its excellent repository of packages, its superior package management, its stability and then took it one step further by fine tuning it to make it newbie friendly by making it possible to install it on to the hard disk by a mere six steps.

If you look closely, Ubuntu has made a number of refinements which make it an ideal OS for the home user. For one, it comes with all the ports closed by default. As a home user who's only interaction with the Internet is web browsing, checking mails and playing a few online games, the last thing I need are unnecessary services running on my machine. And Ubuntu Desktop has made it possible. As a comparison, I found a default Fedora desktop installation to run more than 10 services including Avahi (what ever that means). And in Ubuntu the only service listening on the tcp/udp ports is the cups printing service. Run the following command to see all the services that are listening on your tcp and udp ports to get an idea.

$ netstat -taup
Or better still, run nmap to find the open ports on your machine running your favourite Linux distribution and then see how many are open in Ubuntu.
# nmap -sS <your machine IP address>
Ubuntu Desktop does not ship with even a compiler. So it is impossible for a person who has access to your machine to write some malicious code, compile it and then make it to run during startup, in the process compromising your machine.

And for those who want such facilities as a compiler and ssh daemon running on your machine, it is a simple case of using apt-get to download and install the necessary packages in which case you are expected to know what you are doing and its consequences.

The point I wish to make is that Ubuntu team has been successful in addressing the need for a truly user friendly secure Linux desktop that the other major players failed to address. And the potential Linux users were quick to grab at what was offered and then form a community around it.

And while on the topic of community, One of the biggest strengths that any Linux distribution can aspire for is a large user community revolving around it. It is the community which imparts life to any Linux distribution and it aids in spreading the word about the distribution. And Ubuntu has been a success on this front. This might bring up a question in someones mind. Can a community be brought ? Not at all. It is something that happens when passion is involved. If not for passion, why then would some people spend many man hours developing scripts such as Automatix or EasyUbuntu and then share it freely with the rest of the Ubuntu users ? Why would anyone go out of their way to help in getting some things working right in Ubuntu ? Are these people on the payrolls of Ubuntu? No. It all filters down to passion. And passion my friend, cannot be brought. It is something that comes from within. In my opinion, Ubuntu has taken all the right steps in stirring this passion in its users which is why it enjoys an ever growing community.

That doesn't mean Debian doesn't have a strong community following. But the community revolving around Debian consists of an entirely different set of users. That is those who are more system administrators than ordinary users. And it is anybody's guess that normal users far out number the system administrators. I myself am a fan of Debian and I am sure that any day they bring out a newbie friendly distribution at par with what Ubuntu is now, a community will form around it and start promoting it.

Lastly, by choosing to use the packages in Debian Sid, Ubuntu has made sure that its users enjoy the latest versions of the software they have grown to love. I may also add that the hardware detection of Ubuntu is above par and is known to detect the widest range of hardware out of the box.

To sum up, these are the factors that has catapulted Ubuntu as the number one Linux distribution in terms of popularity. They are as follows:
  • Distributing free Ubuntu CDs to anybody who asks for it.
  • Designed with the home user in mind.
  • Excellent hardware detection - a quality I believe is shared by other Linux distributions.
  • A strong user community backing its growth.
  • Containing the latest versions of the software packages.
  • Managing to grab the average user's imagination with its association with the Bantu word Ubuntu.
Update (Aug 16 2006): My intention of writing this article was not to disparage any Linux distribution be it Debian, Fedora, SuSE or any other which I am sure enjoy its right place in the Linux community. But I strongly believe that if any Linux distribution has to excite the passion in the general public in the same level as Ubuntu has then they should discard this one size fits all principle. Which means they should delink the distribution for the ordinary masses from that for running on the servers. I believe that if these main stream (and other) linux distributions take a leaf from Ubuntu and bring out a truly newbie friendly distribution - sans sshd and other unnecessary services and without the compilers and such, I am sure they will be well recieved by the common public (average joes) who wish to run Linux on their machines.

Sunday 13 August 2006

Book Review: Web Design in a Nutshell - A Desktop Quick Reference (3rd Edition)

Anybody who has had anything to do with the World Wide Web would surely have at one point of time or the other taken a pause to marvel at its rapid growth and the technologies that have spurred the growth. The most abundant and widely used part of the world wide web are websites which allow anybody with access to a web browser and an internet connection to view and share information.

The fundamental building block of a website has been HTML - Hyper Text Markup Language. But owing to its obvious limitations and semantic faults, a conscious effort was made by the standards group W3C.org to not only develop an alternative markup language based on XML called XHTML but also separate the markup from the style through the use of CSS which imparted much more power and flexibility to the people designing websites.

I found the book "Web Design In A Nutshell - A Quick Desktop Reference" authored by Jennifer Niederst Robbins and published by O'Reilly to be a one of a kind book in that it covers all the important concepts right from HTML 4.1 to XHTML, CSS, Java Script and much more. In fact, I found the book to encompass all that is needed to know about the semantics of markup language with stress given to the latest developments in XHTML and CSS.

All of 800 pages and divided into a whooping 36 chapters, I found this book to be a valuable reference for programming and designing solutions for the web.

The 36 chapters are further divided into 6 parts with the first part of the book (all of 6 chapters) giving an introduction to the web standards and their advantages. The Web design and development are divided into different layers such as the Structural layer, the presentation layer and the behavioral layer and in the first part of this book, the author gives a short explanation of each of them. This part also contain tips to keep in mind while designing for a variety of web browsers as well as for various displays which makes interesting reading. A separate chapter has been provided which explains the things to keep in mind while creating websites which can also be used by people with disabilities. In the chapter titled "Internationalization" in the same part, one gets to know the various character sets and encoding such as Unicode and how it can be specified in XHTML.

The second part of the book deals with the structural layer which pertains to XML, XHTML and HTML. In this part spread over 9 chapters, all the tags be they HTML or XHTML are listed with an explanation. The chapter 7 titled "Introduction to XML" gives a sound introduction to the eXtensible Markup Language which form the backbone of XHTML. I found this chapter to give a good understanding of XML, the various tags as well as the XML Document Type Definition file which imparts meaning to the tags. This chapter does not cover XML in depth but succeeds in giving all the data that needs to be known by a web developer.

Chapter 8 titled "HTML and XHTML Overview" concentrates on explaining the fundamental differences between the two. This chapter also lists the different flavours of HTML 4.01 and XHTML. The 9th chapter titled "Document Structure" of this comprehensive book is very important in the sense that it imparts a good understanding of the document structure which obviously has slight variations depending on the standards which the web developer wishes to support.

The third part of this book concentrates on cascading style sheets which falls in the Presentation layer. In the background of the furore created in the web arena on the flexibility and the freedom to experiment that CSS provides to web developers, this part of the book gains a lot of prominence. The author starts the narration with a chapter on the fundamentals of CSS where the advantages of using CSS is impressed on the readers. Then the author explains different ways in which styles could be added to documents,the XHTML document hierarchy and how CSS styles are affected by it.

Over a span of 10 chapters, the third part of this book covers each and every aspect of CSS be they selectors or properties. And where ever possible, the author has also given pictures which show how the text will display in the browser when a particular style is applied. I found this section to be an invaluable resource for coming up-to date with the latest developments in CSS. What is interesting is that over and above listing and explaining each and every CSS property, the author has also inserted at regular intervals tips as well as the pitfalls to look out for while designing websites which are to be viewed using different web browsers.

In the 25th chapter titled "Managing Browser Bugs: Workarounds, Hacks, and Filters", the author addresses the most common web browser bugs that one will encounter when designing with CSS and how best to work around them so as to give a consistent view of the website on all the web browsers.

The fourth part of this book deals with the Behavioural layer and contain two chapters on Java Script and the Document Object Model. In the first of the two chapters in this part of the book, the author starts with the basics of Javascript, its syntax, the different Javascript objects and finally the event handlers.

Document Object Model (DOM) is an application programming interface which is used to access and manipulate the contents of an HTML or XML file. The succeeding chapter titled "DOM Scripting" takes a good look at this topic. Here the author explains how to manipulate documents using DOM functions. But the section which excited me the most was the part where the author gives an introduction to Ajax which by the way has been included in the chapter as a supplement.

Writing code and designing beautiful websites using CSS is not enough to create an efficient website. It is also important to have a fine understanding of the different types of image formats that can be used as well as their pros and cons. And reducing the file size of an image is a part of the job of optimizing a website. On this note the fifth part titled "Web Graphics" gains prominence. Spread over 5 chapters, this part gives an in-depth understanding of the main image formats which can be used on the web such as the Jpeg, (animated) Gifs and PNG and tips on picking the right format for different situations.

The final part titled "Media" spanning 4 chapters dwell on the part that multimedia plays on the aesthetics and functionality of a website. More specifically, the author discusses the different audio formats and their usefulness in providing audio on the web. While on this subject the author is pretty detailed in her explanation of the different ways in which audio can be shared with the end user like streaming audio. A section titled - Preparing your own audio - points at the different avenues that a web developer has when faced with integrating audio solutions with his website. I especially liked the part where the author while dwelling on each and every audio format, explains with the aid of a table what that particular audio format is good for, what tools can be used to create it as well as the players which support the particular audio format. There is also a table giving authors suggestions of the ideal audio formats for different audio needs.

In the subsequent chapter titled "Video on the Web" the reader gets a taste of the video formats and the code which helps in embedding video in a web page especially when it is meant to be streamed. Considering the important part that Flash plays on the richness of the websites, a separate chapter has also been included giving an overview of flash, the tools used to create flash and code used in embedding flash in a website.

The book also contain 5 Appendices which stand out as excellent resources for reference such as HTML Elements and Attributes, CSS 2.1 Properties, Character Entities, non-standard color names and their numerical values and so on.

Book Specifications
Name : Web Design in a Nutshell - A Desktop Reference (3rd Edition)
ISBN No: 0-596-00987-9
Author : Jennifer Niederst Robbins
Publisher : O'Reilly
Number of Pages : 800
Price : Check at Amazon.com
Rating : Excellent

End Note
I found this book to be an excellent reference for designing websites be it on Linux or any other platform. Considering that this book is well into its 3rd edition with supposedly over 200,000 copies in print, I believe a lot of people share my views about this book. The author has done a splendid job of not only covering all the concepts but over the years, the book has been significantly revised and updated to keep up with the rapid advances in the technology related to web design. And the latest edition has been released only in February 2006. Obviously the author considers this excellent book to be a continuous work in progress and as the technology gets more mature, we can hope to see more updates and edits which would preserve the relevance of this book.

Thursday 10 August 2006

Simplifying data extraction using Linux text utilities

I remember, the first time I was introduced to Unix - Yes my first experience with a POSIX OS was with Unix, more specifically SCO Unix and not Linux - the instructor told us that the real power of Unix was in its accomplishment of complex tasks by splitting them into smaller tasks which in turn are split into even smaller tasks and then assigned to different utilities. And the output from these utilities is combined together to get the desired solution. In management speak, it is known as efficient delegation of duties which makes Unix/Linux a winner. Compared to this, in Windows, you have a monolithic software doing all the tasks by itself which leads to unnecessary duplication and waste of resources.

As an example, take the case of spell checking for instance. In Linux you have a utility called aspell which does the spell checking. This is regardless of which application you use - be it Abiword, Vim or OpenOffice.org, when you select the menu to spell check (assuming there is one), it will be passing on the task to aspell; and aspell will in turn pass back the result to the application. But in windows each application has its own spell check code inbuilt in it.

I find combining different utilities to achieve complex tasks in Linux/Unix really fascinating. My favorite one is using a combination of 'find' and 'grep' to search for a particular string in files on my hard disk and list the files which contain the string. This I achieve as follows:
$ find . -iname \*.txt -exec grep -s -l "Linux" {} \;
... The above command will search for and list all the text files with txt extension containing the word "Linux". Try out the command on your Linux machine and see the output. For more information on using the find utility, read this article.

Now try accomplishing the same task in Windows and you will understand what I mean. If you ask me, these little utilities which are bundled with all *nixes are the work horses which impart the sheer power to a POSIX operating system in the first place.

Today I came across this very informative introductory level article written by Harsha S. Adiga which explains how to use some of the most common utilities found in Linux to accomplish numerous day to day tasks. The author explains each tool - and 9 of them are covered - with the aid of examples. Reading this article made me really nostalgic because nowadays, with the beautiful desktops we have in Linux, even I feel a bit spoilt and do not use the command line as frequently as I used to.

Saturday 5 August 2006

Building a Personal Firewall in FreeBSD using FirewallBuilder

Whether it be a production grade server or a home computer, as long as it is connected to the outside world, running a firewall is unavoidable. GNU/Linux comes with its own firewall called iptables (netfilter). And there are a number of front-ends available such as Firestarter which make maintaining a firewall childs play especially on the desktop. In the past, I have published articles explaining how to setup a firewall in GNU/Linux.

Dru Lavigne, an instructor and FreeBSD advocate has written an excellent tutorial explaining how to setup and use Firewall Builder - an object oriented GUI and a collection of policy compilers for various firewalls including pf (packet filter) the firewall used in FreeBSD. The tutorial takes the readers through setting up (installing) firewall builder software in FreeBSD to creating a firewall ruleset to controlling the firewall and lastly fine tuning the rules.

Friday 4 August 2006

An Interview with the KDevelop Team Members

Ask me which is the most user friendly programming IDE available for Linux platform and I will without a pause tell you that it is KDevelop. True, you have got a plethora of IDEs and editors from the ubiquitous Vi (clones) to those which are meant for coding in a particular language such as bluefish. And then there is the GTK(+) GUI user interface builder in Glade. But when it comes to seamless integration between the code and the design, KDevelop leaves its competition far behind. Last time I checked, it supported 12 programming languages.

The latest version of KDevelop is 3.3.4 and version 4.0 is in the making. The yet to be released version 4.0 is precieved to have support for Qt 4.0 based projects, improved code completion and improved UI interface. The site dot.kde.org has a very nice intervew with three KDevelop developers Matt Rogers (lead maintainer), Adam Treat (programmer) and Alexander Dymo (maintainer) where they talk about KDevelop 4.0 and the features it will have when it is finally released.

Thursday 3 August 2006

Host a personal diary on your PC using WordPress

A couple of months back, I did something really interesting. I downloaded the WordPress content management suite from the wordpress.org site and installed it on my machine. The installation as such was a simple affair of unpacking the WordPress files in the desired location - I unpacked it in the '/var/www/' location - and in no time I had a robust blog hosted on my PC.

The prerequisites for getting WordPress up and running are - you need to have PHP, MySQL and Apache web server running on your machine. In an earlier post, I had explained how I configured MySQL and hosted webpages using Apache webserver. The steps are the same for wordpress too.

So why did I do such a thing? Well, I am in the habit of maintaining a diary containing my day to day experiences as well as jotting down my thoughts on topics close to my heart. After installing and trying out wordpress, I decided to write on the blog rather than the diary. And now a days, I write my day's thoughts on the WordPress blog I have hosted on my machine. Already I see a lot of advantages to this form of documenting. For one, my family can pull up my blog and read about my day as well as get to know about things which I may have failed to reveal to them. Not only that they can post comments on the blog sharing their view point.

And in the event that I do not want even my family to read a post, WordPress has a feature of password protecting individual posts which comes handy. This project of hosting WordPress blog on my machine has become such a hit with my family that I have created accounts for my whole family and each one of us put to words our thoughts and tribulations as well as document interesting stuff like recipes, jokes that one came across and so on.

Fig: My (Wordpress) Diary hosted on my own machine

I find the search feature of WordPress really useful. For example, the blog (my diary) now has over 20 posts. And if I want to find a particular post all I have to do is search for it and since wordpress uses MySQL database as the back-end, the search is very fast and accurate.

Editing and managing WordPress blog is a dream come true for any person who is into writing. For one, you have a WYSIWYG kind of editor which though not having all the features of a word processor, has sufficient formatting functions like bold, italic, text alignment and image insertion ... which makes it a joy to write and publish content. And the interesting thing is that it produces correct XHTML code.

Another feature which I really like in WordPress is how one can edit the comments that were made by others. For example, this blog (All about Linux) has seen a fair share of spam in the comment section and one feature lacking in a blog hosted on blogspot domain is that you cannot edit the comments that others insert in your post. Now a days when a good blog could easily be bogged down by spam and flame comments, this feature is god send. And what is more, one can track the IP address of the comments made, and the blog author also have the choice of approving or unapproving a comment which goes a long way in maintaining the sanity of the blog.

Categories in WordPress
WordPress allows one to create categories. For example, this post, if it was published in a wordpress blog could have been tagged wordpress or content management and that makes it easy to navigate. I believe any content management software worth its name should support categories. In wordpress, if one clicks on the particular category, all the posts related to the category are displayed. What is more, it is possible to associate each post with multiple categories. So if I write an article on programming in Linux, I can tag that article in both the Linux as well as Programming categories.

Themes in WordPress
One of the most alluring aspects of wordpress is the ease with which one can create themes. This simple way of creating themes have helped spawn a humongous collection of themes which are free for use by anybody using wordpress. There is a theme to suite any purpose - from the simple two column theme to the complex three or more column theme with heavy graphics. And some very popular themes like K2 have an inbuilt user interface which allows one to make minor changes to the theme layout without touching the underlying code what so ever. I dare say that Wordpress has the most number of themes when compared to other content management suites.

Plugins bring more functionality to Wordpress
The last time I checked, there were hundreds of plugins available for Wordpress. Plugins are pieces of PHP code which can be easily used to add one or more features to the default Wordpress setup. And it is easy to install plugins. Just download the plugin archive and unpack it in the plugins directory in your Wordpress blog. After that all that is needed is to enable the plugin which is done by navigating to the Plugins section in the administrator panel of your blog and activating the plugin.

Steps for Installing Wordpress on ones machine
Before installing Wordpress, it is important to have a running MySQL database server, a web server (preferably Apache) and also the necessary PHP modules installed. In previous posts, I had explained how to configure Apache webserver to host websites on ones machine as well as configuring MySQL database. Once these prerequisites are met, navigate to the official WordPress website and download the latest version of the software. As of this writing, the latest stable version of Wordpress is 2.x. Once downloaded and unpacked, copy the directory into the Apache webserver document root - in a default setup, it is /var/www. On my machine, I unpacked it in the '/var/www/myblog/' folder. And then opened it in the web browser by typing the following address: 'http://localhost/myblog/' . The exact steps of unpacking the Wordpress blog are as follows:
$ tar -xvzf wordpress-2.0.3.tar.gz
$ sudo cp -R wordpress/ /var/www/.
$ cd /var/www/
$ sudo mv wordpress myblog
Since it is the first time that I am opening the blog, it will be opened in setup mode and will initiate the installation process. In particular, it looks for the file 'wp-config.php'. This file should contain all the details about the database such as the username and password needed to connect to the MySQL database, the database name and so on. I found a sample file by name 'wp-config-sample.php' in the blog folder. I just renamed it to wp-config.php and edited the file to mirror my database, username and password. On the other hand wordpress itself will offer to do it for you but if the blog folder does not have sufficient permissions, then it would cause some problems. So it is always fail safe to edit the file by hand.

Of course, it is understood that you need to have a database created with the same name entered in the wp-config.php file prior to the above steps and it should be accessible using the username and password entered in the wp-config.php file. I created the mysql user and the database as follows:
$ sudo mysql
Create the user ravmad
mysql> Create user 'ravi' identified by 'mypassword';
Create the database by name db_myblog
mysql> create database db_myblog;
Grant the rights to the user ravi for the database db_myblog
mysql> GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX,ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON db_myblog.* TO 'ravi'@'localhost' IDENTIFIED BY 'mypassword';

mysql> quit
My wp-config.php file contents after editing is as follows:
<?php

// ** MySQL settings ** //
define('DB_NAME', 'db_myblog'); // The name of the database
define('DB_USER', 'ravi'); // Your MySQL username
define('DB_PASSWORD', 'mypassword'); // ...and password
define('DB_HOST', 'localhost'); // 99% chance you won't need to change this value

... //lines removed for brevity
?>
Now I refreshed the page http://localhost/myblog in the web browser. And Wordpress populated the database db_myblog with the necessary tables as well as generated a unique password for the username 'admin' which by the way has administrator privileges. It is recommended to login as administrator and change the password to one that is easier to remember lest you forget the Wordpress generated password.

Fig: First step of the two step Wordpress installation

Fig: Second and final step.

A simpler way of accomplishing the above tasks
If you feel editing the wp-config.php file by hand to be too much of a chore, you can let Wordpress walk you through the process via the web interface. But the catch is that your blog directory should have the correct permissions. I found that if the blog directory had the user ownership as your username and group ownership as that of the Apache webserver then wordpress is able to edit the files in the blog directory without any problems.

To check the user account used by Apache web server, I ran the command as follows:
$ ps aux|grep apache2|cut -d" " -f1 |head -n 2
root
www-data
The above output tells me that the first apache process (the parent) is owned by root and the Apache child processes spawned by the parent run with the user id www-data. So for successful editing of the files, I need to change the group ownership of the blog directory to www-data and give write permissions to it. This I achieved as follows:
$ sudo chown -R ravi.www-data myblog
$ sudo chmod -R g+w myblog
After executing the above steps, if I do a long listing of the blog directory, I will get the following output.
$ ls -ld /var/www/myblog
drwxrwxr-x 5 ravi www-data 4096 2006-08-04 07:47 myblog
Now I can go back to the web interface (http://localhost/myblog/ ) and finish the installation.

Important: I have seen some content management systems suggesting to give the blog directory write permissions to everyone (777) for ease of installation. But I believe that is a security issue especially when you are hosting your blog on a shared hosting plan where each website is just a directory. Now it is clear to see that if you give your blog directory write permission for everyone, then others who are hosting websites on the same server as yours can have free access to your files and can easily compromise them.

Once I finished setting up WordPress, it became the blogging platform of choice for my whole family. In fact, we use it to document just about anything from contact information to recipes to ... in fact any data that needs to be remembered. And it is much more fun than writing in a book.