Wednesday, 28 June 2006

Check if your computer is subjected to a DDoS attack

DDoS - short for Distributed Denial Of Service attack occurs when multiple compromised systems flood the bandwidth or resources of a targeted system, usually one or more web servers. The usual symptoms of a DDoS attack are a sudden sharp increase in processor activity which is experienced by your computer getting sluggish.

If you are running Linux or any Unix variant, there is a simple method to find out if your computer is under a DDoS attack. This method uses a combination of tools such as netstat, grep, awk, uniq, cut and so on to filter out the unnecessary output and get only the relevant parts. And you get to know the IP address of the machine connecting to your machine and the number of connections to your computer from each of them.

$ netstat -an | grep 'tcp\|udp' | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
In the output of the above command, the more connections you see for each IP address, the greater the probability of that IP address being used for a DDoS attack. Please note that if you are browsing multiple web pages of the same site at the time you run this command, it will show up in the output of the command and should not be taken as a DDoS attack.

Usually a DDoS attack occurs on servers running web services and is targeted at bringing down the server. But it could happen to your personal machine too if it is infected by malware.


You can read a more detailed explanation of the above command at pingpros.com

Monday, 26 June 2006

Lesser known Drag and Drop tips in Gnome

Recently, while I was reading an article in the Firefox web browser, I happened to accidentally click on a link and drag it outside the browser. And the result was rather interesting. Gnome obediently created a shortcut to the webpage pointed by the link on the desktop. Seeing this and having piqued my curiosity, I decided to try a few other things and to put it lightly, I was surprised and pleased with the results. These are my findings...

Create a shortcut of a webpage or a link in a website on the Gnome desktop

Click on the web link in the web browser and drag it on to the desktop. Gnome will automatically create the shortcut (link) to the remote location pointed to by the link you just dragged.

Note:
It also works if you select the link in the Firefox address bar and drag it to the desktop.

Save a block of text from a webpage to a file

Select the text in the webpage and drag it on to the desktop. Gnome will auto-magically create a text file and save the selected text into it. You will also be prompted to rename the file to a name of your choice.

Fig: Showing the drag'n drop possibilities in Gnome

Copy an image from a remote location.

While the image is displayed in the web browser, just click on the image and drag it to the desktop. Gnome will copy the image from the remote location. Take care to drag the image rather than the link pointed to by the image.

Note: These tips are also possible from any application. Not necessarily from the web browser. In fact, I was able to copy a block of text to another file from Gedit. I was able to do this even from evince - the PDF viewer.

These features only prove that Gnome has a very advanced drag and drop functionality built in it.

Book reviews - Linux, Programming, web development, troubleshooting

These are the sum total of the book reviews you will find on this blog. Some of my book reviews have also been published on Slashdot.org. The books are listed in no particular order and contain a mix of Linux system administration, programming, web development, web design and Linux troubleshooting books brought out by a diverse group of book publishers. All the books are quite popular and have been released in or after late 2005. Click on an image to read the review of the respective book.


Saturday, 24 June 2006

Making a list of URLs from an ftp site to download using wget

Recently, I decided to download Debian distribution and visited the official Debian website. Debian allows one to download the distribution in a variety of ways. Some of them being via torrents, Jigdo and of course as CD images (ISOs).

I prefer downloading the ISOs because it is much faster than torrents (especially if there are not that many people seeding the torrents). Also if you have a 256 Kpbs or less internet connection, your best choice is to download the ISOs.

So I navigated to the Debian download page and I encountered a bunch of links pointing to the ISOs.The full Debian distribution takes up as many as 22 CDs.

I wanted to copy all the links and enter them in a text file so that it can be easily passed on to a downloading program such as wget. In Linux this is easily achieved using a combination of lynx (console web browser), grep, head and awk. This is how it is done :

$ lynx -dump http://cdimage.debian.org/debian-cd/4.0_r0/i386/iso-cd/ |grep 'iso$'|awk '{ print $2 }'|head -n 21 > my_url_file.txt

The above command will download the source of the file using the -dump option in lynx and filter only ISOs, select only the path name of the ISO and save it to the file 'my_url_file.txt'. A cursory glance of the 'my_url_file.txt' file indicates that all the URLs of the CD ISOs are there , with one URL per line which is what I needed. Now all I had to do was to edit the file as needed and use it in conjunction with a script to download each of the 21 or so Debian ISOs as follows :
FILE: debian_downloader.sh
# Download debian ISOs one after the other using wget

for url in $(cat my_url_file.txt)
do
wget -c $url
done
Now execute the above bash script to start downloading the files one by one.
$ ./debian_downloader.sh
Note: You can also run this script as a cron job.

Friday, 23 June 2006

Open source advocacy - the Open SuSE way

For a couple of days now I have been keeping track of a certain wordpress blog maintained by Ted Haeger. He is an open source evangelist who (in his own words) is working to motivate the Novell user community. And not surprisingly his blog is full of news related to SuSE Linux and how Novell is working with the open source community to bring a more polished Operating System to the users. But what caught my fancy was the interesting mockups screenshots he has posted of the "Computer Menu" of the upcoming SuSE enterprise desktop ver 10 which showcases a unique design deviating from the normal menu found in Gnome.

Fig: Mock up Screenshot of the computer menu in upcoming SuSE Enterprise Desktop 10

Novel has been hard at work in providing the finishing touches to the desktop in such a way that the user is presented a desktop which is clean, attractive and free from clutter. And looking at the mockups of the computer menu, the results already look really interesting. Apart from the usual productivity applications that are bundled by default in most Linux distributions, SuSE enterprise desktop 10 will also have a well integrated Beagle desktop search, Tomboy notes application, Banshee music player and F-Spot - the photo management software.

The enterprise desktop is at present in a closed beta program and the final version is hoped to be released in the coming July. In redesigning the desktop and the menus, Novell has gone the Red Hat way in providing an integrated look and feel for many of the apps. And the desktop doesn't look anything like Gnome though it is really Gnome. It would be interesting to see the final result when it is released in July.

Does that mean I will switch from Ubuntu to the latest SuSE desktop when it is released ? Well, I am open to try and test any Linux distribution. But what attracts me towards Ubuntu is its unique philosophy and the way it is practiced. For instance, will Novell hand out free CDs to anyone who asks for it?

Being a corporation whose aim is to first and foremost make profit, it is doubtful if they will go the Ubuntu way. Also I believe Ubuntu has a strong community support behind it, I dare say rivaling even its grand daddy Debian.

But then Novell is gunning for enterprise customers who have the deep pockets to shell out the money that Novell charges for its Linux distribution and in return, they get a fully polished OS without the loop holes found in Windows and which contain all the (proprietary) drivers which are necessary to make their hardware work flawlessly.

Thursday, 22 June 2006

Book Review: Building Online Stores with Oscommerce - Professional Edition

PHP has grown into an all encompassing language which is now the preferred choice for developing web based projects including many popular content management systems, database configuration front-ends and even e-commerce applications which can be configured to run out of the box. One such open source project is oscommerce which is an Open Source based online shop e-commerce solution that is available for free under the GNU General Public License. It features a rich set of out-of-the-box online shopping cart functionality that allows store owners to setup, run, and maintain their online stores with minimum effort and with no costs, fees, or limitations involved.

osCommerce has attracted the largest community for an e-commerce solution that consists of over 99,900 store owners and developers worldwide with add-ons being contributed on a daily basis. To date there are over 3,400 add-ons available that have been created by the community to extend the features of an osCommerce online store.

So it is no wonder that entire books have been written explaining the configuration aspects of oscommerce. One book which I found really interesting is "Building Online Stores with Oscommerce : Professional Edition" authored by David Mercer and brought out by Packt Publishing. This book is divided into 12 chapters spanning over 370 pages and aims to give the reader a firm introduction to setting up an online e-commerce store using Oscommerce.

The book is aimed more at people running business who have limited knowledge of web based technologies rather than the hard core techie. And going by this principle, the author starts the narration by giving a fly-by overview of e-commerce. Usually people who run businesses may not be as conversant about the different aspects of technology. This chapter gives a peep into the process of designing an ecommerce site from scratch. The author explains in simple terms the various issues that need to be sorted out like the business requirements, meeting the business related needs, deciding how functional ones site need to be and also the development, testing and debugging of the site.

To run Oscommerce on ones machine, one need to first install and configure three software packages - them being Apache web server, PHP and MySQL database. The second chapter of this easily read book walks the readers through setting up these software packages as well as installing Oscommerce. This chapter is replete with screen shots of the steps which makes it much more easier for the average person to follow what is being explained.

From here the author dives into giving the reader an insight into the underlying technology used in Oscommerce. Here the uninitiated in computers get to know the HTML tags, PHP tags, snippets of code in Oscommerce which interact with the Mysql database, the oscommerce directory structure and so on. This chapter does not cover these topics in depth rather the author explains these things on a need to know basis.

Chapter 4 titled "Basic Configuration" is an important chapter in the sense that it explains the various configuration parameters of Oscommerce. Here one is introduced to the well designed and easy to use administration panel of oscommerce. This is the place where one has to enter the business details, details of the products that are to be showcased, customer details, configure the stock and more.

The primary job of an online store is to tout the products to potential customers. So once the oscommerce suite has been installed and configured, the next step is to add the products one intends to sell online. The next chapter titled "Managing Data" explains how to add, remove and update business related information in the database. The finer nuances of grouping data like creating a catalog, categories, setting product attributes which aim on increasing the aesthetics of the products and keeping track of customers are explained in this chapter. For example the author explains how to import data from an excel spreadsheet into oscommerce using the module Easy Populate which comes in handy when one is faced with entering details of products running into 1000s.

The 6th chapter titled "Customization" is a large one and walks the reader through ways in which one can customize the online store. And truly so an online store should be unique to attract customers. This chapter explains which section of code should be modified to get the desired results.

PayPal - the online payment option is one of the most popular and offered by many online businesses. The 7th chapter titled "Taxes payments and shipping" lists the steps needed to integrate PayPal into oscommerce so that the customers can be offered the choice of paying via PayPal. Various payment options like credit card payments and other alternative forms of payments are also pursued. The power of oscommerce is in its modular architecture. Here this power is amply evident when the author explains the use of two modules - credit card module and psigate module - in Oscommerce.

Chapter 8 titled "Securing your store" goes into security aspects of the online store and one gets to know about SSL and database security. This chapter is very important in the sense that only if secure payment options are included will the customer pay for the products online.

The next three chapters are rather advanced and deals with explaining for example, how to integrate gift vouchers and promotional codes, including an RSS feed as well as various tips and tricks that one can use to fine tune oscommerce to accomplish an out of the ordinary task as well as deployment and database maintenance.

The final chapter titled "Building your Business" gives further tips into attracting customers to ones newly formed oscommerce store. This includes marketing, research, advertising as well as making money through displaying advertisements on the site.

About the Author
David Mercer is a programmer and professional writer who has over seven years of experience in his choosen field. As a consultant for his own technical and editorial consultancy, David balances his time between programming, reviewing, writing and furthering his studies in Applied Mathematics.

Book Specifications:
Name: Building Online Stores with osCommerce - Professional Edition
ISBN No: 1-904811-14-0
Author : David Mercer
No of Pages: 372
Publisher: Packt Publishing
Price: Check at Amazon.com
Rating: Good

End Note:
Oscommerce is a very popular e-commerce suite which is used by tens of thousands of business houses. And as I said earlier, this book is more attuned to people who are new to technology and who wish to setup an oscommerce store to do business online. This book will not only aid one to set up oscommerce with ease but also bring one up to date with the latest technologies used in the project. All in all a very useful book for any one interested in doing business online by setting up an e-commerce store.

Monday, 19 June 2006

10 Top Goofs That Interns Make

Every once in a while, I come across an interesting article which is not directly related to Linux in any way but which I find really informative and is hard to pass up. This is one such post. In any institution whether it is big or small, there are certain protocols to be followed and even if one is only working there as an intern should not be excuse enough to go slack on the rules.

Kerry Miller describes 10 mistakes that interns normally make while working at a firm. To summarize, these are the 10 don'ts (s)he talks about.
  1. Don't be late to work. Be right on time every time.
  2. Don't go to work dressed in casuals. Wear formal clothes.
  3. Don't make unnecessary use of cell phones, music players or other distracting objects during office hours.
  4. Shrug off your shyness. Developing good interpersonal skills and maintaining eye contact while speaking are a definite plus for landing that permanent job.
  5. Don't skip the office party. Socializing with the full time employees will send the right message about your work-life balance.
  6. Be mentally prepared to do mundane tasks. It is expected of you.
  7. Don't be afraid of venturing outside your allotted space. Networking with people in other departments will hold you in good stead.
  8. Don't be afraid to clarify your doubts about your work or ask for help from your superiors.
  9. Have a positive view of criticism. And learn to accept it in the right spirit.
  10. And finally... don't waste time.

Sunday, 18 June 2006

20 ways to secure your Apache configuration

In an earlier post I had explained how to host websites on ones personal machine using apache webserver as well as password protecting the website using .htaccess and .htpasswd files.

But there is much more to apache than these configuration features I described. For instance, there is the mod_rewrite module which is heavily used by most content management systems to provide a easy to remember permanent link to individual web pages and an indepth introduction to mod_rewrite will take up larger part of a big chapter.

Pete Freitag has written a very good article which lists the steps one can take to secure running Apache webserver on ones machine. What I like most about his article is the simple manner in which he explains the various configuration parameters aided with bits of code. A very informative read indeed.

Friday, 16 June 2006

Cream for Vim - Making Vim more user friendly

Learning to use Vi editor could be a real pain for most people as it has a relatively steep learning curve. But once the most common commands are mastered, one gets to enjoy the sheer power of this editor made available at ones finger tips. And there are an ever growing group of Vi enthusiasts around (myself included) who swear by it over other text editors. The modern avatar of Vi is Vim (VI iMproved) created by Bram Moolenaar and released as an open source software package. VIM is an excellent vi clone with a slew of additional features and which can be further enhanced and modified using scripts. On last count there are 1557 scripts available at the Vim.org website which enhances the editor for a variety of uses.

Fig: This article displayed in Cream editor

One project which has gained a lot of popularity in the Vim community is Cream. Cream consists of a collection of scripts and plug-ins which aims to make it much more easier for a new user to cut his teeth in Vim and the user can easily use most of the features of Vim which has made it the popular editor it is by just navigating the menu. All it takes to install Cream in Debian based distributions is to run the command :
# apt-get install cream
... which will install the collection of scripts in the necessary places as well as provide an entry in the Gnome Applications Menu. The only requisite is that you should already have GVim installed on your machine. Once Cream is installed, one can double click the menu entry in the Gnome Applications menu and GVim is started with the Cream scripts in place. At this point one might ask what is unique about this project ? Well for one, this redesigns GVim to bring the same ease of use of any ordinary text editor (read kate, kwrite, nano ...) where the user need not be bothered by the different Vi command modes; rather he/she can just start typing. Secondly, it brings a whole lot of power to the user by providing menu entries for most of the special things one can achieve in GVim. Take for instance creating folds in ones text. In GVim, one does it by pressing the key combination "v}zf" in the command mode. And a new user who is trying Vi will in most cases be ignorant about this. Cream has a menu entry for folding the paragraphs in ones text and all it takes is to just select the paragraph which has to be folded and press F9 or click on the menu Tools --> Folding --> Set fold (selection). And voila! a fold is created. But this is only a small part of the features provided by Cream.

Fig: One can create folds in Cream with ease

The developers of Cream have given it a great deal of attention in providing the same key bindings found in editors in Windows for the most common tasks like cut/copy/paste , undo/redo, open/close a file and so on. For example, to undo the most recent changes, one can press Ctrl+Z instead of the usual way of moving into command mode by pressing 'Esc' and then pressing 'u'. By remapping the shortcuts, the developers of Cream have considerably reduced the learning curve to make it almost negligible.

Another thing which I found really useful is the "Settings" menu which contains options for setting various parameters like the tabstop width, toggling auto-indent and setting the preferences like the font used, bracket flashing and so on. But what is to be noted is that the settings that are changed are automatically made permanent until the user changes them again.

This is different in the original GVim in that the changed settings apply only to that instance of the editor and in-order to make the settings permanent, you have to save them in the gvimrc configuration file residing in the /etc/vim/ directory.
Fig: Calendar plugin

Cream also installs a few additional plugins along with it. One which took my fancy is the calendar plugin which embeds a calendar inside the editor. And one can view the calendar by pressing Ctrl + F11. Why just a calendar... there is also a game called typing tutor which can be played inside the editor to while away ones time and improve ones typing speed at the same time. All in all, this is a very interesting project which reconfigures the Vim editor to make it as easy to use as an ordinary text editor without sacrificing any of its underlying power.

Tuesday, 13 June 2006

Watch Soccer World Cup Live in ASCII

With the world cup fever catching on, millions of soccer crazy people all over the world are glued to the TV to see the outcome of each match. Frankly, I find Soccer a much more fast paced and enjoyable game - what with the constant scream of crowds and the rush of adrenalin... And enterprising people who can't watch the game live on TV are finding ways of keeping track of the game through alternate means.

One such project which allows one to watch the Soccer World Cup live on ones computer is Ascii-wm.net . This site is streaming a live feed of the game in Ascii format which anyone can view by telneting to their server. In *nix, you can view the stream in two ways; one via telnet and the other via netcat as follows:
$ telnet ascii-wm.net 2006

OR
$ nc ascii-wm.net 2006

Fig: Ascii football

All the power to them for their love of soccer and for making this feed available.

Friday, 9 June 2006

An Interview with Jeff Dike - The creator of User Mode Linux

Jeff Dike is the creator and maintainer of User Mode Linux (UML) - a virtual machine which runs on Linux. In recent times, UML has gained a lot of significance after Linus Torvalds incorporated the UML patch into the official Linux kernel source tree. Now a days Jeff works full time for Intel devoting his time towards further development of UML. He has also authored a book titled "User Mode Linux" published by Prentice Hall. After reading through the book written by him on this subject and also running UML on my machine, I had the desire to ask him a few questions on UML and how it fared when compared with other virtualization technologies. And Jeff very kindly agreed to take time off from his important work schedule to give answers to my queries. Without further ado here are the questions I posed to Jeff Dike along with his replies.

Question: There are a lot of virtualization technologies like VMware, Xen and QEMU other than UML. What are the relative strengths of UML which would urge a person in choosing it over its counterparts ?

Jeff Dike: The reason varies according to the technology that you're comparing UML to. With qemu and other instruction emulators, the attraction is speed. These let you boot a kernel on a machine with a different architecture, i.e. a ppc kernel on an i386 host. When the architecture of the virtual machine is the same as the host, there are few reasons to take the overhead of instruction emulation, even if the emulator is optimized in this case to just virtualize instructions.

With hypervisor-based technologies such as VMWare ESX or Xen, the advantage of UML is simplicity. There are two aspects of this. The less important one is that you can have UML up and running by downloading a UML kernel and a filesystem, and running a shell command. This makes it very easy to be up and running with UML quickly.

The more important aspect of simplicity is that UML is conceptually simple. That is, for the host's system administrator, UML introduces relatively few new concepts. You don't have to learn how to administer a hypervisor, since, with UML, the hypervisor is Linux. A UML instance is a set of Linux processes, which every Linux admin knows how to examine and control. All of the Linux diagnostic tools, such as ps, top, and everything in /proc work as well for diagnosing problematic UML instances as they do for any other process on the system. When something goes wrong with a virtual machine and it has to be fixed quickly, UML allows all of your Linux tools, techniques, and experience to be applied to the problem. There's no need to introduce another OS, with which you have limited experience, in order to run some virtual machines.

Question: Are there any drawbacks of UML?

Jeff Dike: The main complaint about UML with respect to other technologies is speed. There is a noticeable amount of overhead with common workloads running under UML. This is a combination of Linux not being a perfect hypervisor and UML not being as well optimized as it should. These are both being fixed. A number of hypervisor-related improvements have gone into Linux recently, including PTRACE_SYSEMU, which greatly improves system call virtualization performance, and MADV_REMOVE, which allows UML to implement hotplug memory.

On the UML side, a relatively recent change made enabling and disabling interrupts much more efficient. This made a surprising performance difference, with a kernel build inside UML on my laptop being 25% faster than before.

The ongoing container effort also promises to bring UML performance much closer to native. This project is adding virtualization infrastructure to the kernel in order to support lightweight containers such as OpenVZ and vserver. It turns out that UML can use this support in order to allow its process system calls to run directly on the host rather than being intercepted and emulated by the UML kernel. I implemented a container for the system time and used it to bring UML's gettimeofday performance to around 99% of native. LWN has done an excellent coverage of this. Other containers will do the same for many other common system calls.

Question: In the book you have written on User Mode Linux, you state the difficulties you faced in getting Linus to merge the UML code with the official Linux kernel source tree. Do you feel that the process of getting new features incorporated in the official Linux kernel source is too tiresome ? And should Linus simplify this procedure to some extent? What are your thoughts on this ?

Jeff Dike: I don't see that UML makes a good case for changing how easy it is to get new things into the Linux kernel. There should be some reluctance to incorporate new code. It should be fairly well-understood, especially when it affects other parts of the kernel. It should also be maintained and have an identifiable user base. All of these things take time to demonstrate, so any new project should spend some time being maintained outside of Linus' tree.

UML spent its initial life being maintained out-of-tree before being incorporated for the first time. I also describe a period after that in which it was difficult to get UML patches into mainline, and UML more or less went back to being maintained out-of-tree. This wasn't entirely Linus' fault, although my changes affect only the UML part of the kernel tree and I am the maintainer of that portion of the tree, so he should have just waved them through. It didn't take too long for my accumulated changes to essentially merge into a small number of very large patches. Submitting patches such as these is contrary to normal kernel practice, which is to have each patch contain a discrete identifiable change.

This deadlock was broken by Paolo Giarrusso, who recognized that UML was better off in-tree than out-of-tree, and sent a large catch-up patch to Andrew Morton, who forwarded that to Linus. This large patch contained all of the changes that I had accumulated in my own tree, and getting that into mainline synchronized Linus' tree with mine.

Any procedure can be improved, and getting code into the kernel is no exception. However, I don't see a case for anything drastic. Maybe it should be slightly easier to submit new code, or maybe it should be slightly harder, I dunno.

Question: There are Linux OSes which run from within windows. CoLinux (www.colinux.org) comes to my mind which is run cooperatively alongside windows on a single machine. Is it possible to run UML inside such Linux distributions which are run from within windows ?

Jeff Dike: If the Linux-inside-Windows is a complete and reasonably bug-free Linux, then UML should run fine. However, while UML is a completely normal process, it is a demanding one, and tends to expose kernel bugs that aren't seen anywhere else. So, UML should run inside something like CoLinux, but I wouldn't be surprised for it to hit bugs when that is first tried.

UML is known to run inside VMWare, which isn't much of a surprise considering that VMWare virtualizes the hardware and runs the same Linux kernel as the host.

There is also the possibility of porting UML directly to Windows, or some other OS. This was a Windows port done a number of years ago (in part by the author of CoLinux) and was very nearly completely working. There were screenshots on the project's web site (umlwin32.sourceforge.net), of UML/Windows running X, but they seem to have disappeared.

Question: With the increase in processor speed and the fall in memory prices, virtualization technology has come within reach of the average computer user. Naturally this has opened avenues which were not available in the past. And many OS companies are taking a keen interest in providing virtualization. For example, Apple has already released a software (boot camp) which is used to run other OSes from inside OSX.(Update: Boot camp is not a virtualization technology but Apple is rumored to be working on building in virtualization technology in its upcoming OS code named Leopard). What in your opinion, is the future of virtualization and what significant role will UML play in this?

Jeff Dike: I see huge potential in application-level virtualization, in which applications gain some of the attributes of an operating system. In the final chapter of the book, I use the example of clusterized applications, in which an application, by incorporating a clustering technology, essentially becomes a cluster. By doing so, it allows multiple users to share a single instance of the application and to simultaneously work on whatever the application lets them work on.

For example, a clusterized word processor would allow many people to work on different parts of a large document at the same time, with the cluster technology within the word processor making sure that everyone sees the same data. The users would all be working on an up-to-date copy of the document, seeing real-time updates of changes made by other users. In a case like this, a cluster filesystem is likely to be the basis of the clustering. So, the rest of the filesystem infrastructure will have needed to been incorporated into the application. This provides our word processor with a full internal filesystem, with a permission system, that can be used to store a large document in a directory hierarchy which reflects the organization of the document. This is only a matter of how the document is stored within the application and would not affect how it appears to the user. However, this representation does make it possible to use the permission system that the word processor has incorporated to assign parts of the documents to individuals or groups and to enforce those assignments by setting ownerships and permissions on the internal files into which the document has been divided.

Clusterizing an application would make it possible for many people to work on a document, spreadsheet, presentation, or almost anything else as though it were a wiki. The question is where this application-level clustering will come from. Here's where UML comes in. There is a fair amount of kernel-level clustering available now. UML makes that technology available in userspace, by virtue of the fact that UML is a userspace port of the Linux kernel.

Almost everything in the Linux kernel is available in userspace via UML. A filesystem internal to the application is also interesting because it provides some consistency guarantees about the data stored within it, providing some crash-resiliency to the application. The SMP scaling work that has gone into the Linux kernel is the equivalent of threading support in a process.

Applications are coming under increasing pressure to become threaded as CPUs are built with increasing number of cores. UML offers all of these things already running in a process. There will be work needed in order to incorporate any of this into an application, but that it likely to be easier than writing it from scratch.

Question: You have authored the book User Mode Linux (Read the review of the book) which I found a really interesting and informative read. Usually it is very difficult to find people who have created a popular software who sit down and author a book on it. But you have excelled in both these fields. On this note, how difficult is it to write a book? Have you found writing a book easier than writing code or vice versa?

Jeff Dike: For me, writing the book was much harder than writing code. Writing prose comes much less naturally to me than writing code. On top of that, writing a book comes with other constraints such as meeting a schedule and making sure that everything you write is well-structured at all levels, from correct spelling and grammar to the manuscript being a consistent and coherent whole.

My less-than-optimal work habits contributed somewhat to the problem. Generally, I had a chapter due every 3-4 weeks. The actual writing of a chapter tended to be done in the week before the deadline, and in some cases, the day or two beforehand. This led to the year 2005 being a cycle consisting of relaxation and good feeling immediately after completing a chapter, followed by two weeks or so of working on other things while an increasingly loud voice in the back of my head reminded me that I wasn't writing. This, of course, was followed by the aforementioned writeathon.

The result of this was that most of the time, I was racked by guilt over needing to write a chapter, but not doing so. Better work habits would have had me writing one chapter immediately after sending in the previous one, and polishing it in a leisurely manner until its deadline.

This situation was further complicated by mishaps such as losing about half of chapter 7 (which owners of this fine book will immediately recognize as being The Long One) during a laptop theft in France. In a classic case of closing the barn door after the horses have fled, I did institute a more careful backup procedure after this.

Question: Can you give a few examples of where UML has been put to use in a production setup ?

Jeff Dike: You can rent a UML instance from a number of ISPs. linode.com is one that I am reasonably familiar with. A completely different area is embedded development - a number of companies use UML internally to simulate devices so that development can proceed before hardware is available. These companies tend to keep quite about their activities - an exception is accenia.com which sells an embedded development toolkit, one part of which is UML.

Question: When a person - especially with a programming background - comes across the acronym UML, he immediately associates it with "Unified Modeling Language". Why did you opt for the name UML for this project and do you perceive a name change ?

Jeff Dike: I opted for the name because of a complete lack of imagination. If I had had any imagination, it would have been called Zeus or Willow or something equally spiffy-sounding and undescriptive. As for the acronym, I consider this to be similar to trademarks - collisions are OK as long as you're not confusing anyone. UML (the VM) and UML (the language), despite both being computer-related, are so dissimilar that no one is going to be confused by the clash. No one is going to go looking for a virtualization technology and get side-tracked by the language, or vice-versa.

Question: Are you entirely responsible for UML?

Jeff Dike: No! I am the principal maintainer of UML and therefore get the credit for it, but many other people have contributed to the project. Paolo Giarrusso, a college student in Italy, has been my second-in-command for a while, making a large number of contributions to UML, in the form of code, support on the UML mailing lists, and documentation. The UML user base has been most supportive, with many UML features owing their existence to requests, and occasionally to patches, from users. I would like to single out Bill Stearns for his support for the project in many ways since almost the beginning. Last, but not least, Intel has contributed greatly to the project since 2004 when they hired me to work full-time on UML.

Thursday, 8 June 2006

Tag your files and folders with an emblem in Gnome

I like all things which are simple. And I like Gnome desktop very much because of the obsession of its developers in making this popular desktop as simple and intuitive to use as possible. Yes I know, Linus Torvalds has in the past gone on record stating his preference for KDE over Gnome for just this very reason. Nevertheless, my disgust towards complexity goes back to the times I used to work in Windows when I had to put up with the hard to understand jargon and syntax of things I had to do to maintain it. So when I switched to Linux, Gnome desktop breathed a bout of fresh air in its simplistic but functional design.

Fig: Shows the folders and files tagged with unique icons

In Gnome, it is possible to tag files and folders with small icons called emblems. Tagging files with visual cues like these will help a person to find the particular file much easily. Here is how it is done:

  1. First choose the file you want to tag with an emblem.
  2. Now right click on it and select Properties from the popup menu.
  3. The Properties menu will have multiple tabs. From them, select the tab named 'Emblems'. Here you can choose from a collection of icons. Gnome allows one to choose multiple emblems. Once the emblem is choosen, press 'Close' button. And you are done.
I have been using this property to keep track of files which contain important data. For example, I have a folder containing contact information of people I interact with. I have tagged this file with the icon showing 'two people' which gives me a cue as to what is in this file.

Operating System Reviews

Which is the best Linux distribution ? How does one Linux distro rate against another ? These Linux OS reviews will help you make up your mind as to which is the best Linux OS for you. Many of these Linux distributions are liveCDs which can be tried out without installing on your machine. This page also feature a few other OS comparisons as well. For instance, Solaris, BSD, DOS and so on.

Clicking on any image will take you to the respective operating system review.

For web hosting comparison based on Linux, refer to the guide chart.

Happy reading.
Read more »

Tuesday, 6 June 2006

How to install anything in Ubuntu

Consider this scenario... You are a Linux neophyte and till a few days back was tied down to using your favorite proprietary OS. But after hearing so much buzz about Linux, you ultimately decided to take the plunge and give Linux a try by installing it on your machine. And going by the popular trend, you zeroed in on (what else?) Ubuntu as your choice of Linux distribution. Now after getting help from one of your geek friends to install it (you could have done it yourselves really), you boot into Ubuntu and you feel...disoriented. You have this jittery feeling that everyone of us have experienced some times in our lives when we are faced with a change. In short you feel lost and are looking for some hand holding regarding working in Ubuntu.

This is a common feeling that each one of us initially go through especially when we make the switch to a newer and better operating system which also comes at an unbeatable price. But then one of the chief strengths of GNU/Linux is the strong community support that none of the closed source software product companies could hope to provide.

On this note, Simon Gray has written an excellent article titled - "How to install anything in Ubuntu" - which could put a new Ubuntu user at ease and hand hold him/her in the fine art of installing software packages in Ubuntu. The article is replete with screen shots and explains how to install software packaged in a variety of formats including deb,rpm,tgz, bin and exe. He also dwells on searching for packages in Ubuntu as well as enabling extra repositories to get the software package normally not found in the default official repositories. In short, a very nice article worth spending ones time on.

Sunday, 4 June 2006

Six steps to installing Ubuntu Dapper Drake

If you ask me, I would say that installing Linux is much more easier than installing Windows. For one, you can skip the licence agreement. Second, the installers bundled with most Linux distributions give a clearer idea about what is going on and the steps the user need to take to finish installing Linux on ones machine. But the latest version of Ubuntu named Dapper Drake (Desktop edition) clearly takes the cake in the installation department with its Live CD graphical install method. Ubuntu team has finetuned the installation of Ubuntu to restrict it to a mere six steps. Just for this once, I will let the images below speak for themselves:

Fig: Step 1 - Choose your language

Fig: Step 2 - Set your timezone

Fig: Step 3 - Select your keyboard layout

Fig: Step 4 - create a user account

Fig: Step 5 - Partition your disk (if necessary)

Fig: Step 6 - Finally, start the installation

Fig: Installation in progress

Fig: Installation complete dialog

As you can see, it never gets any simpler than this. It only takes a little bit of comparison with installing GNU/Linux a few years back to realize how much GNU/Linux has advanced over these years.

Friday, 2 June 2006

Useful site to learn shell scripting

In the past, I have written an article on this blog titled - 10 seconds guide to Bash Scripting - which explained the finer nuances of learning to write scripts for the bash shell. But there is much more to it than scripting for one particular shell as there are numerous shells available for the Linux/Unix environment.
The site Shelldorado positions itselves as a site which explains all the things related to shell scripting. This site contain among other things, articles explaining good coding practices while writing scripts, tips and tricks related to scripting, a good collection of scripts and much more. I found this site really informative in its depth of coverage of the art of shell scripting. In fact the tips and tricks section is categorized into beginner, intermediate, script programmer and advanced sections and is worth a look over.