Help on creating a Makefile

December 12, 2005 0 comments
When ever you download the source package of a software from the net and decide to compile it on your platform, you always move to its directory and issue a make command.

When you execute the 'make' command, it reads a file by name Makefile and according to the directions listed in this file, the source - which may be 100s of files of code - is automagically compiled (provided the necessary compilers are installed on your machine).

So one of the necessary ingredient in successful compiling of code using the make command is the Makefile. If you are a developer or aspiring to create your own programs, then it is imperative that you know how to create a Makefile because using make is much more efficient and time saving than directly compiling the source.

Guy keren has written a very clear tutorial on how to create a Makefile to aid the compilation of code. The author takes the reader through all the steps and syntax of the Makefile and explains it with the help of simple examples.

A fly by view of KDE 3.5

December 09, 2005 1 comments

Recently, the KDE team released the latest stable version of their flagship product KDE desktop for Linux and other Unices. And within a short time, the good samaritans at Ubuntu (or should I say Kubuntu) made available the packages for KDE 3.5 in their repository with the instructions on how to upgrade all the packages on your machine to mirror those of KDE 3.5 provided you were using Breezy Badger v5.10.
The upgrade to KDE 3.5 was a simple affair consisting of a few steps:
  1. Add the digitally signed key to your machine.
    $ wget http://people.ubuntu.com/
    ~jriddell/kubuntu-packages-jriddell-key.gpg
    $ sudo apt-key add kubuntu-packages-jriddell-key.gpg
  2. Include the kubuntu repository in your /etc/apt/sources.list .
    # FILE: /etc/apt/sources.list
    ...
    deb http://kubuntu.org/packages/kde35 breezy main
  3. Update your local database
    $ sudo apt-get update
  4. Lastly do a distribution upgrade
    $ sudo apt-get dist-upgrade
Over 130 MB of packages were downloaded from the repository after which the whole system was upgraded to use KDE 3.5.

KDE 3.5 has many marked improvements over its older sibling (3.4). Some of it like storage media notification for the user to realise and many of it under the hood.

Jure Repinc has written a wonderful review of the improvements that one can find in the new KDE 3.5 . His review is an ongoing series of which part 1 and part 2 has been released. Reading the review gives a user a good idea of what to expect from the new version of this ever popular desktop.

But this is just the beginning. KDE 4.0 the next major release is expected to be even more fantastic with respect to usability and eye candy. And we can wait with our fingers crossed till the release of KDE 4.0 next year.

Essential house keeping in Ubuntu

December 07, 2005 37 comments
I started using Ubuntu Breezy ver 5.10 a month back on my machine. Prior to that I was exclusively into Fedora. What drew me to Ubuntu was the huge number of packages in its repositories including software which I find useful on a day-to-day basis like Tomboy which I had to compile from source in Fedora. But the Ubuntu CD comes with the base packages which support only open file formats. So if you want support for proprietary file formats like mp3 and quicktime support as well as install software not included on the CD, then you have to do some work.

I call it essential housekeeping because it is not exactly a problem, but only a matter of finding out how to get the necessary support. Here I share my experiences in putting the Ubuntu house in order on my machine.

Adding Universal and Multiverse repositories
The first thing to do in Ubuntu is add the universal and multiverse repositories to the /etc/apt/sources.list file. Usually you need to only uncomment the sections which are commented. But I also found that even the universal repositories did not contain some packages like mplayer. So I had to search the net and find a repository which contained the mplayer package and add it to my sources.list file.
deb cdrom:[Kubuntu 5.10 _Breezy Badger_ - Release i386 (20051012)]/ breezy main restricted
deb cdrom:[Ubuntu 5.10 _Breezy Badger_ - Release i386 (20051012)]/ breezy main restricted

deb-src http://in.archive.ubuntu.com/ubuntu breezy main restricted

deb http://in.archive.ubuntu.com/ubuntu breezy-updates main restricted
deb-src http://in.archive.ubuntu.com/ubuntu breezy-updates main restricted

deb http://in.archive.ubuntu.com/ubuntu breezy universe main restricted
deb-src http://in.archive.ubuntu.com/ubuntu breezy universe

deb http://in.archive.ubuntu.com/ubuntu breezy-backports main restricted universe multiverse
deb-src http://in.archive.ubuntu.com/ubuntu breezy-backports main restricted universe multiverse

deb http://security.ubuntu.com/ubuntu breezy-security main restricted
deb-src http://security.ubuntu.com/ubuntu breezy-security main restricted

deb http://security.ubuntu.com/ubuntu breezy-security universe
deb-src http://security.ubuntu.com/ubuntu breezy-security universe

deb http://ca.archive.ubuntu.com/ubuntu breezy universe
deb http://si.archive.ubuntu.com/ubuntu breezy multiverse
The lines highlighted are those that I added seperately. For example the last line lists the path to the repository which has the mplayer package. I couldn't find the package in any of the other repositories.

Once I added the necessary repositories, the next step was to update the package database in Ubuntu. This I did by running the following command:
$ sudo apt-get update
Once updating was completed (took about 10 minutes), I got ready to install all the software that I found necessary. I had already prepared a list of the software I wanted and so it was only a matter of firing up synaptic (the GUI front end to apt-get ) and installing the software. The software list I prepared was as follows:
  • Flash plugin for firefox web browser
  • mplayer with quicktime and wmv support
  • libdvdcss support for playing encrypted DVDs
  • fluxbox - a light weight window manager. I find Gnome and KDE (with all the eye candy) distracting for doing serious work .
  • GNUCash - A personal finance package
  • Tomboy - A very good note taking application based on Mono.
  • Sun's JDK - For java support
  • Gkrellm - A GUI which gives a graphical display in real time of the system status , like cpu load, memory usage, processes loaded in memory, network traffic and so on which I find really useful.
  • Compiler tools - Ubuntu is a distribution targeted at the ordinary user. So it does not bundle a compiler on the CD. But compiler tools are handy because you never know when you might need to compile a package from source and install on your machine.
  • NVIDIA glx drivers - I have a NVIDIA Geforce 2 MX graphics card. And even though ubuntu contains the nvidia drivers, I had to install the nvidia-glx package manually.
  • Install firestarter - The front end for iptables firewall and configure it. This is desirable because I found that the default installation of Ubuntu leaves the system wide open. Check by running the command:
    $ sudo iptables -L
    With an open system, your machine is just sitting there waiting for getting cracked by crackers and script kiddies or including your machine in a DoS attack.
Other than the flash plugin and mplayer, I did not find any trouble in installing all the software. It was only a matter of selecting the necessary software in synaptic and installing. You can also do it in the command line as follows:
$ sudo apt-get install fluxbox tomboy gnucash

Installing NVIDIA Graphics card drivers
First I installed the glx driver package as follows:
$ sudo apt-get install nvidia-glx
Then I had to enable the driver to be used by X server.
$ sudo nvidia-glx-config enable

Installing Flash plugin
At first I couldn't find the flash player package in synaptic. The problem solved by itself when I accessed a web page which contained a flash component. Firefox informed me that a plugin was missing which was needed to view the component and prompted me to install it by directing me to the macromedia website. And once I installed flash plugin for firefox, I was able to view the flash based websites.

Installing the compiler tools
I installed the compiler tools which contained among others, gcc 4.0 and g++ as follows:
$ sudo apt-get install build-essential
$ sudo apt-get install manpages-dev autoconf automake libtool
$ sudo apt-get install flex bison gcc-doc g++
Now I could easily compile a source package and install it if the need arises.

Installing mplayer

Here I had to do a little more work than above but not as much as you might think. First I downloaded the essential codecs from the mplayer website. It was a bzip2 compressed tar file which was around 9MB in size. I unpacked it and copied the contents into the directory /usr/lib/codecs .
$ tar xvjf essential-20050412.tar.bz2
$ cd essential-20050412
$ sudo mkdir -p /usr/lib/codecs
$ sudo cp -R ./* /usr/lib/codecs/.
Now that the codecs were installed, the next step was installation of mplayer itself. This was achieved with the simple command :
$ sudo apt-get install mplayer-586
$ sudo apt-get install mplayer-fonts
Support for playing encrypted DVDs - libdvdcss
If you want to play encrypted DVDs it is important to have the libdvdcss library installed on your machine. There is a very good site called ubuntuguide.org which lists a method of using apt-get to install this library. But some how, it did not work for me. I got the error message that this package is not available in the repository. So I had to use a more round about way to install this library on my machine.
  • First I downloaded the source file libdvdcss-1.2.8.tar.bz2 using wget from the videolan.org website.
  • Then I unpacked it in my home directory and then compiled and installed it in the /usr directory as follows:
    # tar -xvjf libdvdcss-1.2.8.tar.bz2
    # cd libdvdcss-1.2.8
    # ./configure --prefix=/usr; make; sudo make install
The installation went normally and I was able to play encrypted DVDs on ubuntu.

Housekeeping is a necessary bane for all OSes, whether it be windows, linux or OSX. In Linux, it is a little bit more work because of the licence restrictions imposed by the owners of the popular proprietary format files. But Ubuntu has done a good job of reducing this work as much as possible.

Book Review : Self Service Linux - Mastering the Art of Problem Determination

December 05, 2005 0 comments
Self-Service Linux - Mastering the Art of Problem Determination is a part of the "Bruce Perens open source series". The very same Bruce Perens who gave the computing world the Open Source definition which has since been embraced by such popular projects as OpenOffice.org and Mozilla. He is one of the leaders in the Open Source movement and was instrumental as a Debian GNU/Linux Project leader in getting the system on to two U.S space shuttle flights. Prentice Hall and Bruce Perens has come together to bring a series of books under the banner "Bruce Perens Open Source Series" and this book falls in this category.

The Contents of the book at a glance
The book is divided into 9 independent chapters and 2 Appendices each of which covers a distinct topic invaluable to any programmer or system administrator in Linux.
The first chapter deals with the four phases of investigation which should precede any problem solving. This also includes steps to effective technical investigation . By using the guidelines described in this chapter, any person will be able to proceed according to a plan of action which will help save valuable time in trouble shooting a problem. Also these tips help the programmer to stay focused on the problem at hand.

All unices including Linux has a very powerful tool in strace. Strace is a utility which is used to trace system calls that applications make to the kernel. The second chapter of this book takes an in depth look at ways in which a programmer can leverage this tool for effective troubleshooting.

Any book covering problem solving in Linux will be incomplete if it does not have a chapter dedicated to /proc systems. The third chapter of this book gives a broad analysis of the /proc file system including all the important files and directories residing in it. In fact this chapter is spread across 30 pages which should give you a fair idea of the depth of coverage of this very important virtual file system.

The fourth chapter is titled Compiling and concentrates on defending against many compilation- related problems, the potential pitfalls to look out for and compiler optimisation. Mind you, this chapter does not list the commands for compiling but takes the reader through solving a error message generated while compiling the kernel. The authors also explain the various errors that a user can come face to face while compiling the kernel.

One of the most fundamental parts of a computer is a stack. A stack is basically used as a temporary storage for the data and plays an important role in the working of an OS. The fifth chapter of this book pursues this topic with vigor. After reading this chapter, the person will have a very good knowledge of stacks, the role played by them and their inner working on varied architecture like x86 and x64.

The sixth chapter gives a broad review of the GNU debugger. Even though there are other proprietary debuggers, GDB has an advantage in that it is one of the few which are available across a myriad of hardware platforms running Linux. With out a debugger, problems like a memory corruption and code logic error are very hard to pin point. The authors have done an excellent job in explaining the art of debugging using GDB for effective problem determination.

The seventh chapter is a short one and deals with the Linux system crashes and hangs. It pursues how among other things how you can use OOPs report to find the causes of system crashes.

Kernel debugging using KDB forms the basis of the eighth chapter of this well structured book. KDB is a kernel debugger which can be used to debug a live kernel and it can be used from the machine running the kernel itself.

ELF: Executable and Linking Format is the 9th and final chapter of this book. All Linux shared libraries and executables use the ELF format. A good knowledge of ELF is essential to improve the overall knowledge of how an operating system works and knowing ELF format will help improve the diagnostic skills of a Linux user. Considering the importance of the topic with respect to problem determination and solving, the authors have paid special attention in writing this chapter which happens to be the second largest chapter in this book.

Lastly, this book has two good appendices - the first one named The Toolbox - which gives a listing of all the problem determination tools available for Linux. The second appendix lists a Data Collection Script, an invaluable piece which will aid a user in collecting the right information quickly when a problem occurs.

A word about the Authors
This book has been co authored by two experienced developers namely Mark Wilding and Dan Behman.
Mark Wilding is a senior developer at IBM and has over 15 years of experience writing software. He is an expert in the area of Operating Systems, networks, C/C++ development, serviceability, quality engineering and computer hardware.

Dan Behman is a member of the DB2 UDB for Linux platform exploitation development team at the Toronto IBM Software Lab. He has over 10 years of experience with Linux, and has been involved in porting and enabling DB2 UDB on the latest architectures that Linux supports, including x86-64, zSeries and POWER platforms.

Book Specifications
Name : Self-Service Linux - Mastering the Art of Problem Determination
Publisher : Prentice Hall Professional Technical Reference
ISBN No: 0-13-147751-X
Price : $39.99 US / $55.99 CANADA
No of Pages : 460
Target Audience : Intermediate to Expert Programmer/ Administrator.
Website : phptr.com

Things I found interesting in this book
  • This book takes a practical approach to problem determination and solving.
  • Is peppered with sample code and walk throughs which I found useful in understanding a problem. I especially liked the presentation of strace and gdb.
  • Concepts are explained in a simple and lucid manner keeping complex jargon to a minimum.
  • Though entry level programmers will find this book a bit on the far side, more experienced people can look forward to gaining a lot of knowledge from this book.
All in all a good book worthy of a place in the reference book shelf of a programmer or system administrator.

Kompose - The MacOS 'Expose' on Linux

December 04, 2005 14 comments
People always say, why can't Linux have some of the same functionality as OSX ? Look at apple, see the path breaking innovations they bring out w.r.t usability. It is anybody's dream machine. And then, they go for the final kill - have you seen Expose at work on a Mac ? It's awesome!!

I personally like OSX too for the importance they give to usability and aesthetics. And I really like Expose on OSX. Which is the real reason for this post. Now Linux can also claim to have Expose like functionality in an application called Kompose. Kompose is an efficient full screen task manager which gives a graphical representation of the open windows on your desktop(s) at any given time.

When you start kompose, it discretely resides in your system tray. But it also takes a screenshot of all the applications open on your desktop(s) in set intervals. Now when you want to switch from one window to another, just pressing the key combination [Ctrl+Shift+J] or [Ctrl+Shift+I] (You can remap the key combinations in the configuration settings) will neatly tile all the applications on your desktop (see picture below). Kompose creates a full screen view where every window is represented by a scaled screenshot of its own. And then clicking on the application image of your choice will give the focus to that application.

Fig: Kompose at work on a linux desktop

Kompose makes use of Imlib2 to render the images. And the minimum requirements for running this nifty application is KDE 3.2, Qt 3.2 and Imlib2. Of course I need not tell you that for such graphics intensive operations, you also need a very fast machine. It ran smoothly on my Pentium 4 2.0 GHz 256 MB RAM machine.

At present, you have to manually download and install it on your machine. Debian users can simply run :
# apt-get install kompose
However, on my Fedora core 2 machine, I had to download and compile it from source (Another reason to switch to a debian based distribution).

But hopefully, by the time KDE 4.0 is released, we can look forward to it being included as a standard feature of KDE.

Related Posts:
3dDesktop - The desktop switcher on steroids

Adding Windows Fonts in Linux

December 02, 2005 23 comments
Unlike past times, Linux do come with good fonts. And the font rendering can be made better by choosing to antialiase the fonts. But at times you come across a website which has been designed with the windows user in mind. Such websites are best viewed with one of the windows fonts. If you have windows OS installed on your machine, you can copy the essential fonts from the windows partition to linux and use them to get a better web experience. Here is how you do it.
Method I :
Copy the ttf (True Type Fonts) fonts Ariel, Tahoma, Verdana, Courier New and Times New Roman from the windows partition to the fonts:// location in nautilus.

Method 2 :
Some people who are using Ubuntu have said that they can't do it as normal user. And since Ubuntu does not have a root account, they find difficulty in using su to copy eaither. Such people can do the following:
Create a '.fonts' folder in your home directory and copy the necessary fonts into it. Now you have access to the fonts on a per user basis.

Method 3:
This method can be used to install the fonts system wide if the above two methods do not give satisfactory results.
First find out in which location linux has installed the truetype fonts. It is usually at the location '/usr/share/fonts/truetype/' . But you may also do a search for the same as follows:
# find /usr -iname \*.ttf |head -n 5
Once you know the path of the fonts directory, move to this directory and create a folder there (it can be any name).
# cd /usr/share/fonts/truetype
# mkdir windowsfonts
Note: You need to be logged in as root while doing this.
Next copy all the windows ttf fonts to the windowsfonts directory that was just created.
# cp /media/hda1/windows/Fonts/*.ttf .
Now change the ownership of the fonts as well as make sure they have a right of 644 .
# chown root.root *.ttf
# chmod 644 *.ttf
Now run the command mkfontdir while in the windowsfonts directory.
# mkfontdir
This will create an index of the fonts in the directory. It will also create two files fonts.dir and fonts.cache-1 .
Now moving to the parent directory, edit the file fonts.cache-1 using your favourate editor and append the following line to it.
#File: /usr/share/fonts/truetype/fonts.cache-1
...
"windowsfonts" 0 ".dir"
Lastly run the command fc-cache.
# fc-cache
This command will scan the font directories on the system and build font information cache files for applications using fontconfig for their font handling.

That's it. Now you can have access to windows fonts in all your X applications including firefox and OpenOffice.org.

Book Review : Linux Debugging and Performance Tuning (Tips and Techniques)

December 01, 2005 0 comments
Recently, I got hold of a book named Linux Debugging and Performance Tuning (Tips and Techniques) by Steve Best. The book positions itself as a programmers guide to debugging and ways of increasing the performance of the programs that you write. The book explores the various tools that a programmer can access to iron out the bugs that crop up in the programs that he writes.

The author starts the narration with a chapter on profiling. Profiling is the art of seeking the time taken by a program or a part there of on execution. A wide variety of tools are covered in this chapter which the programmer can use for the purpose. For instance, on reading this chapter, I came to know how you can use the ubiquitous 'date' utility for profiling your programs.

The next chapter dwells on code coverage analysis - which is the process of finding and targeting dead or unexercised code.

There is a special chapter on the versatile GNU debugger (gdb) which should be included in the tool box of any serious Linux programmer. In fact the author with the aid of example code hand holds the reader on steps to be taken to debug a program. A special section of the chapter also lists helpful hints for debugging an application more quickly.

If you are a programmer, then I need not tell you the importance of keeping the memory footprint of your program to a minimum. You also have the unenviable job of hunting down and doing away with the memory leaks hogging your program. The fourth chapter of this book deals with ways to curb memory leaks by using various free memory leak detection programs like MEMWATCH, YAMD, Electric Fence and Valgrind.

The fifth chapter explores the /proc filesystem. Any Linux user will know that the /proc filesystem is a virtual filesystem which resides in the memory and among other things it provides an easy view of the kernel resources and components. This chapter of the book deals with the various tools a user can use to view the system resources like CPU load, memory and so on. The tools covered include but is not limited to vmstat, diskstats,iostats and mpstat.

The next chapter deals with the various system tools that a programmer or system administrator can use to find which component of the system is having a problem. The author also covers the network debugging tools in this chapter.

Chapter seven deals with the subject of system error messages. As you know an OS generates a lot of messages which aid the user/programmer in getting a fair idea of what went wrong and where. This chapter pursues the analysis of a Oops message to find the failing line of code.

If you are a regular visitor to this blog, then you might have read my previous posts on using logging in Linux. Chapter 8 covers this and much more and explains it from a programmers point of view.

Any programmer / system administrator will know the importance of tracing. By tracing, you are able to collect system data in real time (that is when the process is running). A trace can be used to isolate and understand system problems and check where in the system you are faced with bottlenecks. Chapter 9 namely "Linux Trace Toolkit" of this book looks at just that.

As I explained earlier, a profiler is a program which aids the programmer to analyse the performance of applications and the kernel there by help him optimise his programs. Where as in the first chapter, the author had covered how one can profile using simple tools like 'date', here (chapter 10) he explores the use of a more complex profiler tool called oprofile.

The next chapter (chapter 11) is my favourite. It gives a very detailed (but easy to understand) description on UML (User Mode Linux). To those in the dark, UML is a complete Linux kernel which has its own scheduler and virtual memory system. UML has gained the fascination of programmers around the world mainly because it helps the programmer test his applications in a complete isolated Linux environment within the parent OS. The author takes the reader right from an introduction to UML, to patching and building a kernel, booting a UML and UML utilities. I really liked the tips section in this chapter which revealed new insights into the working of UML.

Chapter 12 deals with the topic of Dynamic Probes. They help a programmer in acquiring diagnostic information without including special blocks of code in the program. Dynamic probes can also be used as a tracing mechanism for both user and kernel space.

If chapter 3 dealt with GNU debugger (gdb), chapter 13 deals with kernel level debuggers. This chapter covers two kernel level debuggers kgdb and kdb.

The final chapter of this book (chapter 14) deals with the topic of Crash Dump. A crash dump is designed to meet the needs of end users, support personnel and system administrators who need a reliable method of detecting, saving and examining system problems. The author covers this topic with elan.

Things I liked about this book
  • Each chapter contains complete (but simple) programs which are used by the author to explain how to use a set of tools to tackle a problem.
  • This book takes a hands-on approach to tackling problems that a programmer or system administrator might face in his work.
  • No complex jargon has been used by the author while explaining the concepts which brings it to the ambit of even fresh programmers.
  • There is something for everybody - from a newbie in Linux programming to an expert programmer.
  • At the end of each chapter, there is a section which lists the web resources where the reader can find more information about the topic covered.
  • The tools covered in this book range from the ubiquitous to the specialist ones. In fact the author throws light on each and every tool covered in this book with the aid of examples.
Meet the Author
This wonderful book has been authored by Steve Best. He works in the Linux Technology Center of IBM in Austin, Texas. He is currently working on Linux storage-related products. Some of the achievements of Steve are as follows:
  • Has led the Journaled File System (JFS) for Linux project.
  • Has worked on Linux-related projects since 1999 and has done extensive work in operating system development focusing on file systems, internationalization, and security.
  • He is the author of numerous magazine articles, many presentations, and the file system chapters in Performance Tuning Linux Servers (Prentice Hall PTR 2005).
Book Specifications
Publisher Name : Prentice Hall (Professional Technical Reference)
ISBN No: 0-13-149247-0
No of Pages : 430
Price : Check at Amazon
Intended Audience : Beginners, Intermediates and Experts alike

Final Word
Believe me, writing a book is a very tough job. Here, Steve has done a wonderful job of compiling this book which can be an asset in the hands of a programmer in Linux. There are even chapters which will help system administrators tackle hard to understand problems they encounter in their work with ease. Reading this book has definitely increased and enriched my knowledge about Linux w.r.t the debugging and optimization techniques.