Category Archives: Linux

Ubuntu Server 7.10: Installing Proftpd and ‘warning: unable to determine IP address of’

I had some trouble installing proftpd today on a Linode.com VPS server. For anyone that hasn’t used linode, it’s a colo service that lets people lease different grades of Linux VPS servers at reasonable prices. I’ve been using them for 4 months with few problems.

Installing proftpd in Ubuntu is easy. All you have to do is open a Terminal window or SSH into your server and type:

sudo apt-get install proftpd

However for some reason, the Ubuntu distros mounted on Linode’s servers have a quirk or two that require a few tweaks to get the server working. The first thing is whether to run the server in standalone or xinetd mode. The server won’t be able access port 21 unless it’s operating in standalone mode. So, set it to standalone mode.

The second quirk, which is not unique to Linode VPS servers, regards resolving the host’s address. You must add the server’s static address to the hosts file. For example, if your machine’s name is ‘johnny’ and the static IP address is 79.221.23.12.

To edit your hosts file type

sudo vi /etc/hosts

Your hosts file  should look like this:

127.0.0.1       localhost
79.221.23.12 johnny

# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
ff02::3 ip6-allhosts

You can determine your machine’s host name by opening the file,  ‘/etc/hostname’.

Aftewards, check your proftpd configuration. To do this, type:

sudo proftpd -td5

If you would like more information on setting up Proftpd or Ubuntu’s network configuration. Please consult the following:

When Not to Run on an Open Source Platform

The other day I had a nice chat with a colleague about the benefits and pitfalls of using an open source platform. I know this subject has been discussed deeply on various forums articles and blogs. I’ve personally done research for Alcatel-Lucent and the University of Texas at Dallas on the subject. For businesses, the bottom line is cost. Their are certain scenarios in a business where switching platforms, whether its open source or not, does not make sense. The scenario I discussed with my colleague revolved around the cost of switching from a Microsoft platform to a Linux platform for its custom server applications.

My colleague’s business faces two main hurdles before considering a migration: the time it will take to migrate custom applications and the expenses incurred during the migration. If the current system is meeting customer needs and does not need further changes, then migrating over to an open source platform may not be the best choice, especially if the lifetime of the system is not close to expiring. Open source software is best used when continuous scaling is demanded. Because of the licensing fees incurred when scaling can become astronomical, development of the system needs to consider the benefits given to a proprietary solution over a open source solution. For example, Microsoft’s Visual Studio has excellent tools for building .NET applications. If the time saved from developing on this platform is justifiable and the scaling needs do not supercede the cost  saved using an open source platform, then using a Microsoft solution may be warranted.

Despite my overall excellent experience developing on the Microsoft platform, I still have a hard time recommending it as the platform of choice, mainly because of cost and time savings. These savings are not obvious to people who have not used an open platform before. One way to explain the cost and time savings incurred is the notion of barriers or hurdles towards completing a goal. When developing or deploying a system, one of the hurdles towards completion is paying for licenses, which incurs some amount of expense in time and money. If the entire application stack is free, then you eliminate that expense altogether for the rest of the system’s lifetime. This allows the complete application stack to be cloned or deployed multitudes of times, whether in a test system, developer system, or production system, without incurring the expense of time and money you would encounter in a purchased product. Removing this hurdle has changed the way entire systems are deployed and general deployment of systems in both open source and commercial projects. One simple example is Debian’s software distribution, ‘apt’. A user can script out the default software configuration for a server with one command line. For example, installing a web server, office suite, browser, a couple of games, a couple of compilers, and a IDE with one shell command. Removing the purchasing barrier also paves the way towards completely automating the scaling of a system’s infrastructure. For the small software vendor, you just can’t do that on a Windows platform.

Ubuntu Desktop 7.10: Setup HP 1200 Printer

After a short search through the Ubuntu forums, I ran into this post that went into details about setting up a HP printing device. After briefly reading through the instructions, I ran a utility called ‘HPLIP’. ‘HPLIP’ is a program that will automatically download and compile all the necessary files to activate your printer.  The program will ask you a few questions about your computer and request you replug-in your printer at the end of the installation. After doing this, I printed a test page and golly… it actually worked.

Ubuntu Linux: Syncing Documents between Different Computers Using NFS and Unison

The other day I successfully made a full transition from my laptop to my desktop as my primary development environment. The biggest hurdle before completing this transition was transferring and syncing documents between my two laptop and desktop. For quick file transfers, I created a network file share, or NFS, on my desktop, while mounting the drive on my laptop. For a quick overview on how to setup and mount NFS, consult this thread on Ubuntu forums.

I also wanted to sync and compare documents from a centralized server and have the ability to compare differences between a client and a centralized ‘master’ copy. (think Subversion – but without all the permissions and change logging) After a quick Google search, I found a wonderful program called ‘Unison’. This program will allow a user to define a master directory on a server and slave directory on a client. Master being the label for the directory where all clients compare their files and slave as clients that send new files or receive files copied from other clients to the master directory. For directions on installing unison, consult this article on howtoforge.com.

Developing on a Open Source Platform

Lately I’ve been tasked with writing an Administration interface that’s very client heavy for a web application. It uses the Extjs framework and its widgets for building the GUI, Django + Python for the application tier, and PostgreSQL for the database end. We’re using Apache and Ubuntu Server as our platform. The entire application stack is open source, so the acquisition costs for starting development is nil.

In the past few weeks, I’ve developed more insights into the advantages of developing on a completely open source stack. The newest pro I’ve discovered is the documentation and active communities in the larger and popular projects. I know a lot of MS developers moan about the lack of adequate support available on some open source projects, but it’s not true of all them out there. When choosing components for your system, it’s almost a given that strong community support is a requirement. Fortunately in OSS projects, the utility of a project and the general following behind it go hand in hand.

I can recommend with confidence that the community support behind our application stack (Extjs, Django, Python, PostgreSQL) for our system is strong and adequate for any web application projects that you may want to pursue

Ubuntu Desktop: Unplugging/Replugging your Network Cable on your Laptop and Requesting a New Address from DHCP.

Sometimes when working on my laptop, which has Ubuntu Desktop installed on it, I have to move it around and therefore unplug the network cable and switch to wireless or vice versa. Unplugging and replugging your laptop into a network sometimes results in the laptop’s inability to renew its IP address or re-establish a connection with the internet. After a quick Google search, I found this post about the problem.

In order to issue a command similar to ‘ipconfig -renew’ in Windows, open your shell and type the following:

sudo ifdown eth0
sudo ifup eth0

These two commands will renew your IP address and should fix the connection problem. However, there’s a program called ‘ifplugd’ that monitors your network connection and automatically renews your address if this problem occurs. To install this program, open your shell and type:

sudo apt-get install ifplugd

Ubuntu Desktop: How to Find Your Application Files that Store Your Personal Preferences

Today I was trying to find the location of my chat logs for gaim in Ubuntu and noticed that none of my logs were being found through the File Search program (‘Places’ -> ‘Search for Files’). After digging through a few Google queries, I ran into a blurb about hidden folders prefixed with a period.  It turns out all personal preferences user specific files for your applications are stored in directories with this notation ‘/home/user_name/.program_name’. So all my personal files and settings related to gaim would be stored in ‘/home/alex/.gaim’.

The File Search program ignores hidden files by default. In order to search hidden files, click on the ‘Available Options’ drop down list and select ‘Show hidden and backup files’ and press the ‘Add’ button. This should include all hidden files in your search.

To see a list of the hidden directories in your home folder, open the shell and type the following:

ls -a

By default, your command prompt should open in your home folder. So you shouldn’t need to navigate to ‘/home/user_name/’.

Mirroring an FTP site in Ubuntu Server

The other day I was tasked with mirroring a FTP site, about 5+ gigs of files, on our local server. Mirroring directories is a fairly common task when administering servers, however, the main differences when tasked with this job are the protocols available, whether the job is bi-directional or one way, and the how fast the mirroring needs to occur.

Lucky for me, this job did not demand instantaneous sync and the job was only one way – meaning changes from a server were reflected only from one server. The biggest problem was this job was limited to using only the FTP protocol for mirroring the site. This immediately removed rsync, a popular server/client for syncing directories remotely, as an option. After a quick search through the Ubuntu forums, I stumbled upon a post that detailed several programs on mirroring an FTP site using FTP protocol only. I chose a program called ‘ftpmirror’.

Ftpmirror is a program that lets a user define ‘packages’, which are configuration details for mirroring an FTP site, and scheduling these ‘packages’ to be run daily, monthly, or weekly. To install this program, I typed

sudo apt-get install ftpmirror

If you’re using Ubuntu Server, the configuration files should reside in the ‘/etc/ftpmirror/’ directory. Upon browsing through the directory, you will find a file called ‘ftpmirror.cf-sample’. This file contains a few example ‘packages’ that can be used as a template. The user puts any active ‘packages’ in the ‘ftpmirror.cf’ file. My ‘ftpmirror.cf’ file looks like this:

package = alexkuo_media
ftp-server = master.alexkuo.info
ftp-user = mirror
ftp-pass = password
remote-directory = /media/pics/
local-directory = /home/deploy/media/pics

This package uses the directory ‘/media/pics/’ as the root directory on the ftp server ‘master.alexkuo.info’ and uses the login/password, mirror/password, to login into the remote server. All files, directories, and subdirectories found in the ‘/media/pics’ directory are then downloaded into the ‘/home/deploy/media/pics’ directory on the local machine once the package is activated.

I decided to run this job once a week, so I added the package to the ‘/etc/ftpmirror/list.weekly’ file. To do this, open the ‘list.weekly’ file with a text editor. Mine looks like this:

alexkuo_media

Pretty plain huh? I removed the comments that originally came with the file, so it looks pretty bare. Adding another package involves defining another package in the ftpmirror.cf file and appending the package name on a new line in one of the *.list files.

Ubuntu Server 7.04 Fiesty: Mounting or Accessing a USB Drive in Bash Shell

I ran into a problem today accessing a USB Drive in Ubuntu Server. In Windows and in Ubuntu Standard, plug and play works fine. My laptop, which has Ubuntu installed, immediately recognized the USB drive once it was plugged in. I accessed its contents in the Bash shell by typing:

ls -l /media
ls -l /media/ExternalHDD

Typing the same thing in Ubuntu Server will not yield the same results. Ubuntu 7.04 server does not come with automatic mounting. After plugging in your drive, you need to run a few commands before access to the drive will be allowed. The first command is to install a program called ‘pmount’. Type:

sudo apt-get install pmount

After installing the pmount, you need to figure out what the drive is called on your system. Type:

fdisk -l

You should see output that looks like this:

Disk /dev/sdf: 300.0 GB, 300069052416 bytes
255 heads, 63 sectors/track, 36481 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes

Device Boot Start End Blocks Id System
/dev/sdf4 * 1 36481 293033601 7 HPFS/NTFS

The USB drive I am using is called ‘/dev/sdf4’. To mount the drive, you’re need to create a directory to alias the drive and use pmount to reference the alias. To do this I type:

sudo mkdir /media/usbdrive
pmount /dev/sdf4 /media/usbdrive

Afterwards, I copied a bunch of files on the USB drive and set the ownership of the files from root to myself. I typed:

sudo cp -R /media/usbdrive/filesToCopy /home/alex/
cd /home/alex
sudo chown -R alex:alex filesToCopy

New Article