The Cat Fancier's Handbook

Saturday, December 20, 2008

Timing is everything

My file synchroniser needs to know when a file was last modified, in order to decide if this version is newer than the version it's being compared with. Perl provides a very convenient stat() function that includes this, along with a variety of other information.

All we need do is...

use File::stat;

then...

my $StatusBlock = stat($Entry)
or die "Couldn't stat $Entry: $!";
my $LastModified = $StatusBlock->mtime;
print MASTERFILE "$Entry | $LastModified\n";

The first line creates a hash for the file information, the second is a crude handler for any errors, the third extracts the number of seconds since the epoch at which the file was modified and the last prints the file name and the date to my work file.

This is a nice example of Perl's power to simplify something that would otherwise require a lot of code. What's more, it's easy to understand, which is a massive advantage when maintenance is required. Having spent far too long puzzling over obscure code in the wee small hours with frantic managers breathing heavily in my ear, 'easy to understand' seems very good to me.
:

Friday, December 19, 2008

Progress in Perl

I always find 'silent' programmes especially irritating. There's nothing worse than starting a process and then watching a frozen screen in the hope of seeing some sign of progress. This is especially irritating with CLI programmes - you just don't know if it's busy, crashed or plain confused.

I've been working on a directory synchroniser recently and this very issue, of letting the user know that something is happening, came up. There are all sorts of ways to solve it. Simple but inelegant solutions include filling the screen with lines of detail showing what's going on, printing a single dot for each pass through the programme's loop, even writing a copious log and sneeringly advising the user to tail -f the file. Then there are the pretty, but complicated solutions, all of which tend to be a variation on the progress bar theme. Beside taking a lot of effort, these tend to impose a significant run-time overhead and, for a simple little job such as my synchroniser project, seem like massive overkill.

There is, however, one very neat and simple CLI solution: use a spinner. Although GUI spinners can be complicated, in text mode Perl, they are almost laughably trivial, although I don't recall seeing this approach described anywhere. The slight spin (ouch!) on my approach, is that I needed the progress to be shown in a sub-routine that calls itself recursively, but that's quite easy with some global variables.

I only needed three variables to implement the spinner. I defined them with the vars package, as the routines that will use them are in packages outside the main script...

use vars qw/
@Spinner
$ProgressCount
$BackSpace
/;

@Spinner is a simple array for the characters that make up the spinning wheel, $ProgressCount is an integer that decides which character to display and $BackSpace is, as its name suggests, the delete backwards character.

Then, still in the main script, I only had to initialise them...

@Spinner = ('!', '/', '-', '\\', '!', '/', '-', '\\');
$ProgressCount = 0;
$BackSpace = "\010";

The spinner characters are pretty obvious, except that you need to remember to escape the backslash, to avoid any parsing problems. The progress counter is set to 0 when we begin, although it could be any valid index into the spinner array. The backspace value should work equally well for Windows and Unix systems, although I've only tested it on Windows XP and Apple OS-X so far.

Finally, to use the spinner, you only need three lines, at the point where you're showing the progress...

print $::BackSpace . $::Spinner[$::ProgressCount];
if( $::ProgressCount++ > 6 ) { $::ProgressCount = 0; }
$=1;

The first line backspaces over whatever was there, then shows the currently selected character. The second line increments the progress counter and checks it hasn't got too big. If it has, it resets it. The third line forces Perl to flush the print buffer, thus making sure that we see the progress as it occurs. If you don't do this, you could end up with nothing apparently happening, which rather negates the point of this excercise.

By the way, the syntax '$::varname' just points 'varname' at the calling $main, in this case, our control script.
:

Tuesday, December 16, 2008

It lives, Igor, it lives!

My spare parts, twin screen PC is now running smoothly under 'obsolete' Windows 2000. I've got Vmware Server running guests created in other hosts, provided the guests have been shut down correctly.

I've also come across the dreaded '511 vmware-serverd service is not running' message. This seems to be a general purpose grumble, along the lines of Apache's 404 message.

In my own case, I cleared it by removing all the 'Datastore' lines in

\Documents and Settings\All Users\Application Data\VMware Server\config.ini

Then I went to the Control Panel, ran the Administrative Tools application, selected Computer Management and, from the Services and Applications list, selected the Services item. I restarted the VMware Registration Service and the whole thing sprang to life.

VMware Server is a useful tool but you do need to treat it with care.
:

Sunday, December 14, 2008

Just when you thought it was safe to go back in the water.

I rebuilt my twin screen PC under Windows 2000 and, as expected, it was a simple, straightforward job. Then I loaded VMware Server.

First of all, VMware asked if I wanted to download the new, improved version and I said "yes". As others have discovered before me, this was not a good idea. The first clue was that v2.0.0 is now a whopping 590MByte download, against 1.0.6's sprightly 149MByte. Still, there must be good things to come from all these extra bytes. Perhaps, but not for someone who wants to run on a paltry little AMD XP 2200+ with a miniscule 1GByte of RAM. The good news was that I could keep the Task Manager's performance pane open on one screen, as my outdated box struggled manfully to cope with the demands of the new software. Performance, as the saying goes, sucked. Badly.

So I uninstalled V2 and re-installed V1.0.6.

This should have been the end of the story except, that I discovered something I hadn't known before. All our other PCs are Pentium based, this is the one and only AMD box we have. So all our virtual machines were created on Pentium kit. Well, here's a thing: VMware won't run a guest built under one chipset on a host running a different one.

To be fair, the console will tell you that this is a bad idea but I thought "everything that runs on Pentium runs on AMD XP". No it won't, not if it's a VMware virtual machine. If you ignore the warning and start up the guest, the first time you try to do anything you'll get an error. Living in my closeted Intel bubble I hadn't realised there were major compatibility issues, even though other, wiser heads clearly knew there were.

Apparently, all is not lost. It's alleged that this problem only affects 'Suspended' guests. The theory is that you can migrate a 'Switched Off' guest quite happily. I need to try this and see.
:

Saturday, December 13, 2008

and now for something completely the same

Having failed to get my twin head 'nix system up under Centos and Mandriva, my thoughts turned to things Sun. I have a nice shiny OpenSolaris CD in my disk wallet and it seemed like that might solve my problem.

Two hours later and the answer is 'no'.

Things turned iffy when the installer failed to recognise my network card, a Via Technologies Rhine II that all the other installers had no problem with. It claimed, however, to understand the Nvidia video card and so it did, kind of. It knew enough to recognise the second monitor and even work out its maximum resolution but was not so obliging as to turn it on. This was something of a problem as it decided that the first screen has a maximum resolution of 640 x 480.

I couldn't even activate the second screen, although it was quite clearly visible to the OS, because someone decided that the Nvidia X control panel dialog should have a minimum size of something larger than 640 x 480, which is sort of reasonable given the screens that most people will be working with. The only problem was that I couldn't get to the controls at the bottom of the dialog. Enough of this sort of thing and you start to lose the will to live.

Finally I tried Kubuntu. That didn't even get as far as installing, as it couldn't live with the Nvidia card for some reason.

I like Open Source. In my own small way I've tried to contribute to it. My commercial experience shows that Open Source is displacing Closed Source in the server room, ever more rapidly. The problem is that the server room is the province of geeks, who will spend the time to solve the problems, as I would solve this problem if I was being paid to do so. But I'm not and nor are most people who will be trying to put 'nix on the desktop.

Open Source software can work wonderfully well, when people who know what they're doing take a hand. Look at Apple's OS-X, which is, at heart, FreeBSD. We use several Apple machines and they're amazingly stable and incredibly versatile. Those who don't want to know what's going on under the covers, need never even think about it, while we techie types can go on-site with a laptop that's impressive even in the board room, yet is running one of the best 'nixes in the business.

Linux too can be stable and useful. The Asus Eee portables show what can be done when a commercial organisation take charge, nailing down all the problems and quality engineering the rough edges. I've worked on a couple of projects myself where Linux has been engineered into specialist hardware with great success. The common point here, is that the companies have applied commercial savvy to the process, specified to the last degree what hardware the OS must run on, then tested the bugs out of the combination. That is something that large corporations can do, which loose federations of enthusiasts will always have a problem with.

It's no harsh criticism to say the Open Source movement needs to handle expectations somewhat more carefully than they do. It's just a case of, "look, this is a list of what we know the software can handle. if your stuff isn't on here, you'll need to proceed with caution". Of course, a geek can find that out easily enough, without being told, and quite possibly sort it out, but most of us aren't geeks.

This was intended to be a quick fix and it turned into a saga, so I cut the whole thing short by zapping the partitions and re-installing Windows 2000 across the whole disk (this is the machine that owns the licence, in any case). For the time being, I'll go back to my usual practice of running Linux on the desktop as a guest, under the free Vmware Server system and admit that, until further notice, Windows owns the desktop.
:

Friday, December 12, 2008

Close, but still no cigar

Linux is often proclaimed as the saviour of the benighted computeer. Moved to rightous anger by the beast of Redmond and the clear mastery of Linux, a mighty wave of Unix based systems will sweep away the evil empire of Microsoft and there will be much cheering throughout the galaxy.

Perhaps, but not just yet.

I am, it must be said, a friend of all things 'nix. I've earned a good living from the old bag of bolts for many years and consider myself to have some understanding of what's going on inside a running system. So, when I decided to recycle my wife's old LCD screen, by connecting it to my 'spare' PC, I didn't anticipate much in the way of trouble. Well, as Evelyn Waugh was wont to say, 'Up to a point, Lord Copper'.

I already had a couple of small Windows 2000 partitions on the disk, so I plugged the monitor into the spare socket of my old but perfectly serviceable Nvidia FX5500 and right clicked on the desktop. Select 'Properties', click the 'Settings' tab, pull down the 'Display' list and select 'use both monitors'; whereupon, I have to say, Robert was clearly your mother's brother. Total elapsed time: less than 5 minutes, including playing around with both monitors' settings, to get the best balance.

Ah, I thought, now let's put Linux on the unused disk space and build a twin head 'nix system. As I do a lot of work with Redhat servers, it seemed an obvious step to stick Centos 5.2, the free version of Redhat Enterprise Server, on the disk. This was not a success. As expected, Centos installed happily and soon I was looking at a KDE desktop. On one screen.

Nothing I could do would bring the second screen up as anything other than a psychadelic explosion in a neon tube factory. A few minutes searching turned up several comments, to the effect that the 'community' version of the Nvidia driver can't handle twin screens and you need to download Nvidia's own, proprietory, driver. So I did. It didn't. I could not get that driver to install, it just kept complaining that there was a problem with the installation and falling out.

All right, I thought to myself, we'll have another go. Out came the Madriva 2009 disk and 20 minutes or so later I was looking at, you guessed, one working monitor plus a piece of art guaranteed to get into the finals of the Turner Prize. Sigh.

So I rebooted, only to find that Grub had not got the message about the change in targets, even though Mandriva had claimed to have told it. So I had to boot from my Windows 2000 CD and invoke the recovery console, in order to invoke fixmbr. This is not your everyday DOS command but, when required, there's very little else that will do. So now I have to scrap the Linux partitions and start again.

Well, it's one way to use up a slow Friday.
:

Friday, September 21, 2007

Installing wxWidgets under Apple OS-X

wxWidgets (formerly known as wxWindows) is a very, very useful cross platform GUI development library. In plain English, you can write an application once on any platform, then deploy it easily on a completely different platform. It's Java Swing without the processing overhead!

You'll find that the installation of wxWidgets on an OS-X system is quite straight-forward but the documentation can confuse the newcomer a little. This may help...

Essentially, you first need to have installed the Apple Developer's Toolkit which you can download, free of charge, from the Apple Developer site. Follow the instructions for installing and testing the tools.

With the tool kit installed, and tested, you can then download the wxWidgets distribution for OS-X from the organisation's home page. Unzipping the single download file will create a substantial file tree with a root directory name of the form wxMac-n.n.n, where 'n.n.n' is the version number. I would recomend that you move this tree to the /Developer path created when you installed the Developer's Tool Kit. This just keeps everything together.

The next thing we have to do is make sure that the installation is correctly set up for your system. We do this by running configure, which has been installed with wxWidgets. Before we do this, we need to create a directory into which the wxWidgets libraries will be compiled. It's strongly suggested that you make this directory under your installation directory.

Let's start by running Terminal. Then the first thing we do when the terminal window is open, is to change to the installation directory...

$> cd ../../Developer/wxMac-2.6.1

Now create the directory that we're going to build the library in and then get into it...

$> mkdir osx-build
$> cd osx-build

We're ready to configure the installation for our needs. We call configure from the parent directory and tell it that we want a stand-alone library...

$> ../configure --disable-shared

This just sets up the makefile which will control the installation. You'll see a nice selection of messages as the process proceeds. When it's done, we're ready to create the library with which we'll link our application programmes. All that's required is to invoke make...

$> make

There will be a long wait while the compilation proceeds, there's a lot for it to do. Finally, if all is well, it will come to a natural halt. Now we can test it.

If we do the following...

$> cd samples/minimal
$> make

...we should see the compiler running. Provided it completes normally we do...

$> open minimal.appgt; open minimal.app

A window should open on the desktop and we should see a sample programme with the Aqua look and feel...


Minimal programme screenshot


There's not a lot this basic programme does other than display menus but that should be enough to prove that the installation has worked.

The reason for using open is that wxWidgets on the Mac creates bundled applications. Open is a Darwin executable which emulates, in a Terminal window, the process of double clicking on an application in Aqua.

You can see the effect of the bundling if you look at the samples/minimal directory from the Finder...




You'll notice that there are two 'minimal' applications shown. One is for the Classic environment (up to OS 9) and the other for OS-X. In practice, you'd distribute these in different directories, for the convenience of your users.
:

Followers

Who is this Sejanus character anyway?

I'm a British freelance Analyst Programmer who has spent the last 25 years working on everything from microcontrollers to mainframes. I use a wide variety of languages at work but try to stick to C and Perl for my own projects.