South African Ruby and Ruby on Rails job portal

Earlier this week myself in collaboration with Shuntyard Technologies launched the rubysa.co.za domain. This has been an idea I had floating around for a while and I’m glad it’s finally seen the light of day. This portal serves the local Ruby and Ruby on Rails community and it is my hope that job seekers and business’s requiring work will use the site and help it grow into something useful.

If there any missing features just pop me a mail and we can look at adding it.

rubysa.co.za

PostgreSQL: Installing on windows using alternate service username

I came across this scenario the other day. I had to install PostgreSQL 9.0 on a windows box that already had an old PostgreSQL install and hence a local postgres account already setup. Using the same postgres account was not an option as a) I didn’t know the password and b) if the the password ever changed nobody would update the PostgreSQL 9.0 service entry with the updated password.

Fortunately there is an easy way to do this. Launch the one-click installer from the command line with the following arguments

postgresql-9.0.01-1-windows.exe --serviceaccount username

This will allow you to complete the installation as usual but the service account username will be the value passed. You can confirm this on the screen which asks you for the password. You can also complete the whole installation unattended just Launch the one-click installer with –help argument for a full list of options.

Pluggable Excel processing engine

How nice would be if you could design fancy-schmancy stuff in excel and plug that into your SQL process engine. Just image the possibilities, all those nasty algorithms you could come up with would be easily available to end users. Basically you would design what you wanted using a sample dataset in Excel with the same columns definitions as your real dataset. Somehow the you could drive your resultset through the excel engine and use the output. If you wanted to get fancy for larger datasets you could implement a mechanism where data could get streamed in chunks just large enough to surface the required calcs in Excel. E.g you are doing some time based calc that requires a full weeks worth of data to operate, you could then stream years worth of data in week chunks to the Excel engine. I bet this would beat the pants off of most complicated native sql based solutions as unless your well versed in sql and have a decent model within your control things start getting very slow very quickly for the average sql developer.

Compressing a file using GZipStream

 string file = @"C:\windows\temp\file.txt";

using (FileStream inFile = File.OpenRead(file))
{
    using (FileStream outFile = File.Create(
        Path.GetFullPath(file) + ".gz"))
    {
        using (GZipStream Compress = new GZipStream(outFile,
                CompressionMode.Compress))
        {
            byte[] buffer = new byte[4096];
            int numRead;
            while ((numRead = inFile.Read(buffer, 0, buffer.Length)) != 0)
            {
                Compress.Write(buffer, 0, numRead);
            }
        }
    }
}