Jordan Brock

Slide

Apr 19, 2009

Slide from Jordan Brock on Vimeo.

Using git to sync configurations between two computers

Apr 13, 2009

I have two machines that I use for my development work: a 24” iMac and a MacBook Air (It is so choice. If you have the means, I highly recommend picking one up. Bonus points for quote identification). Naturally, doing any work spread over two machines leads to all sorts of problems: Inconsistencies in development platform, files, data, code and just the general configuration of the machine. Thankfully, there exists a number of solutions to these problems.

DropBox

DropBox is simply file synchronisation done right. You install it, drag the files you want to have synced between two computers into the Dropbox folder, and BOOM, it copies everything up to their server, and when you set it up on the second computer, copies everything back down again. After the initial sync, which can take quite a while if you’ve got lots of stuff, it’s pretty much seamless. Create, or make a change to, a file on one computer, and before you know it, it’s available on the other computer.

Need access to those files from another computer, or even from your phone? No problem, they have a web interface that give you full access to all of your “synced” files. Simple.

To add a lovely cherry to the ice cream sundae that is DropBox, they also have versioning. That means if you make a series of dunderhead changes to a document, you can just rollback to the last good copy. Awesome.

Git and GitHub

For the longest time (mostly while I was still working on Windows) “Source Control” was some mystical mumbo jumbo that only people in large companies needed to worry about. Of course, that was mainly because at the time the options for Visual Studio only included Microsoft SourceSafe, which is perhaps some of the crappiest, most unusable software ever foisted on the unsuspecting public by a company with a long history of foisting crap on the unsuspecting public.

Once I switched to MacOSX and started working with Rails, it became clear that all source control systems weren’t created equally. Subversion seemed to be the choice of the community, so I loaded everything up onto my own SVN server. A world was opened up to me, and I could code with wild abandon, safe in the knowledge that I could rollback with no danger. Of course, that was the ideal world, but the basic approach worked. I created branches, tagged releases and deployed sites from SVN.

As with most things, people started grumbling about SVN and some of it’s shortcomings: difficulties in merging different code branches, no distributed repositories and some general discontent. Git was the solution, and the rails community just jumped ships in the middle of the night. Before you knew it, everyone was using Git, and also GitHub. I won’t go into the specifics of why this combo is so good, except to say that if you’re not using git, well, good day to you sir.

git and config_files

In addition to using git to store your development work, you can also use it to manage system configuration files. I found the wonderful config_files. It’s a collection of shell, git, vim and irb configuration files. So? How does that help?

A great feature of git, and github in particular, is that you can “fork” a repository. You effectively copy someone else’s repository, and you can then make all the changes to it you want, without affecting the original. If you want to let other people know about your changes, you can send the original developer a “pull request”, and they can grab your changes, and incorporate them, or not.

So, fork the config_files repository, and then clone it to your own system. Run the “install.sh” script, which basically just sets up a series of symlinks in your user directory. These symlinks point to the files in the git repository. You make any changes you want to the repository and commit them up to github. Then do the same on your other computer, and bingo, you’re synchronising your configuration between machines.

And why would you want config_files? Well, you get some great all round system setup tidbits, but my favourite bit is how it prints the current git branch as part of your prompt, saving you the need to try and work out what you’re in the middle of doing.

Tasty.

Online backups: How much should you back up?

Mar 03, 2009

Over the past few months, I have been slowly putting together a backup system that uses both local and online storage systems to provide a level of security and peace of mind.

Backup Overview

Using a combination of Super Duper, a couple of backup external HDD’s, Dropbox, Jungle Disk and Amazon S3 I have built up what I think is a relatively comprehensive and reliable backup system.

Local Backups

SuperDuper is a wonderful program that creates bootable mirrors of a hard drive. Effectively what this means is that if a hard drive fails, you can just replace it with the backup copy. You can schedule the backups to run as often as you want, which ensures that your backup copy is fresh and useable in case of a disaster.

And what am I backing up? I have two main drives that I use: The system disk (the built in drive in my iMac which contains all of my work files, personal documents, applications and the Operating System), and an external FireWire800 drive that stores photographs, videos, movies and TV shows.

Amazon S3

Amazon S3 is basically an online storage system that can be accessed via an API to upload, manage and retrieve data. Amazon charges for both the uploading of data (US$0.10 per Gb), and then for the actual storage of that data. The uploading charge is a one off cost, and the storage costs are charged monthly (US$0.15 per Gb per month).

Jungle Disk

One downfall of S3 is that it’s not actually setup to be used without additional software to manage the mechanics of the backups. Step up Jungle Disk. It’s pretty simple software: you give it your S3 account details, tell it what to backup, when to back it up and it will go ahead and take care of it for you. Simple. And then you get your monthly bill from S3. Nothing else to do.

One big downside of the whole online backup setup is the time it takes to actually backup any large amount of data, and that’s a limitation of my internet connection more than anything else. When you’re uploading 100+Gb, be prepared for a bit of a wait :)

The cost of Amazon S3

While the monthly cost of the Amazon S3 service makes it a perfect online backup solution for data that you would class as priceless (photos, videos of the kids, work files, personal documents), when it comes to backing up stuff that can be easily replaced at a relatively low cost, such as iTunes TV shows and Movies, there soon comes a point where it’s not actually economically feasible to use online storage.

iTunes TV Shows

The average 1 hour (42 minutes of network TV) HD episode of a TV show on iTunes is about 1.4Gb. In addition to this, you also get an iPod/iPhone compatible SD version which is generally 600Mb. So, a single TV show is effectively 2Gb of data that needs to be backed up.

What is the cost of backing this file(s) up? Well, there’s a US$0.20 charge for uploading it to Amazon S3 initially, and then a US$0.30 charge per month. At that rate, it only takes 10 months of storing the data online until you’ve actually paid for the file twice. This means that if you lost the original file 12 months after you first bought it, you’d actually be better off buying the file again.

This effectively renders online backups for iTunes TV Shows pointless, considering how often you’re actually likely to watch a TV show, and also how cheap the cost of having a local HDD mirror is.

iTunes Movies

Currently iTunes Movies are only available in SD (boo to the movie studios and Apple for this one) so the files aren’t as large as they could be, but they’re still pretty sizable, weighing in at 1.67Gb for the recent “The Dark Knight”, which cost US$14.99. And how long until it’s not feasible to store this on Amazon S3?

At the purchase price, it would take 4.99 years until you’ve paid twice. However, Apple drops the price of new release films to $9.99 after about 4-6 months, so the replacement cost is greatly reduced. At this price the cut-off becomes 3.33 years. Obviously this timeframe requires a judgement call as to whether or not it’s worth it. Personally, I’d rather just trust my local HDD backups.

iTunes/Amazon Music

Music files are obviously considerably smaller than video files, and as such are going to incur a greatly reduced monthly fee for storage. Your average iTunes album costs approximately US$9.99 and is generally around 110Mb. This small size means that it will actually take about 49.95 years until it’s been paid for twice, buy which point you’ll either be A) dead, or B) listening to music on your personal bone implant that plays whatever music you want that’s being broadcast by SkyNet.

So, music is one area where it’s probably economical to maintain an online backup of your files, particularly considering how annoying it would be to go and re-purchase the 800+ albums you’ve got in your library in the first place.

Other possible solutions and problems.

Whenever you’re talking about local HDD backups, it’s always worth considering a Drobo, which is a redundant array of HDD’s that theoretically keeps your data happy and safe. I don’t have a Drobo, but I know that users who have one swear by them.

The one problem with local backups is that, of course, they are susceptible to local threats, e.g. fire and theft. There’s no point in having duplicate hard drives that slavishly mirror each other if some reprobate comes along and pilfers them both. Which means you need a third mirror, that you store off-site. Which in turn means you need a fourth drive that you store off-site in rotation with the third one. An endless cycle.

Singapore - Little India

Jan 10, 2009

Thanks to the generosity of my rocking parents, we recently spent 10 days in Singapore, staying at The Sentosa Resort on Sentosa Island. While the idea of staying in Singapore for 10 days might not seem to be particularly relaxing, Sentosa is quite removed from all of the hustle of Singapore, and could be a tropical resort island anywhere. If you wanted to, you could quite happily spend all of your time on Sentosa, and not venture into the city at all.

However, we did spend a bit of time in Singapore, and one of the best places we went was to Little India. There’s a large Indian population in Singapore, and they seem to be largely centered around the Little India area.

One of the things you notice about Singapore is the massive amount of construction that’s been going on, and the fact that almost none of the “old Singapore” remains, having been bulldozed and rebuilt with modern buildings. Little India, by contrast, is almost in original condition. It doesn’t have any of the sanitised vibe that exists elsewhere in Singapore. As a result, it’s certainly the most vibrant and lively area that we went to.

Another great thing about Little India is the food. Eating out in Singapore can be quite expensive if you head to the “western” restaurants, but if you want to eat some of the “local” foods, then it’s quite remarkable how cheaply you can eat. We ate (twice it turns out) in a wonderful tandoori restaurant, stunned at how little everything cost. A chicken tikka, aloo methi, naan, raita, rice and samosas for $30. Gold.

Little India from Jordan Brock on Vimeo.

And here are some photos I took that are up on Flickr.

</param> </param> </param></embed>

2nd Gen Accelerators, Rails and attachment_fu

Oct 17, 2008

File this one under WTF.

I’ve been working on an updated version of the Soil Quality website for a little while now, and recently needed to deploy the site to a staging server for testing. I ordered up a 1/4Gb Accelerator from Joyent, configured it, and deployed the app, just like I’ve done with 20+ other sites, and BOOM, straight into a brick wall.

I opened up the log files and saw this error:

** Daemonized, any open files are closed.  Look at /tmp/soilquality-mongrel.8200.pid and log/mongrel.8200.log for info.
** Starting Mongrel listening at 127.0.0.1:8200
** Define INLINEDIR or HOME in your environment and try again

Wonderfully descriptive I know, but something I’d never run into before. So, fire the the google and it turns out it’s a common error, with a common fix: just put

ENV['INLINEDIR'] = RAILS_ROOT + "/tmp"

into your config/environment.rb file, make sure the directory exists and that it’s writeable by the mongrel, restart, and away you go.

But of course, that didn’t fix it, did it.

Much hair pulling ensued. I finally enlisted the help of Darcy Laycock and together we managed to track the problem down to the attachment_fu plugin, which was causing the problem as the mongrel process booted. OK, so now we knew where the problem lay, but what was causing it.

It turned out to be the ImageScience image processor, or more specifically the way the attachment_fu plugin and ImageScience work together ON A 2ND GENERATION JOYENT ACCELERATOR - eg the ones that use pkgsrc. Didn’t seem to cause the problem on an ubuntu machine, nor on one of the older BlastWave based Accelerators. I’m not sure why as of yet, as I was more worried about getting some sleep last night when we managed to fix the problem.

And the fix? Basically, strip ImageScience out of attachment_fu. Remove

vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu/processors/image_science_processor.rb

and remove “ImageScience” from this line

default_processors = %w(ImageScience Rmagick MiniMagick Gd2 CoreImage)

in this file

vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu/attachment_fu.rb

Like I said, I have no real idea why this is happening at the moment, but I’ll try and work it out and update this post if I get anywhere.

Once again, thanks to Darcy for A) being a sounding board, B) helping fix the problem and C) being up and available when I needed him :)