meso·pixel    

Late May

Making the Kindle 3 more like a book

Despite it being a remarkable little device, one thing that annoys me about the Kindle is how the screensaver gets enabled whenever I leave it for more than a minute. One can only look at a portrait of Emily Dickinson so many times, you know? As far as I understand, there isn't much happening in the background on the Kindle when you are reading and the page is static, which is why it has such extraordinary battery life, and since my kindle rarely leaves my desk, I don't ever need the screensaver to kick in. This way, it's more like a book that I can put down and pick up right where I left off.

Since there are no settings to disable this you can resort to some hackery to disable the Kindle 3's screensaver:

  • Go to the Homescreen
  • Press the Enter key (looks like a return key with a little arrow)
  • type ';debugOn' (note the capital 'O' and the semicolon in front)
  • type '~disableScreensaver' (note the capital 'S' and the tilde in front)
  • type ';debugOff' (again, note the capital 'O' and the semicolon in front)
  • and to reenable the screensaver use instead '~resumeScreensaver'

I can't speak as to whether this works on other devices, but it works like a charm for the Kindle 3.
Happy reading!

Edit: On further testing, this appears to reduce your battery life quite a bit, use at your own discretion :(

 

Mid May

Slow down and back that thing up

Losing data sucks Big Time. I’ve been bitten by the data-loss bug numerous times, both on my website (due to goofs/bad configuration) and on my personal computers (due to hardware failures). And now that I’m working on EC2, it’s becoming more and more apparent that having a good backup strategy is really important.

For example, Amazon’s S3 service has a reported reliability rate of 99.999999999%[1], which means your chances of losing data is less than your chances of winning the lottery. Awesome. However, EBS backed EC2 instances like the one running this site now aren't so lucky, with what Amazon describes as an annual EBS volume failure rate of up to 0.5%[2], and an EC2 uptime of at least 99.95%[3]. Not so great. (But still much better than failure rates in consumer hardware)

Since I never gave it much thought initially when building the original site, my previous strategy, or lack of, was embarrassingly poor – mainly consisting of me manually downloading the site and associated database once a year. The worst part was that, all changes were made directly to the site live, which meant that anyone browsing the site would be affected every time I make a mistake or typo in PHP. All this despite me using best practices in my day to day work... for shame!

Right now, my strategy for backing up this new site is as follows, starting at the AWS layer:

  • EBS backed AMIs (Amazon Machine Images) created on major server configuration changes – this ensures that we have a properly server instance that can be spooled up (with slightly old data and mysql database) at any time
  • S3/local based backup of MySQL data base, website, and server config files created every change – this ensures that we have the most recent data that can be restored onto an old AMI, with a little bit of manual work
  • Git repository of website and server config files – this allows me to pull down a copy of the website for local editing when I don’t have internet access, along with all the other benefits of having a version control system
  • Development/Production environments in git, and on the server – this ensures that I can properly test new features before pushing it out for people to see (which is done through another script). The production environment is a separate branch from the dev environment, which means that we can revert any faulty pushes if necessary
  • Occasionally running a full restore process by taking the last AMI, and pulling the latest website/server config to ensure that the backups are actually valid – your backups are only good if they can actually be restored!

The biggest concern of this strategy is that S3 is not accounted for at all, so the question becomes how comfortable I am with S3’s 99.999999999% reliability, or whether it is worth duplicating that data in another S3 zone (probability of loss then becomes 1-p2) or saving it locally. And to be honest, those are pretty good numbers to begin with, and I’m actually OK with leaving data on S3 as-is and performing bi-annual or annual backups. Otherwise, for the more frequent (and expected) failures of EC2 and EBS, I am fairly confident that I could restore the server in the event of a catastrophe. Now as for the personal files on my computers at home, well that’s another strategy that I’m going to have to come up with soon! :)

[1] Amazon S3 RRS
[2] Amazon EBS
[3] Amazon EC2 Service Level Agreement

 

Make-shift macro lens

I always wondered how people could justify buying an expensive macro lens for occasional shooting but a recent reddit post made me realize that you can get the same kind of macro effect by inverting a normal camera lens! The way that a normal lens works is that it focuses light from a scene through a series of concave and convex lenses to a small area on the camera’s sensor. The curvature of the lenses vary depending on the focal length of the lens, which also explains why a 20mm lens is wider than a 50mm lens. When you invert the lens (put it backwards), the opposite effect happens and light that comes in the back end of the lens (like in this picture) is now focused at a distance that the sensor would normally be (on my camera it’s about one focal length away).

macro alaska quarter inverted 20mm lens
A magnified view of an Alaska quarter from an inverted 20mm lens

Crunching a few numbers, the head of the bear is about 0.635cm on the quarter, and in the picture it’s about 1582px out of the full 4000px at the large size. That’s about 39.55% of the frame and since the micro four thirds sensor is roughly 18mm x 13.5mm, this translates to about 7.12mm in actual size or a magnification of roughly 1:11.2!

This isn’t too shabby for a make-shift macro lens, but real macro lenses can get up to 1:2 or 1:1 (object:image ratio) which is what you pay so much for. Lastly, the 20mm lens that I have for the GF-1 only has electronic aperture controls, so I believe the shallow depth of field is just due to the lens reverting to its native aperture (f/1.7) when unmounted.

Pretty neat eh?

Waterfall

Ah, the memories of MIDI

This is almost artistic, cruise ships from an ariel view

Auto-complete Bash history using arrow keys (probably the best Bash tip I know)

Pong

Remember and Big Shiny Tunes and Much Dance? Good times.

Worst office fear: Rolling over your own toes with your computer chair.

Don't say Disney won't go to great lengths to optimize their animatronics...

Like horse racing but for nerds and biologists, Genetic Cars.

Monterey 2013 (4)
Monterey 2013 (4)