α ↔ ω : The Process
Now that you've heard my long diatribe on why I need to take tens of thousands of frames of growing plants (totaling hundreds — or possibly thousands — of gigabytes), I'll explain my process for managing and processing the images (the technical side of things), in case anyone ever wants to do something similar...
I've got two cameras for this project; a D80, and a Coolpix 8700. They're both way beyond the point where quality is a problem when compressing down to my destination format (HD@720p, or @1080p if Nate gets his way). The D80 has an antique 55mm Micro-Nikkor attached (along with a PK-13 extension tube to get to 1:1 if I need it); the 8700 has a reasonable macro feature on the built-in lens.
Both cameras are set on full-manual — shutter speed, aperture, white balance, focus, everything. The D80 is outputting RAWs; the 8700 is only outputting JPEGs (for reasons I'll explain in a minute) — this means that a shot from the D80 is about 8MB, and a shot from the 8700 is about 1MB.
Each camera takes a picture once a minute. To achieve this, the D80 (which doesn't ship with an intervalometer feature, and I have yet to finish my intervalometer project) is tethered (via a nice, standard USB-A-to-USB-mini-B cable) to my EeePC running Ubuntu. The Eee is connected to an external (USB) 160GB "buffer" hard drive.
The general flow of photos from the D80 to Final Cut follows something like the following:
- a) gphoto2 instructs the camera to take a picture (every 60 seconds).
gphoto2 --capture-image-and-download --filename "%Y%m%d%H%M%S.nef" -I 60
- b) The photo is taken, saved to the SD card, then immediately copied to the 160GB hard drive (by gphoto2) and deleted from the camera.
- c) Cron runs a script every hour that rsyncs the photos to a large (2TB) pool of storage on Jayne (I manually check occasionally to make sure that the 160GB drive isn't running out of space, and remove the copied pictures when it is; I haven't made this automatic simply as an extra layer of backup in case something goes bad).
rsync -a /home/hortont/timelapse jayne.hortont.com:Backups/
- d) Every few hours (or more often, if I'm bored), I use Adobe Camera Raw to render the most recent NEFs (still on Jayne, over the network) into a bunch of JPEGs on Kaylee's disk, fixing saturation and contrast problems on the way, as well as setting a reasonable white balance.
- e) I then use Quicktime 7 (damnit, Apple, fix Quicktime X!) to render the image sequence to 720p H.264 — this is just for observation, not for editing. I watch the video a few times, show it to people, etc.
- f) If I'm happy with what I've got and am done with the current plant/angle/lighting/whatever, I go back to Quicktime 7 (after having rendered all of the frames with Camera Raw), and export to Apple Intermediate at the resolution of the camera frames. I keep the full resolution so that I have lots of pixels to play with in Final Cut, for things like SmoothCam, and so that I can choose the crop later. This results in gigantic video files, but they're still easily less than half the size of the JPEGs (and ~1/10th the size of the RAWs).