The Zaurus has a Linksys WCF11 CF card, and the desktop has a D-Link DWL-122 (usb) interface.

I was able to install the prism2_usb driver from here, but wasn't able to get it to be the host AP for ohter devices to find.

A DI-524 wireless router works, though. I disabled its DHCP and set my wireless devices (including zaurus) to fixed addresses.

Powered by Zope

Microphone power supply

Microphone power supply

Microphone power supply

I've got a lavalier microphone that needs external power which I want to use with my video camera. The goal is to accept balanced XLR in, and output unbalanced both-channels-the-same stereo audio. Also, if the mini plug to the camera is unplugged, the closure of the switch inside the jack should turn the power supply off.

Here's my lousy plan. 9V battery puts 9V on each of the +sig and -sig incoming signal lines. A transformer makes a new signal out of the difference between +sig and -sig (which should be twice the signal, with noise cancelled out).

The transformer specs are inset at the bottom-right. I have probably really screwed up the impedances for the mic (200 ohms) and the camera (don't know). Please send suggestions if you know better.

The parts, laid in the final case.

A prototype version of the project.

The finished circuit board and connected components.

Back of the case, showing the belt clip.

Front of the case, showing the output jack.

Powered by Zope

Sleeping Dog short film

Sleeping Dog short film

Sleeping Dog

For the Robotmedia film festival in Berkeley ( link), I produced a short video with my girlfriend Kelsi and dog Micky. This is not Micky's first performance (but this one might have been).

Download the movie:

small 5.1M mpeg2 slow local copy
large 18M divx4 slow local copy


Kelsi and I wrote the story over lunch on 2004/1/29. We shot that night from 19:29-21:58 and got 36min of footage. It's mostly shots of the dog resting. We didn't break any dishes while shooting, but I shattered the vase while I was packing up.

A theme of this festival was 'silent'. The entrants make silent movies and a live band accompanies them at the show. The point of our movie was to film a story that relied completely upon sound effects, and then to present all the sounds in the picture.


I cut the show in several hours with cuisine, an open-source editor. I delivered the picture early the next week, so the band could see it and prepare their music. I produced the title effects over two days on the week of the show.

Screenshot of cuisine's timeline interface editing this show (before I did the audio mix using the show audio)


The titles are generated by a Python program that I wrote for this show. It's about 160 lines of code. The program generates all the moves from equations in the code- any 'keyframing' came from trial and error adjustment of expressions like "5*sin(id*4)+(25*f)*sin(30)". I could see the graphics in real time, and see the composited show pretty quickly, too, so I went through hundreds of iterations. has a mode where it can render larger than the output frame and then scale down, which is how I antialiased some of the effects. - program to write DV files for all 11 title effects - module for starting up pygame and optionally resizing and sending the display to encodedv (part of the libdv project)

The fonts are all from Larabie Fonts. I didn't seem to have a program around that would preview a bunch of ttf files that weren't setup as X11 fonts, so I wrote one. The result is less than twice as long as this paragraph, thanks to pygame and SDL.


As mentioned above, I worked on the movie in silence. After I set the timing, the band Legends & Deeds prepared the music and performed it live at the show. That was the first time I ever heard the soundtrack, and I think it's super. In the movie files above, you hear a recording right from the band's mixer during the live performance. At the end, I crossfade to a room mic to hear the audience.

Software used, beyond what's commonly distributed with Linux

  • dvgrab - move DV from camcorder to disk
  • libdv - DV decoding and encoding
  • cuisine - browse and edit footage, composite titles
  • ffmpeg - write VBR mpeg movies from DV
  • transcode - write divx movies from DV
I used mozilla to display my captured footage (screenshot), and the dv1394 kernel driver to output DV back to tape.



summermovies intro for Robotmedia

Fish I built a simple camera arm to get the slow camera move in the opening shot. It pivots by resting on the tripod's screw that would stick into a camera. The result was smooth enough that I could track the fish into the tank in about 10 keys over 500 frames. Finally, there's a bug in the conversion from the simulation rotations to the renderer's system, so the resulting fish rotation has problems (maybe gimbal lock, or maybe just a math error). The simulator looks great; the output rotation has lots of flips. Brawl I tried to make a shot where the crowd of people got faster and faster. blue at sierra; too bad the ground isn't blue crowd is carson high bulge in pov crack in pure povray bg blur and light shafts by ryan in AE Arch vid is old RM couldnt render roto by showtime py prog to make screen

Powered by Zope

image straightener

image straightener

The object is to auto-straighten an image. I have written some test code that tries to figure out the dominant angles in an image. A dominant angle near horizontal or vertical tells us that we should rotate the image by the right amount to make that angle *be* horiz/vert.

There's no production code yet. I think what's needed right now is a more precise way of estimating the dominant angle.

Get the code with anonymous CVS:

cvs -z3 -d co straighten

ViewCVS is available

Some results. The graph is some popularity measure vs angle. The peak in the middle is near 90, but isn't exactly 90. I'd like to know, to perhaps .1 degree, what that peak angle is.
more files

A search based on the Hough transform may also be useful. I have not looked into how other auto-straighteners work at all. I think they just use the derivative method that I'm using.

Business card

Business card

You're probably holding my business card right now. The writing in the star is a perl program that outputs the entire card in postscript, including the star of code. Click the image above for a bigger, almost legible version, or get the postscript version

The condensed code is here. This is the version that you see on the card. It's about 1945 characters long. I did not attempt to make the code as short as possible. Whenever it was getting too long for the shape, I'd adjust the shape parameters and compress the code a little more.

The verbose code is here. This version first strips itself to become the condensed version, and then outputs the postscript. The condensed and verbose versions output the same postscript. The full version is about 4241 characters (152 lines) long.

You might be interested in these pages about quines:

Now go look at my home page!

Powered by Zope

Using the parallel port with Linux

Using the parallel port with Linux

Using the parallel port for output with Linux

by Drew Perttula
Here's a tiny program for Linux that turns the parallel port's data lines on and off. parcon.c

You say:

> parcon 1h 2h 3l

old = 00000000
new = 00000011

to set the first and second lines high and the third low.

Here's an executable for Intel Linux systems. The program needs to be run as root. Either make yourself root before running it, or (as root)

chown root parcon
chmod u+s parcon
This turns on the setuid bit for the program, so it gets root privileges even when other users run it.

Here's a simple interface to parcon, written in Tcl/tk. You can click on the numbers or type them on the keyboard to toggle the states of the output lines. tkparcon

Want to use the parallel port from python? You might like to check out some SWIG modules I used on a project. Check out the whole project with:

cvs -z3 -d co light9
The file light9/light8/parport.i is a swig wrapper for parport.c which I used on a recent project. It's messy and specific to the particular hardware I was controlling, but it contains at least what you'd need. There's a makefile too. (As well as entire functioning theater lighting control system :)

Home page

The button in my car

The button in my car

The button, outside of the car.

The button looked nice in the car, but then someone broke my window and snapped the button off. I haven't glued it back yet.

Powered by Zope

Compositing reel

Compositing Reel

Drew Perttula

October 2001 Demo Reel Notes

DivX demo reel (26MB) available, though you'll have to ask me to post a link to the current location, since it has moved around.

My tools are After Effects and Photoshop by Adobe; Rayz by Silicon Grail; Blender by NaN; POV-Ray, Gimp, and various scripting languages.

God with flamethrower

The interactive effects are an animated color-correction and a noise distortion field near the center of the image-- both Rayz effects.

Original particle system written in Perl. 50 particles emit from a moving source position and direction. Particles accelerate upward and are recycled after they reach a certain distance. Rendered as spheres with POV-Ray.

Final POV-Ray render using a halo effect with turbulence inside each sphere. Additional glow added later with Rayz.

Church explosion

Pew model by Harlan Hile using Blender. I exported the Blender object to POV-Ray for rendering.

Pew animation written in POV-Ray's own scripting language. Pews and floor rendered in POV-Ray.

Z-buffer rendered as a separate pass with a gradient texture on all objects.

Explosion is a POV-Ray sphere with a turbulent halo interior. I animated the threshold between opaque orange and transparent to increase the volume of the explosion over time.

I used Rayz to mask the explosion to an animated depth using the pew element Z-buffer.

Traveling light pass from POV-Ray. Glow enhanced with Rayz.

Dust element pulled from video shot at a ranch in Martinez; stabilized with Rayz. I replicated the usable dust area and masked it with another animated depth matte.

To add authentic camera shake, I actually shook a real camera and tracked the result with Rayz. Then I applied the shake to the church shot (in the same slow motion as the rest of the shot).


I have two friends who I can convince to stand outside the Sierra Spring Water building in Emeryville in various outfits. I'm the one in black at the end.

Rayz contains Ultimatte, which I used on the 12 "cels" to create about 3000 frames of people.

A custom crowd placer Perl script reads a bitmap of person locations (shown here). The script drives the Gimp to scale and place the individual-person frames at the right locations. Scaling is simply a function of y-position. Each person instance has its own counter and framerate.

The placer script placed people on a large panorama, but for efficiency, it didn't include people very far outside of the current field of view. The actual panning and zooming was done with Rayz.


Tower climb

Simple composite done with After Effects. The foreground element had to be tracked to undo a camera move.


Neptune - ship flyby

A multi-layer effect done with POV-Ray (for the 3D work) and After Effects (for the compositing). The dog was composited over the cockpit, whose front window was replaced with a moving starfield. That result was mapped onto a surface inside the ship's window.


Neptune - lever pull

After Effects combines a background miniature; a 3D lever created in POV-Ray; and a roto'd dog element. I adjusted the timing of the lever pull to match the dog's movement.


Neptune - ship hit

Some simple After Effects lightning and camera shake effects, plus a roto'd dog animated over the scene.


Real-time foreground over time-lapse scene

The background uses a custom video capture program I wrote that accumulates frames and saves a motion-blurred, time-lapse animation directly. The foreground (me) was difference-matted from its background using Rayz and placed over the time-lapse animation.

Powered by Zope