Configuring and testing the app for various motions

Currently, our main function initializes the LazyEyes object with the default parameters. By filling in the same parameter values explicitly, we would have this statement:

  lazyEyes = LazyEyes(maxHistoryLength=360,
            minHz=5.0/6.0, maxHz=1.0,
            amplification=32.0,
            numPyramidLevels=2,
            useLaplacianPyramid=True,
            useGrayOverlay=True,
            numFFTThreads = 4,
            numIFFTThreads=4,
            imageSize=(480, 360))

This recipe calls for a capture resolution of 480 x 360 and a signal processing resolution of 120 x 90 (as we are downsampling by 2 pyramid levels or a factor of 4). We are amplifying the motion only at frequencies of 0.833 Hz to 1.0 Hz, only at edges (as we are using the Laplacian pyramid), only in grayscale, and only over a history of 360 frames (about 20 to 40 seconds, depending on the frame rate). Motion is exaggerated by a factor of 32. These settings are suitable for many subtle upper-body movements such as a person's head swaying side to side, shoulders heaving while breathing, nostrils flaring, eyebrows rising and falling, and eyes scanning to and fro. For performance, the FFT and IFFT are each using 4 threads.

Here is a screenshot of the app that runs with these default parameters. Moments before taking the screenshot, I smiled like a comic theater mask and then I recomposed my normal expression. Note that my eyebrows and moustache are visible in multiple positions, including their current, low positions and their previous, high positions. For the sake of capturing the motion amplification effect in a still image, this gesture is quite exaggerated. However, in a moving video, we can see the amplification of more subtle movements too.

Configuring and testing the app for various motions

Here is a less extreme example where my eyebrows appear very tall because I raised and lowered them:

Configuring and testing the app for various motions

The parameters interact with each other in complex ways. Consider the following relationships between these parameters:

Let's try our hand at reconfiguring Lazy Eyes, starting with the numFFTThreads and numIFFTThreads parameters. We want to determine the numbers of threads that maximize Lazy Eyes' frame rate on your machine. The more CPU cores you have, the more threads you can gainfully use. However, experimentation is the best guide to pick a number. To get a log of the frame rate in Lazy Eyes, uncomment the following line in the _applyEulerianVideoMagnification method:

    print 'FPS:', 1.0 / timePerFrame

Run LazyEyes.py from the command line. Once the history fills up, the history's average FPS will be printed to the command line in every frame. Wait until this average FPS value stabilizes. It might take a minute for the average to adjust to the effect of the FFT and IFFT functions that begin running once the history is full. Take note of the FPS value, close the app, adjust the thread count parameters, and test again. Repeat until you feel that you have enough data to pick good numbers of threads to use on your hardware.

Now, experiment with other parameters to see how they affect FPS. The numPyramidLevels, useGrayOverlay, and imageSize parameters should all have a large effect. For subjectively good visual results, try to maintain a frame rate above 10 FPS. A frame rate above 15 FPS is excellent. When you are satisfied that you understand the parameters' effects on frame rate, comment out the following line again because the print statement is itself an expensive operation that can reduce frame rate:

    #print 'FPS:', 1.0 / timePerFrame

Let's try another recipe. Although our default recipe accentuates motion at edges that have high grayscale contrast, this next recipe accentuates motion in all areas (edge or non-edge) that have high contrast (color or grayscale). By considering 3 color channels instead of one grayscale channel, we are tripling the amount of data being processed by the FFT and IFFT. To offset this change, we will cut each dimension of the capture resolution to two third times its default value, thus reducing the amount of data to 2/3 * 2/3 = 4/9 times the default amount. As a net change, the FFT and IFFT process 3 * 4/9 = 4/3 times the default amount of data, a relatively small change. The following initialization statement shows our new recipe's parameters:

  lazyEyes = LazyEyes(useLaplacianPyramid=False,
            useGrayOverlay=False,
            imageSize=(320, 240))

Note that we are still using the default values for most parameters. If you have found non-default values that work well for numFFTThreads and numIFFTThreads on your machine, enter them as well.

Here are the screenshots to show our new recipe's effect. This time, let's look at a non-extreme example first. I was typing on my laptop when this was taken. Note the haloes around my arms, which move a lot when I type, and a slight distortion and discoloration of my left cheek (viewer's left in this mirrored image). My left cheek twitches a little when I think. Apparently, it is a tic—already known to my friends and family but rediscovered by me with the help of computer vision!

Configuring and testing the app for various motions

If you are viewing the color version of this image in the e-book, you should see that the haloes around my arms take a green hue from my shirt and a red hue from the sofa. Similarly, the haloes on my cheek take a magenta hue from my skin and a brown hue from my hair.

Now, let's consider a more fanciful example. If we were Jedi instead of secret agents, we might wave a steel ruler in the air and pretend it was a lightsaber. While testing the theory that Lazy Eyes could make the ruler look like a real lightsaber, I took the following screenshot. The image shows two pairs of light and dark lines in two places where I was waving the lightsaber ruler. One of the pairs of lines passes through each of my shoulders. The Light Side (light line) and the Dark Side (dark line) show opposite ends of the ruler's path as I waved it. The lines are especially clear in the color version in the e-book.

Configuring and testing the app for various motions

Finally, the moment for which we have all been waiting is … a recipe for amplifying a heartbeat! If you have a heart rate monitor, start by measuring your heart rate. Mine is approximately 87 bpm as I type these words and listen to inspiring ballads by the Canadian folk singer Stan Rogers. To convert bpm to Hz, divide the bpm value by 60 (the number of seconds per minute), giving (87 / 60) Hz = 1.45 Hz in my case. The most visible effect of a heartbeat is that a person's skin changes color, becoming more red or purple when blood is pumped through an area. Thus, let's modify our second recipe, which is able to amplify color motions in non-edge areas. Choosing a frequency range centered on 1.45 Hz, we have the following initializer:

  lazyEyes = LazyEyes(minHz=1.4, maxHz=1.5,
            useLaplacianPyramid=False,
            useGrayOverlay=False,
            imageSize=(320, 240))

Customize minHz and maxHz based on your own heart rate. Also, specify numFFTThreads and numIFFTThreads if non-default values work best for you on your machine.

Even amplified, a heartbeat is difficult to show in still images; it is much clearer in the live video while running the app. However, take a look at the following pair of screenshots. My skin in the left-hand side screenshot is more yellow (and lighter) while in the right-hand side screenshot it is more purple (and darker). For comparison, note that there is no change in the cream-colored curtains in the background.

Configuring and testing the app for various motions

Three recipes are a good start—certainly enough to fill an episode of a cooking show on TV. Go observe some other motions in your environment, try to estimate their frequencies, and then configure Lazy Eyes to amplify them. How do they look with grayscale amplification versus color amplification? Edge (Laplacian) versus area (Gaussian)? Different history lengths, pyramid levels, and amplification multipliers?

Note

Check the book's support page, http://www.nummist.com/opencv, for additional recipes and feel free to share your own by mailing me at .