Simulate longexposure photography with OpenCV
Published January 13, 2013, by Mansour Moufid.
Last updated January 03, 2014.
Longexposure photography is a technique to capture dynamic scenes, which produces a contrast between its static and moving elements. Those parts of the scene which were in motion will appear blurred, creating a nice effect.
Below is a longexposure shot of a stream I took recently. It is technically not a longexposure photograph, but a simulation; this image was actually generated from a video recording taken with an old iPod, which was then processed in software into a single image.
(Forgive the poor quality, I don’t own a good camera. Nonetheless, this image demonstrates the desired effect.)
In this post, I describe a simple method to generate images like the ones above from video, and present an implementation as a Python script, which you can use to create your own simulated longexposure photographs.
Theory
Let a photograph be a matrix of values representing measures of light in two dimensions, at some point in time:
$$ P(t) = \begin{bmatrix} \mathbf{p}_{1,1} & \cdots & \mathbf{p}_{1,n} \\ \vdots & \ddots & \vdots \\ \mathbf{p}_{m,1} & \cdots & \mathbf{p}_{m,n} \end{bmatrix} $$
These values $\mathbf{p}_{i,j}$ can be either scalar, as in the case of monochromatic photographs, or multidimensional (vectors), as in the case of colour photographs. It’s practical to think of these as pixels.
Now consider that a photograph is a capture of a scene projected through a camera’s lens at a specific point in time. The matrix $P$ is just one in an infinite series. A longexposure photograph is the total of a number of such matrices within a given time window $\Delta$, determined by the exposure time. That is, the matrix $P^{(\Delta)}$ representing a longexposure photograph is given by:
$$ P^{(\Delta)} = \int_0^\Delta P(t) dt $$
Now consider a series of photographs (or frames) taken in quick succession. If we allow that the very short interval between frames is an infinitesimal in the limit, then we can express a longexposure matrix $P^{(\Delta)}$ as a finite sum. Thus, let our simulated longexposure matrix be given by $P^{(\Delta)} = \sum_{i = 1}^n P_i \times d$, where $P_i = P(t_i)$ are discrete values of the function $P$ above, and $n$ is the number of frames. The infinitesimal $dt$ has become the constant $d$, given by the length of the exposure interval $\Delta$ divided by the number of frames. Thus,
$$ P^{(\Delta)} = \sum_{i = 1}^n P_i \frac{\Delta}{n} = \Delta \left[ \frac{1}{n} \sum_{i = 1}^n P_i \right] $$
The term in brackets in the far right is simply the average of the set of matrices $\left\{ P_i \right\}$. In other words, we can simulate a longexposure photograph by averaging many photographs taken in relatively quick succession — such as the frames of a video. The concept is rather simple, and can be implemented in software.
Implementation
Hipshot is a Python script which converts a video file into a single image file simulating a longexposure photograph, using the method described above.^{1}
Use it like so:
$ hipshot /path/to/foo.mp4
Or draganddrop the video file onto the script’s icon. Hipshot will create the image file in the same directory as the video file.
Note that this implementation requires that the camera remain perfectly immobile during the entirety of the scene! It’s best to use a tripod, or at least set the camera down while it’s recording.
Hipshot uses the Python bindings of the OpenCV library.
Conclusions
Real longexposure photography requires the proper equipment,^{2} which can be expensive, so the ability to simulate this style of photography makes it much more accessible. Now anyone can create longexposure photographs like a hipster, without an expensive camera!
I’ve found this technique to be especially useful in lowlight, rainy, or otherwise noisy environments; the noise picked up by the camera seems to be normally distributed, and so averages to zero. This technique also has more interesting applications than art. Astronomers have used a similar procedure to subtract out noise caused by cosmic radiation from photographs taken in space.^{3}
Please leave a comment below if you have any questions or feedback. If you do create simulated longexposure shots using Hipshot, feel free to share.
Update: Note that the repository has moved from Google Code to Bitbucket. The latest version is also available from PyPI.

This implementation differs from the method described in that the exposure time factor $\Delta$ is normalized to a value of one. However, the effect is the same.
↩ 
For video tutorials on traditional longexposure photography, see: Long exposure tutorial with Scott Kelby and Long exposure photography tutorial on Youtube.
↩ 
An average of many exposures taken when the shutter is closed is subtracted from the raw data from the CCD image sensors to produce a more accurate photograph. This is called “darkframe subtraction.”
↩