The Blur-Noise Trade Off Dataset

Introduction

This dataset was used to validate the restoration performance models for image deblurring presented in [Boracchi and Foi 2012].

Image Acquisition Settings

Images have been acquired by fixing a Canon EOS 400D DSLR camera on a tripod, placed in front of a monitor running a movie where a natural image was progressively translated along a specific trajectory. The camera was accurately positioned to ensure parallelism between the monitor and the imaging sensor (we carefully checked that in each picture the window displaying the movie was rectangular) and images were acquired while the movie was playing. The resulting blur can be rightly treated as spatially invariant (convolutional).
Overall, 24 videos have been rendered considering the six PSF trajectories shown in Figure 1 (five representing camera-shake blur and one uniform-motion blur) and four original images, shown in Figure 2.

Trajectories used to generate motion blur

Figure 1: Trajectories used to generate motion-blur PSFs in [Boracchi and Foi 2012]. The trajectory (a) corresponds to uniform (rectilinear) motion, while the remaining (b - f) represent blur due to camera-shake.


Original Images

Figure 2: Original Images (Balloons, Liza, Jeep, Salamander)


For each trajectory/image pair we rendered a 3-second movie and, from each movie, we acquired 30 pictures using the settings 1 - 10 shown below:

Setting ISO T # of shots
1 1600 1/125 3
2 1600 1/60 3
3 1600 1/30 3
4 1600 1/15 3
5 1600 1/8 3
6 800 1/4 3
7 400 1/2 3
8 200 1 3
9 100 2 3
10 100 2.5 3
G.T. 100 4 1

For each movie we also acquired a ground-truth image (G.T.), which corresponds to a 4-second exposure with ISO 100 taken from the still video. The G.T. image was exclusively used to compute the restoration error and to estimate the PSFs from blurred images.

We cropped observations of 256x256 pixels from different channels of the Bayer pattern and we estimate the PSFs via parametric fitting through the minimization of the RMSE of the restored image (since we do not consider blind deblurring scenario we levereged the knowledge of the continuous trajectory generating the motion blur).

The camera-raw dataset contains 1954 images acquired from 74 movies, of which 285 (from 13 movies) corrupted by uniform-motion blur (i.e., trajID = 'a'). The noise parameters a and b have been estimated using the algorithm in [Foi et al. 2008], and have been also reported in the dataset.

This experimental setup is analogous to the one in [Boracchi and Foi 2011], where raw images corrupted by uniform-motion blur were deconvolved. This dataset contains uniform motion blur along a single direction since, provided a PSF estimate, any deblurring algorithm would achieve essentially the same restoration quality when the blur direction varies, as discussed in [Boracchi and Foi 2011].

DataSet Description

The dataset is composed of 4 Matlab files, each containing a cell array img. Each file stores the observations from the same original image (Figure 2) and each cell contains a structure array that refers to a specific movie and color plane: the elements in the array are up to 30 (images where the PSF estimation was not succesfull has been discarded).
Here is the description of the fields of each structure array:

  • img{f}(1).gt: the ground truth acquired when the movie was paused. Only the first structure of the array contains a non-empty gt field;
  • img{f}(s).z: s-th observation (blurred and noisy image) from f-th movie
  • img{f}(s).psf.estimate: PSF estimated from the s-th observation exploiting the continuous trajectory generating the motion in the f-th movie
  • img{f}(s).psf.sigmaL: the psf descriptor of the s-th observation from f-th movie
  • img{f}(s).psf.sigmaS: the psf descriptor of the s-th observation from f-th movie
  • img{f}(1).psf.TrajStatistics: trajectory information. array:
    • img{f}(1).psf.TrajStatistics.x: a complex vector describing the continuous trajectory in the 2D plane
    • img{f}(1).psf.TrajStatistics.trajID: a char (a,b,c,d,e,f) identifying the trajectory curve, as in the figure above (we recall that img{f}(1).psf.TrajStatistics.trajID == 'a' corresponds to uniform motion).
    The trajectories have been generated from the Motion-Blur PSF generation sowftware (available here).
    Note that only the first structure of the array contains a non-empty psf.TrajStatistics field.
  • img{f}(s).EXIF.T: exposure time used to acquire the s-th observation from f-th movie.
  • img{f}(s).EXIF.ISO: ISO parameter used when to acquire the s-th observation from the f-th.
  • img{f}(s).EXIF.ColorChannel: color channel of the Bayer Pattern.
  • img{f}(s).EXIF.filename: additional information are reported in the filename.
  • img{f}(s).noiseParams.a: estimate of the noise parameter a as in [Foi et al., 2008].
  • img{f}(s).noiseParams.b: estimate of the noise parameter b as in [Foi et al., 2008].

DataSet Download

References

[Boracchi and Foi, 2012] Modeling the Performance of Image Restoration from Motion Blur
Giacomo Boracchi and Alessandro Foi,
Image Processing, IEEE Transactions on. vol.21, no.8, pp. 3502 - 3517, Aug. 2012, doi:10.1109/TIP.2012.2192126
(Preprint), (BibTeX), (Original)

[Boracchi and Foi, 2011] Uniform motion blur in Poissonian noise: blur/noise trade-off
Giacomo Boracchi and Alessandro Foi,
Image Processing, IEEE Transactions on. vol. 20, no. 2, pp. 592-598, Feb. 2011
doi: 10.1109/TIP.2010.2062196
(Preprint), (BibTeX), (Original), (Raw Images)

[Foi et al., 2008] Practical Poissonian-Gaussian noise modeling and fitting for single image raw-data
A. Foi, M. Trimeche, V. Katkovnik, and K. Egiazarian,
Image Processing, IEEE Transactions on. vol. 17, no. 10, pp. 1737-1754, October 2008. doi:10.1109/TIP.2008.2001399
(Preprint), (Download Software )