ISBI 2013 eurobioimaging
Open Bio Image Alliance

Deconvolution — Making the Most of Fluorescence Microscopy

Deconvolution is one of the most common image-reconstruction tasks that arise in 3D fluorescence microscopy. The aim of this challenge is to benchmark existing deconvolution algorithms and to stimulate the community to look for novel, global and practical approaches to this problem.

The challenge will be divided into two stages: a training phase and a competition (testing) phase. It will primarily be based on realistic-looking synthetic data sets representing various sub-cellular structures. In addition it will rely on a number of common and advanced performance metrics to objectively assess the quality of the results.

Register Now!


What type of noise sources affect the datasets released for this challenge?

We apply both Poisson and Gaussian noise, as well as quantization. In addition there is a background signal.

Please refer to the description of the forward model for more information.

For the qualification-stage, do I have to estimate the noise parameters (background constant and Gaussian variance)? Or can I assume that they are the same as in the training stage?

The noise parameters are NOT the same as in the training stage.

For simplicity, we have decided to disclose the value of the background signal for each channel of the qualification-stage data (see table below). Depending on the algorithm you use, you may have to estimate the Gaussian variance yourself.

Channel 0 1 2 3
Background value 5.2 3.4 7.3 9.7

Should my restored data include the background value?

Your restored data should NOT include the background signal.

The goal is to produce the best-possible estimate of the original ground-truth data, that is, from a mathematical perspective, to estimate the quantity \(f[\boldsymbol{n}]\) appearing in our description of the forward model.

Are you expecting results as 32-bit floating point or 32-bit unsigned integer (uint32) TIFF files? Can you provide a Matlab function to generate such files?

We are asking for floating-point data.

We have prepared a Matlab function for saving an image stack in the requested format: imwritestack.m. Note that it requires a reasonably recent version of Matlab (R2009a or later).

Will you be providing the ground-truth images for reference in the training stage?

The ground-truth images will NOT be published, as we would like to reproduce the conditions of a real deconvolution problem.

However, you are free to generate synthetic data based on your own ground-truth images, using for example the Matlab code that we provide (specifically, the ForwardModel3D.m script). If you are looking for good ground-truth images, one option would be to use a very "clean" (low-noise) confocal image stack. You may be able to obtain such stacks from online databases, e.g., the Cell Image Library.

Note that it is your responsibility to obtain permission to use data that is not published in the framework of this challenge.

Would it be possible to get the optical parameters for the image and PSF datasets?

Yes, although this information is not critical for the challenge.

Note however that the non-isotropic pixel size for the qualification-stage data will impact the performance metrics. Specifically, the grid parameter for the TVscore.m script must be set to [1 1 2] instead of the default value of [1 1 1].

Below is a table that summarizes the nominal optical parameters of the training and qualification data.

Parameter Training data Qualification data
Emission wavelength Red channel: 675 nm
Green channel: 525 nm
Blue channel: 450 nm
Channel 0: 600 nm
Channel 1: 525 nm
Channel 2: 675 nm
Channel 3: 450 nm
Pixel size (X-Y) 80 nm 80 nm
Pixel size (Z) 80 nm 160 nm
Objective NA 1.4 1.4
Refractive index (immersion medium) 1.518 1.518
Refractive index (sample) 1.33 1.33

How can I run your reference implementation of the Richardson-Lucy algorithm?

In addition to img (the measured image stack) and psf (the PSF stack), you need to pass at least iter (the number of iterations) and b (the value of the background). This is done as follows:

RLdeblur3D(img, psf, 'iter', 100, 'b', 5);

If you want to evaluate the quality metrics within your own calibration experiment, you need to pass img (the ground truth) and imgb (the blurred ground truth) in the same way as iter and b. You can use the default parameters for the rest of the arguments.

Important Dates

Beginning of training stage

The training stage of the 2nd edition of the challenge will begin soon. Follow this link for early registration.

July 15, 2013