Single-Molecule Localization Microscopy  •  Software Benchmarking

  • Second edition of the challenge presented at SMLMS 2016

  • Special session on YouTube

  • Super-resolution fight club

  • Publication of the results in Nature Methods 2015

  • Challenge 2013 focussed on 2D, low and high density

  • Super-resolution microscopy wins the 2014 Nobel Prize

  • Why Challenges?

  • Open software initiative for bioimaging informatics

Benchmarking of SMLM Software

Comprehensive review of the single-molecule localization microscopy (SMLM, PALM, STORM) software packages.
This benchmarking uses common reference datasets that simulates biological structures and image formation process. The evaluation of the software packages is performed with well-defined metrics to get an objective and quantitative assessment. The performance test is done be the developers themselves who participated to a challenge 2013 or to the challenge 2016.

Publication of the Assessment Results

picture1

NATURE METHODS | ANALYSIS
D. Sage, H. Kirshner, T. Pengo, N. Stuurman, J. Min, S. Manley & M. Unser, Quantitative evaluation of software packages for single-molecule localization microscopy, Nature Methods (2015). doi:10.1038/nmeth.3442

Abstract The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

News

Update of the challenge1 Sept. 2017

Thanh-an Pham has a talk at the SMLMS 2017 Symposium in London: Developments of the ongoing 3D SMLM software challenge.

Software 11 Jul. 2017

Update of the directory of SMLM software packages. 83 SMLM software are now identified and referred.

2nd round10 April 2017

From April 10 to May 10 a second round of the challenge is running. All the DH have be submitted with the new normalized datasets.

Participation1 August 2016

A large panel of software have submitted localization files to the challenge in all modalities.

Dataset (3D)6 May 2016

The first dataset is released. The same sample is simulated in 4 modalities: 2D, 3D-astigmatism, 3D-biplane, and 3D-double-helix.

New challenge 24 Aug. 2015

A new challenge will be held in August 2016. Send an email to be kept informed.

Publication 15 June 2015

Advanced online publication of the comparative results on the Nature Methods website.

Online challenge 11 July 2013

The ISBI Challenge is turned to a permanent online challenge.

© 2017 Biomedical Imaging Group, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
Last update: 31 Mar 2017