Support by the official sponsors of
the Single-Molecule Localization Microscopy Symposium

Single-Molecule Localization Microscopy  •  Software Benchmarking

  • Second edition of the challenge in SMLMS 2016

  • Publication of the results in Nature Methods 2015

  • The 2013 Challenge is turned to an online permanent challenge

  • Super-resolution microscopy wins the 2014 Nobel Prize

  • Open software initiative for bioimaging informatics

Benchmarking of SMLM Software

Comprehensive review of the single-molecule localization microscopy (SMLM, PALM, STORM) software packages.
This benchmarking uses common reference datasets that simulates biological structures and image formation process. The evaluation of the software packages is performed with well-defined metrics to get an objective and quantitative assessment. The performance test is mainly done be the developers of the software themselves who participate to a challenge.

Publication of the Assessment Results


D. Sage, H. Kirshner, T. Pengo, N. Stuurman, J. Min, S. Manley & M. Unser, Quantitative evaluation of software packages for single-molecule localization microscopy, Nature Methods (2015). doi:10.1038/nmeth.3442

Abstract The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.


Live Workshop28 August 2016

28 Aug: Special Session
» Live TV YouTube Channel
29 Aug: Poster and Demos
30 Aug: Awards

Participation1 August 2016

A large panel of software have submitted localization files to the challenge in all modalities.

Competition is running15 June 2016

The competition Challenge 2016 will run from 15 June to 22 July 2016. The results will be presented SMLMS 2016 symposium in Lausanne Switzerland, 28-30th August.

Dataset (3D)6 May 2016

The first dataset is released. The same sample is simulated in 4 modalities: 2D, 3D-astigmatism, 3D-biplane, and 3D-double-helix.

fight club
26 Feb. 2016

Seamus Holden and Daniel Sage
Nature Photonics 10, 2016, PDF.

New challenge 24 Aug. 2015

A new challenge will be held in August 2016. Send an email to be kept informed.

Publication 15 June 2015

Advanced online publication of the comparative results on the Nature Methods website.

Software 24 Jan. 2015

Update of the comprehensive list of SMLM software packages. 45 for localization software and 6 for deconvolution-type reconstruction software.

Conditions of use 27 Sept. 2014

These reference datasets are designed to be largely used by the developpers to validate theirs software and by the users to check a software. They can be freely used if the sources and references are properly cited.

Online challenge 11 July 2013

The ISBI Challenge is turned to a permanent online challenge.

© 2016 Biomedical Imaging Group, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
Last update: 19 Aug 2016