syncAndCrunchthat pre-processes tiles during acquisition, displays the last completed slice on the web, and automatically stitches the data at the conclusion of acquisition. This is useful since pre-processing of the images is more time consuming than stitching. We have found monitoring the state of the acquisition to be enormously helpful. In the past we have used this to rescue samples that would otherwise have been lost to things like laser instability, too much or too little laser power, incorrect PMT gain, air bubbles on the objective, etc.
syncAndCrunchfunction running on the analysis machine periodically pulls data off the acquisition PC for pre-processing. Stitching is conducted automatically once acquisition completes. Stitched datasets that pass quality criteria are stored long-term on a large secure server. The analysis machine can only holds a limited number of data sets and are not secure.
syncAndCrunchsearches for a currently running acquisition and works on that.
syncAndCrunchwill notice it and start stitching.
syncAndCrunchassembles stitched images at the end of acquisition because it uses all available tiles to calculate the grand average images used for illumination correction. Stitched data are stored as one tiff per optical section in a directory called
stitchedImages_100. Data from each channel are stored in separate sub-directories. See here for a more detailed walk-through of the process.
syncAndCrunch(and other StitchIt functions) reads settings from an INI file (e.g. define the webserver to which preview images are kept). You should set up the INI file for your system before attempting to use
syncAndCrunch. Make sure you set up a valid
landingDir, which is where the data will be sent on your analysis machine.