In this notebook, we will learn how to work with the N5 API and ImgLib2.
The N5 API unifies block-wise access to potentially very large n-dimensional data over a variety of storage backends. Those backends currently are the simple N5 format on the local filesystem, Google Cloud and AWS-S3, the HDF5 file format and Zarr. The ImgLib2 bindings use this API to make this data available as memory cached lazy cell images through ImgLib2.
First let’s add the necessary dependencies. We will load the n5-ij module which will transitively load ImgLib2 and all the N5 API modules that we will be using in this notebook. It will also load ImageJ which we will use to display data.
Now, we register a simple renderer that uses ImgLib2’s ImageJ bridge and Spencer Park’s image renderer to render the first 2D slice of a RandomAccessibleInterval into the notebook. We also add a renderer for arrays and maps, because we want to list directories and attributes maps later.
We will now open N5 datasets from some sources as lazy-loading ImgLib2 cell images. For opening the N5 readers, we will use the helper class N5Factory which parses the URL and/ or some magic byte in file headers to pick the right reader or writer for the various possible N5 backends. If you know which backend you are using, you should probably use the appropriate implementation directly, it’s not difficult.
Code
importij.*;importnet.imglib2.converter.*;importnet.imglib2.type.numeric.integer.*;importorg.janelia.saalfeldlab.n5.*;importorg.janelia.saalfeldlab.n5.ij.*;importorg.janelia.saalfeldlab.n5.imglib2.*;/* make an N5 reader, we start with a public container on AWS S3 */finalvar n5Url ="https://janelia-cosem.s3.amazonaws.com/jrc_hela-2/jrc_hela-2.n5";finalvar n5Group ="/em/fibsem-uint16";finalvar n5Dataset = n5Group +"/s4";finalvar n5 =newN5Factory().openReader(n5Url);/* open a dataset as a lazy loading ImgLib2 cell image */final RandomAccessibleInterval<UnsignedShortType> rai = N5Utils.open(n5, n5Dataset);/* This is a 3D volume, so let's show the center slice */Views.hyperSlice(rai,2, rai.dimension(2)/2);
log4j:WARN No appenders could be found for logger (com.amazonaws.auth.AWSCredentialsProviderChain).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Could not load AWS credentials, falling back to anonymous.
That’s a bit low on contrast, let’s make it look like TEM, and let’s show a few of those hyperslices through the 3D volume:
Code
var raiContrast = Converters.convert( rai,(a, b)-> b.setReal(Math.max(0,Math.min(255,255-255*(a.getRealDouble()-26000)/6000))),newUnsignedByteType());display(Views.hyperSlice(raiContrast,2, rai.dimension(2)/10*4),"image/jpeg");display(Views.hyperSlice(raiContrast,2, rai.dimension(2)/2),"image/jpeg");display(Views.hyperSlice(raiContrast,2, rai.dimension(2)/10*6),"image/jpeg");
6e32749d-48d5-4c52-be9b-41c43bae02f4
We can list the attributes and their types of every group or dataset, and read any of them into matching types:
Let’s save the contrast adjusted uin8 version of the volume into three N5 supported containers (N5, Zarr, and HDF5), parallelize writing for N5 and Zarr:
Code
importjava.nio.file.*;/* create a temporary directory */Path tmpDir = Files.createTempFile("","");Files.delete(tmpDir);Files.createDirectories(tmpDir);var tmpDirStr = tmpDir.toString();display(tmpDirStr);/* get the dataset attributes (dataType, compression, blockSize, dimensions) */finalvar attributes = n5.getDatasetAttributes(n5Dataset);/* use 10 threads to parallelize copy */finalvar exec =Executors.newFixedThreadPool(10);/* save this dataset into a filsystem N5 container */try(finalvar n5Out =newN5Factory().openFSWriter(tmpDirStr +"/test.n5")){ N5Utils.save(raiContrast, n5Out, n5Dataset, attributes.getBlockSize(), attributes.getCompression(), exec);}/* save this dataset into a filesystem Zarr container */try(finalvar zarrOut =newN5Factory().openZarrWriter(tmpDirStr +"/test.zarr")){ N5Utils.save(raiContrast, zarrOut, n5Dataset, attributes.getBlockSize(), attributes.getCompression(), exec);}/* save this dataset into an HDF5 file, parallelization does not help here */try(finalvar hdf5Out =newN5Factory().openHDF5Writer(tmpDirStr +"/test.hdf5")){ N5Utils.save(raiContrast, hdf5Out, n5Dataset, attributes.getBlockSize(), attributes.getCompression());}/* shot down the executor service */exec.shutdown();display(Files.list(tmpDir).map(a -> a.toString()).toArray(String[]::new));