To fill in my profile tags, a thread:
#TrakEM2 is open source software mostly for #connectomics (but found uses well beyond), and provides the means for both manual and automatic montaging and aligning overlapping 2D image tiles (with #SIFT features and rigid or elastic transformation models), and then reconstructing with mostly manual means–by painting with a digital brush–the volumes of structures of interest, as well as trace the branched arbors of e.g., neurons and annotate their synapses, therefore mapping a #connectome from #vEM (volume electron microscopy).
#TrakEM2 paper at https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0038011
Git repository at https://github.com/trakem2/
For 3D visualization, #TrakEM2 uses the 3D Viewer https://imagej.net/plugins/3d-viewer/
As software, #TrakEM2 runs as a plugin of #FijiSc https://fiji.sc/ and in fact motivated the creation of the #FijiSc software in the first place, to manage its many dependencies and therefore facilitate distribution to the broader #neuroscience community.
#TrakEM2 was founded in 2005, when terabyte-sized datasets were rare and considered large. The largest dataset that I've successfully managed with #TrakEM2 was about 16 TB. For larger datasets, see #CATMAID below.
(That was #VirtualFlyBrain: http://virtualflybrain.org/ )
What can you do with a #CATMAID server? Say, let's look at the #Drosophila (vinegar fly, often referred to as fruit fly) larval central nervous system, generously hosted by the #VirtualFlyBrain https://l1em.catmaid.virtualflybrain.org/?pid=1&zp=108250&yp=82961.59999999999&xp=54210.799999999996&tool=tracingtool&sid0=1&s0=2.4999999999999996&help=true&layout=h(XY,%20%7B%20type:%20%22neuron-search%22,%20id:%20%22neuron-search-1%22,%20options:%20%7B%22annotation-name%22:%20%22papers%22%7D%7D,%200.6) or the #Platynereis (a marine annelid) server from the Jekely lab https://catmaid.jekelylab.ex.ac.uk/
First, directly interact by point-and-click: open widgets, find neurons by name or annotations, fire up a graph widget and rearrange neurons to make a neat synaptic connectivity diagram, or an adjacency matrix, or look at neuron anatomy in 3D. Most text–names, numbers–are clickable and filterable in some way, such as regular expressions.
Second, interact from other software. Head to r-catmaid https://natverse.org/rcatmaid/ (part of the #natverse suite by Philipp Schlegel @uni_matrix, Alex Bates and others) for an R-based solution from the Jefferis lab at the #MRCLMB. Includes tools such as #NBLAST for anatomical comparisons of neurons (see paper by Marta Costa et al. 2016 https://www.sciencedirect.com/science/article/pii/S0896627316302653 ).
If R is not your favourite, then how about #python: the #navis package, again by the prolific @uni_matrix, makes it trivial, and works also within #Blender too for fancy 3D renderings and animations. An earlier, simpler version was #catpy by @csdashm https://github.com/ceesem/catpy , who also has examples on access from #matlab.
Third, directly from a #psql prompt. As in, why not? #SQL is quite a straightforward language. Of course, you'll need privileged access to the server, so this one is only for insiders. Similarly privileged is from an #ipython prompt initialized via #django from the command line, with the entire server-side API at your disposal for queries.
Fourth, and one of my favourites: from the #javascript console in the browser itself. There are a handful of examples here https://github.com/catmaid/CATMAID/wiki/Scripting but the possibilities are huge. Key utilities are the "fetchSkeletons" macro-like javascript function https://github.com/catmaid/CATMAID/wiki/Scripting#count-the-number-of-presynaptic-sites-and-the-number-of-presynaptic-connectors-on-an-axon and the NeuronNameService.getInstance().getName(<skeleton_id>) function.
Notice every #CATMAID server has its /apis/, e.g., at https://l1em.catmaid.virtualflybrain.org/apis/ will list all GET or REST server access points. Reach to them as you please. See the documentation: https://catmaid.readthedocs.io/en/stable/api.html
In short: the data is there for you to reach out to, interactively or programmatically, and any fine mixture of the two as you see fit.
Now onto #FijiSc: Fiji is a recursive acronym meaning "Fiji is just ImageJ" https://fji.sc (and the paper https://www.nature.com/articles/nmeth.2019 ) –and #ImageJ is a #java open source software for image processing https://imagej.nih.gov/ij/index.html written by Wayne Rasband from the #NIH Research Branch.
An analogy: think of ImageJ as the kernel and Fiji as the rest of the operating system.
#FijiSc brings to #ImageJ:
(1) a package manager to install and update plugins, and that crucially enables reproducible science by exporting the whole set of plugins and libraries as an executable;
(2) a Script Editor https://imagej.net/scripting/script-editor supporting many languages (#python, #groovy #ruby #scala #clojure and more), all with access to a huge collection of #JVM libraries;
(3) huge amount of libraries such as #ImgLib2, #JFreeChart for plotting, for GUIs, etc.
There are many, many plugins. A tiny sample:
Machine learning-based image segmentation:
- #LabKit https://imagej.net/plugins/labkit/
- #WEKA Trainable Segmentation https://imagej.net/plugins/tws/index
3D/4D/ND Visualization:
- 3D/4D Viewer #3DViewer https://imagej.net/plugins/3d-viewer/index with ray-tracing, orthoslices, volume rendering, and more
- #BigDataViewer #BDV https://imagej.net/plugins/bdv/index for interactively navigate N-dimensional image volumes larger than RAM
Image registration and serial section alignment:
- #BigStitcher for registering 3D/4D tiled datasets, with multiview deconvolution and more https://imagej.net/plugins/bigstitcher/index
- #TrakEM2 for montaging in 2D and alinging in 3D collections of serial sections, typically from #vEM (volume electron microscopy) https://syn.mrc-lmb.cam.ac.uk/acardona/INI-2008-2011/trakem2.html
- #mpicbg libraries for extracting #SIFT and #MOPS features, then finding feature correspondences and estimating rigid and elastic transformation models https://www.nature.com/articles/nmeth.2072
Summarizing #FijiSc is impossible. See the online forum where questions find answers by the hand of the broader community of users and developers https://forum.image.sc/
For an introduction to #FijiSc from the comfort of #python (or rather, #jython 2.7), see my online tutorial, walking you through image processing concepts with working code that you can copy-paste into the Script Editor, which has code autocompletion to facilitate class and method discovery across the many libraries: https://syn.mrc-lmb.cam.ac.uk/acardona/fiji-tutorial/
@albertcardona That's impressive. Are there connectomes already available as output of this massive collective effort?
@manlius Yes, a lot, but generated mostly with #CATMAID which is more purpose-built for #connectomics.
An early reconstruction of a neural circuit done with #TrakEM2 was by Davi Bock et al. 2011 on the mouse visual cortex, "Network anatomy and in vivo physiology of visual cortical neurons" https://www.nature.com/articles/nature09802
Another one with #TrakEM2 was by Dan Bumbarger et al. 2013 "System-wide rewiring underlies behavioral differences in predatory and bacterial-feeding nematodes" where they compared #celegans with another nematode, #pristionchus pacificus that has the exact same amount of neurons but connected differently https://www.sciencedirect.com/science/article/pii/S0092867412015000
Later ones with #CATMAID include:
The polychaete worm #Platynereis by @jekely 's group, "Whole-animal #connectome and cell-type complement of the three-segmented Platynereis dumerilii larva" Verazto et al. 2020 https://www.biorxiv.org/content/10.1101/2020.08.21.260984v2.abstract
And all of ours in #Drosophila larva. See the #VirtualFlyBrain server which hosts the #vEM of the whole central nervous system and lists all the neurons included in each published paper (currently 23), shared among the papers and all connecting to each other: https://l1em.catmaid.virtualflybrain.org/?pid=1&zp=108250&yp=82961.59999999999&xp=54210.799999999996&tool=tracingtool&sid0=1&s0=2.4999999999999996&help=true&layout=h(XY,%20%7B%20type:%20%22neuron-search%22,%20id:%20%22neuron-search-1%22,%20options:%20%7B%22annotation-name%22:%20%22papers%22%7D%7D,%200.6)
The 24th will come soon, featuring the complete whole #Drosophila larval brain with ~2,500 neurons. It's under review.
The web-based open source software #CATMAID was devised as "google maps but for volumes". Documentation at https://catmaid.org and source code at https://github.com/catmaid/CATMAID/
Modern #CATMAID enables hundreds of #neuroscience researchers world wide to collaboratively map neuronal circuits in large datasets, e.g., 100 TB or larger, limited only by bandwidth and server-side storage. The goal: to map and analyse a whole brain #connectome.
Running client-side on #javascript and server-side on #django #python #postgresql, it's a pleasure to use–if I may say so–and easy to hack on to extend its functionality with further widgets.
The first minimally viable product was produced in 2007 by Stephan Saalfeld (what we now refer to, dearly, as "Ice Age CATMAID), who demonstrated to us all that the web, and javascript, where the way to go for distributed, collaborative annotation of large datasets accessed piece-wise. See the original paper: https://academic.oup.com/bioinformatics/article-abstract/25/15/1984/210794
See also public instances at the #VirtulaFlyBrain http://virtualflybrain.org/ particularly under "tools - CATMAID - hosted EM data such as this #Drosophila first instar larval volume of its complete nervous system https://l1em.catmaid.virtualflybrain.org/?pid=1&zp=108250&yp=82961.59999999999&xp=54210.799999999996&tool=tracingtool&sid0=1&s0=2.4999999999999996&help=true&layout=h(XY,%20%7B%20type:%20%22neuron-search%22,%20id:%20%22neuron-search-1%22,%20options:%20%7B%22annotation-name%22:%20%22papers%22%7D%7D,%200.6)