Author: Donald Doherty

  • A Brain Model that Simulates the Individuality of Millions of Neurons and their Anatomically Correct Collective Structure

    Figure 1. Panels A, B, and C from Figure 4 in the paper “Linking Macroscopic with Microscopic Neuroanatomy using Synthetic Neuronal Populations” published October 23, 2014 in PLoS Computational Biology). (A) Visualization of the dentate gyrus model highlighting 1,000 synthetic dendritic trees (dark purple structure scattered through model). (B) Rendering of the complete morphologies for all granule cells in a 20 µm transverse slice from the center of the model dentate gyrus. (C) Rendering of 48 granule cells from the crest of the slice in (B).

    Most signal integration and transformations happening in your brain are happening in the neuropil. The neuropil is the tangle of extremely small processes connecting nerve cells together. Axons from nerve cell bodies touch dendrites of receiving neurons through an intervening tiny gap called the synapse through which chemicals from the transmitting cell address the receiving cell.

    It’s been analytically surmised for many decades now that the shape, branching, and other properties of neuron processes influence signal processing and, although technically difficult, a number of experiments have shown this to be true. These experiments are performed on just one or a few neurons and show that, for example, a branch in an axon may cause a signal (an action potential) to slow down, or even stop, at the branch. The signal may continue down one branch but may cease to exist the other branch (this is called filtering).

    If morphology has such large affects on signal processing in a single cell, what would be the effects of hundreds of thousands or millions of cells with diverse morphologies on the signal processing in a particular organ of the brain? The research team behind the article “Linking Macroscopic with Microscopic Neuroanatomy using Synthetic Neuronal Populations” (published October 23, 2014 in PLoS Computational Biology) doesn’t address this question directly but carried out work that lays a foundation for helping to answer it.

    In this paper, the authors set out to build an anatomically and morphologically realistic model of a well studied organ of the brain known as the dentate gyrus located in the hippocampus. This area of our brains is particularly interesting for its central roll in forming memories and helping us to navigate through our environment. To study details of signal processing in the dentate gyrus we must take into account the significant variability in neuron morphology across this structure. Currently we can only hope to study details in signal processing across hundreds of thousands or millions of cells using simulation tools. Therefore, this team set out to construct an anatomically and morphologically realistic model of the dentate gyrus.

    Note: The authors have posted the Matlab source code used to create this synthetic dentate gyrus at “Generation of granule cell dendritic morphology (Schneider et al. 2014)” record in the SenseLab ModelDB repository at . The paper doesn’t state the level of computer power necessary to generate the model but the research team did use a high performance computer cluster so it’s probably safe to say “a lot.” Please keep this in mind if you decide to download the code and play.

    Rip a donut in half and squish that half circle of a donut on a countertop so that the arch almost completely collapses and each end is splayed out a bit. Now you have something resembling a rat’s dentate gyrus. A rat’s dentate gyrus is estimated to contain about 1.2 million granule cells (see Figure 1A above). Using mathematical and computational methods that capture both its overall anatomical shape (the squished half donut) and the morphology of each of over 1 million cells, the research team generated 1.19 million granular cells packed at the appropriate density into a three-dimensionally appropriate structure. They used regional statistical variation to capture experimentally observed variability in neuron processes across the dentate gyrus. The result is an impressive anatomical model that can be used to study the effect of anatomical and morphological heterogeneity on signal processing in the dentate gyrus.

  • Octave: an Open Source Alternative to Matlab Revisited

    Figure 1. Octave 3.8 graphical interface version displaying neuron to neuron connectivity matrices for whisker related somatosensory cortex (for more about these data see my July 9, 2011 post or the paper “Laminar Analysis of Excitatory Local Circuits in Vibrissal Motor and Sensory Cortical Areas” published January 4, 2011 in PLoS Biology).

    More than three years ago on this blog I introduced Octave while writing about brain circuitry data. Octave has come a long way since then. After downloading and a simple setup on your computer you’ll notice two Octave applications. One is the traditional client and the other is the new graphical user interface version, which is slated to be the standard in version 4.

    If you’d like to try running the code that displays brain circuitry data like in Figure 1 above, go to the “Laminar analysis of excitatory circuits in vibrissal motor and sensory cortex (Hooks et al. 2011)” record in the SenseLab ModelDB repository and download mhconmatvalues20100928_octave.m from the model files. Place this file in a location you’ll remember.

    After starting up Octave-gui you will see a File Browser area in a left area of the application’s window. Navigate to your copy of mhconmatvalues20100928_octave.m and double click on it so that it loads into the editor (in the area to the right). The result should look similar to Figure 1 above without the six graphic display windows. Find the arrowhead (or right-pointing blue triangle) in the editor’s toolbar and click on it. This runs the file displayed in the editor. The six figures defined in the file should display.

    Octave is maturing into a very attractive freely available and open source alternative to Matlab. Soon on this blog we’ll look at how easy or difficult it may be to run code written for Matlab in Octave.

  • R and RDF: Where Statistics and the Semantic Web Meet

    Many of the most exciting developments in information based technologies today incorporate Semantic Web technologies. These include IBM’s Watson, Apple’s Siri (originally created and developed at SRI), and Google’s Search Engine. It’s not surprising, but perhaps unfortunate, to read in Egon Willighagen’s preprint “Accessing biological data in R with semantic web technologies” that “most new databases do not yet use semantic web technologies.”

    Note: R is a freely available and open source tool especially useful for interactive data analytics and visualization.

    The happy news is that Egon Willighagen has once again contributed to the community tool chest. While his article is targeted at a life sciences audience, the tool (a set of R packages) he has written, known as rrdf, is useful to anyone who wants to import RDF data into R.

    Want to start working with triples from within your R environment?Install the rrdf package:

    install.packages(“rrdf”)

    Installing the rrdf package also installs two dependencies: rJava and rrdflibs. The rrdf package provides RDF and SPARQL functionality through Apache Jena. The rrdflibs package contains the Apache Jena libraries which are written in Java. The rJava package provides an interface to Java so that Apache Jena may run. The rrdf package itself contains the R functions that wrap Jena functionality and convert data into the appropriate structures where needed.

    Load the rrdf package into your R environment:

    library(rrdf)

    Now you’re able to query your favorite triple store and pull triples into your R environment. Here we will query Live DBpedia for a list of 40 programming languages. First provide the URL for the SPARQL endpoint:

    endpoint <- "http://dbpedia.org/sparql"

    Next provide the SPARQL query itself:

    query <- "SELECT DISTINCT ?language WHERE { ?s ?o . ?o ?language } LIMIT 40"

    Finally, carry out the query using the sparql.remote() function and assign the results to a variable:

    data <- sparql.remote(endpoint, query)

    You’ve imported your first set of triple data into R!