We are very happy to provide a consolidated update on the #NeuroML ecosystem in our @eLife paper, “The NeuroML ecosystem for standardized multi-scale modeling in neuroscience”: https://doi.org/10.7554/eLife.95135.3

#NeuroML is a standard and software ecosystem for data-driven biophysically detailed #ComputationalModelling endorsed by the @INCF and CoMBINE, and includes a large community of users and software developers.

#Neuroscience #ComputationalNeuroscience #ComputationalModelling 1/x

The NeuroML ecosystem for standardized multi-scale modeling in neuroscience

The NeuroML model description language, with its extensive software ecosystem, supports researchers in the development of FAIR, data-driven, biologically detailed models of neural systems.

eLife
While our main source of information on the brain—how/what/where/why/when it does things—are “wet” experiments, models and theory are necessary to combine the many specific, isolated findings that experiments generate into coherent theories of brain function. If we are to understand the _mechanisms_ underlying various brain processes, we must build data-driven biophysically detailed models of the brain. Models allow us to generate new predictions that can be tested in laboratories. 2/x
A number of software tools are available for construction and simulation of models: #NEURON, #NetPyNE, #Brian, #PyNN, #NEST, #MOOSE, #EDEN etc. These have their own features, styles, programming interfaces (APIs). This is great but it also means that researchers need to learn each of these individually to use them. It also means that tools and models developed for one don’t necessarily work for others and need to be manually converted. This is often a non-trivial task and limits model reuse. 3/x
#NeuroML provides a simulator independent standard and software tools. The idea is that researchers can use NeuroML to build their models, and these models will “just run” in any of the supported simulators. So, researchers only need to learn how to use NeuroML and then choose what simulator they want to run their model in. NeuroML-compliant tools will take care of the rest, “under the hood”. 4/x
#NeuroML provides a curated set of model elements for researchers to use. This includes simpler single compartment cells (integrate and fire, Izhikevich, and so on), but also bits required to build detailed multi-compartmental cells (Hodgkin Huxley and Kinetic scheme ionic conductances), synapse models, networks/projections, and network inputs such as spike trains and pulse generators.
You can also create new model elements if existing ones aren’t enough AND because #NeuroML is designed to be modular and hierarchical, ALL model elements are independent and can be reused in any NeuroML models. See the full specification here: https://docs.neuroml.org/Userdocs/Specification.html 6/x
Schema/Specification — NeuroML Documentation

#NeuroML supports all stages of the modelling life-cycle with a vast ecosystem of software tools: creating (#pyNeuroML, #neuroConstruct, #NEURON, #NetPyNE, #PyNN, #N2A), validating (pyNeuroML, #OMV, #SciUnit), visualising (pyNeuroML, #OSB, #NeuroML-DB), simulating (#NEURON, #NetPyNE, #Brian, #PyNN, #NEST, #MOOSE, #EDEN), model fitting/optimisation (#NeuroTune, #BluePyOpt, NetPyNE), sharing and reusing of models (OSB, NeuroML-DB, #NeuroMorpho.org). 7/x
#NeuroML is a global community initiative. It is developed by an elected Editorial Board and overseen by a Scientific Committee. All the software/documentation/models produced in NeuroML are completely Free/Open. In this way, NeuroML supports #FAIR (Findability, Accessibility, Interoperability, and Reusability) principles, thus promoting open, transparent and reproducible science. 8/x
Please see the paper for more details, and whether you’d like to use #NeuroML in your work, or support NeuroML in your tools/modelling pipelines, please come speak to us on any of our communication channels. Full documentation on NeuroML is here at https://docs.neuroml.org 9/9