INCF was formed in 2005 and has since worked to develop and promote standards, tools, and infrastructure for neuroscience researchers. Here is a summary of our activities of the last 10 years.
Between 2006 and 2013, the INCF has organized 22 scientific workshops on various topics, such as technical issues related to neuroscience databasing, tool development, and modeling of the nervous system. Each workshop had invited participants, selected on the basis of their expertise in the particular area of the workshop. Each workshop delivered a formal report with recommendations to serve as the basis for further activities initiated by INCF.
In our first two phases (2006-2010, 2011-2015), INCF operated scientific Programs in four areas: Digital Brain Atlasing, Ontologies of Neural Structures, Standards for Data Sharing, and Multi-scale Modeling. Programs were preceded by workshops which gathered experts in the respective field to give updates on current state of the art and identify barriers to sharing and collaboration. Recommendations from the workshops formed the basis of each Program's activities. Each Program had an Oversight Committee, overseeing its general direction, and one or several Task Forces carrying out most of the activities and development.
A great success of these endeavours was the coordinated efforts of many scientists in coming together as a community to resolve issues and deliver solutions.
Key Program deliverables include:
Program on Digital Brain Atlasing
- Waxholm Space (WHS), a a coordinate-based reference space of the rodent brain
- A hub-based Digital Atlasing Infrastructure (DAI) which links WHS to key community mouse atlases
- Scalable Brain Atlas, a web-based display engine for brain atlases, imaging data, and topologies
Program on Multi-Scale Modelling
- NineML, a general model description language
- The Multi-Simulation Coordinator (MUSIC), which allows allows large-scale neuron simulators and other applications to communicate during runtime
- The Connection Set Algebra (CSA) for unambiguous description of connectivity in neuronal network models
- A computational neuroscience ontology developed to provide standard semantic annotation models.
Program on Standards for Data Sharing
- A proposed HDF5 standard for storing electrophysiology data (included in Neurodata Without Borders)
- An ontology for experimental neurophysiology (OEN)
- A web page on resources for data sharing in electrophysiology
- A simple sharing tool ('the one-click tool') for neuroimaging DICOM/NIfTI data
- Inventory of resources and general recommendations on how to incorporate data sharing in neuroimaging experiments with human subjects
- NeuroVault, a database for storing and distributing statistical maps of the human brain from PET and fMRI
- The Neuroimaging Informatics Data Model (NI-DM), a framework for the generation, storage, and query of metadata including provenance information
- The Neuroimaging Analysis Pipeline (Nipype), a framework that combines standard interfaces to current most used neuroimaging software and execution engines
- The Brain Imaging Data Structure, BIDS, a simple and easy to adopt way of organizing neuroimaging and behavioral data.
Program on Ontologies for Neural Structures
- CUMBO, a reference framework for classifying general mammal nervous system structures
- NeuroLex, a structural neuroscience lexicon built on Neuron Registry and CUMBO
- The start of a project to develop a community-based encyclopedia for neuroscience, the KnowledgeSpace
- The Neuron Registry, a standard set of relations and values used to describe neurons
Training and education
From 2010 onward, following a series of workshops on training needs in neuroinformatics, INCF also established several activities to support training and education. These activities were overseen by the INCF Training Committee. The committee organized an introductory neuroinformatics course each year, ran a framework for financial support to other neuroinformatics courses, and supported the INCF participation in the Google Summer of Code program.
Students and teachers from the 2014 INCF short course "Introduction to Neuroinformatics".
In the last few years, INCF has taken part in both short- and long-term projects in neuroscience standards, infrastructure and data management. These are some of the projects started before 2016:
- CENTER-TBI project “Collaborative European NeuroTrauma Effectiveness Research in TBI”; a large European project aims to improve the care for patients with TBI and identify the most effective clinical interventions for managing TBI.
- The Neurodata without Borders – Cellular Neurophysiology initiative aims to produce a unified data format for cellular-based neurophysiology data based on representative use cases.
- BigNeuron, a community effort to define and advance state-of-the-art of single neuron reconstruction, led by the Allen Institute for Brain Science
Here is a list of INCF's current project collaborations.
Who we are
INCF's network consists of countries, regions, organizations, and individual neuroscience researchers who collaborate to develop and promote global brain research.