History

INCF was formed in 2005 and has since worked to develop and promote standards, tools, and infrastructure for neuroscience researchers. Here is a summary of our activities of the last 10 years.

Scientific workshops

Between 2006 and 2013, the INCF has organized 22 scientific workshops on various topics, such as technical issues related to neuroscience databasing, tool development, and modeling of the nervous system. Each workshop had invited participants, selected on the basis of their expertise in the particular area of the workshop. Each workshop delivered a formal report with recommendations to serve as the basis for further activities initiated by INCF.

All INCF Scientific Workshops and workshop reports

Scientific programs

In our first two phases (2006-2010, 2011-2015), INCF operated scientific Programs in four areas: Digital Brain Atlasing, Ontologies of Neural Structures, Standards for Data Sharing, and Multi-scale Modeling. Programs were preceded by workshops which gathered experts in the respective field to give updates on current state of the art and identify barriers to sharing and collaboration. Recommendations from the workshops formed the basis of each Program's activities. Each Program had an Oversight Committee, overseeing its general direction, and one or several Task Forces carrying out most of the activities and development.

A great success of these endeavours was the coordinated efforts of many scientists in coming together as a community to resolve issues and deliver solutions. 

Notes from one of the INCF Cross-Programs discussions

 

Key Program deliverables include:

Program on Digital Brain Atlasing

Program on Multi-Scale Modelling

Program on Standards for Data Sharing

  • A proposed HDF5 standard for storing electrophysiology data (included in Neurodata Without Borders)
  • An ontology for experimental neurophysiology (OEN)
  • A simple sharing tool ('the one-click tool') for neuroimaging DICOM/NIfTI data
  • Inventory of resources and general recommendations on how to incorporate data sharing in neuroimaging experiments with human subjects 
  • NeuroVault, a database for storing and distributing statistical maps of the human brain from PET and fMRI
  • The Neuroimaging Informatics Data Model (NI-DM), a framework for the generation, storage, and query of metadata including provenance information
  • The Neuroimaging Analysis Pipeline (Nipype), a framework that combines standard interfaces to current most used neuroimaging software and execution engines
  • The Brain Imaging Data Structure, BIDS, a simple and easy to adopt way of organizing neuroimaging and behavioral data.

Program on Ontologies for Neural Structures

  • CUMBO, a reference framework for classifying general mammal nervous system structures 
  • NeuroLex, a structural neuroscience lexicon built on Neuron Registry and CUMBO 
  • The start of a project to develop a community-based encyclopedia for neuroscience, the KnowledgeSpace 
  • The Neuron Registry, a standard set of relations and values used to describe neurons

 

Training and education

From 2010 onward, following a series of workshops on training needs in neuroinformatics, INCF also established several activities to support training and education. These activities were overseen by the INCF Training Committee. The committee organized an introductory neuroinformatics course each year, ran a framework for financial support to other neuroinformatics courses, and supported the INCF participation in the Google Summer of Code program.

Students and teachers from the 2014 INCF short course "Introduction to Neuroinformatics".

Students and teachers from the 2014 INCF short course "Introduction to Neuroinformatics".

External projects

In the last few years, INCF has taken part in both short- and long-term projects in neuroscience standards, infrastructure and data management. These are some of the projects started before 2016:

Here is a list of INCF's current project collaborations.