This Working Group aims to develop tools for the ARTEM-IS standard for electrophysiological methods reporting. ARTEM-IS stands for an Agreed Reporting Template for EEG Methodology - International Standard. Accurate reporting is critical for transparent, reproducible, replicable research in the scientific record, and allows advanced forms of meta-science to be conducted. Systematic reviews of EEG literature have highlighted specific and actionable weaknesses in the way that methods are typically reported - in particular, a lack of specificity about individual methodological decisions in the methods sections of journal articles. This challenge can be addressed by designing tools that facilitate detailed methodology documentation, where the structure of the tool helps to reduce errors, ambiguities and omissions in reporting. This outcome will be achieved more effectively through the collaborative action of EEG stakeholders, so that the tools are designed to maximise ease of use, clarity and specificity.
This working group is joint between OCNS and INCF. The group focuses on evaluating and testing computational neuroscience tools; finding them, testing them, learning how they work, and informing developers of issues to ensure that these tools remain in good shape by having communities looking after them. Since many members of the WG are themselves tool developers, we will also learn from each other and will work towards improving interoperability between related tools.
This working group will turn the COBIDAS recommendations and guidelines into a series of checklists hosted on a website, to let users report information faster and with more detail.
The machine-readable output can form the foundation of a Methods section. This will enhance adoption and use of emerging neuroimaging standards such as BIDS and fMRIprep, facilitate data sharing and pre-registration, and help with peer-review.
The groups envisions that using checklists to report methods and results can:
Provide comprehensive human and machine readable descriptions of the data collection and analysis pipelines to reduce inefficiencies and frictions in reuse
Facilitate the creation and preparation of pre-registration and registered reports, and help users think about and create pipelines before they start collecting data
Help make peer-review more objective: by supplying an app to check pipelines
Facilitate systematic literature reviews and meta-analyses
Facilitate data sharing
The implementation of this project should remain flexible enough to accommodate the inclusion of new items in the checklist as new methods mature, and reusable to enable easily setting up a checklist website for a different field.
When The Virtual Brain(TVB) runs simulations on cortical surfaces, we need to be able and compute geodesic distances (distance on the surface) instead of the trivial Euclidean distances. For this calculus, we have a small C++ library, which had become outdated. We need to:
start with an analysis done by the student if the current implementation should be reused and fixed, or completely replaced, then
proceed with the fix/replacement as concluded at the previous step.
If we are to fix the current implementation, we need to fix the 6 issues reported on Github during this project, and also
make sure the library compiles correctly with the latest version of clang.
we need to have unit-test written for the main flows as well as for some common exceptions
the unit-tests should run automatically by integration in our CI Jenkins system
at the end of this project, also tvb-gdist packages on Pypi and conda-forge should be updated.
The SciUnit framework was developed to help researchers create unit tests for scientific models. Currently, unit tests exist for models of single neurons and small networks thereof. However, unit tests for models concerned with large-scale brain network dynamics, such as meso-scale, mean-field descriptions and corticothalamic circuit models have not been developed yet.
● To create a basic GUI interface for the reconstruction pipeline which gives the ability to users to provide input data, choose configurations, identify the outputs, and check logs in case of any problem which occurs during the whole process.
● To integrate the GUI with our Pegasus workflow engine for automation, fault-tolerance and debugging and to provide the job status and execution statistics.
● Implement GUI automated testing.
● To implement more functionality for the GUI at a higher level of abstraction.
This project is about unit tests for brains with SciUnit. We want to evaluate the strength of various models and select them among competitors for analyzing brain data.
The FitzHugh-Nagumo model (FHN) lets us study spike generation in squid giant axons using a 2-dimensional simplified version created from the Hodgkin-Huxley model. This Jupyter notebook file lets users test the stability of various parts of the FitzHugh-Nagumo model under different parameter conditions. Varying the parameters, we can determine where points of stability lie and how the nature of parts of the model differs under these conditions. We use the SciUnit boolean test to determine whether different parameter sets match one another in terms of stability for various conditions.
Currently, XNAT comes with a native built-in GUI. This project will provide a dashboard framework to allow users to easily develop responsive dashboards, for exploring, monitoring, and reviewing datasets stored on any XNAT instance.
It will interact with the XNAT server instance and get the required data from it, using that data, we will visualize the information present in it in a summarized form.
It will be designed so that it can be used with any XNAT instance.
This project will create a flexible dashboard framework which can further improve and add new features as per the changing requirements or needs of the user.
The Workflow Designer is a prototype web-based application allowing drag-and-drop creating, editing, and running workflows from a predefined library of methods. Moreover, any workflow can be exported or imported in JSON format to ensure reusability and local execution of exported JSON configurations. The application is primarily focused on electroencephalographic signal processing and deep learning workflows.
Currently, the entire Workflow Designer system (server, workflow system and methods) is based on Java. The aim of this project is to transfer backend technologies from Java to Python and allow executing workflow blocks (methods) implemented in Python, using e.g. MNE for EEG signal processing, or TensorFlow for deep learning. Just like in the current version, each block has inputs, outputs (can be streams, arrays, files, etc.) and parameters that can be configured using a GUI. After the system is transformed, develop a few deep learning workflow-related blocks to demonstrate the functionality of the system.
The objectives I wish to achieve for EEG and DL workflow are:
Rewrite all the models/algorithms in python. I.e. re-writing:
Neural network models and classifiers in python using Keras (TensorFlow).
Preprocessing, Low/High pass filter, epoch extraction, averaging filter, etc.