Skip to content

Getting started with Kymata Core

Set Up

This provides an overview of how to set up Kymata Core locally.

Please be aware that this codebase is released publicly to ensure the transparency of the results in the Kymata Atlas. While we welcome users using this codebase, we are unable to prioritise installation support.

Prerequisites

  • Python

Confirm you have the correct version of Python installed. Type

$ pyenv versions
This should confirm that python 3.11 or above is installed. If it isn't already there, install it using pyenv install. You should be able to confirm you are using the correct version using
$ python -V

  • Poetry

This package uses Poetry to manage packages. See python-poetry.org for installation instructions.

Installation

  1. Clone this repository:
    $ git clone https://github.com/kymata-atlas/kymata-core.git
    
  2. To install the python packages you will need to use Poetry. Assuming you have installed Poetry, type:

    $ poetry install
    
    to load the pakages needed.

  3. At this point, you should be able to either run the xx from the terminal

    $ poetry run invokers/run_gridsearch.py
    
    or activate in this environment in an IDE such as PyCharm.

Running tests, linting, and generating documentation

This will be done automatically via Github actions.

To run the tests manually, run:

$ poetry run pytest
To run linting manually, run:
$ poetry run ruff check
To serve the documentation locally, run:
$ poetry run mkdocs serve check

Analysing a dataset with Kymata

1. Locate your raw EMEG dataset

You'll need the following files:

  • <participant_name>_run1_raw.fif
  • <participant_name>_recording_config.yaml

2. Preprocess the data

Kymata Core holds the Kymata preprocessing code that comprises the 'Kymata back-end', including preprocessing steps, gridsearch procedures, expression plotting and IPPM generation.

Run the following invokers from invokers/ in order:

  • invoker_run_data_cleansing.py
  • This does:
    1. first-pass filtering
    2. maxfiltering
    3. second-pass filtering
    4. eog removal
  • invoker_create_trialwise_data.py
  • This does:
    1. Splits the data into trials
  • This is all you need for sensor-space gridsearch.
  • invoker_run_hexel_current_estimation.py
  • invoker_estimate_noise_covariance.py
  • This is only necessary if running the gridsearch in source space (hexels).

3. Run the gridsearch

Run the following invoker from invokers/:

invokers/run_gridsearch.py

This will output a .nkg file, which can then be loaded (see demos/demo_save_load.ipynb).

Notes

If running at the CBU, an easier way to do this (see Troubleshooting) may be to use the shell script submit_gridsearch.sh, which sets up the Apptainer environment the right way. Either run it locally with ./submit_gridsearch.sh, or run it on the CBU queue with sbatch submit_gridsearch.sh.

4. Plot the results

  • invoker_run_nkg_plotting.py

See also demos/demo_plotting.ipynb.

5. Visualise processing pathways

See demos/demo_ippm.ipynb.

Troubleshooting on the CBU compute cluster

  • You see Acccess denied permission error: 403 when you try to use github.

This is because your git instance at the CBU is not passing the correct authorisation credentials to your GitHub account. You will have to create a new public key in ~/.ssh/ in your cbu home folder, and then use this to create an SSH key in your github settings.

Then, create (or modify) the config file in ~/.ssh/:

Host github.com
        LogLevel DEBUG3
        User git
        Hostname github.com
        PreferredAuthentications publickey
        IdentityFile /home/<username>/.ssh/<name of private key>
  • You can't install python versions using pyenv.

This is becasue the login nodes don't have the right C++ compilers. To get around this, ignore pyenv, and instead use apptainer to install poetry:

module load apptainer
apptainer shell /imaging/local/software/singularity_images/python/python_3.11.7-slim.sif
mkdir ~/poetry
export VENV_PATH=~/poetry/
python3 -m venv $VENV_PATH
$VENV_PATH/bin/pip install -U pip setuptools
$VENV_PATH/bin/pip install poetry
  • You see ModuleNotFoundError: No module named 'numpy'

You are probably running submit_gridsearch.sh, and it currently has Andy's kymata-core location hard-coded. Update to point it at your copy.

  • You see ModuleNotFoundError: No module named 'kymata'

You're not using the poetry environment. You'll need to run this with Apptainer. First make sure kymata-core is installed with poetry, so the kyamata package is available within the virtual environment:

module load apptainer
apptainer shell -B /imaging/projects/cbu/kymata /imaging/local/software/singularity_images/python/python_3.11.7-slim.sif
export VENV_PATH=~/poetry/
cd /path/to/kymata-core

# Allow the CBU poetry to communicate with pip
export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring

$VENV_PATH/bin/poetry install

Now (within the Apptainer) you can run it using poetry, e.g.:

$VENV_PATH/bin/poetry run python invokers/invoker_create_trialwise_data.py