A solution for the Neuralink Compression Challenge.
Go to file
2024-05-24 22:01:59 +02:00
.gitignore initial commit 2024-05-24 22:01:59 +02:00
bitstream.py initial commit 2024-05-24 22:01:59 +02:00
cli.py initial commit 2024-05-24 22:01:59 +02:00
config.yaml initial commit 2024-05-24 22:01:59 +02:00
data_processing.py initial commit 2024-05-24 22:01:59 +02:00
main.py initial commit 2024-05-24 22:01:59 +02:00
model.py initial commit 2024-05-24 22:01:59 +02:00
README.md initial commit 2024-05-24 22:01:59 +02:00
requirements.txt initial commit 2024-05-24 22:01:59 +02:00
train.py initial commit 2024-05-24 22:01:59 +02:00
utils.py initial commit 2024-05-24 22:01:59 +02:00

Spikey

This repository contains a solution for the Neuralink Compression Challenge. The challenge involves compressing raw electrode recordings from a Neuralink implant. These recordings are taken from the motor cortex of a non-human primate while playing a video game.

Challenge Overview

The Neuralink N1 implant generates approximately 200Mbps of electrode data (1024 electrodes @ 20kHz, 10-bit resolution) and can transmit data wirelessly at about 1Mbps. This means a compression ratio of over 200x is required. The compression must run in real-time (< 1ms) and consume low power (< 10mW, including radio).

Installation

To install the necessary dependencies, create a virtual environment and install the requirements:

python3 -m venv env
source env/bin/activate
pip install -r requirements.txt

Usage

Configuration

The configuration for training and evaluation is specified in a YAML file. Below is an example configuration:

name: Test

preprocessing:
  use_delta_encoding: true # Whether to use delta encoding.

predictor:
  type: lstm # Options: 'lstm', 'fixed_input_nn'
  input_size: 1 # Input size for the LSTM predictor.
  hidden_size: 128 # Hidden size for the LSTM or Fixed Input NN predictor.
  num_layers: 2 # Number of layers for the LSTM predictor.
  fixed_input_size: 10 # Input size for the Fixed Input NN predictor. Only used if type is 'fixed_input_nn'.

training:
  epochs: 10 # Number of training epochs.
  batch_size: 32 # Batch size for training.
  learning_rate: 0.001 # Learning rate for the optimizer.
  eval_freq: 2 # Frequency of evaluation during training (in epochs).
  save_path: models # Directory to save the best model and encoder.
  num_points: 1000 # Number of data points to visualize.

bitstream_encoding:
  type: arithmetic # Use arithmetic encoding.

data:
  url: https://content.neuralink.com/compression-challenge/data.zip # URL to download the dataset.
  directory: data # Directory to extract and store the dataset.
  split_ratio: 0.8 # Ratio to split the data into train and test sets.

Running the Code

To train the model and compress/decompress WAV files, use the CLI provided:

python cli.py compress --config config.yaml --input_file path/to/input.wav --output_file path/to/output.bin
python cli.py decompress --config config.yaml --input_file path/to/output.bin --output_file path/to/output.wav

Training

Requires Slate, which is not currently publicaly avaible. Install via (requires repo access)

pip install -e git+ssh://git@dominik-roth.eu/dodox/Slate.git#egg=slate

To train the model, run:

python main.py config.yaml Test