We implemented a CLI/GUI workflow for prompt-based picking with ProPicker. See GUI_QUICKSTART.md for details.
-
Use Conda to install the necessary dependencies in a new environment. From the root of the repository, run:
conda env create -f environment.yml conda activate ppicker
-
Install ProPicker itself:
pip install . -
You need the checkpoint of our pre-trained model, as well as the checkpoint of the TomoTwin model we used as prompt encoder.
-
You can download the ProPicker checkpoint here here
-
You can download the TomoTwin checkpoint by running
bash download_tomotwin_ckpt.sh
After downloading, place the files in the
ProPickerdirectory. -
-
Set the following environment variables to point to the model files:
export PROPICKER_MODEL_FILE=/abs/path/to/propicker.ckpt export TOMOTWIN_MODEL_FILE=/abs/path/to/tomotwin.pth
We provide an example for prompt-based picking in the TUTORIAL1:empiar10988_prompt_based_picking.ipynb notebook, in which we pick ribosomes in the EMPIAR-10988 dataset.
An example for fine-tuning ProPicker on the EMPIAR-10988 dataset is provided in the TUTORIAL2:empiar10988_fine_tuning.ipynb notebook.
Training from scratch lives in propicker/training_from_scratch/train.py, with parameters in propicker/training_from_scratch/train_cfg.py.
To download the training data, you can use datasets/download_train_data.sh.
Note: The training data is large, so you might want to download it to a different location. To do this, modify datasets/download_train_data.sh; also adjust the training data path in propicker/paths.py.
This repository contains code copied and modified from the following projects:
All derived code is explicitly marked as such.