NoTody / MIMICCXR-Dev

Some Attempts on Learning Good Representation from MIMIC-CXR Dataset (under dev)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MIMICXR-Dev

Some Attempts on Learning Good Representation from MIMIC-CXR Dataset (Under Dev)

Example Pretraining command:

python ../main_pretrain.py --batch_size 32 --gpus 4 --num_nodes 1 --max_epochs 25 --lr_backbone 1e-4 --lr_projector 1e-4 --im    g_backbone "vit2d_b16" --max_length 128 --features_dim 768 --img_embedding_dim 768 --weight_decay 0.1 --optimizer "adamw" --method "SLIP_SIMCLR" --save_dir "slip_saved" --two_transform --pretrained --seed 2022

or

python main_pretrain.py --batch_size <batch_size> --gpus <num_gpu> --num_nodes <num_node> --max_epochs <num_epochs> --lr_backbone <backbone_learning_rate> --lr_projector <projector_learning_rate> --img_backbone <image_backbone_name> --max_length <text_tokenizer_length> --features_dim <feature_dimension> --img_embedding_dim <image_ebmedding_dimension> --optimizer <optimizer_name> --weight_decay <weight_decay> --method <train_method> --save_dir <save_directory> --two_transform --pretrained --seed 2022

Example finetuning command:

python base3.py --model_load_path <path_to_weights> --batch_size 64 --max_epoch 5 --save_suffix <suffix> --seed 5 --train_percent 0.01
python train_full.py --model_load_path <path_to_weights> --model_name "resnet50" --batch_size 16 --max_epoch 30 --save_suffix <suffix> --seed 5 --train_percent 0.1 --method "FT" --num_class 14

About

Some Attempts on Learning Good Representation from MIMIC-CXR Dataset (under dev)


Languages

Language:Python 53.4%Language:Jupyter Notebook 46.6%