FrozenWolf-Cyber / L-T-EduTech-Hackathon

Solutions of L&T EduTech Hackathon by our team - Bharatfly_Coders

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

L-T-EduTech-Hackathon

Solutions of L&T EduTech Hackathon by our team - Bharatfly_Coders

Each folder contains a README.md folder explaining the results, metrics, training and observations in detailed.

Throughout all task general observation is that the Vision transformers (ViT) performed very well compared to classical models such as Resnet, VGG16, EfficientNet-Bx.

https://drive.google.com/file/d/1VYsAKcM1DK613JGXaUBbw3NfFrW6ZXLg/view?usp=drivesdk

https://www.canva.com/design/DAFY7ewXb-s/uhhCwuxgKm9mK2PxNLM5wA/edit?utm_content=DAFY7ewXb-s&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton

Results:

PS1:

SNO Model Training loss Training accuracy Validation loss Validation accuracy Test Loss Test Accuracy Precision Recall F1 Score Support Size(mb)
Positive Negative Positive Negative Positive Negative Positive Negative
1 vit_tiny_r_s16_p8_224 0.000554299960640491 100 0.0162891943918321 98.5576923076923 0.0168405814239612 100 1 1 1 1 1 1 100 100 24.64
1a vit_tiny_r_s16_p8_224(adam) 8.65E-06 100 0.341207414359325 96.1538461538461 1.32988578424848 84.1346153846153 0.75757576 1 1 0.68 0.86206897 0.80952381 100 100 24.64
* 2 vit_tiny_patch16_224 0.000391737806766238 100 0.0066456968404684 100 0.00862143541542956 100 1 1 1 1 1 1 100</td> 100 24.64
3 vit_small_patch8_224_dino 0.000114268117565523 100 0.000981450813504544 100 0.00134592478011304 100 1 1 1 1 1 1 100 100 86.74
4 vit_small_patch16_224 0.00109570801348462 100 0.0165256364203882 99.0384615384615 0.0172053847032097 100 1 1 1 1 1 1 100 100 86.72
5 vit_base_patch8_224 2.34E-05 100 0.0109473363600591 99.5 0.0298415567417396 99 1 0.98039216 0.98 1 0.98989899 0.99009901 100 100 343.3
6 vit_base_patch16_224 2.89E-05 100 0.000303537664754003 100 0.000899293540896906 100 1 1 1 1 1 1 100 100 343.26
7 vit_large_patch16_224 0.000131708909235233 100 0.00477138461696085 100 0.0421330400995793 99.5 1 0.99009901 0.99 1 0.99497487 0.99502488 100 100 1.21gb

Best model performance:

Model name: vit_tiny_patch16_224

TEST PREC: [1. 1.] RECALL: [1. 1.] F1 SCORE: [1. 1.] SUPPORT: [100 100]

PS2:

SNO Model Training loss Training accuracy Validation loss Validation accuracy Loss Accuracy_best Precision Recall F1 Score Support Kappa Size(mb)
Surge Arrestor Transformer Transformer with Surge Arrestor Surge Arrestor Transformer Transformer with Surge Arrestor Surge Arrestor Transformer Transformer with Surge Arrestor Surge Arrestor Transformer Transformer with Surge Arrestor
1 vit_tiny_r_s16_p8_224 0.0225348259555175 99.4791666666666 1.20923614501953 66.6666666666666 0.377547923475503 89.0625 0.78947368 0.9375 0.89473684 0.83333333 0.83333333 0.94444444 0.81081081 0.81081081 0.91891892 18 18 18 0.805555555555555 24.64
2 vit_tiny_patch16_224 0.0177038602996617 98.9583333333333 0.967087268829345 75 0.401243545114994 81.7708333333333 0.76470588 0.8 0.88235294 0.72222222 0.88888889 0.83333333 0.74285714 0.84210526 0.85714286 18 18 18 0.722222222222222 22.16
3 vit_small_patch8_224_dino 0.0191240947363742 98.9583333333333 0.354451566934585 91.6666666666666 0.413140682503581 81.7708333333333 0.72222222 0.85 0.875 0.72222222 0.94444444 0.77777778 0.72222222 0.89473684 0.82352941 18 18 18 0.722222222222222 86.74
4 vit_small_patch16_224 0.0229812290053814 98.9583333333333 0.567398607730865 75 0.31965248286724 83.3333333333333 0.8 0.77272727 0.94117647 0.66666667 0.94444444 0.88888889 0.72727273 0.85 0.91428571 18 18 18 0.75 86.72
* 5 vit_base_patch8_224(sgd) 0.0134351163142127 98.8888888888888 1.07717802127202 58.3333333333333 0.220577826648617 92.8571428571428 0.88888889 0.94444444 0.94444444 0.88888889 0.94444444 0.94444444 0.88888889 0.94444444 0.94444444 18 18 18 0.888888888888888 343.3
5a vit_base_patch8_224(adam) 1.22904528545008 50 1.18211032946904 50 1.16420696462903 44.6428571428571 0.27272727 0.40740741 0.625 0.16666667 0.61111111 0.55555556 0.20689655 0.48888889 0.58823529 18 18 18 0.166666666666666 343.3
6 vit_base_patch16_224 0.0138681949919878 99.4444444444444 1.38310711582501 75 0.315895141800865 85.7142857142857 0.77777778 0.88888889 0.88888889 0.77777778 0.88888889 0.88888889 0.77777778 0.88888889 0.88888889 18 18 18 0.777777777777777 343.26
7 vit_large_patch14_224 0.949684994750552 53.3333333333333 1.11181551218032 58.3333333333333 1.00124096231801 53.5714285714285 0.48148148 0.53846154 0.64285714 0.72222222 0.38888889 0.5 0.57777778 0.4516129 0.5625 18 18 18 0.305555555555555 1.21gb
8 vit_large_patch16_224(batch_size4) 0.0184153128463852 99.4444444444444 1.32563134034474 66.6666666666666 0.415346324234893 85.7142857142857 0.78947368 0.84210526 0.9375 0.83333333 0.88888889 0.83333333 0.81081081 0.86486486 0.88235294 18 18 18 0.777777777777777 1.21gb
8a vit_large_patch16_224(batch_size3) 0.0220416884048366 99.4444444444444 1.28843541815876 66.6666666666666 0.453457333567914 85.1851851851851 0.78947368 0.84210526 0.9375 0.83333333 0.88888889 0.83333333 0.81081081 0.86486486 0.88235294 18 18 18 0.777777777777777 1.21gb

Among multiple models vit_base_patch8_224 stood out in performance with a kappa score of 0.8888888888888888

Best model performance:

Model name: vit_base_patch8_224

TEST PREC: [0.88888889 0.94444444 0.94444444] RECALL: [0.88888889 0.94444444 0.94444444] F1 SCORE: [0.88888889 0.94444444 0.94444444] SUPPORT: [18 18 18]
KAPPA:  0.8888888888888888

PS3:

SNO Model Training loss Training accuracy Validation loss Validation accuracy Precision Recall F1 Score Kappa Size(mb)
Cargo Military Carrier Cruise Tanker Cargo Military Carrier Cruise Tanker Cargo Military Carrier Cruise Tanker
1 resnet50 0.0155044855898387 99.547803617571 0.30526700746268 92.5 0.90243902 0.93333333 1 0.95454545 0.82 0.82222222 0.93333333 1 0.93333333 0.91111111 0.86046512 0.93333333 1 0.94382022 0.86315789 0.903333333333333 94.39
2 vit_tiny_r_s16_p8_224 0.000518766089389687 99.9838501291989 0.465771711990237 91.25 0.84615385 0.9375 1 0.93333333 0.8125 0.73333333 1 1 0.93333333 0.86666667 0.78571429 0.96774194 1 0.93333333 0.83870968 0.883333333333333 24.64
3 vit_tiny_patch16_224 0.000258662122500031 100 0.284326805872842 93.75 1 0.9375 1 1 0.78947368 0.73333333 1 1 0.93333333 1 0.84615385 0.96774194 1 0.96551724 0.88235294 0.916666666666666 22.16
4 vit_small_patch8_224_dino 3.42E-05 100 0.0970607578463386 97.5 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.966666666666666 86.74
5 vit_small_patch16_224 0.000357729069072556 100 0.294432922825217 96.25 0.92857143 1 1 1 0.875 0.86666667 1 1 1 0.93333333 0.89655172 1 1 1 0.90322581 0.95 86.73
* 6 vit_base_patch8_224(sgd) 5.43E-05 100 0.0298978400795022 98.75 0.95454545</td> 1 1 1 0.93478261 0.93333333 1 1 1 0.95555556 0.94382022 1 1 1 0.94505495 0.97 343.3
6a vit_base_patch8_224(adam) 1.26545023194271 46.2855297157622 1.36305949687957 33.6363636363636 0.21568627 0.42857143 0.42857143 0.125 0.51111111 0.48888889 0.6 0.06666667 0.02222222 0.51111111 0.29931973 0.5 0.11538462 0.03773585 0.51111111 0.227777777777777 343.3
7 vit_base_patch16_224 5.04E-05 100 0.0669359442894347 97.5 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.966666666666666 343.27
8 vit_large_patch16_224 0.000314225390369969 99.9838187702265 0.186438377808216 96.9298245614035 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.93333333 1 1 1 0.93333333 0.973099415204678 1.21gb

Among multiple models vit_base_patch8_224(sgd) stood out in performance with a kappa score of 0.97 on validation set.

Note: The validation set was split evenly from the train set such that all classes were balanced out, and since there was no test set to find the kappa value, we used this validation dataset which was kept unseen from the model.

Best model performance:

Model name: vit_base_patch8_224

  VALIDATION PREC : [0.95454545 1.         1.         1.         0.93478261]
  RECALL : [0.93333333 1.         1.         1.         0.95555556]
  F1 : [0.94382022 1.         1.         1.         0.94505495]  
  ACC : 98.75  KAPPA : 0.97

About

Solutions of L&T EduTech Hackathon by our team - Bharatfly_Coders


Languages

Language:Jupyter Notebook 100.0%