GPU issue on Encoder Transformer Electra spelling correction model
FathiKR opened this issue · comments
FathiKR commented
I tried to run the encoder transformer model (Electra) on several texts data (1000 total) by using google colab (GPU) . But after reaching 35th rows, it said the environment RAM has been used up. So here is the 35th index that was requested.
text_index35 = 'kan bagus servis mcm ni boleh bt utk tukar suami baru suami aku tu jenis kedekut dah lah pinjam duit aku tak nk byr harap aku bg hantaran murah dah dulu tp tak hargai dah lah nk tukar suami baru maybe boleh dpt yg lebih elok utk tahan lama kali ni'
HUSEIN ZOLKEPLI commented
Thanks! will look into it asap.