dccuchile / beto

BETO - Spanish version of the BERT model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to translate with pretrained model

manycoding opened this issue · comments

I'd like to run English - Spanish translation. I followed transformers guide and as I understand I need a custom pipeline for inference.
Are there any resources which might help with that?

Hello!

I think your question is more suitable to Transformers repository: https://github.com/huggingface/transformers

Also keep in mind that for translation you would need a Encoder-Decoder like architecture, while BERT itself is just an encoder. The following link maybe helpful: https://medium.com/huggingface/encoder-decoders-in-transformers-a-hybrid-pre-trained-architecture-for-seq2seq-af4d7bf14bb8

Good luck!