apply an entire BERT as text encoder
lxianl455 opened this issue · comments
Can I apply an entire BERT as text encoder ? Not just 6 layer.
What should I do to apply an entire BERT as text encoder ? And another BERT as the cross modal encoder?
Hi,
it's OK. I also used 12+6 architecture in X^2-VLM.