deepset-ai / FARM

:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.

Home Page:https://farm.deepset.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Running the inference with out docker

kruthikakr opened this issue · comments

Hi .. I am trying to run QA in large document , I am able to train the model using co lab link. Now I want to do inference, but with out docker.. I am able to use inference
nlp = Inferencer.load("bert-large-uncased-whole-word-masking-finetuned-squad", task_type="question_answering")

But need the code to get the exact answer and aggregator for very big document.

Also in the demo the words are limited to 15000.. which parameter decide the length of the context ?

Hey @kruthikakr when you run the Inferencer in colab it does not have any length restrictions on the document. The 15000 character limitation is only for the demo.
Did you already try QA on long documents within colab?

Issue seems fixed, closing now. Feel free to reopen