epfLLM / meditron

Meditron is a suite of open-source medical Large Language Models (LLMs).

Home Page:https://huggingface.co/epfl-llm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Are you planing to release fine-tuned models?

anowak opened this issue · comments

Thank you for this great work and very detailed paper! In the paper, you write:

MEDITRON models (7B and 70B) with and without fine-tuning to the public to ensure access for real-world evaluation and to facilitate similar efforts in other domains.

Should we expect fine-tuned models to be released soon?

Hi! Thank you for your interest in our work!

Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):

  1. PubMedQA with CoT
  2. MedMCQA with CoT
  3. MedQA with Cot

Note that they are all task-specific, not instruction-tuned.

Thank you, I will stay tuned!

commented

Hi! Thank you for your interest in our work!

Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):

  1. PubMedQA with CoT
  2. MedMCQA with CoT
  3. MedQA with Cot

Note that they are all task-specific, not instruction-tuned.

Do you have a specific timetable about releasing the models?