Are you planing to release fine-tuned models?
anowak opened this issue · comments
Thank you for this great work and very detailed paper! In the paper, you write:
MEDITRON models (7B and 70B) with and without fine-tuning to the public to ensure access for real-world evaluation and to facilitate similar efforts in other domains.
Should we expect fine-tuned models to be released soon?
Hi! Thank you for your interest in our work!
Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):
- PubMedQA with CoT
- MedMCQA with CoT
- MedQA with Cot
Note that they are all task-specific, not instruction-tuned.
Thank you, I will stay tuned!
Hi! Thank you for your interest in our work!
Yes, we are currently finishing some documentation on the fine-tuned models. We will release them soon afterward. There will be three fine-tuned models (all 70B):
- PubMedQA with CoT
- MedMCQA with CoT
- MedQA with Cot
Note that they are all task-specific, not instruction-tuned.
Do you have a specific timetable about releasing the models?