bigcode-project / bigcode-evaluation-harness

A framework for the evaluation of autoregressive code generation language models.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

If I want to add my own designed prompts before each question, how should I modify the code

ALLISWELL8 opened this issue · comments

You can try using the --prefix argument which adds a prefix to the prompts, but be careful it might influence the generations style if you're using instruct models (e.g generating some text before the actual code which will cause the test to fail with the default postprocessing).

We have an "instruction-tuning" version of HumanEval under HumanEvalPack which supports different prompt templates depending on the models, example here