lighteval
21934d52 - add vlmm backend (#274)

Commit
1 year ago
add vlmm backend (#274) what this PR does: - adds vllm as backend for faster inference. how to use: ``` lighteval accelerate --model_args="pretrained=meta-llama/Meta-Llama-3.1-8B-Instruct,dtype=bfloat16,vllm,data_parallel_size=2" use_chat_template --tasks "leaderboard|arc:challenge|0|0,extended|ifeval|0|0,lighteval|gsm8k|5|1" output_dir="./evals/" ``` --------- Co-authored-by: Clémentine Fourrier <22726840+clefourrier@users.noreply.github.com>
Author
Parents
Loading