adds llm as judge using transformers (#223)
- lazy load of openai and transformers lib
- able to use openai client to prompt transfrmers models
- add llama 3.1 405B as llm as a judge in default metrics
- update doc
- make sure we are not rate limited
- add option to use local transformer model as judge
---------
Co-authored-by: anilaltuner <anil@firstbatch.xyz>
Co-authored-by: Clémentine Fourrier <22726840+clefourrier@users.noreply.github.com>