Add litellm inference (#385)
This PR enables running inference using any model provider supported by litellm as well as using litellm for llm as a judge.
---------
Co-authored-by: Egor Lebedev <egor.lebe@inbox.ru>
Co-authored-by: Kryvich <44714498+Kryuski@users.noreply.github.com>
Co-authored-by: Clémentine Fourrier <22726840+clefourrier@users.noreply.github.com>
Co-authored-by: Nazim Ali <nazimali@gmail.com>
Co-authored-by: vsabolcec <60775189+vsabolcec@users.noreply.github.com>
Co-authored-by: Nathan Habib <30601243+NathanHB@users.noreply.github.com>
Co-authored-by: Nathan Habib <nathan.habib@huggingface.co>
Co-authored-by: Albert Villanova del Moral <8515462+albertvillanova@users.noreply.github.com>