Add run llm
run llm will be used with run_with_header
llm endpoints (in swagger):
- https://services-test.clarin-pl.eu/api/v1/docs#/tasks/get_chat_completions_tasks_llm_completions_post
- https://services-test.clarin-pl.eu/api/v1/docs#/tasks/get_chat_edits_tasks_llm_edits_post
llm endpoints (in cb-ws-rest):