Commits
Click on a commit to change the comparison rangeRefactor llm generation to return generations and statistics First version of async llms with statistics Return generations with list of strings and token count from _raw_response Passing tests for inference endpoints Testing vLLM with statistics Refactor to remove code duplication Fix tests from merge responses and group generations Move import to guarded type hint Update tests with the llm generate output format Merge branch 'develop' of https://github.com/argilla-io/distilabel into llm-generate-upgrade Fix test failing with vllm version upgrade Another fix including tokenizer for our llm to work and to avoid outlines complaining Fix dummy offline batch generation Compute tokens using the tokenizer if available Update docs to include references to the new outputs of the LLMs including statistics Fix count_tokens and update corresponding tests Make a single list comprehension as per code review