Hi everyone, I am currently trying to use the ollama integration with llama3.2, with voice assistants, but it always answers in a weird format with name, parameters etc.
I should probably also mention that this only happens when I set ollama to be able to control home assistant.
1 post - 1 participant