HeyThanks for the work done.I do not understand why you plug only onto private llm services where some of them are using private llm My ideal is to be able to connect to llm model server such as https://github.com/ollama/ollamaOr https://github.com/v...