Support for custom LLMs in python #5654
-
When will semantic kernel for python support custom LLM connections for eg. connecting to a locally trained LLM? |
Beta Was this translation helpful? Give feedback.
Answered by
eavanvalkenburg
Mar 28, 2024
Replies: 2 comments 7 replies
-
@sjadlakha thanks for your question. We have an ollama connector in Python right now, that allows for one to load a local LLM. Will this work for your use-case? |
Beta Was this translation helpful? Give feedback.
5 replies
-
@eavanvalkenburg thoughts here? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
As long as you create a class that implements
ChatCompletionClientBase
you can do whatever stuff in there you need to make it work, and indeed Ollama is a good example because it does not rely on a package currently, just RESTfull api calls.