Skip to content

Support for custom LLMs in python #5654

Closed Answered by eavanvalkenburg
sjadlakha asked this question in Q&A
Discussion options

You must be logged in to vote

As long as you create a class that implements ChatCompletionClientBase you can do whatever stuff in there you need to make it work, and indeed Ollama is a good example because it does not rely on a package currently, just RESTfull api calls.

Replies: 2 comments 7 replies

Comment options

You must be logged in to vote
5 replies
@sjadlakha
Comment options

@moonbox3
Comment options

@juharris
Comment options

@sjadlakha
Comment options

@sjadlakha
Comment options

Comment options

You must be logged in to vote
2 replies
@eavanvalkenburg
Comment options

Answer selected by moonbox3
@sjadlakha
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants