Pluggability of LLM Chat Model
In this guide, we will learn how to plugin a new chat model client. The plugin needs to extend/adhere to BaseChatModel
of Langchain framework. Hence, you can refer to the langchain's LLM chat model integration page.
If you don't find your LLM chat model client on the above link, please refer to this link to create your custom LLM chat client by extending BaseChatModel
A chat model is a language model that uses chat messages as inputs and returns chat messages as outputs.
You can seamlessly leverage it within existing Sakhi API workflows with minimal code changes.
Interface
The BaseChatClient
class has the below method to implement and returns an instance of BaseChatModel
representing the specific Chat model client.
Implementation
Let's say we create a YourChatClient
class inheriting from BaseChatClient
using the chat model client provided by Langchain (in the link mentioned above). You must implement get_client
method to instantiate YourChatClient
with the specified model and additional arguments.
Add the necessary environment variables to the .env
file. LLM_TYPE
is to be defined mandatorily. LLM chat model specific environment variables (Ex: API_KEY) are also to be mentioned.
Now go to the llm
folder and create a file named: your_chat_model_name.py
<LangchainChatModelClient>
should be one of the LLM chat model clients mentioned here.
Go to the LLM folder and update __init__.py
with the module lookup entry for YourChatClient
.
Modify env_manager.py
to import YourChatClient
and add a mapping for YourChatClient
in the self.indexes
dictionary.
This setup ensures that YourChatClient
can be instantiated based on specific environment variables, effectively integrating it into the environment management system. The self.indexes
the dictionary now includes a mapping where <your_chat_model_name>
corresponds to the YourChatClient
class, and uses "LLM_TYPE"
as the environment key.
Example usage
For more examples, please refer to Langchain.
Last updated