We have released the official GeekAI plugin to the Coze Plugin Store, making it convenient for you to use the AI model proxy service provided by GeekAI when creating bots/applications on the Coze platform: Coze Coze Taking the creation of a bot in Coze as an example, we can add GeekAI as a plugin to provide low-cost model proxy services for bot conversations: Coze We need to fill in the GeekAI API KEY in the plugin parameters for authentication and the model name to customize the conversation model used. You can also configure the temperature parameter to set a suitable response style for the bot: Coze Coze You can also create workflows referencing the GeekAI plugin to accomplish more complex business logic: Coze Coze Similarly, you need to configure the authentication parameters and model information: Coze In addition, in order to process input and output messages correctly, we also need to use code to handle the input message data structure: Coze And obtain the AI response message output in the end node: Coze The complete workflow configuration is as follows: Coze Once configured, you can use the GeekAI model proxy service for Coze bot conversations: Coze