ChatOpenAI Custom
The ChatOpenAI Custom node is a specialized component designed for integrating custom or fine-tuned OpenAI chat models into a workflow. It provides a flexible interface to configure and use OpenAI-compatible chat models, particularly those that have been custom-trained or fine-tuned.
Node Details
- Name: chatOpenAICustom
- Type: ChatOpenAI-Custom
- Version: 3.0
- Category: Chat Models
Base Classes
- ChatOpenAI-Custom
- [All base classes of ChatOpenAI]
Parameters
Credential
Credential
- Type: credential
- Name: openAIApi
- Description: Connection credentials for the OpenAI API
Inputs
-
Cache (optional)
-
Cache (optional)
- Type: BaseCache
- Description: Caching mechanism for the chat model
-
Model Name
-
Model Name
- Type: string
- Placeholder: ft:gpt-3.5-turbo:my-org:custom_suffix:id
- Description: The identifier of the custom or fine-tuned model
-
Temperature (optional)
-
Temperature (optional)
- Type: number
- Default: 0.9
- Step: 0.1
- Description: Controls randomness in output generation
-
Max Tokens (optional)
-
Max Tokens (optional)
- Type: number
- Step: 1
- Description: Maximum number of tokens to generate
-
Top Probability (optional)
-
Top Probability (optional)
- Type: number
- Step: 0.1
- Description: Nucleus sampling parameter
-
Frequency Penalty (optional)
-
Frequency Penalty (optional)
- Type: number
- Step: 0.1
- Description: Penalizes frequent token usage
-
Presence Penalty (optional)
-
Presence Penalty (optional)
- Type: number
- Step: 0.1
- Description: Encourages the model to talk about new topics
-
Timeout (optional)
-
Timeout (optional)
- Type: number
- Step: 1
- Description: Maximum time (in milliseconds) to wait for a response
-
BasePath (optional)
-
BasePath (optional)
- Type: string
- Description: Custom base URL for the API
-
BaseOptions (optional)
-
BaseOptions (optional)
- Type: json
- Description: Additional configuration options for the API client
Functionality
The node initializes a ChatOpenAI instance with the provided configuration. It handles parameter parsing, credential management, and optional configurations like caching and custom API endpoints. The resulting chat model can be used in workflows that require interaction with custom OpenAI models.
Use Cases
- Integrating domain-specific fine-tuned models into chat workflows
- Customizing API behavior for OpenAI-compatible endpoints
- Experimenting with different model parameters for optimal performance
Notes
- Ensure that the model name correctly corresponds to your custom or fine-tuned model in the OpenAI platform.
- The node supports both required parameters (like model name) and additional parameters for fine-tuning the model’s behavior.
- Proper API credentials must be configured for successful operation.