Node Details

  • Name: chatOpenAICustom
  • Type: ChatOpenAI-Custom
  • Version: 3.0
  • Category: Chat Models

Base Classes

  • ChatOpenAI-Custom
  • [All base classes of ChatOpenAI]

Parameters

Credential

Credential

  • Type: credential
  • Name: openAIApi
  • Description: Connection credentials for the OpenAI API

Inputs

  1. Cache (optional)

  2. Cache (optional)

    • Type: BaseCache
    • Description: Caching mechanism for the chat model
  3. Model Name

  4. Model Name

    • Type: string
    • Placeholder: ft:gpt-3.5-turbo:my-org:custom_suffix:id
    • Description: The identifier of the custom or fine-tuned model
  5. Temperature (optional)

  6. Temperature (optional)

    • Type: number
    • Default: 0.9
    • Step: 0.1
    • Description: Controls randomness in output generation
  7. Max Tokens (optional)

  8. Max Tokens (optional)

    • Type: number
    • Step: 1
    • Description: Maximum number of tokens to generate
  9. Top Probability (optional)

  10. Top Probability (optional)

    • Type: number
    • Step: 0.1
    • Description: Nucleus sampling parameter
  11. Frequency Penalty (optional)

  12. Frequency Penalty (optional)

    • Type: number
    • Step: 0.1
    • Description: Penalizes frequent token usage
  13. Presence Penalty (optional)

  14. Presence Penalty (optional)

    • Type: number
    • Step: 0.1
    • Description: Encourages the model to talk about new topics
  15. Timeout (optional)

  16. Timeout (optional)

    • Type: number
    • Step: 1
    • Description: Maximum time (in milliseconds) to wait for a response
  17. BasePath (optional)

  18. BasePath (optional)

    • Type: string
    • Description: Custom base URL for the API
  19. BaseOptions (optional)

  20. BaseOptions (optional)

    • Type: json
    • Description: Additional configuration options for the API client

Functionality

The node initializes a ChatOpenAI instance with the provided configuration. It handles parameter parsing, credential management, and optional configurations like caching and custom API endpoints. The resulting chat model can be used in workflows that require interaction with custom OpenAI models.

Use Cases

  • Integrating domain-specific fine-tuned models into chat workflows
  • Customizing API behavior for OpenAI-compatible endpoints
  • Experimenting with different model parameters for optimal performance

Notes

  • Ensure that the model name correctly corresponds to your custom or fine-tuned model in the OpenAI platform.
  • The node supports both required parameters (like model name) and additional parameters for fine-tuning the model’s behavior.
  • Proper API credentials must be configured for successful operation.