LLMs
ChatCohere
The ChatCohere node is a wrapper around Cohere’s Chat Endpoints, providing integration with Cohere’s chat models in a workflow.
Node Details
- Name: ChatCohere
- Type: ChatCohere
- Version: 1.0
- Category: Chat Models
Base Classes
- ChatCohere
- BaseChatModel
- BaseLanguageModel
- Runnable
Parameters
Credential (Required)
Parameters
Credential (Required)
- Type: credential
- Name: cohereApi
- Description: API key for accessing Cohere’s services
Inputs
Inputs
-
Cache (optional)
- Type: BaseCache
- Description: Caching mechanism for storing and retrieving chat responses
-
Model Name
- Type: asyncOptions
- Default: “command-r”
- Description: The specific Cohere chat model to use
- Note: Available models are dynamically loaded using the
listModels
method
-
Temperature (optional)
- Type: number
- Default: 0.7
- Step: 0.1
- Description: Controls the randomness of the model’s output
-
Streaming (not visible in UI, but used in initialization)
- Type: boolean
- Default: true
- Description: Whether to use streaming mode for chat responses
Initialization
The node initializes a ChatCohere instance with the following parameters:
- Model name
- Cohere API key
- Temperature (if provided)
- Streaming option
- Cache (if provided)
Usage
This node is used to integrate Cohere’s chat models into a larger workflow or application. It allows for customization of the chat model’s behavior through temperature settings and model selection. The node can be used for various natural language processing tasks such as:
- Generating human-like text responses
- Answering questions
- Assisting with writing and editing tasks
- Language translation
- And more, depending on the capabilities of the selected Cohere model
Note
The actual capabilities and performance of the chat model will depend on the specific Cohere model selected and how it’s used within the larger application context.