LLMs
GroqChat
The GroqChat node is a wrapper around the Groq API, which uses the LPU (Language Processing Unit) Inference Engine. It’s designed to integrate Groq’s chat models into a larger system, likely for natural language processing tasks.
Node Details
- Name: groqChat
- Type: GroqChat
- Version: 3.0
- Version: 3.0
- Category: Chat Models
Base Classes
- ChatGroq
- Additional base classes from the ChatGroq implementation
Parameters
- ChatGroq
- Additional base classes from the ChatGroq implementation
Parameters
Credential
Credential
- Type: credential
- Name: groqApi
- Description: API key for authenticating with Groq services
- Description: API key for authenticating with Groq services
Inputs
- Cache (optional)
Inputs
-
Cache (optional)
- Type: BaseCache
- Description: Caching mechanism for storing and retrieving chat responses
- Description: Caching mechanism for storing and retrieving chat responses
-
Model Name
- Type: asyncOptions
- Load Method: listModels
- Placeholder: llama3-70b-8192
- Description: The specific Groq model to use for chat
- Description: The specific Groq model to use for chat
-
Temperature (optional)
-
Temperature (optional)
- Type: number
- Default: 0.9
- Step: 0.1
- Description: Controls the randomness of the model’s output. Higher values make the output more random, while lower values make it more deterministic
- Step: 0.1
- Description: Controls the randomness of the model’s output. Higher values make the output more random, while lower values make it more deterministic
Methods
loadMethods
- listModels: Asynchronously fetches available chat models for Groq.
init
Initializes the GroqChat model with the following steps:
- Extracts input parameters (modelName, cache, temperature, streaming)
- Retrieves credential data for the Groq API key
- Constructs a ChatGroqInput object with the provided parameters
- Creates and returns a new ChatGroq instance
Usage
This node is used to integrate Groq’s chat models into a larger system. It allows users to specify the model, adjust parameters like temperature, and optionally use caching. The node handles the authentication and initialization of the Groq chat model, making it easy to use within a broader application or workflow.
Key Features
- Asynchronous model listing
- Temperature control for output randomness
- Optional caching support
- Streaming support
- Credential management for API key
Notes
- The node uses the @langchain/groq package for the core functionality.
- It integrates with a broader system, likely a node-based workflow or application builder.
- The node supports dynamic loading of available models.
- It handles credential management, allowing for secure API key usage.