LLM Node
The LLM Node is a crucial component in the Sequential Agents workflow, designed to run a Chat Model and return its output. It serves as an intermediary step in a sequence of agent actions, allowing for complex reasoning and decision-making processes.
Node Details
-
Name: seqLLMNode
-
Type: LLMNode
-
Category: Sequential Agents
-
Version: 3.0
Parameters
-
Name (string)
-
Label: Name
-
Description: A unique identifier for the LLM Node
-
Required: Yes
-
-
System Prompt (string)
-
Label: System Prompt
-
Description: Initial prompt to set the context for the LLM
-
Optional: Yes
-
Additional Parameter: Yes
-
-
Human Prompt (string)
-
Label: Human Prompt
-
Description: Prompt added at the end of the messages as a human message
-
Optional: Yes
-
Additional Parameter: Yes
-
-
Messages History (code)
-
Label: Messages History
-
Description: List of messages between System Prompt and Human Prompt, useful for few-shot examples
-
Optional: Yes
-
Additional Parameter: Yes
-
-
Sequential Node (list)
-
Label: Start | Agent | Condition | LLM | Tool Node
-
Description: Predecessor nodes in the sequential workflow
-
Required: Yes
-
-
Chat Model (BaseChatModel)
-
Label: Chat Model
-
Description: Overwrite model to be used for this node
-
Optional: Yes
-
-
Format Prompt Values (json)
-
Label: Format Prompt Values
-
Description: Assign values to prompt variables, can use $flow.state.
variable-name
-
Optional: Yes
-
Additional Parameter: Yes
-
-
JSON Structured Output (datagrid)
-
Label: JSON Structured Output
-
Description: Instruct the LLM to give output in a JSON structured schema
-
Optional: Yes
-
Additional Parameter: Yes
-
-
Update State (tabs)
-
Label: Update State
-
Description: Mechanism to update the state after LLM execution
-
Optional: Yes
-
Additional Parameter: Yes
-
Tabs:
-
Update State (Table): UI-based state update
-
Update State (Code): Code-based state update
-
-
Input
The LLM Node receives input from its predecessor nodes in the sequential workflow. This can include:
-
Previous agent outputs
-
Tool results
-
Condition evaluations
-
Initial input from the Start node
Output
The LLM Node produces an output that typically includes:
- Content: The main response from the L