Node Details

  • Name: seqLLMNode

  • Type: LLMNode

  • Category: Sequential Agents

  • Version: 3.0

Parameters

  1. Name (string)

    • Label: Name

    • Description: A unique identifier for the LLM Node

    • Required: Yes

  2. System Prompt (string)

    • Label: System Prompt

    • Description: Initial prompt to set the context for the LLM

    • Optional: Yes

    • Additional Parameter: Yes

  3. Human Prompt (string)

    • Label: Human Prompt

    • Description: Prompt added at the end of the messages as a human message

    • Optional: Yes

    • Additional Parameter: Yes

  4. Messages History (code)

    • Label: Messages History

    • Description: List of messages between System Prompt and Human Prompt, useful for few-shot examples

    • Optional: Yes

    • Additional Parameter: Yes

  5. Sequential Node (list)

    • Label: Start | Agent | Condition | LLM | Tool Node

    • Description: Predecessor nodes in the sequential workflow

    • Required: Yes

  6. Chat Model (BaseChatModel)

    • Label: Chat Model

    • Description: Overwrite model to be used for this node

    • Optional: Yes

  7. Format Prompt Values (json)

    • Label: Format Prompt Values

    • Description: Assign values to prompt variables, can use $flow.state.variable-name

    • Optional: Yes

    • Additional Parameter: Yes

  8. JSON Structured Output (datagrid)

    • Label: JSON Structured Output

    • Description: Instruct the LLM to give output in a JSON structured schema

    • Optional: Yes

    • Additional Parameter: Yes

  9. Update State (tabs)

    • Label: Update State

    • Description: Mechanism to update the state after LLM execution

    • Optional: Yes

    • Additional Parameter: Yes

    • Tabs:

      • Update State (Table): UI-based state update

      • Update State (Code): Code-based state update

Input

The LLM Node receives input from its predecessor nodes in the sequential workflow. This can include:

  • Previous agent outputs

  • Tool results

  • Condition evaluations

  • Initial input from the Start node

Output

The LLM Node produces an output that typically includes:

  • Content: The main response from the L