Node Details

  • Name: seqStart

  • Label: Start

  • Version: 2.0

  • Type: Start

  • Category: Sequential Agents

  • Base Class: Start

Parameters

Required

  1. Chat Model

    • Type: BaseChatModel

    • Description: The language model used for the conversation. Only compatible with models capable of function calling, such as ChatOpenAI, ChatMistral, ChatAnthropic, ChatGoogleGenerativeAI, ChatVertexAI, and GroqChat.

Optional

  1. Agent Memory

    • Type: BaseCheckpointSaver

    • Description: Used to save the state of the agent, allowing for persistence across sessions or restarts.

  2. State

    • Type: State

    • Description: An object that is updated by nodes in the graph, passing from one node to another. By default, it contains “messages” that are updated with each message sent and received.

  3. Input Moderation

    • Type: Moderation

    • Description: Detects text that could generate harmful output and prevents it from being sent to the language model. Can have multiple moderation checks.

Initialization

The init method of this node:

  1. Retrieves the input moderation settings (if any)

  2. Gets the specified chat model

  3. Creates and returns an ISeqAgentNode object with:

    • Node identification (id, name, label, type)

    • The START constant as output

    • The language model (both as llm and startLLM)

    • Moderation settings

    • Checkpoint memory (if specified)

Usage

This node is typically used as the first node in a sequential agent graph. It sets up the initial state and components that will be used throughout the conversation flow. The output from this node can be connected to subsequent nodes in the graph to define the agent’s behavior and decision-making process.