Skip to main content

Start Node

As its name implies, the Start Node is the entry point for all workflows in the Sequential Agent architecture. It receives the initial user query, initializes the conversation State, and sets the flow in motion.

Understanding the Start Node

The Start Node ensures that our conversational workflows have the necessary setup and context to function correctly. It's responsible for setting up key functionalities that will be used throughout the rest of the workflow:

  • Defining the default LLM: The Start Node requires us to specify a Chat Model (LLM) compatible with function calling, enabling agents in the workflow to interact with tools and external systems. It will be the default LLM used under the hood in the workflow.
  • Initializing Memory: We can optionally connect an Agent Memory Node to store and retrieve conversation history, enabling more context-aware responses.
  • Setting a custom State: By default, the State contains an immutable state.messages array, which acts as the transcript or history of the conversation between the user and the agents. The Start Node allows you to connect a custom State to the workflow adding a State Node, enabling the storage of additional information relevant to your workflow
  • Enabling moderation: Optionally, we can connect Input Moderation to analyze the user's input and prevent potentially harmful content from being sent to the LLM.

Inputs

RequiredDescription
Chat ModelYesThe default LLM that will power the conversation. Only compatible with models that are capable of function calling.
Agent Memory NodeNoConnect an Agent Memory Node to enable persistence and context preservation.
State NodeNoConnect a State Node to set a custom State, a shared context that can be accessed and modified by other nodes in the workflow.
Input ModerationNoConnect a Moderation Node to filter content by detecting text that could generate harmful output, preventing it from being sent to the LLM.

Outputs

The Start Node can connect to the following nodes as outputs:

  • Agent Node: Routes the conversation flow to an Agent Node, which can then execute actions or access tools based on the conversation's context.
  • LLM Node: Routes the conversation flow to an LLM Node for processing and response generation.
  • Condition Agent Node: Connects to a Condition Agent Node to implement branching logic based on the agent's evaluation of the conversation.
  • Condition Node: Connects to a Condition Node to implement branching logic based on predefined conditions.

Best Practices

Choose the right Chat Model

Ensure your selected LLM supports function calling, a key feature for enabling agent-tool interactions. Additionally, choose an LLM that aligns with the complexity and requirements of your application. You can override the default LLM by setting it at the Agent/LLM/Condition Agent node level when necessary.

Consider context and persistence

If your use case demands it, utilize Agent Memory Node to maintain context and personalize interactions.