Using LangGraph.js SDK to create Agents



This content originally appeared on DEV Community and was authored by Ramandeep Singh

To use the LangGraph.js SDK in your application, you need to meet several prerequisites to ensure proper setup, integration, and functionality. Below is a comprehensive list of the technical and knowledge-based requirements, based on the best practices and information available for building AI agents with LangGraph.js:

Technical Prerequisites

  1. Node.js Installation

    • Requirement: Node.js version 18 or higher.
    • Reason: LangGraph.js is a JavaScript-based framework, and the SDK requires a Node.js runtime environment to execute.
    • Action: Install Node.js from nodejs.org or verify your version with node -v. Ensure npm (Node Package Manager) is also installed for dependency management.
  2. Project Initialization

    • Requirement: A Node.js project initialized with a package.json file.
    • Action: Create a new project directory and run npm init -y to generate a package.json file. This file will manage your project dependencies.
  3. Required Dependencies

    • Core Packages:
      • @langchain/langgraph: The main LangGraph.js library for building and managing AI agent workflows.
      • @langchain/core: Provides core functionality for LangChain integrations, including message handling and state management.
      • @langchain/community (optional): For additional tools like search APIs (e.g., Tavily, SerpAPI) or integrations with services like MongoDB or Google APIs.
      • dotenv: To manage environment variables securely (e.g., API keys).
    • Optional Packages (depending on use case):
      • @langchain/openai or @langchain/anthropic: For integrating with specific LLM providers like OpenAI or Anthropic.
      • @langchain/mongodb and @langchain/langgraph-checkpoint-mongodb: For persistent state storage with MongoDB.
      • @wxflows/sdk@beta: For integrating with IBM’s watsonx.ai flows engine (if used for tool calling).
      • typescript: Recommended for type safety and better code quality, especially for production applications.
    • Action: Install dependencies using npm. For example:
     npm install @langchain/langgraph @langchain/core @langchain/community dotenv typescript
    

    If using TypeScript, initialize with npx tsc --init to create a tsconfig.json file.

  4. Environment Configuration

    • Requirement: A .env file to store sensitive information like API keys and endpoints.
    • Action: Create a .env file in your project root and add necessary environment variables. Example:
     OPENAI_API_KEY=your-openai-api-key
     ANTHROPIC_API_KEY=your-anthropic-api-key
     MONGODB_ATLAS_URI=your-mongodb-atlas-connection-string
     WXFLOWS_APIKEY=your-wxflows-apikey
     WXFLOWS_ENDPOINT=your-wxflows-endpoint
    

    Use the dotenv package to load these variables in your application.

  5. LLM Provider Account and API Key

    • Requirement: Access to a Large Language Model (LLM) provider such as OpenAI, Anthropic, or IBM watsonx.ai.
    • Reason: LangGraph.js agents rely on LLMs for reasoning and decision-making.
    • Action:
      • For OpenAI: Sign up at openai.com and obtain an API key.
      • For Anthropic: Sign up at anthropic.com and obtain an API key.
      • For IBM watsonx.ai: Create an account and obtain credentials for models and the flows engine (if used).
      • Alternatively, you can use local LLMs via platforms like Ollama.
  6. Optional: Database for State Persistence

    • Requirement: A database like MongoDB for persistent state management (e.g., conversation history or checkpoints).
    • Action: Set up a MongoDB Atlas account and obtain a connection string, or use a local MongoDB instance. Install @langchain/mongodb and @langchain/langgraph-checkpoint-mongodb for integration. Example:
     import { MongoClient } from "mongodb";
     const client = new MongoClient(process.env.MONGODB_ATLAS_URI);
    

  7. Optional: External Tool APIs

    • Requirement: API keys for external tools like Tavily, SerpAPI, Google Books, or Wikipedia, if your agent needs to call external services.
    • Action: Sign up for the relevant service (e.g., tavily.com for search) and add the API key to your .env file. Example:
     TAVILY_API_KEY=your-tavily-api-key
    

  8. Optional: LangGraph Server for Production

    • Requirement: For production-ready applications, you may need to set up a LangGraph Server (locally or via LangGraph Cloud).
    • Action: Install the LangGraph CLI for local development:
     npm install -g @langchain/langgraph-cli
    

    Configure a langgraph.json file to define your agent and dependencies. Example:

     {
       "node_version": "18",
       "graphs": {
         "agent": "./src/lib/agent.ts:agent"
       },
       "env": ".env",
       "dependencies": ["."]
     }
    

    Run the server with npx @langchain/langgraph-cli dev --port 54367.

  9. Optional: Frontend Framework (e.g., Next.js)

    • Requirement: If integrating LangGraph.js with a web application, a frontend framework like Next.js is recommended for seamless UI integration.
    • Action: Set up a Next.js project with:
     npx create-next-app@latest
    

    Install additional dependencies like langgraph-nextjs-api-passthrough for API routing.

Knowledge-Based Prerequisites

  1. Basic Understanding of JavaScript/TypeScript

    • You should be familiar with JavaScript (ES6+) or TypeScript for writing and managing agent logic. Knowledge of async/await is particularly important for handling asynchronous API calls and workflows.
  2. Familiarity with Large Language Models (LLMs)

    • Understand how LLMs work and how to interact with them via APIs. Knowledge of prompt engineering and tool-calling concepts is beneficial for defining agent behavior.
  3. Understanding of LangChain Ecosystem

    • LangGraph.js builds on LangChain.js, so familiarity with LangChain concepts (e.g., tools, agents, memory) is helpful. Review the LangChain.js documentation for an overview.
  4. Graph-Based Workflow Concepts

    • Understand the basics of graph-based architectures, including nodes, edges, and state management. LangGraph.js uses a StateGraph to define agent workflows, so knowing how to structure these is key.
  5. Optional: Familiarity with API Development

    • If integrating LangGraph.js with a frontend or external services, knowledge of API development (e.g., using FastAPI or Next.js API routes) is useful for exposing agent functionality.

Additional Considerations

  • LangGraph.js Configuration File:

    • For deployment to LangGraph Platform or self-hosting, create a langgraph.json file to specify project dependencies and configurations. Example:
    {
      "node_version": "18",
      "graphs": {
        "agent": "./src/index.ts:agent"
      },
      "env": ".env",
      "dependencies": ["."]
    }
    

  • Error Handling and Debugging:

    • Be prepared to implement error handling for robust agents. Use tools like LangGraph Studio for visualizing and debugging graph workflows.
  • Scalability and Performance:

    • For production applications, consider modular architectures and optimize for performance to avoid bottlenecks. LangGraph.js supports streaming for faster responses, which is useful for real-time applications.
  • Community and Documentation:

    • Leverage the LangChain community and official documentation (langchain-ai.lang.chat) for additional resources, examples, and troubleshooting.

Example Setup Workflow

  1. Initialize a Node.js project:
   mkdir my-langgraph-app
   cd my-langgraph-app
   npm init -y
  1. Install core dependencies:
   npm install @langchain/langgraph @langchain/core @langchain/community dotenv typescript
   npx tsc --init
  1. Create a .env file with your API keys:
   OPENAI_API_KEY=your-openai-api-key
  1. Write a basic LangGraph.js agent (e.g., index.ts):
   import { createReactAgent } from "@langchain/langgraph/prebuilt";
   import { ChatOpenAI } from "@langchain/openai";
   import { MemorySaver } from "@langchain/langgraph";

   const llm = new ChatOpenAI({ model: "gpt-4o", temperature: 0 });
   const checkpointer = new MemorySaver();
   const agent = createReactAgent({ llm, checkpointSaver: checkpointer });

   async function runAgent() {
     const result = await agent.invoke({
       messages: [{ role: "user", content: "Hello, how can you assist me?" }],
     });
     console.log(result);
   }

   runAgent();
  1. Run the application:
   npx ts-node index.ts

Summary

To use the LangGraph.js SDK in your application, ensure you have:

  • Node.js (18+) and npm installed.
  • A Node.js project with required dependencies (@langchain/langgraph, @langchain/core, etc.).
  • API keys for an LLM provider (e.g., OpenAI, Anthropic).
  • A .env file for environment variables.
  • Optional: MongoDB for persistence, external tool APIs, and a frontend framework like Next.js.
  • Basic knowledge of JavaScript/TypeScript, LLMs, and graph-based workflows.

For further guidance, refer to the LangGraph.js Quickstart or explore example repositories like those mentioned in the LangChain community posts.

If you need help with a specific setup step or example code, let me know!


This content originally appeared on DEV Community and was authored by Ramandeep Singh