This content originally appeared on DEV Community and was authored by Ramandeep Singh
To use the LangGraph.js SDK in your application, you need to meet several prerequisites to ensure proper setup, integration, and functionality. Below is a comprehensive list of the technical and knowledge-based requirements, based on the best practices and information available for building AI agents with LangGraph.js:
Technical Prerequisites
-
Node.js Installation
- Requirement: Node.js version 18 or higher.
- Reason: LangGraph.js is a JavaScript-based framework, and the SDK requires a Node.js runtime environment to execute.
-
Action: Install Node.js from nodejs.org or verify your version with
node -v
. Ensure npm (Node Package Manager) is also installed for dependency management.
-
Project Initialization
-
Required Dependencies
-
Core Packages:
-
@langchain/langgraph
: The main LangGraph.js library for building and managing AI agent workflows. -
@langchain/core
: Provides core functionality for LangChain integrations, including message handling and state management. -
@langchain/community
(optional): For additional tools like search APIs (e.g., Tavily, SerpAPI) or integrations with services like MongoDB or Google APIs. -
dotenv
: To manage environment variables securely (e.g., API keys).
-
-
Optional Packages (depending on use case):
-
@langchain/openai
or@langchain/anthropic
: For integrating with specific LLM providers like OpenAI or Anthropic. -
@langchain/mongodb
and@langchain/langgraph-checkpoint-mongodb
: For persistent state storage with MongoDB. -
@wxflows/sdk@beta
: For integrating with IBM’s watsonx.ai flows engine (if used for tool calling). -
typescript
: Recommended for type safety and better code quality, especially for production applications.
-
- Action: Install dependencies using npm. For example:
npm install @langchain/langgraph @langchain/core @langchain/community dotenv typescript
If using TypeScript, initialize with
npx tsc --init
to create atsconfig.json
file. -
Core Packages:
-
Environment Configuration
-
Requirement: A
.env
file to store sensitive information like API keys and endpoints. -
Action: Create a
.env
file in your project root and add necessary environment variables. Example:
OPENAI_API_KEY=your-openai-api-key ANTHROPIC_API_KEY=your-anthropic-api-key MONGODB_ATLAS_URI=your-mongodb-atlas-connection-string WXFLOWS_APIKEY=your-wxflows-apikey WXFLOWS_ENDPOINT=your-wxflows-endpoint
Use the
dotenv
package to load these variables in your application. -
Requirement: A
-
LLM Provider Account and API Key
- Requirement: Access to a Large Language Model (LLM) provider such as OpenAI, Anthropic, or IBM watsonx.ai.
- Reason: LangGraph.js agents rely on LLMs for reasoning and decision-making.
-
Action:
- For OpenAI: Sign up at openai.com and obtain an API key.
- For Anthropic: Sign up at anthropic.com and obtain an API key.
- For IBM watsonx.ai: Create an account and obtain credentials for models and the flows engine (if used).
- Alternatively, you can use local LLMs via platforms like Ollama.
-
Optional: Database for State Persistence
- Requirement: A database like MongoDB for persistent state management (e.g., conversation history or checkpoints).
-
Action: Set up a MongoDB Atlas account and obtain a connection string, or use a local MongoDB instance. Install
@langchain/mongodb
and@langchain/langgraph-checkpoint-mongodb
for integration. Example:
import { MongoClient } from "mongodb"; const client = new MongoClient(process.env.MONGODB_ATLAS_URI);
-
Optional: External Tool APIs
- Requirement: API keys for external tools like Tavily, SerpAPI, Google Books, or Wikipedia, if your agent needs to call external services.
-
Action: Sign up for the relevant service (e.g., tavily.com for search) and add the API key to your
.env
file. Example:
TAVILY_API_KEY=your-tavily-api-key
-
Optional: LangGraph Server for Production
- Requirement: For production-ready applications, you may need to set up a LangGraph Server (locally or via LangGraph Cloud).
- Action: Install the LangGraph CLI for local development:
npm install -g @langchain/langgraph-cli
Configure a
langgraph.json
file to define your agent and dependencies. Example:
{ "node_version": "18", "graphs": { "agent": "./src/lib/agent.ts:agent" }, "env": ".env", "dependencies": ["."] }
Run the server with
npx @langchain/langgraph-cli dev --port 54367
. -
Optional: Frontend Framework (e.g., Next.js)
- Requirement: If integrating LangGraph.js with a web application, a frontend framework like Next.js is recommended for seamless UI integration.
- Action: Set up a Next.js project with:
npx create-next-app@latest
Install additional dependencies like
langgraph-nextjs-api-passthrough
for API routing.
Knowledge-Based Prerequisites
-
Basic Understanding of JavaScript/TypeScript
-
Familiarity with Large Language Models (LLMs)
-
Understanding of LangChain Ecosystem
- LangGraph.js builds on LangChain.js, so familiarity with LangChain concepts (e.g., tools, agents, memory) is helpful. Review the LangChain.js documentation for an overview.
-
Graph-Based Workflow Concepts
-
Optional: Familiarity with API Development
Additional Considerations
-
LangGraph.js Configuration File:
- For deployment to LangGraph Platform or self-hosting, create a
langgraph.json
file to specify project dependencies and configurations. Example:
{ "node_version": "18", "graphs": { "agent": "./src/index.ts:agent" }, "env": ".env", "dependencies": ["."] }
- For deployment to LangGraph Platform or self-hosting, create a
-
Error Handling and Debugging:
-
Scalability and Performance:
-
Community and Documentation:
- Leverage the LangChain community and official documentation (langchain-ai.lang.chat) for additional resources, examples, and troubleshooting.
Example Setup Workflow
- Initialize a Node.js project:
mkdir my-langgraph-app
cd my-langgraph-app
npm init -y
- Install core dependencies:
npm install @langchain/langgraph @langchain/core @langchain/community dotenv typescript
npx tsc --init
- Create a
.env
file with your API keys:
OPENAI_API_KEY=your-openai-api-key
- Write a basic LangGraph.js agent (e.g.,
index.ts
):
import { createReactAgent } from "@langchain/langgraph/prebuilt";
import { ChatOpenAI } from "@langchain/openai";
import { MemorySaver } from "@langchain/langgraph";
const llm = new ChatOpenAI({ model: "gpt-4o", temperature: 0 });
const checkpointer = new MemorySaver();
const agent = createReactAgent({ llm, checkpointSaver: checkpointer });
async function runAgent() {
const result = await agent.invoke({
messages: [{ role: "user", content: "Hello, how can you assist me?" }],
});
console.log(result);
}
runAgent();
- Run the application:
npx ts-node index.ts
Summary
To use the LangGraph.js SDK in your application, ensure you have:
- Node.js (18+) and npm installed.
- A Node.js project with required dependencies (
@langchain/langgraph
,@langchain/core
, etc.). - API keys for an LLM provider (e.g., OpenAI, Anthropic).
- A
.env
file for environment variables. - Optional: MongoDB for persistence, external tool APIs, and a frontend framework like Next.js.
- Basic knowledge of JavaScript/TypeScript, LLMs, and graph-based workflows.
For further guidance, refer to the LangGraph.js Quickstart or explore example repositories like those mentioned in the LangChain community posts.
If you need help with a specific setup step or example code, let me know!
This content originally appeared on DEV Community and was authored by Ramandeep Singh