A production-ready Next.js template for building AI agents with LangGraph.js, featuring Model Context Protocol (MCP) integration, human-in-the-loop tool approval, and persistent memory.
Complete agent workflow: user input → tool approval → execution → streaming response
I help teams design and optimize LangGraph-based AI agents (RAG, memory, latency, architecture).
If you're building something serious on top of this template and want hands-on help:
Happy to jump on a short call.
- Model Context Protocol integration for dynamic tool management
- Add tools via web UI - no code changes required
- Support for both stdio and HTTP MCP servers
- Tool name prefixing to prevent conflicts
- Interactive tool call approval before execution
- Granular control with approve/deny/modify options
- Optional auto-approval mode for trusted environments
- Real-time streaming with tool execution pauses
- LangGraph checkpointer with PostgreSQL backend
- Full conversation history preservation
- Thread-based organization
- Seamless resume across sessions
- Upload images, PDFs, and text files with messages
- S3-compatible storage (MinIO for development)
- Automatic file processing for AI consumption
- Production-ready with AWS S3, Cloudflare R2 support
- Server-Sent Events (SSE) for live responses
- Optimistic UI updates with React Query
- Type-safe message handling
- Error recovery and graceful degradation
- Frontend: Next.js 15, React 19, TypeScript, Tailwind CSS
- Backend: Node.js, Prisma ORM, PostgreSQL, MinIO/S3
- AI: LangGraph.js, OpenAI/Google models
- UI: shadcn/ui components, Lucide icons
- Node.js 18+ and pnpm
- Docker (for PostgreSQL and MinIO)
- OpenAI API key or Google AI API key
git clone https://github.com/IBJunior/fullstack-langgraph-nextjs-agent.git
cd fullstack-langgraph-nextjs-agent
pnpm installcp .env.example .env.localEdit .env.local with your configuration:
# Database
DATABASE_URL="postgresql://user:password@localhost:5434/agent_db"
# AI Models (choose one or both)
OPENAI_API_KEY="sk-..."
GOOGLE_API_KEY="..."
# Optional: Default model
DEFAULT_MODEL="gpt-4o-mini" # or "gemini-1.5-flash"docker compose up -d # Starts PostgreSQL and MinIOpnpm prisma:generate
pnpm prisma:migratepnpm devVisit http://localhost:3000 to start chatting with your AI agent!
- Navigate to Settings - Click the gear icon in the sidebar
- Add MCP Server - Click "Add MCP Server" button
- Configure Server:
- Name: Unique identifier (e.g., "filesystem")
- Type: Choose
stdioorhttp - Command: For stdio servers (e.g.,
npx @modelcontextprotocol/server-filesystem) - Args: Command arguments (e.g.,
["/path/to/allow"]) - URL: For HTTP servers
MCP server configuration form with example filesystem server setup
Want to build your own MCP server? Check out create-mcp-server - scaffold production-ready MCP servers in seconds with TypeScript, multiple frameworks (MCP SDK or FastMCP), and built-in debugging tools.
{
"name": "filesystem",
"type": "stdio",
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "/Users/yourname/Documents"]
}{
"name": "web-api",
"type": "http",
"url": "http://localhost:8080/mcp",
"headers": {
"Authorization": "Bearer your-token"
}
}Note: Some HTTP MCP servers require OAuth 2.0 authentication. See OAuth Documentation for details.
- Agent Requests Tool - AI suggests using a tool
- Approval Prompt - Interface shows tool details and asks for approval
- User Decision:
- ✅ Allow: Execute tool as requested
- ❌ Deny: Skip tool execution
- ✏️ Modify: Edit tool parameters before execution
- Continue Conversation - Agent responds with tool results
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Next.js UI │◄──►│ Agent Service │◄──►│ LangGraph.js │
│ (React 19) │ │ (SSE Streaming) │ │ Agent │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ React Query │ │ Prisma │ │ MCP Clients │
│ (State Mgmt) │ │ (Database) │ │ (Tools) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│
▼
┌──────────────────────────────┐
│ PostgreSQL │ MinIO/S3 │
│ (Persistence)│ (File Store) │
└──────────────────────────────┘
- Creates StateGraph with agent→tool_approval→tools flow
- Handles tool approval interrupts
- Manages model binding and system prompts
- Dynamic tool loading from database-stored MCP servers
- Support for stdio and HTTP transports
- Tool name prefixing for conflict prevention
- Server-Sent Events for real-time responses
- Message processing and chunk aggregation
- Tool approval workflow handling
- React Query integration for optimistic UI
- Stream management and error handling
- Tool approval user interface
- S3-compatible storage with MinIO (development) or AWS S3 (production)
- File validation, upload, and content processing for AI
- Multimodal message building with base64 conversion
For detailed architecture documentation, see docs/ARCHITECTURE.md.
pnpm dev # Start development server with Turbopack
pnpm build # Production build
pnpm start # Start production server
pnpm lint # Run ESLint
pnpm format # Format with Prettier
pnpm format:check # Check formatting
# Database
pnpm prisma:generate # Generate Prisma client (after schema changes)
pnpm prisma:migrate # Create and apply migrations
pnpm prisma:studio # Open Prisma Studio (database UI)src/
├── app/ # Next.js App Router
│ ├── api/ # API routes (stream, upload, mcp-servers)
│ └── thread/ # Thread-specific pages
├── components/ # React components
├── hooks/ # Custom React hooks
├── lib/ # Core utilities
│ ├── agent/ # Agent-related logic
│ └── storage/ # File upload & S3 utilities
├── services/ # Business logic
└── types/ # TypeScript definitions
prisma/
├── schema.prisma # Database schema
└── migrations/ # Database migrations
- Agent Configuration:
src/lib/agent/builder.ts,src/lib/agent/mcp.ts - API Endpoints:
src/app/api/agent/stream/route.ts,src/app/api/agent/upload/route.ts - File Storage:
src/lib/storage/(validation, upload, content processing) - Database Models:
prisma/schema.prisma - Main Chat Interface:
src/components/Thread.tsx,src/components/MessageInput.tsx - Streaming Logic:
src/hooks/useChatThread.ts
We welcome contributions! This project is designed to be a community resource for LangGraph.js development.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Commit:
git commit -m 'Add amazing feature' - Push:
git push origin feature/amazing-feature - Open a Pull Request
- Follow TypeScript strict mode
- Use Prettier for formatting
- Add JSDoc comments for public APIs
- Test MCP server integrations thoroughly
- Update documentation for new features
This project is licensed under the MIT License - see the LICENSE file for details.
- LangChain for the incredible AI framework
- Model Context Protocol for the tool integration standard
- Next.js team for the amazing React framework
Ready to build your next AI agent?
If this repo helped you and you’d like guidance implementing it in production, feel free to reach out on LinkedIn.






