The Unstoppable Lead Generation Engine: A Multi-Agent AI Workflow with n8n & Bright Data



This content originally appeared on DEV Community and was authored by Inforeole Automatisations IA

This is a submission for the AI Agents Challenge powered by n8n and Bright Data

What I Built

I built the Lead Opportunity Finder, an AI-powered agent designed to automate the initial phase of B2B prospecting.

This tool solves the problem of manual, time-consuming research by automatically analyzing a prospect’s website to generate a list of actionable business opportunities. It’s a strategic asset for sales, marketing, and business development professionals, providing them with personalized, data-driven insights to tailor their outreach efforts.

Demo

n8n Workflow

Link to GitHub Gist with workflow JSON

Technical Implementation

The agent’s architecture is a multi-step n8n workflow orchestrated through a central chat trigger.

  • System Instructions: The agent operates with a series of distinct system prompts tailored for each step. The first prompt, for an AI Agent, instructs it to act as a “B2B data extraction expert” to identify relevant company URLs. Subsequent prompts, for two other AI Agents, define their roles as “B2B lead generation expert” and direct them to summarize and synthesize business opportunities from the scraped content.
  • Model Choice: I selected OpenRouter as the Large Language Model provider to leverage a variety of high-performance models. The initial AI Agent uses openai/o4-mini for its cost-effectiveness and speed in filtering URLs, while the more complex analysis and synthesis tasks are handled by openai/gpt-5 for its advanced reasoning capabilities.
  • Memory: The workflow uses a “Set” node to manage the input URL and a “Merge” node to combine the individual AI summaries from different pages into a single payload for final analysis. This approach ensures all relevant context is preserved without relying on a conversational memory, which is not suitable for this kind of sequential data processing.
  • Tools Used: The primary tools are n8n’s built-in nodes for code execution and data manipulation, alongside the core integrations with Bright Data and OpenRouter. The “Code” node, for example, is used to parse HTML and clean the extracted URLs.

Bright Data Verified Node

I utilized the Bright Data Verified Node to bypass potential scraping blocks, ensuring reliable access to the target websites. The node’s “Web Unblocker” functionality was critical for this project, as it enabled the workflow to successfully retrieve HTML content from various websites without being detected as a bot. This guaranteed the integrity and completeness of the data used by the AI agents for their analysis, which would have been impossible with a standard HTTP request node.

Journey

The main challenge was fine-tuning the AI prompts to produce consistent and relevant outputs. Initially, a single prompt was used for the entire analysis, which resulted in a lot of irrelevant information and poor synthesis. I overcame this by breaking down the task into specialized stages: first, extracting relevant URLs with a targeted prompt; second, summarizing content on a page-by-page basis; and third, synthesizing all summaries into a final, coherent report with a separate, precise prompt.

Another challenge was managing the workflow’s data flow, specifically ensuring the correct context was passed between nodes. The use of the “Merge” node proved essential for combining the results of multiple AI analyses into a single, comprehensive object for the final synthesis step.

What I learned is the power of a modular approach to building AI agents. Instead of relying on a single, complex prompt, dividing the workflow into smaller, specialized tasks—each with its own targeted instructions and model—dramatically improved the accuracy, reliability, and efficiency of the final output. It also highlighted the importance of a robust scraping tool like Bright Data to ensure the agent has access to clean, reliable data.


This content originally appeared on DEV Community and was authored by Inforeole Automatisations IA