150 lines of AI Web Search Agent



This content originally appeared on DEV Community and was authored by shrey vijayvargiya

Hello and welcome to the new blog

In today’s story, we will build a simple web search AI agent in a few lines of code, without complicating things.

Hono is what I’ll be choosing again for this API endpoint, and LLM, we will be using @google/genai.

2 things we will cover are first, we will use AI LLM to convert a prompt into an answer, and tools for AI LLM, a method that is called by LLM if needed to find answers to a few things from the internet, making a robust AI web search agent.

But since it’s a practice thing, I want to make it quick, so for the tool method, I decided to go with a third-party API, which I’ll cover later on.

Moving ahead, first and foremost is to add google/genai or openai npm module to hono app along with hono getting started.

import dotenv from "dotenv";
import { Type, GoogleGenAI } from "@google/genai";
import { serve } from "@hono/node-server";
import { Hono } from "hono";

// Load environment variables
dotenv.config();

const genai = new GoogleGenAI({
    apiKey: process.env.GOOGLE_GENAI_API_KEY,
});

const app = new Hono();
const googleSearchFunction = async (query) => {
    try {
        const response = await fetch("https://api.firecrawl.dev/v1/search", {
            method: "POST",
            headers: {
                Authorization: `Bearer ${process.env.FIRECRAWL_API_KEY}`,
                "Content-Type": "application/json",
            },
            body: JSON.stringify({
                query: query,
                limit: 5,
                scrapeOptions: {
                    onlyMainContent: true,
                    timeout: 30000,
                    parsePDF: true,
                    removeBase64Images: true,
                    blockAds: true,
                    storeInCache: true,
                },
            }),
        });

        if (!response.ok) {
            throw new Error(
                `Firecrawl API error: ${response.status} ${response.statusText}`
            );
        }

        const data = await response.json();
        return { result: data, success: true };
    } catch (error) {
        return { error: error.message, success: false };
    }
};

const googleSearchDeclaration = {
    name: "google_search",
    description: "Search the web for information",
    parameters: {
        type: Type.OBJECT,
        properties: {
            query: {
                type: Type.STRING,
                description: "The query to search for",
            },
        },
        required: ["query"],
    },
};

app.post("/ai-web-search-agent", async (c) => {
    const { prompt } = await c.req.json();
    const response = await genai.models.generateContent({
        
        model: "gemini-2.0-flash",
        contents: [
            {
                role: "model",
                parts: [
                    {
                        text: "You are a helpful assistant that can search the web for information.",
                    },
                ],
            },
            {
                role: "user",
                parts: [
                    {
                        text: prompt,
                    },
                ],
            },
        ],
        config: {
            tools: [
                {
                    functionDeclarations: [googleSearchDeclaration],
                },
            ],
        },
    });

    const functionCalls = response.candidates[0].content.parts.filter(
        (part) => part.functionCall
    );

    let functionResults = [];
    for (const functionCall of functionCalls) {
        const { name, args } = functionCall.functionCall;
        if (name === "google_search") {
            const result = await googleSearchFunction(args.query);
            console.log(result, "result");
            functionResults.push(result);
        }
    }

    const finalResponse = await genai.models.generateContent({
        model: "gemini-2.0-flash",
        contents: [
            {
                role: "model",
                parts: [
                    {
                        text:
                            "Here are the search results: " +
                            functionResults[0].result.data.map((item) => item.title),
                    },
                ],
            },
        ],
    });
    return c.json({
        response: finalResponse.candidates[0].content.parts[0].text,
    });
});


serve({
    fetch: app.fetch,
    port: 3000,
});

The above is the final AI web search agent code. I’ve shared it upfront to make things easy to understand.

First, I’ve defined a Genai instance to provide an API key for creating an LLM instance. One can use local Ollama models with langchain or openai as well

Then we use an LLM instance to provide a prompt provided by the user

Tools take function declarations, instructing LLM about the function/tool to use for grabbing more information from the internet.

Function/tool call return response is used by LLM again to return the output as per the user prompt, finally

The above 4 steps are the only steps needed to make your AI agent.

Let’s break a bit more

Tool Declaration

Treat it like a function, but first define the function by its name, description, parameters, required, output schema or response schema and so on.

const googleSearchDeclaration = {
    name: "google_search",
    description: "Search the web for information",
    parameters: {
        type: Type.OBJECT,
        properties: {
            query: {
                type: Type.STRING,
                description: "The query to search for",
            },
        },
        required: ["query"],
    },
};

Tool Method

const googleSearchFunction = async (query) => {
    try {
        const response = await fetch("https://api.firecrawl.dev/v1/search", {
            method: "POST",
            headers: {
                Authorization: `Bearer ${process.env.FIRECRAWL_API_KEY}`,
                "Content-Type": "application/json",
            },
            body: JSON.stringify({
                query: query,
                limit: 5,
                scrapeOptions: {
                    onlyMainContent: true,
                    timeout: 30000,
                    parsePDF: true,
                    removeBase64Images: true,
                    blockAds: true,
                    storeInCache: true,
                },
            }),
        });

        if (!response.ok) {
            throw new Error(
                `Firecrawl API error: ${response.status} ${response.statusText}`
            );
        }

        const data = await response.json();
        return { result: data, success: true };
    } catch (error) {
        return { error: error.message, success: false };
    }
};

I am using the Firecrawl API key to create an internet/web search tool method

One can use other API from rapidapi.com, a few examples are Brave web search API, Exa.ai, and Serp API

Once tools are defined and created, we first need to declare the tool and then fetch the tool result after the LLM make the tool call.

How do we know whether LLM makes a tool call or not?

Simple, AI LLM will return the function call in the response as given below

    const functionCalls = response.candidates[0].content.parts.filter(
        (part) => part.functionCall
    );

    let functionResults = [];
    for (const functionCall of functionCalls) {
        const { name, args } = functionCall.functionCall;
        if (name === "google_search") {
            const result = await googleSearchFunction(args.query);
            console.log(result, "result");
            functionResults.push(result);
        }
    }

In the above code, we get the AI response using a prompt, and the AI decides to make the tool call as needed because of the provided prompt and those tools are called functionCalls. Each function call has a name, similar to what name we have provided in the above tool declaration.

In this way, we can not filter the tools calls and invoke the tool method accordingly to get the tool results, pushing all tools result in an array name functionResults to use it later on as the input prompt to the final LLM call.

const finalResponse = await genai.models.generateContent({
        model: "gemini-2.0-flash",
        contents: [
            {
                role: "model",
                parts: [
                    {
                        text:
                            "Here are the search results: " +
                            functionResults[0].result.data.map((item) => item.title),
                    },
                ],
            },
        ],
    });

In the above code, I am using the functionResults to finally provide the input prompt from the tools search to the LLM to generate a final response. In this way, the LLM can use tools whenever needed, for example, find top trending news, find latest updates and so on.

So that’s it, our 150 lines of AI web search agent

But story didn’t end here, one should replace Firecrawl dev API with custom google search or choose some cheap alternatives as well to become a better developer.

So probably in next story, we will build the same, AI web search agent with custom google search engine using playwright or cheerio or crawlee.

Till that time, have a good day

iHateReading

Originally published on iHateReading


This content originally appeared on DEV Community and was authored by shrey vijayvargiya