Building a Chat Interface: From Components to Conversation



This content originally appeared on DEV Community and was authored by Sebastiao Gazolla Jr

Part 6 of the “From Zero to AI Agent: My Journey into Java-based Intelligent Applications” series

We’ve built all the core components: MCPService for tool connections (Post 3), LLMClient for AI intelligence (Post 4), and SimpleInference for smart query processing (Posts 5). Now it’s time to bring everything together into something people can actually use: a simple chat interface.

Today we’ll create a minimal but functional chat application that makes interacting with MCP tools feel natural.

The Goal: Simple Conversation

Instead of complex commands, users should be able to chat naturally:

AI Assistant Ready! Type 'exit' to quit.

You: What's the weather like?
Assistant: The current weather is 22°C with partly cloudy skies.

You: Save that to weather.txt  
Assistant: I've saved the weather information to weather.txt successfully.

You: exit
Assistant: Goodbye!

Simple, clean, functional.

The ChatInterface Class

Here’s our complete chat interface – much simpler than you might expect:

import java.util.Scanner;

public class ChatInterface {
    private final SimpleInference inference; // From Posts 5
    private final MCPService mcpService;     // From Post 3
    private final Scanner scanner;
    private boolean running = true;

    public ChatInterface(MCPService mcpService, LLMClient llmClient) {
        this.mcpService = mcpService;
        this.inference = new SimpleInference(mcpService, llmClient);
        this.scanner = new Scanner(System.in);
    }

    public void startChat() {
        showWelcome();

        while (running) {
            String input = getUserInput();

            if (isExitCommand(input)) {
                running = false;
                System.out.println("Goodbye!");
                continue;
            }

            processUserQuery(input);
        }

        cleanup();
    }

    private void showWelcome() {
        List<Tool> tools = mcpService.getAllAvailableTools();
        System.out.println("AI Assistant Ready! Connected to " + tools.size() + " tools.");
        System.out.println("Type 'exit' to quit.\n");
    }

    private String getUserInput() {
        System.out.print("You: ");
        return scanner.nextLine().trim();
    }

    private boolean isExitCommand(String input) {
        String lower = input.toLowerCase();
        return lower.equals("exit") || lower.equals("quit") || lower.equals("bye");
    }

    private void processUserQuery(String input) {
        if (input.isEmpty()) {
            return;
        }

        try {
            String response = inference.processQuery(input);
            System.out.println(Assistant: " + response + "\n");

        } catch (Exception e) {
            System.out.println("Sorry, I encountered an error: " + e.getMessage() + "\n");
        }
    }

    private void cleanup() {
        scanner.close();
        mcpService.close();
    }
}

That’s it! One class, six methods, everything working.

Complete Application

Here’s how to tie everything together:

public class ChatApp {
    public static void main(String[] args) {
        try {
            // Initialize components from previous posts
            MCPService mcpService = new MCPService();
            LLMClient llmClient = createLLMClient();

            // Start chat
            ChatInterface chat = new ChatInterface(mcpService, llmClient);
            chat.startChat();

        } catch (Exception e) {
            System.err.println("Failed to start: " + e.getMessage());
        }
    }

    private static LLMClient createLLMClient() {
        String groqKey = System.getenv("GROQ_API_KEY");
        if (groqKey != null && !groqKey.isEmpty()) {
            return LLMClientFactory.createGroqClient(groqKey);
        }

        String geminiKey = System.getenv("GEMINI_API_KEY");
        if (geminiKey != null && !geminiKey.isEmpty()) {
            return LLMClientFactory.createGeminiClient(geminiKey);
        }

        throw new RuntimeException("Set GROQ_API_KEY or GEMINI_API_KEY environment variable");
    }
}

Real Conversation Example

Here’s what a real conversation looks like:

AI Assistant Ready! Connected to 8 tools.
Type 'exit' to quit.

You: what's the weather in Tokyo?
Assistant: The current weather in Tokyo is 22°C with partly cloudy skies and light winds.

You: save that to weather-tokyo.txt
Assistant: I've saved the weather information to weather-tokyo.txt successfully.

You: list my files
Assistant: I found 4 files in your current directory: weather-tokyo.txt, config.json, notes.md, and data.csv.

You: what is 2+2?
Assistant: 2+2 equals 4. This is a basic addition operation.

You: read the weather file
Assistant: The weather file contains: The current weather in Tokyo is 22°C with partly cloudy skies and light winds.

You: exit
Assistant: Goodbye!

Notice how it seamlessly handles:

  • Weather queries using MCP weather tools
  • File operations with context (“that” refers to weather data)
  • Direct questions using LLM knowledge
  • Context references between queries

Integration Highlights

Our chat interface successfully uses all components from previous posts:

MCPService (Post 3):

mcpService.getAllAvailableTools()  // Show connection status
mcpService.close()                 // Cleanup on exit
// Used internally by SimpleInference for tool execution

LLMClient (Post 4):

LLMClientFactory.createGroqClient(apiKey)  // Client creation
// Used internally by SimpleInference for intelligence

SimpleInference (Post 5):

inference.processQuery(input)  // Main query processing
// Handles analysis, tool selection, parameter extraction

Everything works together seamlessly with zero additional complexity.

Error Handling

Our simple approach to handle errors:

private void processUserQuery(String input) {
    if (input.isEmpty()) {
        return;
    }

    try {
        String response = inference.processQuery(input);
        System.out.println("Assistant: " + response + "\n");

    } catch (Exception e) {
        System.out.println("Sorry, I encountered an error: " + e.getMessage() + "\n");
    }
}

Why This Simple Approach Works

We’re using a very simple approach for this project for educational purposes, aiming to facilitate understanding and learning. The chat is not our main focus but rather a complementary tool to illustrate concepts clearly.

Running the Application

Set your API key:

   export GROQ_API_KEY=your_key_here

Install MCP servers:

   npm install -g @modelcontextprotocol/server-weather
   npm install -g @modelcontextprotocol/server-filesystem

Run the chat:

   # Compile project
   mvn compile

   # Run interactive chat
   mvn exec:java -Dexec.mainClass="com.gazapps.ChatApp"

Start chatting naturally!

Current Limitations

Our simple approach has some intentional limitations:

  • Single user only – No multi-session support
  • Basic context – Only remembers last result for “that” references
  • Console only – No web interface (yet)
  • No persistence – Conversation doesn’t save between sessions

These limitations keep the code simple while providing core functionality.

What’s Next?

Our next post will dive into multi-tool execution, the 7th part of our “From Zero to AI Agent: My Journey into Java-based Intelligent Applications” series! We’ll explore how to build a system that orchestrates complex workflows, handling real-user requests like “Get the weather in Tokyo and save it to a file” or “Check prices on three websites and tell me the cheapest” by coordinating multiple tools seamlessly. Stay tuned!

The complete source code is available in our GitHub repository.

This is part 6 of our series on building Java AI agents with the Model Context Protocol. Next up: robust error handling and advanced conversation patterns!


This content originally appeared on DEV Community and was authored by Sebastiao Gazolla Jr