This content originally appeared on DEV Community and was authored by Sarvar Nadaf
Hey there, tech enthusiasts!
I’m Sarvar, a Cloud Architect with a passion for transforming complex technological challenges into elegant solutions. With extensive experience spanning Cloud Operations (AWS & Azure), Data Operations, Analytics, DevOps, and Generative AI, I’ve had the privilege of architecting solutions for global enterprises that drive real business impact. Through this article series, I’m excited to share practical insights, best practices, and hands-on experiences from my journey in the tech world. Whether you’re a seasoned professional or just starting out, I aim to break down complex concepts into digestible pieces that you can apply in your projects.
Let’s dive in and explore the fascinating world of cloud technology together!
How I Cut AWS Bills by 30% Using One Simple CLI Command?
Managing multiple AWS accounts used to cost me hours in manual cost analysis until I discovered this AI-powered game-changer. Here’s how I cut investigation time from 3 hours to just 3 minutes and eliminated costly budget overruns. Let’s dive in.
TL;DR – What You’ll Learn
- How to reduce 3-hour cost investigations into 3-minute CLI conversations
- Understanding how the AWS Billing & Cost Management MCP works
- Step-by-step configuration of Amazon Q CLI and integration with the Billing MCP
- Best practices for leveraging the Billing MCP in real-world scenarios
- Practical examples of managing costs across multiple AWS accounts
Table of Contents
The $10K Problem
The Game-Changer: AWS Billing MCP
Pricing MCP vs
Billing MCP
How It Works?
Before vs After: The Transformation
Setup Guide (Ubuntu 22.04)
Real-World Examples
Advanced Real-World Scenarios
Quick Reference Card
Your Turn!
What’s Next
Conclusion
The $10K Problem
Last week, during our monthly billing review, our Leadership asked me a simple question during our weekly review: “Why did our AWS bill jump 30% in the development accounts?”
What should have been a 5-minute answer turned into a 3-hour investigation. I had to:
- Log into 12 different AWS accounts (we organize by team and environment)
- Export CSV files from Cost Explorer for each account
- Manually merge data in Excel to find patterns
- Cross-reference with our deployment logs to find the culprit
By the time I found the answer (someone left a large RDS instance running in our sandbox), the meeting was over and I’d missed two other important calls.
This wasn’t a one-time thing. Every week brought similar fire drills:
- “Are we going to exceed our Q4 budget?”
- “Which team is driving up our CloudWatch costs?”
- “Can we get cost forecasts for the new microservices architecture?”
Each question meant hours of manual work, switching between accounts, and building custom reports. There had to be a better way.
The Game-Changer: AWS Billing MCP
The AWS Billing & Cost Management MCP Server is a true game-changer for anyone managing complex AWS environments. Think of it as a personal cost analyst you can chat with.
Earlier, I used to spend hours every month downloading CSV files, cross-checking usage, and only at the beginning of the month would I realize that costs had spiked. From there, it was a long, time-consuming process reviewing budget alerts, checking with teams, and digging through resources to understand what went wrong.
Now, with Amazon Q CLI connected to the Billing MCP Server, the process is completely transformed. Whenever a budget alert is triggered, I simply ask: “Hey Q, why did I get this alert?” and instantly, I get detailed insights in real time.
Instead of clicking through multiple AWS consoles, I can ask plain-English questions from the CLI and receive immediate answers. It has become part of my daily workflow to check billing directly from the command line.
Example queries I use:
q "Show me which accounts had cost spikes this week"
q "Compare EC2 spending across all production accounts"
q "Are we on track to hit our monthly budget?"
This shift has taken cost analysis from a painful manual process to a fast, conversational experience.
Pricing MCP vs
Billing MCP: What’s the Difference?
I use both servers for different purposes:
AWS Pricing MCP | AWS Billing MCP |
---|---|
“What will it cost?” | “What did it cost?” |
Planning new resources | Analyzing actual usage |
"Price of c5.2xlarge in us-west-2?" |
"How much did we spend on EC2 last month?" |
Before deployment | After deployment |
They work together perfectly. I use Pricing MCP when designing architectures and Billing MCP for ongoing cost management.
For a detailed walkthrough of AWS Pricing MCP, you can explore my article here: Real-Time AWS Cost Estimation Using the Pricing MCP Server and Amazon Q CLI.
How It Works?
The MCP (Model Context Protocol) server runs locally and connects to AWS’s billing APIs. When I ask a question through Amazon Q CLI, it:
- Translates my question into the right AWS API calls
- Fetches data from Cost Explorer, Budgets, and other services
- Analyzes the results across all my accounts
- Returns clear answers with actionable insights
No more manual exports or Excel gymnastics.
Before vs After: The Transformation
Traditional Method | With Billing MCP | Impact |
---|---|---|
3 hours manual analysis | 3 minutes CLI query | 60x faster |
Monthly cost surprises | Real-time alerts | Proactive prevention |
Excel spreadsheet hell | Natural language questions | Zero manual work |
Single account visibility | 47 accounts in one view | Complete oversight |
Reactive cost management | AI-powered optimization | Annual savings |
Manual budget tracking | Automated threshold alerts | Zero overruns |
Setup Guide (Ubuntu 22.04)
This section provides a step-by-step implementation of Amazon Q, followed by the configuration of the AWS Billing MCP Server to integrate seamlessly with Amazon Q.
Configure Amazon Q CLI
This section covers the step-by-step configuration of Amazon Q CLI on an Ubuntu 22.04 LTS instance to ensure seamless integration and optimal performance.
Step 1: Update System Packages
It’s always good practice to update your package list before installing new software.
sudo apt update -y
Step 2: Download the Amazon Q CLI Package
Use wget
to download the latest .deb
package from the official Amazon Q release server:
wget https://desktop-release.q.us-east-1.amazonaws.com/latest/amazon-q.deb
Step 3: Install Dependencies (Optional)
Before installing the package, make sure all required dependencies are present and if you have already perform 1st update command then this step is option for you you can skip it for now.
sudo apt-get install -f
Step 4: Install the Amazon Q CLI Package
Now install the .deb
package using dpkg
:
sudo dpkg -i amazon-q.deb
Step 5: Verify Amazon Q
q --version
If you face any dependency issues, re-run sudo apt-get install -f
to auto-fix them.
Amazon Q CLI Login with Builder ID
After successfully installing Amazon Q CLI, the next step is to authenticate. Here’s how to log in using your Builder ID:
Step 1: Run the Login Command
In your terminal, enter:
q login
You’ll see a prompt with two options. Choose:
Use for Free with Builder ID
If you don’t have a Builder ID yet, you can create one using your email during this step.
Step 2: Confirm Authorization in Browser
Amazon Q will generate a unique confirmation link and code. You must:
- Manually open the provided link in a browser and login with your mail id.
- Enter the verification code when prompted.
Step 3: Allow Access
Once the code is verified, Amazon Q will ask for permission to access your Builder ID account. Click Allow.
Launch Amazon Q CLI
Start Amazon Q using the following command:
q
If you’re looking to subscribe to Amazon Q Pro, this article will guide you through the process of subscribing directly via the Amazon Q CLI: Link
Configure MCP Server for AWS Billing MCP
This section covers how to set up an MCP (Model Context Protocol) server that allows Amazon Q to get AWS pricing access.
Step 1: Install Python 3.10
To run the MCP server locally, Amazon Q requires Python 3.10. Here’s a breakdown of each command to install it properly on Ubuntu 22.04 LTS.
1. Update the package list
sudo apt update -y
What it does:
Fetches the latest list of available packages and versions from the Ubuntu repositories. Always a good first step before installing anything new.
2. Install software-properties-common
sudo apt install -y software-properties-common
What it does:
Installs a package that allows you to manage additional repositories (like PPAs). Required to add the Deadsnakes PPA for Python 3.10.
3. Add the Deadsnakes PPA
sudo add-apt-repository ppa:deadsnakes/ppa -y
What it does:
Adds the Deadsnakes Personal Package Archive (PPA) to your system. This PPA maintains up-to-date versions of Python not available in the default Ubuntu repos.
4. Install Python 3.10 and related tools
sudo apt install -y python3.10 python3.10-venv python3.10-dev
What it does:
-
python3.10
: Installs the Python 3.10 interpreter -
python3.10-venv
: Enables creating virtual environments withpython3.10 -m venv
-
python3.10-dev
: Provides headers and development tools needed to build Python packages with native extensions
Once these steps are complete, Python 3.10 will be available on your EC2 instance.
You can verify the version using:
python3.10 --version
Step 2: Set Up a Virtual Environment
Create a virtual environment to isolate the MCP server:
python3.10 -m venv ~/aws-mcp-env
source ~/aws-mcp-env/bin/activate
Step 3: Install MCP Server and Dependencies
Use pip
to install the required libraries:
pip install --upgrade pip
pip install uv uvenv trio
Step 4: Configure Amazon Q to Use the MCP Server
First, change directory to the Amazon Q configuration directory:
mkdir -p ~/.aws/amazonq
Then create the config file at ~/.aws/amazonq/mcp.json
:
{
"mcpServers": {
"awslabs.billing-cost-management-mcp-server": {
"command": "uvx",
"args": [
"awslabs.billing-cost-management-mcp-server@latest"
],
"env": {
"AWS_PROFILE": "default",
"AWS_REGION": "us-east-1"
},
"disabled": false
}
}
}
You can create the file using nano
or vim
:
nano ~/aws/amazonq/mcp.json
Paste the above configuration and save the file.
Note: If you face any issue in MCP configuration code please follow this link.
Important: If your MCP is not loading or shows warnings like the example below, please note that these warnings come from AWS service limitations and account configurations, not from issues in the MCP server code itself.
Optional: Use Amazon Q CLI to Set Up MCP
Alternatively, Amazon Q CLI itself can help you set up the MCP server if you provide the right prompts. You can ask:
Set up a local MCP server for AWS Pricing MCP
This approach may simplify the process by handling package installation and configuration automatically.
Real-World Examples: Testing & Using AWS Billing MCP
Once your AWS Billing MCP Server is configured, here are practical examples to test your setup and see immediate value.
Example 1: Quick Health Check
A FinOps engineer wants to confirm that the MCP server is connected and returning billing data.
Query:
q "Show me my current AWS billing setup"
Response: Confirms MCP connectivity and retrieves the current billing configuration.
Example 2: Monthly Cost Overview
The CFO requests a quick summary of monthly AWS spend.
Query:
q "What's my total AWS spend this month?"
Response: Provides a consolidated cost report across all linked AWS accounts for the current billing cycle.
Example 3: Cost Spike Investigation
A DevOps lead notices sudden billing increases and wants to find the source.
Query:
q "Show me any unusual cost increases in the last 7 days"
Response: Highlights services or accounts with abnormal growth, enabling fast troubleshooting.
Example 4: Budget Tracking
The finance team needs to check if they are within the monthly budget.
Query:
q "Am I on track to stay within my monthly budget?"
Response: Shows current budget usage, projected month-end spend, and active budget alerts.
Example 5: Optimization Opportunities
A cloud architect wants to identify underutilized resources for savings.
Query:
q "Find EC2 instances that are underutilized and can be downsized"
Response: Lists low-utilization EC2 instances with downsizing or termination recommendations.
Example 6: Multi-Account Visibility
Leadership asks for a breakdown of costs across multiple AWS accounts.
Query:
q "List all my AWS accounts with their current costs"
Response: Provides account-wise cost distribution for better financial accountability.
Example 7: CI/CD Integration Check
Before deploying 5 new m5.large instances, a team wants to validate cost impact in the pipeline.
Query:
q "Estimate cost impact of deploying 5 new m5.large instances"
Response: Estimates incremental monthly costs, helping prevent budget overruns during deployment.
Advanced Real-World Scenarios
Example 8: Anomaly Detection
The security and finance teams want alerts when spending deviates significantly from historical trends.
Query:
q "Alert me to any cost anomalies compared to historical patterns"
Response: Detects unusual spikes across services or regions and suggests root causes for investigation.
Example 9: Tag-Based Cost Allocation
The engineering leadership wants to understand project-level spending using tags.
Query:
q "Show me costs grouped by team or project tags"
Response: Provides cost breakdowns based on tags (e.g., team=dev
, project=mobile-app
), allowing for chargeback and accountability. If no tags are associated with your resources, Amazon Q will still return untagged usage and costs. Below is a sample output of how this appears when queried.
Example 10: Predictive Cost Forecasting
A CTO needs to plan quarterly budgets based on current growth trends.
Query:
q "Predict next quarter's AWS costs based on current usage trends"
Response: Provides forecasted costs for the next quarter and identifies potential areas of overspending.
Example 11: Compliance & Governance Check
The cloud governance team must ensure all accounts follow cost policies.
Query:
q "Show me any resources violating our cost governance policies"
Response: Lists non-compliant resources such as untagged instances, oversized resources, or unused storage volumes.
Quick Reference Card
Essential Daily Prompts:
q "Cost status check" # Overall health
q "Any alerts today?" # Daily issues to address
q "Top optimization opportunities" # Quick wins
q "Remaining budget this month" # Daily planning
Investigation Prompts:
q "Why did costs spike yesterday?" # Root cause analysis
q "Compare costs: this month vs last" # Trend analysis
q "Show unused resources" # Waste identification
q "Rightsizing recommendations" # Optimization
Planning Prompts:
q "Cost impact of new deployment" # Pre-deployment
q "Forecast next month's bill" # Budgeting
q "ROI of Reserved Instances" # Investment decisions
q "Projected costs for next quarter" # Long-term planning
Your Turn!
What’s your biggest AWS cost pain point? Drop a comment below and I’ll show you the exact MCP prompt to solve it!
Try these starter prompts and share your results:
q "What's my biggest cost optimization opportunity right now?"
q "List cost anomalies detected this week"
q "Which accounts need immediate attention?"
What’s Next?
This is just the beginning. I’m already seeing how AI-driven cost management will transform FinOps:
- Proactive alerts before costs spike
- Automated resource optimization instead of manual rightsizing
- Predictive budgeting based on deployment patterns
- Real-time cost optimization during development
Instead of reacting to surprise bills, we’re building cost-awareness into every decision.
Conclusion: By adopting the AWS Billing MCP Server, I transformed AWS cost management across 47 accounts from a reactive, spreadsheet heavy process into a proactive, AI-powered workflow that delivers instant insights, prevents budget overruns, and drives continuous optimization. What once required hours of manual effort now takes minutes, with faster reporting, quicker anomaly detection, and smarter rightsizing recommendations. More importantly, it created a cultural shift towards FinOps by design where cost awareness is embedded into daily standups, CI/CD pipelines, and architecture reviews turning cloud cost management into a continuous, intelligent, and collaborative process that empowers both engineers and leadership to make smarter, data-driven decisions.
Wrapping Up
Thank you for reading! I hope this article gave you practical insights and a clearer perspective on the topic.
Was this helpful?
Like if it added value
Unicorn if you’re applying it today
Save for your next optimization session
Share with your team
Follow me for more on:
- AWS architecture patterns
- FinOps automation
- Multi-account strategies
- AI-driven DevOps
What’s Next
More deep dives coming soon on cloud operations, GenAI, DevOps, and data workflows follow for weekly insights.
Let’s Connect
I’d love to hear your thoughts drop a comment or connect with me on LinkedIn.
Happy Learning
This content originally appeared on DEV Community and was authored by Sarvar Nadaf