# I Built a MindsEye x Google AI Stack in 6 Repos (Without Cloud Credits or API Budget… Yet)



This content originally appeared on DEV Community and was authored by PEACEBINFLOW

Intro

Over the last few days I’ve been quietly assembling something I’ve wanted for a long time:

A Google-native AI layer where prompts, runs, devlogs, and analytics all live in one shared brain — powered by my MindsEye framework — but implemented as small, focused GitHub repos.

I don’t have:

Google Cloud budget

Gemini API credits

Or a giant infra stack

So I did the next best thing:

Designed everything using Google Workspace surfaces (Sheets, Docs, Gmail, Forms, etc.)

Structured the code so it’s cloud-ready the moment I get funding/credits

Kept everything open in six connected repos.

This post is the overview: what each repo does, how they connect, and where MindsEye fits in the middle.

The Six Repos (Quick Map)

Here are the six public repos that make up the system:

Workspace automation layer
👉 https://github.com/PEACEBINFLOW/mindseye-workspace-automation/tree/main

Ledger (Prompt Evolution Tree + runs)
👉 https://github.com/PEACEBINFLOW/mindseye-google-ledger

Gemini orchestrator
👉 https://github.com/PEACEBINFLOW/mindseye-gemini-orchestrator

Devlog generator (for Dev.to-style posts)
👉 https://github.com/PEACEBINFLOW/mindseye-google-devlog/tree/main

Analytics layer (exports + dashboards)
👉 https://github.com/PEACEBINFLOW/mindseye-google-analytics/tree/main

Workflow atlas (“portal maps”)
👉 https://github.com/PEACEBINFLOW/mindseye-google-workflows/tree/main

Together, they turn Google Workspace into a kind of MindsEye-powered AI console:

Workspace events → logged as time-labeled nodes and runs

Nodes → executed via Gemini (when API access is available)

Runs → narrated as devlogs + aggregated as analytics

All flows → described as YAML workflows so nothing is “mystery glue”.

MindsEye’s Role in All This

MindsEye, in my head, is the cognitive layer:

It sees events (Gmail, Docs, Drive…)

It reasons over them (prompt evolution, runs, success rates…)

It remembers (ledger, devlogs, analytics history).

These repos are basically the Google-native skeleton for MindsEye:

Time-labeled prompts and experiments (Prompt Evolution Tree)

Cross-app workflows as “portals” between Google surfaces

A structure where Gemini/Google AI can later plug in without redesigning everything.

Right now it’s “offline-brain-mode” (no paid API calls), but all the pathways are there.

  1. mindseye-workspace-automation

Google Workspace as the entry point

🔗 https://github.com/PEACEBINFLOW/mindseye-workspace-automation/tree/main

This repo is where Google Workspace actually touches the system.

The idea:

Use Apps Script to listen for events in:

Gmail (labels, threads)

Google Docs (custom menu actions)

Drive (folder summaries)

Forms (new responses)

Normalize those events into a common shape:

surface (gmail/docs/drive/forms)

source_id, source_url

title, summary, event_type

Then send them through a “portal” to the ledger as either:

a new prompt node

or a new run attached to an existing node.

Think of this repo as:

“Whenever something interesting happens in Google Workspace, MindsEye finds out about it.”

  1. mindseye-google-ledger

The Prompt Evolution Tree (PET) + runs

🔗 https://github.com/PEACEBINFLOW/mindseye-google-ledger

This is the core database, and it’s built on plain Google Sheets so it’s cheap and accessible.

It defines:

nodes sheet → every prompt / idea / variation

runs sheet → every execution of a node

Plus:

A Prompt Evolution Tree (PET) schema:

node_id, parent_node_id

prompt_type, status, tags

linked doc_url for long-form prompt docs

Apps Script that:

Handles Forms → ledger (new nodes via Google Forms)

Auto-creates Docs per node

Keeps everything indexed by node_id.

Goal:

Make prompt design & experiments first-class, time-labeled data — not random notes in your head.

Everything else (orchestrator, devlog, analytics) treats this ledger as the single source of truth.

  1. mindseye-gemini-orchestrator

When I do have Gemini access, this runs the show

🔗 https://github.com/PEACEBINFLOW/mindseye-gemini-orchestrator

This repo is the part that reads from the ledger, calls Google AI / Gemini, and logs back the result.

Right now, it’s wired as:

Node.js + TypeScript skeleton

Config for:

LEDGER_SHEET_ID, LEDGER_NODES_RANGE, LEDGER_RUNS_RANGE

GEMINI_MODEL_ID (e.g. gemini-1.5-pro)

Code divided into:

sheets_client.ts → read/write nodes/runs

gemini_client.ts → Google AI API interface (future)

runner.ts → orchestrate “run node N and log result”

index.ts / CLI → run single node or batches.

Because I don’t have paid Gemini access yet, this is:

Designed first, powered later — but the interface and data shape are ready.

The moment I can connect a service account + Gemini key, this becomes the execution engine.

  1. mindseye-google-devlog

Dev.to-style logs straight from the ledger

🔗 https://github.com/PEACEBINFLOW/mindseye-google-devlog/tree/main

This repo’s entire job is to turn time windows of ledger activity into narrative devlogs.

It:

Reads nodes + runs from the ledger (via Sheets API)

Groups runs by node_id

Builds a DevlogData structure with:

run counts

contexts (gmail, docs, forms, etc.)

first/last run time

sample outputs/notes

Renders markdown using a Handlebars template ready for Dev.to.

Optionally, when Gemini access is available, it can:

Ask Gemini to write a summary intro for the week’s activity.

So instead of manually tracking everything, I can auto-generate posts like:

“Here’s what MindsEye experimented with this week across Gmail, Docs, and Devlogs.”

  1. mindseye-google-analytics

Metrics, charts, and dashboards over PET + runs

🔗 https://github.com/PEACEBINFLOW/mindseye-google-analytics/tree/main

This repo turns the ledger into numbers and graphs.

It includes:

Sample exports:

exports/nodes_sample_export.csv

exports/runs_sample_export.csv

Python scripts:

compute_stats.py →

total_nodes, total_runs, avg_runs_per_node

success rate per prompt_type

top models by score

surface usage (run_context)

generate_charts.py →

runs per day

runs by prompt_type

runs by run_context

Dashboard docs:

How to wire it all into Looker Studio using Sheets as the source

KPI definitions (kpi_definitions.md).

This is the “how is the system evolving?” lens:
are prompts getting better, which surfaces are busy, which models behave best, etc.

  1. mindseye-google-workflows

The atlas: “portal maps” for the whole system

🔗 https://github.com/PEACEBINFLOW/mindseye-google-workflows/tree/main

This repo doesn’t run code; it defines how everything connects.

It includes:

workflows/*.yaml files like:

00_overview.yaml → high-level system map

workspace_event_to_ledger.yaml → how Gmail/Docs/Drive/Forms become nodes/runs

ledger_to_gemini.yaml → node → Gemini → run

devlog_generation.yaml → time window → markdown devlog

analytics_refresh.yaml → nightly exports → stats + charts

portal_routes.yaml:

Canonical “portals” like:

workspace_event_to_ledger

ledger_to_gemini

ledger_to_devlog

ledger_to_analytics

Mapping files:

mappings/repos.yaml → logical component → GitHub repo

mappings/google_apps.yaml → logical role → Google app

Python helpers:

validate_workflows.py → check workflow YAMLs are valid + portals consistent

visualize_workflows.py → generate Mermaid diagrams from workflows.

Basically:

This repo is the atlas. The others are the continents.

Why Google’s Ecosystem? (And What’s Missing)

I intentionally built this on Google Workspace + Google AI because:

Sheets, Docs, Gmail, Forms, Drive, Calendar → already where teams live

Devs can easily extend with Apps Script, Node.js, Zapier, etc.

Looker Studio + Sheets → low-cost dashboards

Gemini → natural fit for prompt/run experimentation once I have access

What I don’t currently have:

✅ Cloud infra money (Google Cloud project funded, App Engine/Cloud Run hosting, etc.)

✅ Paid Gemini API access / higher-tier free credits

So right now, the stack runs mostly as:

GitHub repos + local scripts

Google Workspace + Apps Script

A clear blueprint for when funding/API keys show up.

But the important part is done:

The data model exists

The workflows are described

The repos are live and open.


This content originally appeared on DEV Community and was authored by PEACEBINFLOW