This content originally appeared on DEV Community and was authored by Pavel
Sound familiar? You ask an LLM to write a function and get back brittle code with a missing await
or a security hole.
LLMs excel at high-level thinking—architecture, structure, and data flow. But they stumble on implementation details. Forcing them to write code is using them for the wrong job.
So, let’s change the approach. Instead of asking an LLM HOW to do something, let’s give it a tool to describe WHAT needs to be done.
Meet Serverokey: A Declarative Engine for Node.js
Serverokey is an engine that lets your LLM fill out a simple, clear blueprint—a single manifest.js
file—instead of writing code.
The core idea: Your LLM is the architect, not the bricklayer.
Stop prompting like this:
Ineffective: “Write an Express route that connects to the DB, fetches a user, checks if they’re an admin, and then renders the admin dashboard.”
Start prompting like this:
Effective: “In
manifest.js
, describe aGET /admin
route. Secure it withauth: { required: true, role: 'admin' }
. It should read from the ‘users’ connector and render the ‘adminPanel’ component.”
See the difference? We’re talking about structure, not implementation.
How It Works
1. All Logic in One File (manifest.js
)
Your entire application—from routes and security to UI and database connections—is described in one place. This keeps the LLM in context and you in control.
// manifest.js
module.exports = {
connectors: { users: { type: 'wise-json', collection: 'users' } }, // Data sources
components: { adminPanel: { template: 'admin-panel.html' } }, // UI components
routes: {
'GET /admin': {
type: 'view',
auth: { required: true, role: 'admin' }, // 🛡 Built-in protection
reads: ['users'], // What data to fetch
render: { 'panel': 'adminPanel' }, // What to render & where
},
},
};
2. Logic as a Simple Recipe (steps
)
Instead of spaghetti code, you describe logic as a clear sequence of steps. This isn’t code generation; it’s architecture generation.
// Instead of this:
app.post('/action/addItem', async (req, res) => { /* ...a lot of buggy code... */ });
// You describe this:
'POST /action/addItem': {
type: 'action',
steps: [ // A clear sequence of steps
{ 'set': 'context.product', 'to': 'data.items.find(p => p.id == body.id)' },
{ 'set': 'data.receipt.items', 'to': 'data.receipt.items.concat([context.product])' },
]
}
3. Built-in Safeguards & a Sandbox
- Validator: Checks your
manifest.js
for logical errors before you run. - Resilient: Minor errors (like a null reference) won’t crash your server.
- Secure: The LLM has no direct access to
fs
or other dangerous Node.js APIs.
4. Zero-JavaScript Interactivity
Add special atom-*
attributes to your HTML, and the engine brings your page to life. No client-side JS required.
<!-- This single line creates an interactive search input -->
<input type="text"
atom-action="POST /action/search"
atom-target="#search-results"
atom-event="keyup"
>
<div id="search-results">
<!-- The server will automatically re-render this block -->
</div>
Try It Yourself!
I built a full point-of-sale example app to showcase these features.
# 1. Clone the repo
git clone https://github.com/Xzdes/serverokey.git
cd serverokey
# 2. Install dependencies
npm install
# 3. Run the example
cd packages/kassa-app-example
npm run dev
Now open http://localhost:3000
(login: kassir
/ password: 123
).
The Future is Declarative
For a huge class of web apps (admin panels, CRMs, internal tools), this is a more pragmatic way for humans and AI to collaborate. We leverage the LLM’s strengths while mitigating its weaknesses.
GitHub: https://github.com/Xzdes/serverokey
This content originally appeared on DEV Community and was authored by Pavel