This content originally appeared on DEV Community and was authored by Gábor Maksa
I’ve always been cautious about clicking shortened URLs. Sometimes they feel sketchy, and I want to know where they lead before committing. Sure, you could use curl or third-party “unshortener” websites, but that’s clunky — and it comes with a subtle problem: these tools actually register a visit on the shortener itself. The person who created the link thinks someone clicked it, even though nobody really did.
That little frustration got me thinking: what if the shortener itself let you safely peek at the destination URL, without counting it as a click?
And that’s how Trails was born: a simple, privacy-first URL shortener that gives users transparency and control, while respecting their privacy.
Before I dive into the story of how I built it, here are the core principles that guided its design:
- Transparency — Peek URLs safely; see creation date, expiry, and visit counts. Trails are immutable to maintain trust.
- No accounts — Each Trail has a unique token; owning the token lets you delete or manage it.
- Privacy — IPs are never stored raw — only hashed using SHA-256, so identities remain safe.
Getting the motivation
Most of my earlier projects (that has never seen daylight) were built using either Starlette or axum. While I gained valuable experience, none of them were truly complete or polished. As an ambitious developer, this was a point of frustration, especially when I saw job descriptions on LinkedIn that consistently asked for hands-on projects.
It was then that I noticed a common theme: many Python-related roles specifically requested experience with FastAPI. I had an idea: what if I took the project I had already started and rebuilt it using FastAPI and Next.js — two of the most in-demand frameworks — while also focusing on high-quality documentation? I started building again, this time with a clear goal in mind.
Laying down the foundations
A URL shortener service is quite simple: take in a URL, generate a unique (preferably short) identifier, store both of them in a database, and whenever someone tries to access the short identifier, just redirect them to the long URL — that’s it.
I just needed a few things to clarify:
- Which database to use?
- Which framework to use?
- How to write readme, documentation and how do I even launch a project?
- Where to host it?
Database
Choosing the database was straightforward: PostgreSQL. I have the most experience with it, very easy to spin up, has a lot of trust and is open-source (I guess the last two come hand-in-hand).
I’ve always found Docker Compose to be a lifesaver for quickly setting up the services an application needs. For Trails, I used a simple docker-compose.yml
file to spin up our PostgreSQL database. This keeps the setup process clean and consistent.
services:
db:
image: postgres:latest
container_name: trails_db
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: trails
ports:
- "5577:5432"
-
image: postgres:latest
— Pulls the official PostgreSQL image. -
container_name: trails_db
— Names the container for easy reference. -
environment
— Sets up the database user, password, and name. -
ports
— Maps the default PostgreSQL port (5432) inside the container to port 5577 on my local machine to avoid conflicts with other instances.
To get the database up and running, just need a single command:
docker compose up -d
And then connect to it using the specified port and credentials:
psql -h 127.0.0.1 -p 5577 -U postgres -d trails
Backend
My usual approach for a Python web API is to manually combine libraries like Starlette, Pydantic, and Uvicorn. I realized FastAPI was a powerful all-in-one solution that bundled these tools, allowing me to focus on the core logic.
I just had to define my routes and their resolvers, and the application was ready to go. On top of that, FastAPI automatically generates comprehensive documentation from the Pydantic models I use for data validation. With just a few descriptions added to the code, I had a fully documented API that implements the OpenAPI standard, without any extra effort.
Frontend
My primary passion is backend development, but I wanted to build a user-friendly frontend to complete this project. Given my experience with React, Next.js was the obvious choice due to its excellent documentation and ease of use.
Getting started
The start was easy, I just needed to run the following command:
npx create-next-app@latest
and go through the prompts.
The routing was also straightforward, I just needed this file structure:
src
├── app
│ ├── about
│ │ └── page.tsx
│ ├── contact
│ │ └── page.tsx
│ ├── error.tsx
│ ├── globals.css
│ ├── info
│ │ ├── page.tsx
│ │ └── [trailid]
│ │ ├── DeleteButton.tsx
│ │ └── page.tsx
│ ├── layout.tsx
│ ├── page.tsx
│ ├── peek
│ │ ├── page.tsx
│ │ └── [trailid]
│ │ └── page.tsx
│ └── saved
│ └── page.tsx
The pages are automatically generated by Next.js, with built-in dynamic routing.
The issue with environment variables
There was only one issue that I still have not found the best solution for — environment variables. I wanted the website to support both development and production environments without the need to rebuild the application.
By default, you can use environment variables with Next.js, but they are only available at build time. This means that the following code will not work as expected:
export default function Page() {
return (
<span>{process.env.NEXT_PUBLIC_API_URL}</span>
)
}
When we look at this code and see process.env.
we could assume that the environment variable will be available at runtime, but it is not. When we build the application, Next.js will replace process.env.*
with the actual value of that environment variable, making it static.
This caused some problems while deploying the application. A workaround I found is to create an API route that returns the environment variables I need (as a config), and then fetch them from the frontend.
-
Pros
- Development and production environments can use different configurations without rebuilding the application.
- I can later extend the API to return more configuration options if needed.
-
Cons
- It adds an extra API call to fetch the configuration (although it is cached, so it is not a big deal).
Seems like a good trade-off to me, so I went with it.
// src/app/api/config/route.ts
import { AppConfig } from "@/config";
import { NextResponse } from "next/server";
export async function GET() {
return NextResponse.json({
apiUrl: process.env.NEXT_PUBLIC_API_URL,
} as AppConfig);
}
I can fetch and cache it using a helper function:
// src/config.ts
// some code before...
let appConfig: Promise<AppConfig> | null = null;
export const getAppConfig = async (): Promise<AppConfig> => {
if (appConfig !== null) return appConfig;
if (isServerSide()) {
const apiUrl = process.env.NEXT_PUBLIC_API_URL;
if (!apiUrl) {
throw new Error(
"NEXT_PUBLIC_API_URL is not defined in the environment variables.",
);
}
return {
apiUrl,
} as AppConfig;
}
appConfig = fetch("/api/config")
.then((res) => res.json())
.then((data) => data as AppConfig)
.catch((error) => {
console.error("Failed to fetch app config:", error);
throw error;
});
return appConfig;
};
Type safety to the max
If you are curious about how I managed to write a 100% type-safe API client for the Next.js frontend leveraging Trails’ OpenAPI spec, a detailed article is on the way.
Automation
I believe that automation — when implemented strategically — can be a game-changer. For Trails, I designed a Continuous Integration (CI) pipeline using GitHub Actions to streamline the development process and ensure reliability.
This workflow automates two key tasks:
- On every push, it runs all tests to catch regressions early.
- On every new tag, it automates the release process by creating a new GitHub release with notes from the
CHANGELOG.md
and then building and pushing the latest Docker image to our container registry.
You can check out the full configuration here.
Hosting
Deployment
After building Trails, I knew it deserved a more robust and professional home than my personal Raspberry Pi. I started looking for a hosting service that could handle my requirements.
There was not really much that I needed:
- Docker container support — as both of the backend and the frontend are containerized.
- First-class PostgreSQL support — so I can easily make backups and manage the database. This one is not a must, as PostgreSQL has docker images, but it is a nice-to-have.
And then Railway caught my attention. It is a platform that has both of my requirements, and has a free starting tier. I signed up, and started deploying the application.
Deployment was a no-brainer. I created 3 services:
- PostgreSQL — I just had to select the option, and it already created a PostgreSQL instance for me, with a connection string that I could use in my application.
- Backend — provided the Docker image url, and it automatically pulled the image and started the container.
- Frontend — same as the backend.
Domain
In the past, I’ve used simpler domain registrars for my personal projects, but for Trails, I wanted something that offered a more robust set of features. I chose Cloudflare for its powerful DNS management and security benefits, which made it a perfect candidate.
While Railway’s custom domain setup guide was holding my hand, I was able to add a CNAME record to my domain’s DNS settings. This tells Cloudflare to route all traffic for trls.link
and api.trls.link
to their respective services running on Railway.
The Future of Trails
Now that Trails has come to life, I am focused on its next phase of development. The backend is solid and the core features are functional, but still needs a lot of love, particularly on the frontend. My goal is to add more advanced features that build on the project’s core principles of transparency and privacy.
More Precise Metrics
Currently, users can see the total and unique number of visits for each Trail. However, I want to provide more precise way to see this data: timeframes. It is not a big issue to implement it, as I already store the timestamp for each visit, I just need to find a way to aggregate this data and return it in a user-friendly way.
Trending Trails
Trails is not a social media platform — so why would I want to show “trending” Trails? Well, why not? People are curious, and it would be fun to see which Trails are currently popular. I could implement a simple algorithm that calculates a score for each Trail based on the last 24 hours of unique visits, how well the unique visits are distributed over time, and some other things I find interesting. We will see how it goes, but I think it would be a weird yet fun feature to have that helps Trails stand out.
Dynamic Trails
The Idea
If the previous feature idea was not exciting enough, what about Trails with dynamic URL resolution? Okay, this might sound like something that would completely destroy the main purpose of this service: being transparent — but hear me out.
Imagine a Trail that redirects to a different URL based on the time of day, current date or some header value. For example:
*Click* --> [Is it before 20th August 2025?] -----YES-----> https://some-website.com/giveaway/
|
+-----------------------NO------> https://some-website.com/news/
This is a simple example, but you can imagine more complex scenarios. Still, the question remains: How would I make it transparent?
Potential Solution
- Create a new endpoint that clearly shows the user that the Trail they are about to visit is dynamic. The static Trails would remain
/t/{trailid}
, while the dynamic ones would be/td/{trailid}
. - Adjust the peek endpoint to return a graph of the possible destinations (so that users can examine every possible outcome), as well as what conditions are met, and what the destination would be for the current request — for a quick decision making.
- Conditions must be one of the few predefined ones, such as:
- Time of day
- Current date
-
Some header value (e.g.
User-Agent
,Referer
, etc.) - And maybe more
- Limit the number of conditions and/or destinations to ensure the service remains performant while keeping the transparency intact (or at least as much as possible).
And there it is — Trails was born. Alive, breathing, and ready to test a new idea in the world of URL shorteners. It all started as a way to prove my skills, but now I see it as a potential tool that can help people navigate the web more safely and transparently. It is still in its early days, but I am excited to see where it goes.
I invite you to explore Trails for yourself or dive into the full source code on GitHub. If you are curious about the deeper technical details regarding the API client of the Next.js frontend, stay tuned for an upcoming article that will cover how I achieved 100% type safety using Trails’ OpenAPI spec.
This content originally appeared on DEV Community and was authored by Gábor Maksa