Building a Serverless Patient Appointment Portal Backend with Entire Azure Architecture



This content originally appeared on Level Up Coding – Medium and was authored by Chhavi Dhankhar

Building a Serverless Patient Appointment Portal Backend with an Entire Azure Architecture

The problem provided was:

  • Patients struggled to schedule appointments with specialists,
  • Accessing past medical records was inconvenient, and
  • Missed or delayed care often happened due to the lack of timely reminders.

To address this, I had to build a cloud-native web app that automates appointment booking, medical record management, and notifications.

This is how the portal looks.

This workshop assignment turned into a hands-on exploration of how different Azure services work together in the real world. When I started this project, I was a complete novice to Azure. I had very little experience with its cloud services before, but by the end, I had stitched together a working patient portal backend using Azure Functions, SQL Database, Blob Storage, and Communication Services.

So this is a step-by-step guide to building a fully working backend using only Azure resources: a secure, scalable, and practical solution.

Workflow

Step 1: Set Up Azure Student Account

Azure gives $100 free credit to students. You need to sign up with your personal email and then provide your college email for verification. Once verified, no other steps. You can use all of Azure resources within the $100 limit.

Once your account is active, create a resource group to organise all your Azure resources.

Note that a student subscription restricts deployment to certain regions. You can check this under:
Your Subscription → Policies → Compliance → Allowed resource deployment regions

Step 2: Create an Azure Function App for APIs

The Function App will act as the backend for our portal, providing API endpoints to manage patients, doctors, and appointments. We’ll use Azure Functions because it’s serverless, cost-effective, and automatically scales.

Create an Azure Function App:

  1. Go to the Azure Portal → Click Create a resource → Search for Function App.
  2. Choose a Consumption Plan (pay-per-execution, best for small apps).
  3. Click Create and provide:
    Subscription & Resource Group (use the same as your SQL Database)
    Function App name (must be unique globally)
    – Runtime stack
    : Node.js or Python (choose based on preference)
    Region: Same as your SQL DB for low latency
  4. Click Review + Create, then Create.

Set up Local Development Environment:

Create a folder for the entire project and open the terminal in VS Code

# Install Azure Functions Core Tools 
npm install -g azure-functions-core-tools@4 - unsafe-perm true

Create a Backend Folder and install dependencies

# Still in the backend folder
npm init -y
npm install mssql @azure/functions

Folder structure:

I added this here for a better understanding of the next sections.

Step 3: Create Azure SQL Database & Server

I also needed a SQL database for storing data, and a SQL server on which the database will run.

Create an Azure SQL server:

  1. Go to the Azure Portal → Click Create a resource.
  2. Search for SQL Server and select Create.
  3. Provide:
    – Select your subscription and resource group.
    Server name (must be unique globally)
    Region (choose the same region as your Function App for better performance)
    Authentication method: I went with Microsoft Entra-only authentication, but you can choose any.
    – Set Microsoft Entra Admin.
  4. Click Review + Create, then Create.

Create an Azure SQL Database:

  1. After the server is created, click Create a resource → Search for SQL Database.
  2. Select the previously created SQL Server.
  3. Set:
    Database name (e.g., appointmentDB)
    SQL elastic pool (All databases within an elastic pool share a common allocation of resources (CPU, memory, and storage). This is based on the assumption that not all databases will simultaneously require peak resources. This is a cost-effective solution.)
    Backup storage redundancy options (Protection against data loss. Locally-Redundant Storage (LRS) for a single data centre, Zone-Redundant Storage (ZRS) for protection against zone outages within the primary region, and Geo-Redundant Storage (GRS), which replicates data to a secondary region for disaster recovery. The best option depends on your budget and durability requirements, with LRS being the cheapest but least resilient, and GRS offering the highest durability against large-scale disasters.)

4. Click Review + Create, then Create.

To allow your Function App and local machine to access the DB:

  1. Go to the SQL ServerNetworking.
  2. Under Firewall and virtual networks, add:
    Your IP address (for local testing)
    -Enable Allow Azure services and resources to access this server.

Once the database is ready, use the Query Editor in the Azure Portal. Alternatively, I added a SQL Server extension to my VS Code and connected to my database through Microsoft Entra Authentication, then ran all my queries within my project folder only.

Step 4: Database connection to function app

Create a file within the src folder of the function app (db.js). Here is a snippet of my DB connection code.

const sql = require('mssql');
const { DefaultAzureCredential, ManagedIdentityCredential } = require('@azure/identity');

const config = {
server: <Your Azure SQL Server>,
database: <Your Azure SQL Database>,
options: { encrypt: true }
};

// Get Azure AD token
async function getAccessToken() {
const credential = process.env.WEBSITE_SITE_NAME
? new ManagedIdentityCredential() // Azure environment
: new DefaultAzureCredential(); // Local dev
const tokenResponse = await credential.getToken('https://database.windows.net/');
return tokenResponse.token;
}

// Connect to DB with token
async function getPool() {
const accessToken = await getAccessToken();
return sql.connect({
...config,
authentication: {
type: 'azure-active-directory-access-token',
options: { token: accessToken }
}
});
}

// Execute SQL query
async function executeQuery(query, params = {}) {
const pool = await getPool();
const request = pool.request();
Object.keys(params).forEach(key => request.input(key, params[key]));
return request.query(query);
}

module.exports = { executeQuery };

Step 5: Create functions/API endpoints

Within backend/src/functions, create app.js, which will contain all your functions.

Make sure that in the package.json, “main”: “src/functions/app.js”

Here is an example of an HTTP trigger fucntions for patients (GET, POST functions):

const { app } = require('@azure/functions');
const { executeQuery } = require('../db');

const corsHeaders = {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "GET, POST, PUT, DELETE, OPTIONS",
"Access-Control-Allow-Headers": "Content-Type, Authorization",
"Content-Type": "application/json"
};

// OPTIONS handler
app.http('corsHandler', {
methods: ['OPTIONS'],
authLevel: 'anonymous',
route: '{*restOfPath}',
handler: async () => ({ status: 200, headers: corsHeaders, body: '' })
});

//example of patient http trigger
app.http('patients', {
methods: ['GET', 'POST'],
authLevel: 'anonymous',
route: 'patients',
handler: async (request, context) => {
try {
if (request.method === 'GET') {

const username = request.query.get('username');

let query;
let params = {};

if (username) {
query = `
SELECT patient_id, name, email, username, phone, date_of_birth, gender, created_at
FROM Patients
WHERE username = @username
`;
params = { username };
} else {
query = `
SELECT patient_id, name, email, username, phone, date_of_birth, gender, created_at
FROM Patients
`;
}
const result = await executeQuery(query,params);
return { status: 200, headers: corsHeaders , jsonBody: { success: true, data: result.recordset, count: result.recordset.length } };
}

if (request.method === 'POST') {
const body = await request.json();
const { name, email, password_hash, username, phone, date_of_birth, gender } = body;

if (!name || !email || !password_hash || !username) {
return { status: 400, headers: corsHeaders, jsonBody: { success: false, error: "Missing required fields" } };
}

const query = `
INSERT INTO Patients (name, email, password_hash, username, phone, date_of_birth, gender)
OUTPUT INSERTED.patient_id, INSERTED.name, INSERTED.email, INSERTED.username, INSERTED.phone, INSERTED.date_of_birth, INSERTED.gender, INSERTED.created_at
VALUES (@name, @email, @password_hash, @username, @phone, @date_of_birth, @gender)
`;
const params = { name, email, password_hash, username, phone, date_of_birth, gender };
const result = await executeQuery(query, params);

return { status: 201, headers: corsHeaders, jsonBody: { success: true, message: "Patient created", data: result.recordset[0] } };
}

} catch (error) {
context.error(error);
return { status: 500, headers: corsHeaders, jsonBody: { success: false, error: error.message } };
}
}
});

Similarly, you can create other endpoints too (for doctors, appointments, etc.)

authLevel: ‘anonymous’. For testing we keep endpoints open, but in production, use Azure AD B2C or App Service Authentication

CORS header:

The corsHeaders object is added to handle CORS (Cross-Origin Resource Sharing), which is a browser security feature.

Header breakdown:

  • "Access-Control-Allow-Origin": "*" → Allows requests from any origin (you can restrict to your frontend URL for more security).
  • "Access-Control-Allow-Methods": "GET, POST, PUT, DELETE, OPTIONS" → Lists the HTTP methods your API supports.
  • "Access-Control-Allow-Headers": "Content-Type, Authorization" → Lets the client send these headers.
  • "Content-Type": "application/json" → Ensures API responses are sent as JSON.

Azure Functions itself also enforces CORS at the platform level. So go to your function app -> API -> CORS -> add “*” if you want to allow all origins, or you can add specific origins to increase security.

Azure Configuration: You can use Azure CLI for configuration, or you can use the Azure portal as well.

az login

az functionapp config appsettings set --name YOUR_FUNCTION_APP_NAME --resource-group YOUR_RESOURCE_GROUP_NAME --settings "WEBSITE_NODE_DEFAULT_VERSION=~22"

az functionapp identity assign --name YOUR_FUNCTION_APP_NAME --resource-group YOUR_RESOURCE_GROUP_NAME

Configure SQL Database Access: Run these queries in your query editor or SQL file in your backend folder

-- Create user for your function app
CREATE USER [YOUR_FUNCTION_APP_NAME] FROM EXTERNAL PROVIDER;

-- Grant permissions
ALTER ROLE db_datareader ADD MEMBER [YOUR_FUNCTION_APP_NAME];
ALTER ROLE db_datawriter ADD MEMBER [YOUR_FUNCTION_APP_NAME];

-- Verify
SELECT name, type_desc FROM sys.database_principals WHERE name = 'YOUR_FUNCTION_APP_NAME';

To run the function app locally

func start

To deploy the function app to Azure

func azure functionapp publish YOUR_FUNCTION_APP_NAME --javascript
Azure docs use –javascript for Node.js and –python for Python. Just make sure it matches what you set in the runtime.

Get Function Key:

  1. Azure Portal → Function App → App keys
  2. Copy the default key

You are going to need these keys when integrating the backend with the frontend.

Test in Postman:

  1. URL: https://YOUR_FUNCTION_APP_NAME.azurewebsites.net/api/patients
  2. Method: GET
  3. Headers: x-functions-key: YOUR_FUNCTION_KEY

Step 6: Connect your Azure Blob Storage Resource

Patients often need to upload past reports or prescriptions. For this, I used Azure Blob Storage to securely store files like PDFs, images, or scans.

When you create a Function App, Azure automatically creates a Storage Account. I reused this same account to store patients’ medical records instead of creating a new one.

Although the best practice is to use a separate storage account.

Steps:

  1. In the Function App’s Storage Account, go to Containers → + Container.
    – Name it medical-records.
    – Set Public Access to Private (no anonymous access).

2. Go to your Function App → Settings → Environment variables → AzureWebJobsStorage (AzureWebJobsStorage contains your connection string to your blob storage. Add it to your local.settings.json)

Reminder: local.settings.json is only for local development. Do not push it. it contain secrets and keys.

Here is a snippet of the storage.js file:

const { BlobServiceClient } = require('@azure/storage-blob');

// Get connection string from env (local.settings.json / Azure App Settings)
const connectionString = process.env.AzureWebJobsStorage;
if (!connectionString) throw new Error('AzureWebJobsStorage not found');

// Create blob service client
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);

// Container names
const CONTAINERS = { MEDICAL_RECORDS: 'medical-records' };

const storageHelpers = {
// Upload file to blob storage
async uploadFile(containerName, fileName, fileBuffer, contentType = 'application/octet-stream') {
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
await blockBlobClient.upload(fileBuffer, fileBuffer.length, {
blobHTTPHeaders: { blobContentType: contentType }
});
return blockBlobClient.url; // return file URL
},

// Generate unique filename
generateUniqueFileName(originalName) {
const timestamp = Date.now();
const random = Math.random().toString(36).substring(2, 8);
const ext = originalName.split('.').pop();
return `${timestamp}_${random}.${ext}`;
}
};

module.exports = { storageHelpers, CONTAINERS };

Configuration of storage account using Azure CLI in the function app folder:

az storage container list --account-name YOUR_STORAGE_ACCOUNT_NAME
az storage blob list --container-name medical-records --account-name YOUR_STORAGE_ACCOUNT_NAME

Upload file http trigger function:

app.http('uploadFile', {
methods: ['POST'],
authLevel: 'anonymous',
route: 'upload/{containerType}',
handler: async (request, context) => {
const containerType = request.params.containerType;

// Validate container type
if (containerType !== 'medical-records') {
return { status: 400, jsonBody: { success: false, error: "Invalid container type" } };
}

// Get file buffer from request
const fileBuffer = Buffer.from(await request.arrayBuffer());
if (!fileBuffer.length) {
return { status: 400, jsonBody: { success: false, error: "No file data provided" } };
}

// Extract metadata
const originalFileName = request.query.get('filename') || 'uploaded_file';
const patientUsername = request.query.get('patientUsername');
const contentType = request.query.get('contentType') || 'application/octet-stream';

// Generate blob name with optional patient folder
const uniqueName = storageHelpers.generateUniqueFileName(originalFileName);
const blobName = patientUsername ? `${patientUsername}/${uniqueName}` : uniqueName;

// Upload to Blob Storage
const uploadResult = await storageHelpers.uploadFile(
CONTAINERS.MEDICAL_RECORDS,
blobName,
fileBuffer,
contentType
);

// Save to DB if patient info exists
if (patientUsername) {
try {
const patientResult = await executeQuery(
`SELECT patient_id FROM Patients WHERE username = @username`,
{ username: patientUsername }
);

if (patientResult.recordset.length > 0) {
const patient_id = patientResult.recordset[0].patient_id;
const recordResult = await executeQuery(
`INSERT INTO MedicalRecords (patient_id, file_name, blob_name, file_size, mime_type)
OUTPUT INSERTED.* VALUES (@patient_id, @file_name, @blob_name, @file_size, @mime_type)`,
{
patient_id,
file_name: originalFileName,
blob_name: blobName,
file_size: fileBuffer.length,
mime_type: contentType
}
);
uploadResult.databaseRecord = recordResult.recordset[0];
}
} catch (err) {
console.error("DB error:", err.message);
}
}

// Final response
return {
status: 200,
jsonBody: {
success: true,
message: "File uploaded",
data: { ...uploadResult, blobName }
}
};
}
});

Step 7: Email notification service

To send appointment reminders and confirmations, we use Azure Communication Services (ACS), which allows sending emails directly from your backend without setting up an SMTP server. The function app triggers the email when a reminder is due, and ACS handles the secure delivery.

Create Communication Services resource:

  1. Go to Azure Portal → Create a resource
  2. Search for “Communication Services”
  3. Click Create
  4. Fill in details:
    Resource Name: appointment-notifs (or anything you want)
    Resource Group: Same as your Email Communication Service
    Data Location: Same region as your Email service
  5. Click “Review + Create” → “Create
Get Connection String from Communication Services:
Communication Services resource → Settings → Keys → Copy the “Primary connection string”

Set Up your Email Domain:

  1. Create and go to your Email Communication Service resource
  2. Click “Provision Domains” under “Email”
  3. Click “Add domain”
  4. Select “Azure Managed Domain” (easiest for testing)
  5. Enter subdomain name (e.g., “healthcare”)
  6. Click “Add”
  7. Wait for provisioning (takes a few minutes)
  8. Your sender email will be: donotreply@xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx.azurecomm.net

Add your Connection String from Communication Services and your sender email to local.settings.json.

Connect Email Domain to Communication Service

  1. Go back to your Communication Services resource
  2. Click “Domains” under “Email” in the left menu
  3. Click “Connect domain”
  4. Select your Email Communication Service resource
  5. Select the domain you just created
  6. Click “Connect”

In app.js add:

const { EmailClient } = require('@azure/communication-email');

app.http('testEmail', {
methods: ['POST'],
authLevel: 'anonymous',
route: 'test/email',
handler: async (request, context) => {
try {
const emailClient = new EmailClient(process.env.AZURE_COMMUNICATION_CONNECTION_STRING);

const emailMessage = {
senderAddress: process.env.SENDER_EMAIL,
content: {
subject: "Test Email from Healthcare System",
plainText: "This is a test email to verify Azure Communication Services setup.",
html: "<h1>Test Email</h1><p>This is a test email to verify Azure Communication Services setup.</p>"
},
recipients: {
to: [{ address: "your-email@example.com", displayName: "Test User" }]
}
};

const poller = await emailClient.beginSend(emailMessage);
const response = await poller.pollUntilDone();

return {
status: 200,
headers: { 'Content-Type': 'application/json' },
jsonBody: {
success: true,
message: "Test email sent successfully",
messageId: response.id
}
};

} catch (error) {
console.error('Email test failed:', error);
return {
status: 500,
headers: { 'Content-Type': 'application/json' },
jsonBody: {
success: false,
error: error.message
}
};
}
}
});

Add your email in the recipient area above and test in Postman. You’ll receive an email.

With this setup, we now have a fully serverless backend on Azure: SQL for structured data, Blob for medical records, Functions for APIs, and Communication Services for reminders. I have tried to add solutions to all the problems I ran into while developing this. The next step is to integrate this with a frontend (React) and secure it with authentication.

I hope this was helpful 🙂


Building a Serverless Patient Appointment Portal Backend with Entire Azure Architecture was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.


This content originally appeared on Level Up Coding – Medium and was authored by Chhavi Dhankhar