Production Deployment
Deploy your DataBridge API to production with confidence
Production Deployment
Complete guide to deploying your DataBridge API to production environments.
Overview
This guide covers deployment to popular platforms:
- Railway
- Vercel
- AWS (EC2, ECS)
- DigitalOcean
- Heroku alternatives
Pre-Deployment Checklist
Environment Variables
Create .env.production:
DATABASE_URL="mysql://user:pass@production-db:3306/mydb?ssl=true"
NODE_ENV=production
API_PORT=8080
CORS_ORIGIN="https://yourdomain.com"
JWT_SECRET="your-production-secret-change-this"
Production Dependencies
Ensure package.json has all runtime dependencies:
{
"dependencies": {
"fastify": "^5.6.2",
"@fastify/cors": "^11.1.0",
"@prisma/client": "^5.22.0",
"prisma": "^5.22.0",
"zod": "^3.25.76",
"pino": "^9.6.0",
"pino-pretty": "^13.1.2",
"tsx": "^4.20.6"
},
"scripts": {
"build": "tsc",
"start": "node dist/server.js",
"dev": "tsx src/server.ts",
"prisma:generate": "prisma generate"
}
}
Build Configuration
tsconfig.json:
{
"compilerOptions": {
"target": "ES2022",
"module": "commonjs",
"lib": ["ES2022"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true
},
"include": ["src/**/*"],
"exclude": ["node_modules"]
}
Railway Deployment
Railway offers the easiest deployment with MySQL support.
Step 1: Install Railway CLI
npm install -g @railway/cli
railway login
Step 2: Initialize Project
railway init
railway link
Step 3: Add MySQL Database
railway add mysql
Railway will automatically set DATABASE_URL environment variable.
Step 4: Configure Build
Create railway.json:
{
"build": {
"builder": "NIXPACKS",
"buildCommand": "npm install && npm run prisma:generate && npm run build"
},
"deploy": {
"startCommand": "npm start",
"restartPolicyType": "ON_FAILURE",
"restartPolicyMaxRetries": 10
}
}
Step 5: Deploy
railway up
Get your deployment URL:
railway status
Environment Variables
railway variables set NODE_ENV=production
railway variables set API_PORT=8080
railway variables set CORS_ORIGIN=https://yourdomain.com
Vercel Deployment
Vercel is great for serverless deployments.
Step 1: Install Vercel CLI
npm install -g vercel
vercel login
Step 2: Configure vercel.json
{
"version": 2,
"builds": [
{
"src": "src/server.ts",
"use": "@vercel/node"
}
],
"routes": [
{
"src": "/(.*)",
"dest": "src/server.ts"
}
],
"env": {
"NODE_ENV": "production"
}
}
Step 3: Add Serverless Adapter
Update src/server.ts:
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
// ... register plugins and routes ...
// For Vercel
export default async (req, res) => {
await fastify.ready();
fastify.server.emit('request', req, res);
};
// For local development
if (require.main === module) {
fastify.listen({ port: 3000 });
}
Step 4: Deploy
vercel
AWS EC2 Deployment
Traditional VM deployment with full control.
Step 1: Launch EC2 Instance
- Choose Ubuntu 22.04 LTS
- Instance type: t3.medium or larger
- Security group: Allow ports 22 (SSH), 80 (HTTP), 443 (HTTPS)
- Create/download key pair
Step 2: Connect via SSH
chmod 400 your-key.pem
ssh -i your-key.pem ubuntu@your-ec2-ip
Step 3: Install Dependencies
# Update system
sudo apt update && sudo apt upgrade -y
# Install Node.js 20
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
sudo apt-get install -y nodejs
# Install MySQL
sudo apt install mysql-server -y
sudo mysql_secure_installation
# Install PM2
sudo npm install -g pm2
Step 4: Setup Database
sudo mysql -u root -p
CREATE DATABASE mydb;
CREATE USER 'apiuser'@'localhost' IDENTIFIED BY 'strong-password';
GRANT ALL PRIVILEGES ON mydb.* TO 'apiuser'@'localhost';
FLUSH PRIVILEGES;
EXIT;
Step 5: Deploy Application
# Clone repo
git clone https://github.com/yourusername/your-api.git
cd your-api
# Install dependencies
npm install
# Setup environment
cat > .env << EOF
DATABASE_URL="mysql://apiuser:strong-password@localhost:3306/mydb"
NODE_ENV=production
API_PORT=3000
EOF
# Generate Prisma client
npx prisma generate
# Build
npm run build
# Start with PM2
pm2 start dist/server.js --name api
pm2 startup
pm2 save
Step 6: Setup Nginx Reverse Proxy
sudo apt install nginx -y
sudo nano /etc/nginx/sites-available/api
Add configuration:
server {
listen 80;
server_name api.yourdomain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
Enable site:
sudo ln -s /etc/nginx/sites-available/api /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
Step 7: SSL with Let’s Encrypt
sudo apt install certbot python3-certbot-nginx -y
sudo certbot --nginx -d api.yourdomain.com
DigitalOcean App Platform
Similar to Heroku, fully managed.
Step 1: Create App
- Go to DigitalOcean App Platform
- Connect GitHub repository
- Choose branch to deploy
Step 2: Configure Build
App spec (auto-detected or create app.yaml):
name: databridge-api
services:
- name: api
github:
repo: yourusername/your-api
branch: main
build_command: npm install && npx prisma generate && npm run build
run_command: npm start
environment_slug: node-js
instance_count: 1
instance_size_slug: basic-xs
http_port: 8080
envs:
- key: NODE_ENV
value: production
- key: API_PORT
value: "8080"
databases:
- name: db
engine: MYSQL
version: "8"
Step 3: Deploy
App Platform will automatically:
- Build your app
- Provision MySQL database
- Set
DATABASE_URL - Deploy and provide URL
Docker Deployment
Containerize your application for any platform.
Dockerfile
FROM node:20-alpine
WORKDIR /app
# Copy package files
COPY package*.json ./
COPY prisma ./prisma/
# Install dependencies
RUN npm ci --only=production
# Generate Prisma client
RUN npx prisma generate
# Copy source
COPY . .
# Build TypeScript
RUN npm run build
# Expose port
EXPOSE 8080
# Start app
CMD ["npm", "start"]
docker-compose.yml (Production)
version: '3.8'
services:
api:
build: .
restart: unless-stopped
ports:
- "8080:8080"
environment:
NODE_ENV: production
DATABASE_URL: mysql://root:${MYSQL_ROOT_PASSWORD}@db:3306/${MYSQL_DATABASE}
API_PORT: 8080
depends_on:
db:
condition: service_healthy
db:
image: mysql:8.0
restart: unless-stopped
environment:
MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD}
MYSQL_DATABASE: ${MYSQL_DATABASE}
volumes:
- mysql_data:/var/lib/mysql
healthcheck:
test: ["CMD", "mysqladmin", "ping", "-h", "localhost"]
interval: 10s
timeout: 5s
retries: 5
nginx:
image: nginx:alpine
restart: unless-stopped
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
- ./ssl:/etc/nginx/ssl
depends_on:
- api
volumes:
mysql_data:
Build and Deploy
docker-compose up -d --build
Environment Variables Management
Railway
railway variables set KEY=value
Vercel
vercel env add KEY production
AWS (SSM Parameter Store)
aws ssm put-parameter \
--name /myapp/DATABASE_URL \
--value "mysql://..." \
--type SecureString
DigitalOcean
Through App Platform UI or doctl CLI:
doctl apps update $APP_ID --spec app.yaml
Health Checks
Ensure your API has health endpoints:
fastify.get('/health', async () => {
const dbHealth = await fastify.prisma.$queryRaw`SELECT 1`;
return {
status: 'ok',
database: dbHealth ? 'connected' : 'disconnected',
uptime: process.uptime(),
timestamp: new Date().toISOString(),
};
});
Monitoring & Logging
Production Logging
Use JSON logging for log aggregation:
const fastify = Fastify({
logger: process.env.NODE_ENV === 'production'
? true // JSON logging
: { // Pretty logging for dev
transport: {
target: 'pino-pretty',
options: {
translateTime: 'HH:MM:ss Z',
ignore: 'pid,hostname',
},
},
},
});
PM2 Monitoring
pm2 monit
pm2 logs api
pm2 describe api
Error Tracking
Integrate Sentry:
npm install @sentry/node
import * as Sentry from '@sentry/node';
Sentry.init({
dsn: process.env.SENTRY_DSN,
environment: process.env.NODE_ENV,
});
fastify.setErrorHandler((error, request, reply) => {
Sentry.captureException(error);
request.log.error(error);
reply.status(500).send({ error: 'Internal server error' });
});
Database Migrations
Production Migration Strategy
# 1. Backup database
mysqldump -u user -p mydb > backup.sql
# 2. Run migrations
npx prisma migrate deploy
# 3. Verify
npx prisma migrate status
Zero-Downtime Migrations
- Deploy new version (backward compatible)
- Run migrations
- Verify health checks
- Switch traffic
- Remove old version
Performance Optimization
Connection Pooling
const prisma = new PrismaClient({
datasources: {
db: {
url: process.env.DATABASE_URL,
},
},
log: ['error'],
});
Caching
npm install @fastify/caching
import caching from '@fastify/caching';
await fastify.register(caching, {
privacy: 'public',
expiresIn: 300, // 5 minutes
});
fastify.get('/products', {
cache: { expiresIn: 60 },
}, async () => {
return await prisma.products.findMany();
});
Rate Limiting
npm install @fastify/rate-limit
import rateLimit from '@fastify/rate-limit';
await fastify.register(rateLimit, {
max: 100,
timeWindow: '15 minutes',
});
Security Best Practices
- ✅ Use HTTPS (SSL/TLS)
- ✅ Enable CORS only for trusted origins
- ✅ Set security headers (Helmet)
- ✅ Use environment variables for secrets
- ✅ Enable rate limiting
- ✅ Implement authentication
- ✅ Keep dependencies updated
- ✅ Use SQL injection protection (Prisma handles this)
- ✅ Implement request logging
- ✅ Regular security audits
Troubleshooting
Database Connection Issues
# Test connection
mysql -h your-db-host -u user -p
# Check environment variables
echo $DATABASE_URL
# Prisma debug
DEBUG="prisma:*" npm start
Memory Issues
# Increase Node.js memory
NODE_OPTIONS="--max-old-space-size=4096" npm start
# PM2
pm2 start dist/server.js --max-memory-restart 1G
High CPU
# PM2 cluster mode
pm2 start dist/server.js -i max
Additional Resources
Was this page helpful?
Thank you for your feedback!