Skip to main content

🚀 Production Deployment Guide

Deploy your Dubhe applications to production with confidence using battle-tested strategies

Prerequisites: Completed application development and local testing

🎯 Deployment Overview

Deploying a Dubhe application involves multiple components working together to provide a seamless user experience.

Smart Contracts

Move contracts deployed to blockchain networks

Indexer Service

Real-time blockchain event processing

Frontend Application

User interface with WebSocket connections

📋 Pre-Deployment Checklist

1

Code Audit & Security Review

Smart Contract Audit

  • Review all Move contracts for vulnerabilities
  • Run formal verification tools
  • Check access control mechanisms
  • Validate input sanitization

Frontend Security

  • Audit client-side code
  • Verify API endpoints
  • Check authentication flows
  • Review environment variable usage
2

Performance Testing

# Load testing with realistic user scenarios
pnpm run test:load

# Gas optimization testing
pnpm run test:gas-optimization

# Frontend performance testing
pnpm run test:lighthouse
3

Integration Testing

# Full end-to-end testing
pnpm run test:e2e

# Cross-chain compatibility testing
pnpm run test:multi-chain

# Real network testing (testnet)
pnpm run test:testnet

⛓️ Blockchain Deployment

Network Configuration

// dubhe.config.js
export default {
  networks: {
    'sui-mainnet': {
      url: 'https://fullnode.mainnet.sui.io:443',
      faucet: null, // No faucet on mainnet
      packageId: null, // Will be set after deployment
    },
    'aptos-mainnet': {
      url: 'https://fullnode.mainnet.aptoslabs.com/v1',
      faucet: null,
      packageId: null,
    }
  },
  deployment: {
    network: 'sui-mainnet',
    gasPrice: 1000, // Higher gas price for faster confirmation
    gasLimit: 10000000,
    confirmations: 3, // Wait for 3 confirmations
  }
}

Contract Deployment Process

1

Pre-deployment Verification

# Verify all contracts compile without errors
dubhe build --network mainnet

# Run comprehensive test suite
dubhe test --coverage

# Validate schema consistency
dubhe schema validate

# Check gas estimation
dubhe estimate-gas --network mainnet
2

Staged Deployment

# Deploy to testnet first (final verification)
dubhe deploy --network sui-testnet --verify

# Run integration tests against testnet deployment
pnpm run test:integration --network testnet

# Deploy to mainnet (production)
dubhe deploy --network sui-mainnet --verify

# Verify deployment on explorer
dubhe verify --network sui-mainnet
3

Post-deployment Validation

# Test basic functionality
dubhe test:smoke --network mainnet

# Initialize game state if needed
dubhe init-game-state --network mainnet

# Set up monitoring and alerts
dubhe setup-monitoring --network mainnet

Multi-Chain Deployment Strategy

# Primary chain deployment (usually Sui)
dubhe deploy --network sui-mainnet
export PRIMARY_PACKAGE_ID=$(dubhe get-package-id --network sui-mainnet)

# Secondary chain deployments
dubhe deploy --network aptos-mainnet --reference-package $PRIMARY_PACKAGE_ID
dubhe deploy --network rooch-mainnet --reference-package $PRIMARY_PACKAGE_ID

# Verify cross-chain compatibility
dubhe test:cross-chain --all-networks
# Deploy to multiple chains simultaneously
dubhe deploy --network sui-mainnet,aptos-mainnet,rooch-mainnet --parallel

# Sync package IDs across deployments
dubhe sync-package-ids --all-networks

# Validate consistent behavior
dubhe validate-consistency --all-networks

🗃️ Indexer Service Deployment

Infrastructure Requirements

Database

PostgreSQL 14+ with optimized configuration for high-throughput writes

Message Queue

Redis for real-time event processing and WebSocket management

Container Runtime

Docker with orchestration via Kubernetes or Docker Compose

Load Balancer

NGINX or cloud load balancer for distributing client connections

Docker Configuration

# Dockerfile for Dubhe indexer
FROM node:18-alpine AS builder

WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production

COPY . .
RUN npm run build

FROM node:18-alpine AS runner

WORKDIR /app
RUN addgroup -g 1001 -S nodejs
RUN adduser -S dubhe -u 1001

COPY --from=builder --chown=dubhe:nodejs /app/dist ./dist
COPY --from=builder --chown=dubhe:nodejs /app/node_modules ./node_modules
COPY --from=builder --chown=dubhe:nodejs /app/package.json ./package.json

USER dubhe

EXPOSE 3001 3002

CMD ["npm", "start"]

Kubernetes Deployment

apiVersion: apps/v1
kind: Deployment
metadata:
  name: dubhe-indexer
  labels:
    app: dubhe-indexer
spec:
  replicas: 3
  selector:
    matchLabels:
      app: dubhe-indexer
  template:
    metadata:
      labels:
        app: dubhe-indexer
    spec:
      containers:
      - name: indexer
        image: your-registry/dubhe-indexer:latest
        ports:
        - containerPort: 3001
          name: api
        - containerPort: 3002
          name: websocket
        env:
        - name: NODE_ENV
          value: "production"
        - name: DATABASE_URL
          valueFrom:
            secretKeyRef:
              name: dubhe-secrets
              key: database-url
        - name: REDIS_URL
          valueFrom:
            secretKeyRef:
              name: dubhe-secrets
              key: redis-url
        resources:
          requests:
            memory: "1Gi"
            cpu: "500m"
          limits:
            memory: "2Gi"
            cpu: "1"
        livenessProbe:
          httpGet:
            path: /health
            port: 3001
          initialDelaySeconds: 30
          periodSeconds: 10
        readinessProbe:
          httpGet:
            path: /ready
            port: 3001
          initialDelaySeconds: 5
          periodSeconds: 5
---
apiVersion: v1
kind: Service
metadata:
  name: dubhe-indexer-service
spec:
  selector:
    app: dubhe-indexer
  ports:
  - name: api
    port: 80
    targetPort: 3001
  - name: websocket
    port: 3002
    targetPort: 3002
  type: LoadBalancer

🖥️ Frontend Deployment

Static Site Deployment

# Install Vercel CLI
npm install -g vercel

# Deploy to Vercel
vercel --prod

# Set environment variables
vercel env add NEXT_PUBLIC_INDEXER_URL production
vercel env add NEXT_PUBLIC_WEBSOCKET_URL production
vercel env add NEXT_PUBLIC_NETWORK production
// vercel.json
{
  "version": 2,
  "builds": [
    {
      "src": "package.json",
      "use": "@vercel/next"
    }
  ],
  "env": {
    "NEXT_PUBLIC_INDEXER_URL": "https://api.your-game.com",
    "NEXT_PUBLIC_WEBSOCKET_URL": "wss://api.your-game.com",
    "NEXT_PUBLIC_NETWORK": "mainnet"
  },
  "headers": [
    {
      "source": "/(.*)",
      "headers": [
        {
          "key": "X-Content-Type-Options",
          "value": "nosniff"
        },
        {
          "key": "X-Frame-Options",
          "value": "DENY"
        }
      ]
    }
  ]
}

📊 Monitoring & Observability

Application Monitoring

// Health check endpoints for monitoring
import express from 'express';
import { pool } from './database';
import { redis } from './redis';

const app = express();

app.get('/health', async (req, res) => {
  try {
    // Check database connection
    await pool.query('SELECT 1');
    
    // Check Redis connection
    await redis.ping();
    
    // Check blockchain RPC
    const rpcResponse = await fetch(process.env.BLOCKCHAIN_RPC_URL);
    if (!rpcResponse.ok) throw new Error('RPC not responding');
    
    res.json({
      status: 'healthy',
      timestamp: new Date().toISOString(),
      services: {
        database: 'ok',
        redis: 'ok',
        blockchain: 'ok'
      }
    });
  } catch (error) {
    res.status(503).json({
      status: 'unhealthy',
      error: error.message,
      timestamp: new Date().toISOString()
    });
  }
});

app.get('/ready', async (req, res) => {
  try {
    // More comprehensive readiness check
    const dbResult = await pool.query('SELECT COUNT(*) FROM entities');
    const redisInfo = await redis.info();
    
    res.json({
      status: 'ready',
      metrics: {
        entityCount: parseInt(dbResult.rows[0].count),
        redisMemoryUsage: redisInfo.used_memory_human,
        uptime: process.uptime()
      }
    });
  } catch (error) {
    res.status(503).json({
      status: 'not ready',
      error: error.message
    });
  }
});

Error Tracking & Logging

import winston from 'winston';

const logger = winston.createLogger({
  level: process.env.LOG_LEVEL || 'info',
  format: winston.format.combine(
    winston.format.timestamp(),
    winston.format.errors({ stack: true }),
    winston.format.json()
  ),
  transports: [
    new winston.transports.Console(),
    new winston.transports.File({ 
      filename: 'error.log', 
      level: 'error' 
    }),
    new winston.transports.File({ 
      filename: 'combined.log' 
    })
  ]
});

// Usage in application
logger.info('Blockchain event processed', {
  eventType: 'ComponentAdded',
  entity: entityId,
  component: componentType,
  network: 'sui-mainnet',
  blockNumber: blockNumber
});

logger.error('Database query failed', {
  query: 'SELECT * FROM entities',
  error: error.message,
  duration: queryDuration
});

🚦 CI/CD Pipeline

GitHub Actions Workflow

name: Deploy to Production

on:
  push:
    branches: [main]
  workflow_dispatch:

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      
      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'
          cache: 'npm'
      
      - name: Install dependencies
        run: npm ci
      
      - name: Run tests
        run: npm run test:ci
      
      - name: Run linting
        run: npm run lint
      
      - name: Build contracts
        run: npm run build:contracts
      
      - name: Gas optimization check
        run: npm run test:gas

  deploy-contracts:
    needs: test
    runs-on: ubuntu-latest
    if: github.ref == 'refs/heads/main'
    steps:
      - uses: actions/checkout@v3
      
      - name: Setup deployment environment
        run: |
          echo "PRIVATE_KEY=${{ secrets.DEPLOY_PRIVATE_KEY }}" >> $GITHUB_ENV
          echo "RPC_URL=${{ secrets.MAINNET_RPC_URL }}" >> $GITHUB_ENV
      
      - name: Deploy to Sui Mainnet
        run: |
          npm run deploy:sui-mainnet
          echo "SUI_PACKAGE_ID=$(npm run get-package-id:sui)" >> $GITHUB_ENV
      
      - name: Deploy to Aptos Mainnet
        run: |
          npm run deploy:aptos-mainnet
          echo "APTOS_PACKAGE_ID=$(npm run get-package-id:aptos)" >> $GITHUB_ENV
      
      - name: Update package registry
        run: |
          npm run update-package-registry

  deploy-indexer:
    needs: [test, deploy-contracts]
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      
      - name: Build Docker image
        run: |
          docker build -t dubhe-indexer:${{ github.sha }} .
          docker tag dubhe-indexer:${{ github.sha }} dubhe-indexer:latest
      
      - name: Push to registry
        run: |
          echo ${{ secrets.DOCKER_PASSWORD }} | docker login -u ${{ secrets.DOCKER_USERNAME }} --password-stdin
          docker push dubhe-indexer:${{ github.sha }}
          docker push dubhe-indexer:latest
      
      - name: Deploy to Kubernetes
        uses: azure/k8s-deploy@v1
        with:
          manifests: |
            k8s/deployment.yaml
            k8s/service.yaml
          images: |
            dubhe-indexer:${{ github.sha }}

  deploy-frontend:
    needs: [test, deploy-contracts]
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      
      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'
          cache: 'npm'
      
      - name: Install dependencies
        run: npm ci
      
      - name: Build frontend
        run: npm run build
        env:
          NEXT_PUBLIC_INDEXER_URL: ${{ secrets.PRODUCTION_INDEXER_URL }}
          NEXT_PUBLIC_WEBSOCKET_URL: ${{ secrets.PRODUCTION_WS_URL }}
          NEXT_PUBLIC_SUI_PACKAGE_ID: ${{ env.SUI_PACKAGE_ID }}
          NEXT_PUBLIC_APTOS_PACKAGE_ID: ${{ env.APTOS_PACKAGE_ID }}
      
      - name: Deploy to Vercel
        uses: amondnet/vercel-action@v20
        with:
          vercel-token: ${{ secrets.VERCEL_TOKEN }}
          vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
          vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
          vercel-args: '--prod'

🛡️ Security Considerations

Environment Security

Secret Management

  • Use environment-specific secrets
  • Rotate keys regularly
  • Never commit secrets to code
  • Use vault systems in production

Network Security

  • Enable HTTPS/WSS everywhere
  • Configure proper CORS policies
  • Use rate limiting
  • Implement DDoS protection
# Production environment variables
NODE_ENV=production

# Database (use connection pooling)
DATABASE_URL=postgresql://user:secure_password@db-cluster:5432/dubhe_prod
DATABASE_MAX_CONNECTIONS=20
DATABASE_SSL=true

# Redis (use cluster for HA)
REDIS_URL=redis://redis-cluster:6379
REDIS_PASSWORD=secure_redis_password

# Blockchain connections
SUI_RPC_URL=https://fullnode.mainnet.sui.io:443
APTOS_RPC_URL=https://fullnode.mainnet.aptoslabs.com/v1

# Package IDs (set after deployment)
SUI_PACKAGE_ID=0x123...
APTOS_PACKAGE_ID=0x456...

# Monitoring
SENTRY_DSN=https://your-sentry-dsn
MONITORING_TOKEN=secure_monitoring_token

# Security
CORS_ORIGIN=https://your-game.com
RATE_LIMIT_REQUESTS=1000
RATE_LIMIT_WINDOW=900000

# Feature flags
ENABLE_ANALYTICS=true
ENABLE_DEBUG_LOGS=false

📈 Performance Optimization

Database Optimization

-- Optimize for common query patterns
CREATE INDEX CONCURRENTLY idx_entity_components ON entity_components(entity_id, component_type);
CREATE INDEX CONCURRENTLY idx_events_timestamp ON blockchain_events(timestamp DESC);
CREATE INDEX CONCURRENTLY idx_players_active ON players(last_active) WHERE active = true;

-- Partial indexes for better performance
CREATE INDEX CONCURRENTLY idx_entities_alive ON entities(id) WHERE status = 'alive';
CREATE INDEX CONCURRENTLY idx_recent_events ON events(created_at) WHERE created_at > NOW() - INTERVAL '1 hour';

-- Analyze query performance
EXPLAIN (ANALYZE, BUFFERS) SELECT * FROM entities WHERE component_type = 'HealthComponent';
import { Pool } from 'pg';

const pool = new Pool({
  connectionString: process.env.DATABASE_URL,
  max: 20, // Maximum connections
  idleTimeoutMillis: 30000,
  connectionTimeoutMillis: 2000,
  maxUses: 7500, // Rotate connections
  ssl: process.env.NODE_ENV === 'production' ? { rejectUnauthorized: false } : false,
});

// Connection health monitoring
pool.on('connect', (client) => {
  console.log('New database connection established');
});

pool.on('error', (err) => {
  console.error('Database connection error:', err);
  // Alert monitoring systems
});

Caching Strategy

import Redis from 'ioredis';

const redis = new Redis({
  host: process.env.REDIS_HOST,
  port: parseInt(process.env.REDIS_PORT || '6379'),
  password: process.env.REDIS_PASSWORD,
  retryDelayOnFailover: 100,
  enableReadyCheck: true,
  maxRetriesPerRequest: null,
});

// Cache component data
export async function getCachedComponent(entityId: string, componentType: string) {
  const cacheKey = `entity:${entityId}:${componentType}`;
  const cached = await redis.get(cacheKey);
  
  if (cached) {
    return JSON.parse(cached);
  }
  
  // Fetch from database
  const component = await db.getComponent(entityId, componentType);
  
  // Cache for 5 minutes
  await redis.setex(cacheKey, 300, JSON.stringify(component));
  
  return component;
}

// Invalidate cache on updates
export async function invalidateEntityCache(entityId: string) {
  const pattern = `entity:${entityId}:*`;
  const keys = await redis.keys(pattern);
  
  if (keys.length > 0) {
    await redis.del(...keys);
  }
}

🔄 Rollback & Recovery

Deployment Rollback Strategy

1

Database Migrations

-- Always create reversible migrations
-- Migration up: 20231201_add_new_component_table.sql
CREATE TABLE new_component_data (
    id SERIAL PRIMARY KEY,
    entity_id BIGINT NOT NULL,
    component_data JSONB NOT NULL,
    created_at TIMESTAMP DEFAULT NOW()
);

-- Migration down: 20231201_add_new_component_table_rollback.sql
DROP TABLE IF EXISTS new_component_data;
2

Smart Contract Upgrades

# Prepare upgrade with backwards compatibility
dubhe prepare-upgrade --version v2.1.0 --backwards-compatible

# Deploy upgrade to staging first
dubhe deploy-upgrade --network testnet --version v2.1.0

# Test upgrade thoroughly
npm run test:upgrade-compatibility

# Deploy to production with rollback plan
dubhe deploy-upgrade --network mainnet --version v2.1.0 --enable-rollback

# If issues occur, rollback immediately
dubhe rollback --network mainnet --to-version v2.0.5
3

Service Rollback

# Kubernetes rollback
kubectl rollout undo deployment/dubhe-indexer

# Docker Compose rollback
docker-compose down
docker-compose pull dubhe-indexer:v2.0.5
docker-compose up -d

# Verify rollback success
curl -f http://your-api/health

📋 Post-Deployment Checklist

1

Immediate Verification

  • All services are running and healthy
  • Database connections are stable
  • WebSocket connections are working
  • Frontend can connect to backend
  • Basic game functions work correctly
2

Performance Monitoring

  • Response times are within acceptable limits
  • Database query performance is optimal
  • Memory usage is stable
  • No error rate spikes
  • WebSocket connection count is normal
3

Business Logic Validation

  • User registration works
  • Game state updates correctly
  • Transactions are processed
  • Events are emitted and captured
  • Cross-chain functionality (if applicable)
4

Security Verification

  • HTTPS is enforced
  • CORS policies are correct
  • Rate limiting is active
  • No sensitive data in logs
  • Authentication flows work

🚀 Scaling Considerations

Horizontal Scaling Strategies

Indexer Scaling

  • Multiple indexer instances with load balancing
  • Event partitioning by entity ID ranges
  • Read replicas for database queries
  • Redis cluster for session management

Frontend Scaling

  • CDN for static assets
  • Edge deployment for global users
  • Client-side caching strategies
  • Progressive loading techniques

📚 Further Resources

Performance Guide

Advanced optimization techniques for production

Testing Guide

Comprehensive testing strategies

Monitoring Best Practices

Setting up effective monitoring and alerts

Community Support

Get help from the Dubhe community