Back to Deployment Guides

    Deploy Continue.dev on Your RamNode VPS

    Transform your coding workflow with AI-powered code completion, chat, and editing capabilities running on your own infrastructure.

    45-60 min
    Setup Time
    4GB RAM
    Recommended
    Intermediate
    Difficulty
    Port 3000
    Default Port

    Why Choose Continue.dev on Your Own VPS?

    Continue.dev is an open-source AI code assistant that brings powerful code completion, chat, and editing capabilities directly into your development environment. Unlike cloud-based solutions, Continue.dev gives you complete control over your AI coding assistant by running it on your own infrastructure.

    Privacy and Security

    Your code never leaves your infrastructure. Perfect for proprietary codebases and organizations with strict data governance requirements.

    Full Customization

    Complete control over model selection and configuration. Choose from various AI models including local Ollama models.

    Cost Control

    Predictable hosting costs without per-token pricing. Pay only for your VPS, not for every API call.

    Consistent Performance

    Dedicated resources ensure reliable response times without rate limits or throttling.

    Prerequisites

    Recommended VPS Plan

    We recommend the 4GB Standard Cloud VPS plan for optimal Continue.dev performance with local AI models:

    • 4GB RAM minimum (8GB+ for better performance)
    • 2+ CPU cores for model processing
    • 80GB+ SSD storage for AI models
    • Sufficient bandwidth for development workflow
    View Cloud VPS Plans

    Server Requirements

    • Ubuntu 22.04 LTS or newer
    • 4GB+ RAM (8GB recommended)
    • 20GB+ available storage
    • 2+ CPU cores

    Local Requirements

    • VS Code or compatible editor
    • SSH client for server access
    • Basic command-line familiarity
    • Domain name (optional)

    Step 1: Initial Server Setup

    Update System and Install Essentials

    # Update the system
    sudo apt update && sudo apt upgrade -y
    # Install essential packages
    sudo apt install -y curl wget git unzip software-properties-common apt-transport-https ca-certificates gnupg lsb-release

    Configure Firewall

    sudo apt install ufw -y
    sudo ufw default deny incoming
    sudo ufw default allow outgoing
    sudo ufw allow 22
    sudo ufw allow 80
    sudo ufw allow 443
    sudo ufw allow 3000
    sudo ufw enable

    Step 2: Install Docker and Docker Compose

    Install Docker

    Docker simplifies dependency management and ensures consistent deployment

    # Add Docker's official GPG key
    curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
    # Add Docker repository
    echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
    # Update and install Docker
    sudo apt update
    sudo apt install -y docker-ce docker-ce-cli containerd.io
    # Add user to docker group
    sudo usermod -aG docker $USER
    # Start and enable Docker
    sudo systemctl start docker
    sudo systemctl enable docker

    Install Docker Compose

    sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
    sudo chmod +x /usr/local/bin/docker-compose
    docker-compose --version
    # Log out and back in for group changes
    newgrp docker

    Step 3: Download and Configure Continue.dev

    Clone Repository

    cd /opt
    sudo git clone https://github.com/continuedev/continue.git
    sudo chown -R $USER:$USER continue
    cd continue

    Create Environment Configuration

    cp .env.example .env
    nano .env

    Configure these key variables:

    # Server configuration
    PORT=3000
    HOST=0.0.0.0
    # Security
    CONTINUE_SECRET_KEY=your_secure_random_key_here
    # Model configuration
    DEFAULT_MODEL=codellama:7b
    # Database
    DATABASE_URL=sqlite:///app/data/continue.db
    # Privacy
    TELEMETRY_ENABLED=false

    Generate Secure Key

    Generate a secure secret key:

    openssl rand -hex 32

    Use the output as your CONTINUE_SECRET_KEY

    Step 4: Set Up Ollama for Local AI Models

    Install Ollama

    Ollama provides local AI models for Continue.dev

    curl -fsSL https://ollama.ai/install.sh | sh
    # Start Ollama service
    sudo systemctl start ollama
    sudo systemctl enable ollama

    Configure Ollama for Docker

    # Create service override directory
    sudo mkdir -p /etc/systemd/system/ollama.service.d
    # Create override configuration
    sudo tee /etc/systemd/system/ollama.service.d/override.conf > /dev/null <<EOF
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0:11434"
    EOF
    # Reload and restart
    sudo systemctl daemon-reload
    sudo systemctl restart ollama

    Download AI Models

    # Download CodeLlama (optimized for code)
    ollama pull codellama:7b
    # Optional: Download other models
    ollama pull llama2:7b
    ollama pull mistral:7b
    # List installed models
    ollama list

    Step 5: Create Docker Compose and Deploy

    Create Docker Compose Configuration

    nano docker-compose.yml

    Add this configuration:

    version: '3.8'
    services:
    continue-server:
    build:
    context: .
    dockerfile: Dockerfile
    ports:
    - "3000:3000"
    environment:
    - PORT=3000
    - HOST=0.0.0.0
    - CONTINUE_SECRET_KEY=${CONTINUE_SECRET_KEY}
    - DEFAULT_MODEL=codellama:7b
    - OLLAMA_BASE_URL=http://host.docker.internal:11434
    volumes:
    - ./data:/app/data
    - ./config:/app/config
    restart: unless-stopped
    extra_hosts:
    - "host.docker.internal:host-gateway"
    # Create required directories
    mkdir -p data config ollama-data
    chmod 755 data config ollama-data

    Build and Start Services

    docker-compose build
    docker-compose up -d
    # Check status
    docker-compose ps
    # View logs
    docker-compose logs -f continue-server

    Verify installation:

    curl http://localhost:3000/health
    curl http://localhost:11434/api/tags

    Step 6: Configure SSL/TLS with Nginx

    Install and Configure Nginx

    Recommended for production deployments

    sudo apt install nginx -y
    sudo nano /etc/nginx/sites-available/continue

    Add this configuration:

    server {
    listen 80;
    server_name your-domain.com;
    location / {
    proxy_pass http://127.0.0.1:3000;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection 'upgrade';
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_read_timeout 86400;
    }
    }
    # Enable site
    sudo ln -s /etc/nginx/sites-available/continue /etc/nginx/sites-enabled/
    sudo rm /etc/nginx/sites-enabled/default
    sudo nginx -t
    sudo systemctl restart nginx

    Install SSL Certificate

    sudo apt install certbot python3-certbot-nginx -y
    sudo certbot --nginx -d your-domain.com
    sudo certbot renew --dry-run
    # Update firewall
    sudo ufw allow 'Nginx Full'
    sudo ufw delete allow 3000

    Step 7: Configure Continue.dev Client

    Install VS Code Extension

    1. Open Visual Studio Code
    2. Go to Extensions (Ctrl+Shift+X)
    3. Search for "Continue"
    4. Install the Continue extension by Continue

    Configure Extension

    Open VS Code settings (Ctrl+,) and search for "Continue", then set your server URL. Alternatively, create a .continuerc.json file in your project root with your server configuration.

    Step 8: Performance Optimization

    Optimize Docker Resources

    echo "DOCKER_DEFAULT_PLATFORM=linux/amd64" >> .env

    Configure Model Performance

    # Use optimized models
    ollama pull codellama:7b-code
    # Lower memory usage
    echo 'OLLAMA_NUM_PARALLEL=1' | sudo tee -a /etc/systemd/system/ollama.service.d/override.conf

    Monitor Resources

    docker stats
    htop
    df -h

    Step 9: Backup and Maintenance

    Regular Updates

    cd /opt/continue
    git pull origin main
    docker-compose build --no-cache
    docker-compose up -d
    # Update models
    ollama pull codellama:7b
    # Clean up
    docker system prune -f

    Automated Backups

    Create a backup script to regularly backup your Continue.dev data and configuration files. Store backups in a secure location and test restoration procedures regularly.

    Troubleshooting

    Service Won't Start

    docker-compose logs continue-server
    docker-compose logs ollama
    sudo journalctl -u ollama -f

    Memory Issues

    free -h
    # Use smaller quantized models
    ollama pull codellama:7b-instruct-q4_0

    Connection Issues

    curl -v http://localhost:3000/health
    curl -v http://your-vps-ip:3000/health
    sudo ufw status
    docker network ls

    Frequently Asked Questions

    Ready to Deploy?

    Get started with RamNode's high-performance VPS hosting