Run Your Own MCP Server on AWS

Run Your Own MCP Server on AWS

June 6, 2025 / Nirav Shah

As AI tooling becomes more embedded in our daily development workflows, the quality of context we feed into LLMs is becoming just as important as the model itself. Tools like ChatGPT, Cursor, GitHub Copilot, and Claude are impressive — but without real-time, structured context, they’re still operating in the dark.

That’s where MCP (Model Context Protocol) comes in.

In this guide, I’ll walk you through running your own MCP server on AWS — giving your developers a secure, real-time context bridge to feed internal data into their AI tools.

 

What Is MCP?

MCP is a lightweight protocol that enables tools like IDEs, terminal assistants, or chat interfaces to fetch structured data from your internal systems and serve it as context to an LLM.

Think of it as an API that LLMs use to ask:

“Hey, what’s the latest deployment log for Project X?”

“Show me the last 10 commits touching auth.py.”

“What’s the sales data for March from our internal DB?”

Your MCP server listens for these structured requests, executes them securely, and returns rich JSON context.

 

Architecture at a Glance

 

mvc diagram


Hands-On Setup

This walkthrough uses:

  • AWS EC2 (Ubuntu)
  • Python 3.10+
  • MySQL
  • HTTPS using Nginx + Certbot


Boot an EC2 Instance

  • Ubuntu 22.04 LTS
  • Expose port 9000 (MCP), 22, 80, and 443
  • SSH into the instance

 

Set Up MySQL and Sample Table:

sudo apt install mysql-server -y

sudo mysql_secure_installation



CREATE DATABASE mcp_demo;

CREATE USER 'mcpuser'@'localhost' IDENTIFIED BY 'StrongPasswordHere';

GRANT ALL PRIVILEGES ON mcp_demo.* TO 'mcpuser'@'localhost';

FLUSH PRIVILEGES;

USE mcp_demo;

CREATE TABLE users (

  id INT AUTO_INCREMENT PRIMARY KEY,

  name VARCHAR(255),

  email VARCHAR(255),

  role VARCHAR(50)

);

 

Python MCP Server Code (Minimalist Version)

Save as mcp_server.py:

from http.server import BaseHTTPRequestHandler, HTTPServer

import json, mysql.connector




class Handler(BaseHTTPRequestHandler):

    def do_POST(self):

        try:

            length = int(self.headers.get('Content-Length', 0))

            payload = json.loads(self.rfile.read(length))

            table = payload.get("table", "users")




            db = mysql.connector.connect(

                host="localhost", user="mcpuser",

                password="StrongPasswordHere", database="mcp_demo"

            )

            cur = db.cursor(dictionary=True)

            cur.execute(f"SELECT * FROM {table} LIMIT 100")

            rows = cur.fetchall()

            self.send_response(200)

            self.send_header('Content-Type', 'application/json')

            self.end_headers()

            self.wfile.write(json.dumps(rows).encode())




        except Exception as e:

            self.send_response(500)

            self.end_headers()

            self.wfile.write(f"Error: {e}".encode())




def run(): HTTPServer(('', 9000), Handler).serve_forever()

if __name__ == '__main__': run()

Test it:

curl -X POST http://<your-ec2-ip>:9000 -H "Content-Type: application/json" -d '{"table": "users"}'

Add HTTPS with Nginx + Certbot:

sudo apt install nginx certbot python3-certbot-nginx -y

Configure Nginx:

server {

    listen 80;

    server_name mcp.yourdomain.com;




    location / {

        proxy_pass http://localhost:9000;

    }

}

sudo certbot --nginx -d mcp.yourdomain.com

Done. You now have a production-ready, HTTPS-enabled MCP server.

Sample MCP Response:

[

  {

    "id": 1,

    "name": "Nirav Shah",

    "email": "nirav@example.com",

    "role": "Admin"

  },

]

 

Integrating with Tools Like Claude or Cursor

Now that your MCP endpoint is live, you can register it in tools like:

  • Claude’s Custom Context Providers
  • Cursor IDE’s LLM API settings
  • Open-source developer copilots

They will now send structured POSTs like:

{ "table": "users" }

And your LLM will respond with live internal context.

 

Security Tips

 

  • Add an API token check inside do_POST
  • Enable IP allow-listing (especially for remote tools)
  • Log usage and throttle abuse
  • Avoid exposing arbitrary query support unless sanitized

 

Why This Matters

Running your own MCP server unlocks:

  • 💡 On-demand knowledge delivery to your AI tools
  • 🛡️ No sensitive data leaves your firewall
  • 🧠 Increased relevance in LLM outputs
  • 🧰 Build internal copilots with live org data

You’re not just giving your AI access to context you’re taking control of it.

Talk to AWS Certified Consultant

    Spread Love By Sharing:

    Let’s Talk About Your Needed AWS Infrastructure Management Services

    Have queries about your project idea or concept? Please drop in your project details to discuss with our AWS Global Cloud Infrastructure service specialists and consultants.

    • Swift Hiring and Onboarding
    • Experienced and Trained AWS Team
    • Quality Consulting and Programming
    Let’s Connect and Discuss Your Project