Spin Up a Free Oracle Cloud Server: Serve Websites & APIs with NGINX

Oracle Cloud Infrastructure (OCI) offers an exceptionally generous "Always Free" tier, enabling users to explore and utilize cloud services without incurring costs. This tier provides a robust platform for hosting applications, including the ability to set up Virtual Machines (VMs) and serve applications using NGINX. With its competitive offerings, such as Arm-based Ampere A1 cores, AMD-based compute instances, and substantial storage and bandwidth allowances, Oracle Cloud stands out as a cost-effective solution for developers, small businesses, and tech enthusiasts.

The "Always Free" tier includes up to 4 VMs, comprising 2 x86 instances (1 vCPU, 1 GB RAM each) and 2 Ampere instances (up to 4 cores and 24 GB RAM shared). Additionally, users receive 200 GB of block storage10 TB of outbound data transfer, and other essential resources. These features make it possible to deploy scalable applications at no cost, provided users remain within the free tier limits. For more details about the free tier, visit the Oracle Cloud Free Tier page.

In this guide, you’ll learn how to:

  • Set up an Oracle Cloud free-tier.
  • Provisioning a VM running Ubuntu.
  • Install and configure NGINX.
  • Deploy a sample web application behind NGINX.
  • Set up firewall rules and security settings.

Whether you are new to cloud computing or looking for a budget-friendly hosting solution, this guide will help you leverage Oracle Cloud's free-tier offerings effectively. This setup is ideal for hosting websites, APIs, or even running backend services for personal or small-scale projects.

For additional context, tutorials such as Jakob Osterberger's guide and Ryan Harrison's walkthrough highlight the practical benefits of using Oracle Cloud for hosting applications. Furthermore, resources like the Oracle Linux NGINX installation guide provide valuable insights into configuring NGINX on Oracle's infrastructure.

By the end of this guide, you will have a fully functional cloud-based server running NGINX, capable of serving your application efficiently and securely.

Lets get started!

Example Use Cases

Hey, before we roll up our sleeves and get into the details, let's take a look at some cool ways you can use Oracle Cloud’s Free Tier. This setup—complete with an Nginx-powered VM—opens the door to a ton of creative and practical projects, all while staying within the no-cost limits. Below, you’ll find a few inspiring ideas that just might spark your next DIY endeavor.

  • Self-Hosted VPN or Proxy for IoT Edge Devices: For developers managing IoT edge devices, a self-hosted VPN or proxy server can be a game-changer. By setting up a lightweight VPN (such as WireGuard or OpenVPN or TailScale) or a reverse proxy on your Oracle Cloud VM, you can securely route traffic from distributed IoT devices to a centralized location. This setup not only enhances security by encrypting data in transit but also simplifies device management. For example, you could configure your VM to act as a gateway for remote firmware updates or data aggregation, ensuring that your IoT devices remain both secure and functional without relying on third-party services.
  • Mini CI/CD Runner (Continuous Integration): Small development teams or solo developers can utilize their Oracle Cloud VM as a mini CI/CD runner. Tools like Jenkins, GitLab Runner, or GitHub Actions self-hosted runners can be installed on the VM to automate testing, building, and deployment pipelines for your projects. While the free tier’s resource limits may not support large-scale builds, they are more than sufficient for lightweight applications or microservices. This setup allows you to maintain control over your build environment, reduce reliance on external CI/CD platforms, and avoid potential costs associated with proprietary solutions.
  • Low-Traffic Game Server or Chat Server: If you’re looking to host a small multiplayer game or a chat server for friends, family, or a niche community, Oracle Cloud’s Free Tier is a perfect fit. For example, you could set up a Minecraft server, a MUD (Multi-User Dungeon), or even a lightweight chat application using tools like Matrix or Discord bots. By optimizing your VM with Nginx as a reverse proxy and enabling SSL encryption, you can ensure a secure and responsive experience for users. While the free tier isn’t designed for high-traffic scenarios, it works exceptionally well for small-scale, low-latency applications where performance and cost-efficiency are key.
  • Tor Onion Service or Privacy Relay: Privacy-conscious users can leverage their Oracle Cloud VM to host a Tor Onion Service or act as a privacy relay. An Onion Service allows you to host websites or applications anonymously on the Tor network, accessible only via .onion addresses. Alternatively, you could configure your VM as a Tor relay to contribute bandwidth to the Tor network, helping to improve its overall resilience and anonymity. This use case aligns well with the ethical principles of decentralization and privacy, making it an impactful way to utilize your free-tier resources for the greater good.
  • Tiny, High-Availability “Cluster” with Multiple Free Accounts: For those seeking to experiment with high availability or distributed systems, Oracle Cloud’s Free Tier offers a unique opportunity to create a tiny “cluster” across multiple accounts. Technically, if you have access to multiple personal or friend/family accounts (each in compliance with Oracle’s terms of service), you can orchestrate a multi-node setup spanning several Always Free VMs. For instance, you could deploy a load-balanced web application, a distributed database like CockroachDB, or even a Kubernetes cluster using tools like k3s. While this approach requires careful coordination and adherence to Oracle’s policies, it demonstrates the potential for creativity and scalability within the constraints of the free tier.

These use cases highlight the flexibility and potential of Oracle Cloud’s Free Tier VMs. Whether you’re securing IoT devices, automating workflows, hosting niche applications, enhancing privacy, or experimenting with distributed systems, the free tier provides a robust foundation for innovation.

Creating and Configuring an Oracle Cloud Free Tier VM to Serve an Application via Nginx

Selecting the Appropriate VM Shape and Image

When setting up a free-tier VM on Oracle Cloud, it's crucial to select the correct VM shape and operating system image. The Oracle Cloud Free Tier offers two types of VM instances:

  1. AMD-based VM (VM.Standard.E2.1.Micro):
    • 1/8 OCPU (1 vCPU equivalent).
    • 1 GB of RAM.
    • Suitable for lightweight applications or testing environments.
  1. ARM-based Ampere A1 VM (VM.Standard.A1.Flex):
    • Up to 4 OCPUs and 24 GB of RAM (shared across instances).
    • Ideal for more resource-intensive applications or multiple services.

To create a VM:

  • Log in to the Oracle Cloud Console.
  • Navigate to Compute > Instances and click Create Instance.
  • Under "Image and Shape," click Edit.
  • For the image, select an Always Free Eligible OS, such as Canonical Ubuntu 22.04 or Oracle Linux 8.7.
  • For the shape, choose either VM.Standard.E2.1.Micro or VM.Standard.A1.Flex. For ARM-based instances, allocate the desired OCPUs and memory.

This step ensures that your VM remains within the free tier limits, avoiding unexpected charges.

Configuring Networking and Security Settings

Proper networking and security configurations are essential for serving applications. Oracle Cloud VMs are protected by default firewalls, so you must explicitly allow traffic to your application.

  1. Create a Virtual Cloud Network (VCN):
    • During VM creation, select Create new virtual cloud network in the "Configure Networking" section.
    • Ensure the VCN includes a public subnet for internet access.
  1. Modify Security Rules:
    • After creating the VM, go to Networking > Virtual Cloud Networks.
    • Select your VCN and navigate to the Default Security List.
    • Add ingress rules to allow HTTP (port 80) and HTTPS (port 443) traffic:
 - **Source CIDR**: `0.0.0.0/0` (allows traffic from any IP).
 - **Protocol**: TCP.
 - **Port Range**: 80 and 443.
  1. Assign a Public IP:
    • Ensure your VM instance has a public IP address for external access.
    • In the Instance Details page, verify that a public IP is assigned. If not, create one under Networking > Public IPs.

These steps ensure that your VM is accessible from the internet, enabling Nginx to serve applications.

Generating and Adding SSH Keys

SSH keys are required to securely access your Oracle Cloud VM. If you do not already have an SSH key pair, you can generate one using the following steps:

  1. Generate SSH Keys:
    • On Linux/macOS, run:
 ssh-keygen -t rsa -b 2048
    • On Windows, use tools like PuTTYgen to create the key pair.
  1. Add the Public Key to the VM:
    • During VM creation, paste the contents of your public key (~/.ssh/id_rsa.pub) into the Add SSH Keys section.
    • Alternatively, upload the public key file.
  1. Test SSH Access:
    • After the VM is created, connect to it using the public IP:
 ssh -i ~/.ssh/id_rsa opc@<VM_PUBLIC_IP>

This ensures secure access to your VM for further configuration and deployment.

Installing Nginx on the VM

Once the VM is set up, you can install and configure Nginx to serve your application. The process varies slightly depending on the operating system.

  1. Update the System:
    • On Ubuntu:
 sudo apt update && sudo apt upgrade -y
    • On Oracle Linux:
 sudo dnf update -y
  1. Install Nginx:
    • On Ubuntu:
 sudo apt install nginx -y
    • On Oracle Linux:
 sudo dnf install nginx -y
  1. Start and Enable Nginx:
    • Start the Nginx service:
 sudo systemctl start nginx
    • Enable it to start on boot:
 sudo systemctl enable nginx
  1. Verify Installation:
    • Open a browser and navigate to your VM's public IP. You should see the default Nginx welcome page.

This step ensures that Nginx is installed and running, ready to serve your application.

Configuring Nginx to Serve an Application

To serve an application via Nginx, you need to configure a server block (virtual host). This involves creating a configuration file and pointing it to your application's root directory.

  1. Create a Directory for Your Application:
    • Create a directory to host your application files:
 sudo mkdir -p /var/www/myapp
    • Set the appropriate permissions:
 sudo chown -R $USER:$USER /var/www/myapp
 sudo chmod -R 755 /var/www/myapp
  1. Create an Nginx Server Block:
    • Create a new configuration file:
 sudo nano /etc/nginx/sites-available/myapp
    • Add the following configuration:

  server {
       listen 80;
     server_name your_domain_or_public_ip;
     root /var/www/myapp;
     index index.html;
     location / {
         try_files $uri $uri/ =404;
     }
 }
  1. Enable the Server Block:
    • Create a symbolic link to enable the configuration:
 sudo ln -s /etc/nginx/sites-available/myapp /etc/nginx/sites-enabled/
    • Test the Nginx configuration:
 sudo nginx -t
    • Reload Nginx:
 sudo systemctl reload nginx
  1. Deploy Your Application:
    • Place your application's files (e.g., index.html) in /var/www/myapp.
  1. Access Your Application:
    • Open a browser and navigate to your VM's public IP or domain name. Your application should now be accessible.

This configuration allows Nginx to serve your application efficiently.

Enabling HTTPS with a Free SSL Certificate

Securing your application with HTTPS is critical for protecting user data and improving SEO rankings. You can use Let's Encrypt to obtain a free SSL certificate.

  1. Install Certbot:
  • On Ubuntu:
 sudo apt install certbot python3-certbot-nginx -y
  • On Oracle Linux:
 sudo dnf install certbot python3-certbot-nginx -y
  1. Obtain an SSL Certificate:
  • Run the Certbot command to automatically configure Nginx and obtain a certificate:
 sudo certbot --nginx -d your_domain
  • Follow the prompts to complete the process.
  1. Test HTTPS:
  • Open a browser and navigate to https://your_domain. The connection should now be secure.
  1. Set Up Automatic Renewal:
  • Certbot automatically renews certificates, but you can test the renewal process:
 sudo certbot renew --dry-run

This step ensures that your application is served securely over HTTPS.

By following these steps, you can successfully create and configure an Oracle Cloud Free Tier VM to serve an application via Nginx. Each section builds upon the previous one, ensuring a comprehensive and secure setup.

Configuring Firewall Rules for NGINX

Allowing HTTP and HTTPS Traffic

To ensure that NGINX can serve applications over the web, you must configure the firewall to allow HTTP (port 80) and HTTPS (port 443) traffic. This step is crucial for enabling external access to your web server.

Run the following commands to open the necessary ports:

sudo firewall-cmd --permanent --zone=public --add-service=http

sudo firewall-cmd --permanent --zone=public --add-service=https

sudo firewall-cmd --reload

Alternatively, if you are using iptables, you can execute:

sudo iptables -I INPUT 6 -m state --state NEW -p tcp --dport 80 -j ACCEPT

sudo iptables -I INPUT 6 -m state --state NEW -p tcp --dport 443 -j ACCEPT

sudo netfilter-persistent save

These commands ensure that traffic on ports 80 and 443 is allowed, enabling NGINX to handle HTTP and HTTPS requests.

Configuring Security Lists in Oracle Cloud

In addition to configuring the firewall on the VM, you must update the security lists in Oracle Cloud to allow inbound traffic.

  1. Navigate to Networking > Virtual Cloud Networks in the Oracle Cloud dashboard.
  2. Select your Virtual Cloud Network (VCN).
  3. Go to the Default Security List under the selected VCN.
  4. Add ingress rules for HTTP and HTTPS:
  • Source CIDR0.0.0.0/0 (allows traffic from all IPs).
  • Protocol: TCP.
  • Destination Port Range: 80 (for HTTP) and 443 (for HTTPS).

This configuration ensures that your VM is accessible from the internet. (source)


Automating NGINX Startup

Enabling NGINX to Start on Boot

To ensure that NGINX starts automatically after a system reboot, you need to enable the service:

sudo systemctl enable nginx

This command registers NGINX as a service that starts during the boot process. This step is essential for maintaining uptime and ensuring that your application remains accessible even after planned or unplanned reboots. (source)


Optimizing NGINX Configuration for Performance

Adjusting Worker Processes and Connections

NGINX can handle a large number of simultaneous connections, but its default configuration may not be optimized for high traffic. Modify the nginx.conf file to adjust worker processes and connections:

  1. Open the configuration file:

sudo nano /etc/nginx/nginx.conf

  1. Update the following parameters under the events block:
worker_processes auto;
events {
   worker_connections 1024;
   }
  1. Save and exit the file, then reload NGINX:

sudo systemctl reload nginx

This configuration ensures that NGINX utilizes available CPU cores efficiently and can handle more concurrent connections.

Enabling Gzip Compression

To reduce bandwidth usage and improve load times, enable Gzip compression in the NGINX configuration:

  1. Open the configuration file:

sudo nano /etc/nginx/nginx.conf

  1. Add the following lines under the http block:
gzip on;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;
gzipminlength 1000;
  1. Save and exit the file, then reload NGINX:

sudo systemctl reload nginx

This optimization compresses responses before sending them to clients, reducing page load times.


Configuring Logging and Monitoring

Enabling Access and Error Logs

NGINX provides detailed logs that can help you monitor traffic and troubleshoot issues. By default, access and error logs are enabled in the main configuration file.

  1. Verify the log file paths in /etc/nginx/nginx.conf:
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
  1. To view the logs in real-time, use the tail command:
sudo tail -f /var/log/nginx/access.log
sudo tail -f /var/log/nginx/error.log

These logs are invaluable for identifying traffic patterns and diagnosing server errors.

Integrating with Monitoring Tools

For more advanced monitoring, integrate NGINX with tools like Prometheus or Grafana. Install the NGINX Exporter to expose metrics for Prometheus:

  1. Download and install the NGINX Exporter:
wget https://github.com/nginxinc/nginx-prometheus-exporter/releases/latest/download/nginx-prometheus-exporter-linux-amd64.tar.gz

tar -xvzf nginx-prometheus-exporter-linux-amd64.tar.gz
  1. Configure the exporter to scrape metrics from NGINX:
./nginx-prometheus-exporter -nginx.scrape-uri=http://127.0.0.1:8080/stub_status
  1. Add the exporter as a target in your Prometheus configuration.

This setup provides real-time insights into server performance and traffic.


Implementing Advanced Security Measures

Configuring a Firewall for Specific IPs

To restrict access to your server, configure the firewall to allow traffic only from specific IP addresses:

  1. Add rules to the firewall:
sudo iptables -A INPUT -p tcp -s --dport 80 -j ACCEPT
sudo iptables -A INPUT -p tcp -s --dport 443 -j ACCEPT
  1. Save the configuration:
sudo netfilter-persistent save

This approach enhances security by limiting access to trusted IPs.

Enabling Rate Limiting

To prevent abuse, enable rate limiting in the NGINX configuration:

  1. Open the configuration file:
sudo nano /etc/nginx/nginx.conf
  1. Add the following lines under the http block:
limitreqzone $binaryremoteaddr zone=one:10m rate=10r/s;
  1. Apply the rate limit to a specific location block:
server {
   location / {
       limit_req zone=one burst=5;
   }
}
  1. Save and reload NGINX:
sudo systemctl reload nginx

This configuration limits the number of requests a client can make, protecting your server from denial-of-service attacks.

Configuring NGINX to Serve Applications on Oracle Cloud Free Tier VM

Setting Up a Custom Server Block for Multiple Applications

While the existing content discusses creating a single server block to serve an application, this section focuses on configuring NGINX to host multiple applications on the same Oracle Cloud Free Tier VM. This is particularly useful when you want to maximize the resources provided by Oracle's free tier.

  1. Create Directories for Each Application

For each application, create a separate directory to host its files:

sudo mkdir -p /var/www/app1
sudo mkdir -p /var/www/app2

Assign appropriate permissions:

sudo chown -R $USER:$USER /var/www/app1
sudo chown -R $USER:$USER /var/www/app2
sudo chmod -R 755 /var/www/app1
sudo chmod -R 755 /var/www/app2
  1. Create Separate NGINX Configuration Files

For each application, create a new server block configuration file:

sudo nano /etc/nginx/sites-available/app1

Add the following configuration:

server {
   listen 80;  
   server_name app1.yourdomain.com;  
   root /var/www/app1;  
   index index.html;  
   location / {  
       try_files $uri $uri/ =404;  
   }  
}

Repeat the process for app2 with its respective domain and directory.

  1. Enable the Server Blocks

Create symbolic links for each configuration file:

sudo ln -s /etc/nginx/sites-available/app1 /etc/nginx/sites-enabled/
sudo ln -s /etc/nginx/sites-available/app2 /etc/nginx/sites-enabled/
  1. Test and Reload NGINX

Test the configuration for syntax errors:

sudo nginx -t

If no errors are found, reload NGINX:

sudo systemctl reload nginx

This setup allows you to serve multiple applications from the same VM by leveraging NGINX's virtual hosting capabilities.


Configuring NGINX for Static and Dynamic Content

While the existing content mentions serving static files, this section dives deeper into configuring NGINX to handle both static and dynamic content efficiently.

  1. Serving Static Files

Create a directory for static files:

sudo mkdir -p /var/www/static

Add some static files, for example:

echo "Welcome to the Static Site" | sudo tee /var/www/static/index.html

Update the NGINX configuration file:

server {
   listen 80;  
   server_name static.yourdomain.com;  
   root /var/www/static;  
   location / {  
       try_files $uri $uri/ =404;  
   }  
}
  1. Forwarding Dynamic Requests

For dynamic content (e.g., Node.js or Python applications), configure NGINX as a reverse proxy:

server {
   listen 80;  
   server_name dynamic.yourdomain.com;  
   location / {  
       proxy_pass http://localhost:4556;  
       proxy_http_version 1.1;  
       proxy_set_header Upgrade $http_upgrade;  
       proxy_set_header Connection 'upgrade';  
       proxy_set_header Host $host;  
       proxy_cache_bypass $http_upgrade;  
   }  
}

This configuration forwards requests to a backend application running on port 4556.

By separating static and dynamic content handling, you can optimize performance and simplify application management.


Configuring Load Balancing with NGINX

This section introduces load balancing, which is not covered in the existing content. Load balancing is essential for scaling applications across multiple backend servers.

  1. Set Up Backend Servers

Assume you have two backend servers running on ports 3001 and 3002.

  1. Configure NGINX for Load Balancing

Edit the NGINX configuration file:

upstream backend {
   server localhost:3001;  
   server localhost:3002;  
}

server {   listen 80;  
   server_name loadbalanced.yourdomain.com;  
   location / {  
       proxy_pass http://backend;  
       proxy_http_version 1.1;  
       proxy_set_header Upgrade $http_upgrade;  
       proxy_set_header Connection 'upgrade';  
       proxy_set_header Host $host;  
       proxy_cache_bypass $http_upgrade;  
   }  
}
  1. Test and Reload NGINX

Test the configuration:

sudo nginx -t

Reload NGINX:

sudo systemctl reload nginx

This setup distributes incoming traffic across multiple backend servers, improving scalability and fault tolerance.


Implementing Caching for Static Assets

Caching is a powerful way to enhance performance by reducing server load and speeding up content delivery.

  1. Enable Browser Caching

Add caching headers to your NGINX configuration:

location /assets/ {
   root /var/www/static;  
   expires 30d;  
   add_header Cache-Control "public";  
}
  1. Enable Microcaching for Dynamic Content

Configure NGINX to cache dynamic responses for a short duration:

proxycachepath /var/cache/nginx levels=1:2 keyszone=microcache:10m maxsize=100m inactive=1m usetemppath=off;

server {   
    location / {  
       proxy_cache microcache;  
       proxy_cache_valid 200 1m;  
       proxy_pass http://backend;  
   }  
}
  1. Test and Reload NGINX

Test the configuration:

sudo nginx -t

Reload NGINX:

sudo systemctl reload nginx

Caching significantly reduces response times and improves user experience.


Configuring NGINX for Custom Error Pages

This section adds a unique focus on customizing error pages, which is not covered in the existing content. Custom error pages improve user experience during server issues.

  1. Create Custom Error Pages

Create a directory for error pages:

sudo mkdir -p /var/www/errors

Add an example error page:

echo "404 - Page Not Found" | sudo tee /var/www/errors/404.html
  1. Update NGINX Configuration

Add the following lines to your server block:

error_page 404 /404.html;

location = /404.html {
    root /var/www/errors;  
}
  1. Test and Reload NGINX

Test the configuration:

sudo nginx -t

Reload NGINX:

sudo systemctl reload nginx

Custom error pages provide a professional touch and help users navigate issues effectively.


By implementing these configurations, you can optimize your Oracle Cloud Free Tier VM to serve applications efficiently and scale effectively.

Conclusion

In this article, we’ve walked through a detailed roadmap for setting up an Oracle Cloud Free Tier VM to host and serve applications using Nginx. From the initial steps of creating a virtual machine to advanced configurations that optimize performance and security, this guide has aimed to provide a holistic approach to deploying web applications in a cost-effective manner.

One of the key takeaways is the importance of selecting the right VM shape—whether AMD-based or ARM-based—to make the most of the free tier’s resource limits. Equally critical are the networking and security configurations, such as setting up a Virtual Cloud Network (VCN), assigning public IPs, and fine-tuning security rules to allow HTTP and HTTPS traffic. These steps ensure that your VM is not only accessible but also protected from unauthorized access. Additionally, generating SSH keys and testing secure connections are foundational practices for managing your VM with confidence.

The implications of this guide extend beyond technical implementation. For developers and small businesses, leveraging Oracle Cloud’s free tier offers a powerful opportunity to deploy scalable and secure applications without incurring additional costs. By following the steps outlined here, users can build robust web applications while maximizing the benefits of Oracle’s free-tier offerings.

Looking ahead, there are exciting opportunities to further enhance your setup. Integrating monitoring tools like Prometheus and Grafana can provide real-time insights into your application’s performance, as detailed in the NGINX Prometheus Exporter documentation . Alternatively, exploring containerization with Docker could add flexibility to your deployment strategy.

Ultimately, this guide serves as a foundation for anyone looking to harness the potential of Oracle Cloud’s free tier. With the right configurations and optimizations, you can create a reliable, high-performing environment for your applications—without breaking the bank. Whether you’re a developer experimenting with cloud technologies or a small business owner seeking cost-effective solutions, this approach empowers you to achieve more with less. For further details on Nginx optimization, refer to the Oracle Linux Nginx documentation.


💡 Important Disclosure

This article contains affiliate links, which means I may earn a small commission if you click through and make a purchase—at no additional cost to you. These commissions help support the ongoing creation of helpful content like this. Rest assured, I only recommend products and services I personally use or genuinely believe can provide value to you.

Thanks for Your Support!
I truly appreciate you taking the time to read my article. If you found it helpful, please consider sharing it with your friends or fellow makers. Your support helps me continue creating content like this.

  • Leave a Comment: Got questions or project ideas? Drop them below—I'd love to hear from you!
  • Subscribe: For more tutorials, guides, and tips, subscribe to my YouTube channel and stay updated on all things tech!
  • Shop & Support: If you're ready to get started, check out the recommended products in my articles using my affiliate links. It helps keep the lights on without costing you anything extra!

Thanks again for being part of this community, and happy building!