Setting up AWS Production MariaDB

This was an extremely exciting and pivotal milestone: I “set up” my AWS Production MariaDB. Kind of. Honestly, the AWS RDS options can feel… pricy for somebody with limited funds. At $12 a month for an extremely low-usage database, that’s just too expensive. However, an AWS LightSail WordPress Instance already has a MariaDB. So for that low monthly rate of $5, I get both a 24×7 WordPress site AND a relational database. Now that the application is “mostly finished” (I can later go back and start playing with some Python statistics libraries later), I’m ready to promote my database to a production environment.

Here are the steps for setting up my production database.

  • Database Schema Creation
  • SSH Tunnel
  • Export Data
  • Build Production Database
  • Import Data
  • .env File Notes

Though development passwords are shared (that only run on my local laptop), production ones will not be shared.

Database Schema Creation

The first thing that I did was SSH onto my WordPress Instance. To figure out the database connections, I ran the command:

cat /opt/bitnami/wordpress/wp-config.php

and looked for the block which looks much like:

define( 'DB_NAME', 'your_database_name' );
define( 'DB_USER', 'your_database_user' );
define( 'DB_PASSWORD', 'your_database_password' );
define( 'DB_HOST', 'your_database_host' );

Yes, the DB Password will be a long hexadecimal value! Using those found credentials, I logged into the database:

mariadb -u root -p

Then I ran these commands to set up the new schema, added a new application user, and granted its privileges:

CREATE DATABASE job_search;
SHOW DATABASES;
CREATE USER 'django_user'@'%' IDENTIFIED BY 'super_long_cryptic_password';
GRANT ALL PRIVILEGES ON job_search.* TO 'django_user'@'%';
FLUSH PRIVILEGES;
SHOW GRANTS FOR 'django_user'@'%';

SSH Tunnel

An SSH tunnel is a secure, encrypted connection between a local computer and a remote system, established using the Secure Shell (SSH) protocol. It allows you to safely transmit data over untrusted networks (such as the Internet) by forwarding traffic through an encrypted “tunnel.” This technique is commonly used to access private services or internal systems not directly accessible from the outside world.

The primary use case for an SSH tunnel is to securely access remote resources—such as databases, web applications, or internal networks—without exposing them to the public internet. For example, if a MariaDB server is only accessible within a private network, you could use an SSH tunnel to connect to it from your local machine without opening external firewall ports. SSH tunnels can also bypass network restrictions or securely transmit sensitive data.

Enable AWS SSH Tunnel

On my AWS instance, enabling SSH tunnels turned out to be really straightforward. I edited the file at:

sudo nano /etc/ssh/sshd_config

Find the line that starts with AllowTcpForwarding (it may be commented out with a # symbol) and uncomment it.

#AllowTcpForwarding yes

Restart the SSH service to apply the changes:

sudo systemctl restart sshd

Start Windows SSH Tunnel

An SSH tunnel can be created through your GUI SQL editor. I’ve been using HeidiSQL, which did not behave well with setting up the tunnel on my laptop. By the time I had figured this out, I had already spent too many hours trying to get it to work and… though I do use it some to determine what the code is doing in the database, it just was not worth another few hours fixing HeidiSQL. Instead, I went with the Windows 10 Command which will also be used by my application:

start /B ssh.exe bitnami@34.216.225.2 -p 22 -i "C:\Users\brian\.ssh\aws-bitnami-key.pem" -N -L 3309:127.0.0.1:3306

This command starts the tunnel in a background process, freeing up the console (which I will run inside a script) to do other tasks. The localhost port 3309 is mapped to the tunnel over to AWS. The authentication uses the AWS PEM Key which you can get directly from your instance.

Export Data

Exporting the data had a bit of a snag: on my Windows 10 laptop, it did not automatically handle the UTF-8 encoding. So I wrote this small script to handle this:

import sys
import os
import django
from django.core.management import call_command

# Set the settings module
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_react_job_search.settings')

# Initialize Django
django.setup()

# Ensure stdout uses UTF-8 encoding
sys.stdout.reconfigure(encoding='utf-8')

# Open the file to dump the data into
with open("data.json", "w", encoding="utf-8") as f:
    call_command("dumpdata", "--natural-primary", "--natural-foreign", "--indent", "2", stdout=f)

With this script polished off, I was able to run the script:

python dumpdata_utf8.py

Build Production Database

With the data dumped, we are now ready to build the production database. This turned out to be uneventful:

python manage.py migrate

Load Production Data

The loaddata was suspenseful. I ran the command and… what’s going on?! Yeah, over my SSH tunnel with 14MB of data, it took a few minutes to run.

python manage.py loaddata data.json

.env File Notes

My project now has two Django env files (technically, three: .env, .env.development, .env.production). I’m not going to share with you my production one. Here is the development one. When I change which environment mode I’m in, I copy the appropriate one into the primary .env file which is what Django is reading. I’m sharing this here for the database settings that the SSH Tunnel will be using along with the differences in Database Login Parameters:

DJANGO_ENV=development

# Development AWS environment configuration
ENABLE_AUTH=True  # Enable authentication for AWS development

# CORS settings for development
CORS_ALLOW_CREDENTIALS=True
CORS_ORIGIN_ALLOW_ALL=True    # Simple for Dev, allow any and all
CORS_ALLOWED_ORIGINS=http://localhost:3000    #not necessary with allow all set to true.

# Optional: You could also specify different Django settings for your backend
DEBUG=False
SECRET_KEY=your-secret-key

# Database connection
DB_HOST=localhost
DB_PORT=3306
DB_NAME=django_job_search
DB_USER=django_app
DB_PASSWORD=django_app

# Optional settings for production
# ENABLE_LOGGING=True  # Enable detailed logging in production (commented out for now)

Summary

Once I finally had my database up and running, I was ecstatic. All of my data came up perfectly through the app and everything just… works. I’m super excited about standing up the production docker image this next week!