How to Deploy Next.js, FastAPI, and PostgreSQL with Shell Scripts


In this tutorial, I will show you how to deploy a Next.js, FastAPI, and PostgreSQL stack using shell scripts. This is the continuation of a previous tutorial on how to build a Full Stack NFP Boilerplate.

The branch for this tutorial can be found here:

The complete project can be found here:

Pick a cloud provider

The first step is to pick a cloud provider. It doesn’t matter which one you pick as long as you can spin up a server running Ubuntu 20.04, as that is the OS that we will be using in this tutorial. I have used the following services before and they work pretty well.

  • DreamHost DreamCompute
  • AWS Lightsail
  • Digital Ocean

Once you’ve picked one, go ahead and launch an instance with Ubuntu 20.04 and save the key pair pem file.

The shell scripts

In nfp-boilerplate create a new nfp-devops directory for the deployment files.

$ mkdir nfp-devops
$ cd nfp-devops

Create a and file with the following content:

export USER=ubuntu
export HOST=123.456.789.10
export DB_USER=nfp_boilerplate_user
export DB_PASSWORD=password
export DB_NAME=nfp_boilerplate_dev
export SSH_KEY_PATH=key.pem

The file contains the example variables that will be used to provision the server and deploy the application. The file will contain the “real” values, such as the actual IP address and path to the ssh key pair pem file that you downloaded earlier. The file will be git ignored later, as it contains secrets that should not be checked into the repository.

Create a file.


# update
sudo apt-get update

# nginx
sudo apt-get install -y nginx

# nodejs
curl -fsSL | sudo -E bash -
sudo apt-get install -y nodejs

# python
sudo apt-get install -y python3 python3-venv python3-pip

# postgresql
sudo sh -c 'echo "deb $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
wget --quiet -O - | sudo apt-key add -
sudo apt-get update
sudo apt-get -y install postgresql

# pm2
sudo npm install -g pm2

# for psycopg2
sudo apt-get install -y libpq-dev build-essential

# postgres user and db
sudo -u postgres createuser $DB_USER
sudo -u postgres createdb $DB_NAME
sudo -u postgres psql -c "ALTER role $DB_USER WITH PASSWORD '$DB_PASSWORD'"

The script does the following:

  1. Load the variables from
  2. Update the apt packages.
  3. Install nginx.
  4. Install nodejs.
  5. Install python.
  6. Install postgresql.
  7. Install pm2.
  8. Install dependencies for psycopg2.
  9. Create the database user with password and database as defined in

Create an file.

ssh-add $SSH_KEY_PATH
scp $USER@$HOST:
ssh $USER@$HOST ./

The script does the following:

  1. Load the variables from
  2. Add the ssh key pair to the authentication agent.
  3. Upload the and file to the server.
  4. Run the file on the server.

Create a file.


ssh-add $SSH_KEY_PATH

# nfp-backend
rsync -av ../nfp-backend $USER@$HOST: --exclude=venv
cp ../nfp-backend/alembic.ini .
cp ../nfp-backend/.env .
sed -i '' "s/sqlalchemy.url =.*/sqlalchemy.url = postgresql:\/\/$DB_USER:$DB_PASSWORD@localhost\/$DB_NAME/g" alembic.ini
sed -i '' "s/DATABASE_URL=.*/DATABASE_URL= postgresql:\/\/$DB_USER:$DB_PASSWORD@localhost\/$DB_NAME/g" .env
scp .env alembic.ini $USER@$HOST:~/nfp-backend
ssh $USER@$HOST "
    cd nfp-backend
    python3 -m venv venv
    . venv/bin/activate
    pip install -r requirements.txt
    alembic upgrade head
    pm2 delete nfp-backend
    pm2 start 'gunicorn -w 4 -k uvicorn.workers.UvicornWorker main:app' --name nfp-backend

# nfp-frontend
rsync -av ../nfp-frontend $USER@$HOST: --exclude=node_modules
sed -r "s/{HOST}/$HOST/g" .env.template > .env.local
scp .env.local $USER@$HOST:~/nfp-frontend
ssh $USER@$HOST "
    cd nfp-frontend
    npm install
    npm run build
    pm2 delete nfp-frontend
    pm2 start 'npm start' --name nfp-frontend

# nginx
cp default.template.conf default.conf
sed -i '' "s/{HOST}/$HOST/g" default.conf
sed -i '' "s/{DOMAIN}/$DOMAIN/g" default.conf
scp default.conf $USER@$HOST:
ssh $USER@$HOST "
    sudo cp default.conf /etc/nginx/conf.d
    sudo service nginx restart

The script does the following:


  1. Load the variables from
  2. Add the ssh key pair to the authentication agent.


  1. Rsync the nfp-backend folder to the server and exclude the venv directory. We want to skip that directory as we will install packages on the server.
  2. Copy the alembic.ini and .env from the nfp-backend folder.
  3. Using the sed command line tool, we will search and replace the database credentials in these files with the ones from
  4. Upload the files to the server.
  5. SSH into the server, and cd into the nfp-backend directory.
  6. Create a virtual environment.
  7. Activate the virtual environment.
  8. Install the packages.
  9. Run the database migrations with alembic.
  10. Delete any currently running backend process managed by pm2.
  11. Start the backend process with pm2.


  1. Rsync the nfp-frontend folder to the server and exclude the node_modules and .next directories.
  2. Replace the HOST variables in .env.template with the one defined in and output to .env.local.
  3. Upload the file to the server.
  4. CD into nfp-frontend, install the node packages and run the build.
  5. Delete any currently running frontend process managed by pm2.
  6. Start the frontend process with pm2.


  1. Copy the default.template.conf to default.conf.
  2. Replace the HOST and DOMAIN variables in default.conf using sed with the variables defined in
  3. Upload the default.conf file to the server.
  4. Copy it to the /etc/nginx/conf.d.
  5. Restart nginx.

Create a default.template.conf file.

server {
  listen 80;

  server_name {HOST} {DOMAIN};

  location /api {
    rewrite ^/api/(.*)$ /$1 break;
    proxy_pass http://localhost:8000;

  location / {
    proxy_pass http://localhost:3000;

This is the Nginx configuration file. Here’s a quick summary of what the configuration does:

  1. Tell Nginx to listen on port 80. Web traffic comes in through port 80.
  2. Set the HOST IP and DOMAIN. It’s ok if you don’t have a domain yet.
  3. Route any incoming request with /api/ prefix to the http://localhost:8000 where our backend FastAPI app will be listening for requests. The (.*) captures everything after /api/ and passes to whatever is defined as the proxy_pass. For example GET http://123.456.789.10/api/notes/ will get routed to GET http://localhost:8000/notes/.
  4. Route the root path / to http://localhost:3000 where our Next.js server is listening for requests. All subpaths will be routed. For example http://123.456.789.10/notes will get routed to http://localhost:3000/notes.

Create a .env.template file.


This file defines the API URL for Next.js. It is used in the Next.js app to know where to make the backend API calls. It will be copied to .env.local. For more info on how .env files work in Next.js, see the Next.js documentation.

Create a .gitignore file.


This file contains all the things we wish to prevent checking into git – temp files generated during the deployment process or files containing secrets.


The next step is to make sure all of the variables are correct in, particularly the HOST and SSH_KEY_PATH. It is recommended to change the other variables for increased security.

Grant execution permissions on the shell scripts.

$ chmod +x

Run the script.

$ ./

After the server is finished provisioning, run the script.

$ ./

The script can be used whenever you need to do a deployment.

Finally, navigate to your host IP address. You should see the Next.js welcome page. If you navigate to the /notes subpath, you should see the fully functional full-stack application.


Congratulations. With this NFP Boilerplate, you’ll have a cost-efficient way to host a large number of web apps on a single machine. Don’t like the stack? You can use the same strategy for other web frameworks, databases, and Linux distros.

After spending so many hours tinkering with container orchestration and configuration management tools, sometimes you just want to go back to doing things the simple old-school way. This deployment strategy works for anyone who needs to get something simple up and running without too much money or time tinkering with complex tools.

If you found this post useful, follow me for more full-stack web development tips.