April 12, 2020

5 ways to deploy Flask

In this post, I’m going to explore 5 ways to deploy a Flask application. In all examples I’m going to use a simple app from Flask docs:


from flask import Flask
app = Flask(__name__)

def hello_world():
    return 'Hello, World!'

if __name__ == '__main__':

Local machine

This option is used when you need to test your application on a local machine. By simply running app.py you spin up a server and can call the endpoints locally, but this particular scenario help when you need to integrate your app with external service. Think of a service that sends notifications on the progress of sending an email - you send an email and your email provider calls your API asynchronously to notify you when the email was delivered and opened.

For that use ngrok - the install is pretty easy - https://ngrok.com/download. Now you can run Flask:

➜  python3 app.py
 * Serving Flask app "app" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on (Press CTRL+C to quit)

In a separate terminal session run:

➜  ./ngrok http 5000
ngrok by @inconshreveable                                                                                                                                            (Ctrl+C to quit)

Session Status                online
Account                       Alex Smirnov (Plan: Free)
Version                       2.3.35
Region                        United States (us)
Web Interface       
Forwarding                    http://473b0854.ngrok.io -> http://localhost:5000
Forwarding                    https://473b0854.ngrok.io -> http://localhost:5000

Connections                   ttl     opn     rt1     rt5     p50     p90
                              0       0       0.00    0.00    0.00    0.00

In the last to lines - Forwarding - there are 2 URLs (secured and plain) that are accessible form the internet.

Bare metal Linux

In that example you will need:

  • nginx - it will proxy incoming requests to uwsgi gateway
  • uwsgi - runs Python interpreter in some workers executing app code
  • systemd - Linux system that allows to (auto-)start, stop and monitor background processes


On a Debian based system with python3 run these commands

➜ sudo apt-get update
➜ sudo apt-get install python3-pip python3-dev nginx
➜ pip install flask uwsgi

Virtual environment

Create a virtual environment with

➜ mkdir ~/flask_app
➜ cd ~/flask_app
➜ virtualenv env
➜ source env/bin/activate


Create a uwgi.ini file in the same directory with the following content:

module = app:app

master = true
processes = 5

socket = flask.sock
chmod-socket = 660
vacuum = true

die-on-term = true

In the module configuration you need to specify a module (a single app.py file in this case) and Flask application variable name with a semicolon in between.

Next, processes = 5 will run 5 simultaneous instances of uwsgi workers, thus allowing 5 simultaneous requests to the app. You will need to experiment with that number to find a balance between memory consumption and a load you expect on your app.

Then, there is a socket creation section - socket = flask.sock. It will create a socket file. This is a mechanism for nginx to communicate with uwsgi. There is an option to communicate over HTTP. With the socket option, the file will be created in the same directory.


Create a file in this path /etc/nginx/sites-available/flask with the following content It will instruct nginx to proxy requests to a socket file created by uwsgi

server {
    listen 80;
    server_name server_domain_or_IP;

    location / {
        include uwsgi_params;
        uwsgi_pass unix:/home/as/flask_all/flask.sock;

You can restart nginx by sudo systemctl restart nginx to apply this config.

Background service

Create this file in this path /etc/systemd/system/flask.service

Description=uWSGI instance to serve flask

ExecStart=/home/as/flask_app/venv/bin/uwsgi --ini uwsgi.ini


Check the path - in this example, it points to an absolute path on my machine. So change is appropriately. Now we can start a service and enable it to start at boot:

➜ sudo systemctl start flask
➜ sudo systemctl enable flask

Flask app is accesible now in a browser on HTTP port 80


For that, you’ll need 3 things

  • docker image with Flask app
  • container image repository
  • container orchestration (K8s, AWS ECS)

I’ve covered that scenario in these posts


There are several PaaS platforms that you can run Flask on. They include AWS Elastic Beanstalk, GCP App Engine. The nice thing about them is that you don’t need to worry about OS, loadbalancing and autoscaling - platforms will to it for you.

I’m going to cover AWS Elastic Beanstalk here. First, get an AWS account and an install eb CLI tool locally - see the docs here

AWS EB needs a requirements.txt file listing all the required dependencies. Generate it from local virtual environment with:

➜ pip freeze > requirements.txt

Now in the folder with app.py and requirements.txt (we don’t need any files from the examples above), create EB repository with:

➜ eb init -p python-3.6 flask-app-helloworld-as --region eu-west-1
Application flask-app-helloworld-as has been created.

and later deploy it with:

➜ eb create flask-app-helloworld-as

It will take some time (~10 minutes) and will create a bunch of resources - don’t forget to clean up if you don’t need it later with eb terminate flask-app-helloworld-as command.

You can reach your deployed app from CLI by eb open flask-app-helloworld-as.


Serverless is the most modern way to deploy web apps. I’m using this one for all my pet projects as AWS gives 1M free requests per month allowing me to run my apps at no cost.

In order to deploy Flask as an AWS Lambda I’m using a Zappa project. All you need is 3 commands:

➜ pip install zappa
➜ zappa init
➜ zappa deploy

The second command asks a bunch of question interactively about the environments and some configuration. The last command shows an URL where the app is available now:

Your updated Zappa deployment is live!: https://l17c5t2uhe.execute-api.us-east-1.amazonaws.com/dev

Support the author - Buy me a coffee!

Comments powered by Talkyard.

© Alexey Smirnov 2021