Monday, May 22, 2023

Experiments with ECS: 1 - Deploying 1 microservice in ECS

 

NOTE - 

a. Should have ideally kept the RDS database in a Private Subnet

b. Could also have kept the ECS cluster in Private Subnet (Ref URL - https://repost.aws/knowledge-center/ecs-fargate-tasks-private-subnet)


Step 1 - Create a flask application that talks to a locally installed Postgres database

Ref URL - REST API Design Principles: https://www.freecodecamp.org/news/rest-api-best-practices-rest-endpoint-design-examples/


1.1 Create a Flask application


from flask import Flask

app = Flask(__name__)

@app.route(‘/')

def hello_world():

    return’ Hello World’


if __name__ == ‘__main__’:

    app.run(debug=True)


Tested @ http://127.0.0.1:5000/


1.2 Connect the application to Postgres


from flask import Flask, abort, jsonify

import psycopg2


app = Flask(__name__)


myconn = psycopg2.connect(database = "DataWarehouseX", user = "postgres", password = "xxxxx", host = "localhost", port = "5432")

mycursor = myconn.cursor()


@app.route("/api/1.0/products")

def products_view():

    try:

        global mycursor

        mycursor.execute("SELECT * FROM core.dim_product")

        db=[]

        for x in mycursor:

            db.append(x)        

        return jsonify(db)

    except IndexError:

        abort(404)

        

@app.route("/api/1.0/products/<id>")

def product_view(id):

    try:

        global mycursor

        cmd = "SELECT * FROM core.dim_product where product_id = " + "'"+id+"'"

        mycursor.execute(cmd)

        db=[]

        for x in mycursor:

            db.append(x)        

        return jsonify(db)

    except IndexError:

        abort(404)


if __name__ == '__main__':

    app.run(debug=True)


API in action -



Step 2 - Dockerize the application and run containers locally

Ref URL - https://www.freecodecamp.org/news/how-to-dockerize-a-flask-app/


app.py -

#To connect to the localhost postgresDB, switched to 'host.docker.internal'


from flask import Flask, abort, jsonify

import psycopg2

app = Flask(__name__)

myconn = psycopg2.connect(database = "DataWarehouseX", user = "postgres", password = "xxxx", host = "host.docker.internal", port = "5432")

mycursor = myconn.cursor()

@app.route("/")

def hello_world():

    return 'Hello from docker!'

@app.route("/api/1.0/products")

def products_view():

    try:

        global mycursor

        mycursor.execute("SELECT * FROM core.dim_product")

        db=[]

        for x in mycursor:

            db.append(x)        

        return jsonify(db)

    except IndexError:

        abort(404)

        

@app.route("/api/1.0/products/<id>")

def product_view(id):

    try:

        global mycursor

        cmd = "SELECT * FROM core.dim_product where product_id = " + "'"+id+"'"

        mycursor.execute(cmd)

        db=[]

        for x in mycursor:

            db.append(x)        

        return jsonify(db)

    except IndexError:

        abort(404)

if __name__ == '__main__':

    app.run(debug=True)



Dockerfile -

FROM python:3.8-slim-buster
WORKDIR /python-docker
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
COPY . .
CMD [ "python3", "-m" , "flask", "run", "--host=0.0.0.0"]


Requirements.txt

#Had issues using the flask version I was using in Spyder. After checking the below page, decide to remove the flask version from requirements.txt
#https://stackoverflow.com/questions/71718167/importerror-cannot-import-name-escape-from-jinja2

#Had issues building docker image with psycopg2.  Hence, switched to psycopg2-binary


flask
psycopg2-binary



Step 3 - Create a RDS postgres database, load data into it, switch to using it in your rest API application

Ref URL - https://sakyasumedh.medium.com/deploy-backend-application-to-aws-ecs-with-application-load-balancer-step-by-step-guide-part-1-91935ae93c51

Had to create a Public database so that can connect to it from my LVDI and load data. Plus also connect to it from my dockerized rest API application on my local machine

Data migration -

Ref URL - https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ConnectToPostgreSQLInstance.html

Exported data from local database table into a CSV file
Imported data from CSV file into RDS database table

Connection from container to WWW works out of the box

Hence, only had to make the following change to the app.py file -

myconn = psycopg2.connect(database = "DataWarehouseX", user = "postgres", password = "xxxx", host = "rest-ecs-db.c6nvu354y8s3.us-east-1.rds.amazonaws.com", port = "5432")

Also had to fix the inbound rule for the RDS database instance to allow inbound connection from anywhere on port 5432


Step 4 - Create an ECR repo and push images to that repo

Ref URL - https://sakyasumedh.medium.com/deploy-backend-application-to-aws-ecs-with-application-load-balancer-step-by-step-guide-part-2-e81d4daf0a55

Created an ECR repo

Ran AWS configure in Visual Studio terminal where the image file exists

Then followed the push commands specified on the ECR console


Step 5 - Setup ECS cluster and deploy application to ECS

Ref URL - https://sakyasumedh.medium.com/deploy-backend-application-to-aws-ecs-with-application-load-balancer-step-by-step-guide-part-3-b8125ca27177

Had to expose port 5000




Step 6 - Add a load balancer

Ref URL - https://sakyasumedh.medium.com/setup-application-load-balancer-and-point-to-ecs-deploy-to-aws-ecs-fargate-with-load-balancer-4b5f6785e8f

Had to expose port 5000 everywhere (in ALB listened, in Target Group etc.)



Step 7 - Add a custom domain name

https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-to-elb-load-balancer.html

Had to register domain. Will automatically create a public hosted zone.

Creating a CNAME record like 

api.anandmusings.link pointing to the alb - rest-ecs-rds-alb-445658067.us-east-1.elb.amazonaws.com worked




Alias record wasn't working for some reason. Deleted a few times and created afresh. And that too started working 





No comments: