Image 14281+ users

Implementing MicroServices in Python

Implementing MicroServices in Python

29/11/18   20 minutes read     994 Naren Allam

pythonjsonyamldockerprotobufbash

  The solution for scalable web-services in future is microservices. Micro-services is a way of designing a complex system into a loosely-coupled components in functional level. There are no hard and fast rules to create a service, it is completely as per the business requirements and expected scalability of a service in future. A good design leads to smooth-functioning of service eco-system. Many companies realised the potential of microservices and already started migrating their existing monolithic systems to microservices. As there are advantages, there are few challenges too.

The advantages are: -

    1. Failure of single service doesn’t affect the functioning of whole-system.
    2. Every service can have its own database.
    3. Can be implemented in multiple languages as per required throughput and performance.
    4. Multiple teams can work from multiple locations.
    5. Faster development.
    6. Very complex applications also can be broken down to small problems.
    7. Ease of containerisation.
    8. Coupling of a new service is easy.
The main challenges are: -
    1. Architecture.
    2. Inter Process Communication.
    3. Maintenance of API gateways.
    4. Monitoring of services.
  This article further covers implementation of micro-services with a small use-case example in python, IPC using gRPC, dockers: -
    Python microframework flask is used for web development.
    Here the term ‘gRPC’ term refers to serialization technique for transfer of structured data between services using ‘protobuf’ instead of ‘json’ or ‘xml’ format. gRPC implements protobuf for transfer of data which is in binary format which works more efficiently compared to other formats. for more information on implementation of gRPC in python click here.
    Docker containers are used for running the services in isolation which eases development, testing and integration process. For installation of dockers and more info click here.
Implementing Microservices with a Basic Math Example to Compute 'nCr'

    - calculation of ‘nCr’ for which formula is n! / ( (n-r)! * r! )
    - which requires calculation of n!, (n-r)! and r!
    - hence calling of ‘nCr’ function should invoke ‘factorial’ function 3 times.

  Imagine service holding ‘nCr’ function acts a client and makes gRPC call to service holding ‘factorial’ function which is acting as a server.
so we create two services: -
- service1
    calculation of ‘nCr’
    depends on service2 for factorial calculation
    acts as API Gateway
    acts as a gRPC Client
- service2
    calculation of ‘factorial’
    act as a gRPC server
  Service1 has web-service based on flask ‘ncr.py’ and acts as a API gateway, which also has also a gRPC client ‘client.py’ for calling ‘factorial’ function which is ‘server.py’ from service2.
  Service2 has ‘server.py’ which is gRPC server for factorial function and ‘fact.py’ which is a original function for calculation of factorial.
  In this case services namely ‘service1’ and ‘service2’ are running on same machine, whereas in the actual scenario service1 and service2 might be running from different locations and on different networks or machines.
  Requirements: -
    - python3
    - install dockers (sudo apt-get install docker-ce)

Common For Both Services

 Setting-up gRPC

  - create a .proto file
  - create message formats required for request and response data types.
  - create service formats for function parameters and return types.

PROTOBUF  Copy
                    
                      # fact.proto
syntax = "proto3";

message Number {
    float value = 1;
}

service Factorial {
    rpc Factorial(Number) returns (Number) {}
}
                    
                  

Change to the directory where ‘fact.proto’ file is located and execute following commands for creating gRPC generated meta class python files.

BASH  Copy
                    
                      $ pip install grpcio
$ pip install grpcio-tools
$ python -m grpc_tools.protoc -I. --python_out=. --grpc_python_out=. fact.proto
                    
                  

Now check for new files named

    ‘fact_pb2.py’ &
    ‘fact_pb2_grpc.py’

Service1

PYTHON  Copy
                    
                      # service1/client.py
# grpc client python
import sys
import os
import grpc

# import the generated classes
import fact_pb2
import fact_pb2_grpc

# ENV variables
try:
    # while using container env variables
    HOST = os.environ['HOST']
    PORT = os.environ['PORT']
except:
    # else use localhost
    HOST = 'localhost'
    PORT = 50051

# grpc channel
channel = grpc.insecure_channel(f'{HOST}:{PORT}') # for python3

# stub (client)
stub = fact_pb2_grpc.FactorialStub(channel)

def get_fact(n):
    """
    takes number as input makes grpc server call and 
    returns factorial number.
    """
    number = fact_pb2.Number(value=n)
    response = stub.Factorial(number)
    return response

if __name__=="__main__":
    # pass an value as argument to the cmd
    # eg. python client 7
    get_fact(float(sys.argv[1]))
                    
                  
DOCKER  Copy
                    
                      # service1/Dockerfile
# image for python 3.6 with grpc
FROM grpc/python:1.4-onbuild

# Copy all contents from host to container
COPY . /app

# Set the application directory
WORKDIR /app

# Install our requirements.txt
RUN pip install --upgrade pip && \
    pip install -r requirements.txt

# Make port 5000 available for links and/or publish
EXPOSE 5000

# Define our command to be run when launching the container
CMD ["python", "ncr.py"]
                    
                  
PYTHON  Copy
                    
                      # service1/ncr.py
#!/usr/bin/env python3

from flask import Flask, jsonify

from client import get_fact 

app = Flask(__name__)

@app.route('/api/ncr/<x>/<y>',methods=['GET']) 
def ncr(x=None, y=None):
    x, y = float(x), float(y)
    if x == None or y == None or x < y:
        return jsonify({"input":(x,y), "ncr":None, "msg":\
            "supply values in formart '/api/ncr/'big_no'/'small_no'' in the url"})
    # calling Factorial client function
    result = get_fact(x).value / (get_fact(x-y).value * get_fact(y).value)
    return jsonify({"input":(x,y), "ncr":result, "msg":"Success"})

if __name__=='__main__':
    app.run(host="0.0.0.0", port=5000)
                    
                  

Service2

DOCKER  Copy
                    
                      # service2/Dockerfile
# image for python 3.6 with grpc
FROM grpc/python:1.4-onbuild

# Copy all contents from host to container
COPY . /app

# Set the application directory
WORKDIR /app

# Install our requirements.txt
RUN pip install --upgrade pip && \
    pip install -r requirements.txt

# Make port 5001 available for links and/or publish
EXPOSE 5001

# Define our command to be run when launching the container
CMD ["python3", "server.py"]
                    
                  
PYTHON  Copy
                    
                      # service2/fact.py

def fact(n):
    if n in [0,1]:
        return 1
    else:
        return fact(n-1) * n

                    
                  
PYTHON  Copy
                    
                      # service2/server.py
# grpc server python

import grpc
from concurrent import futures
import time

# import the generated classes
import fact_pb2
import fact_pb2_grpc

# import original fact.py
import fact

class FactorialServicer(fact_pb2_grpc.FactorialServicer):
    """
    inherited class of grpc_tools generated FactorialServicer
    """
    def Factorial(self, request, context):
        """
        takes from request.value and returns factorial 
        original method 'fact' is imported 
        """
        response = fact_pb2.Number()
        response.value = fact.fact(request.value)
        return response

# grpc server
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))

# use the generated function `add_FactorialServicer_to_server`
# to add the defined class to the server
fact_pb2_grpc.add_FactorialServicer_to_server(FactorialServicer(), server)

# listen on port 50001
print('Starting server. Listening on port 50051.')
server.add_insecure_port('[::]:50051')
server.start()

# since server.start() will not block,
# a sleep-loop is added to keep alive
try:
    while True:
        time.sleep(86400)
except KeyboardInterrupt:
    server.stop(0)
                    
                  

After creation of all services change directory to microservices and execute following command: -

BASH  Copy
                    
                      docker-compose up
                    
                  

And check for url localhost:5000/api/ncr/x/y
Inplace of x, y give some numbers
For e.g
http://localhost:5000/api/ncr/20/4
you must be to see the results as follows: -

JSON  Copy
                    
                      {"input":[20.0,4.0],"msg":"Success","ncr":4844.999870499092}