The solution for scalable web-services in future is microservices. Micro-services is a way of designing a complex system into a loosely-coupled components in functional level. There are no hard and fast rules to create a service, it is completely as per the business requirements and expected scalability of a service in future. A good design leads to smooth-functioning of service eco-system. Many companies realised the potential of microservices and already started migrating their existing monolithic systems to microservices. As there are advantages, there are few challenges too.
The advantages are: -
1. Failure of single service doesn’t affect the functioning of whole-system.
2. Every service can have its own database.
3. Can be implemented in multiple languages as per required throughput and performance.
4. Multiple teams can work from multiple locations.
5. Faster development.
6. Very complex applications also can be broken down to small problems.
7. Ease of containerisation.
8. Coupling of a new service is easy.
The main challenges are: -
2. Inter Process Communication.
3. Maintenance of API gateways.
4. Monitoring of services.
This article further covers implementation of micro-services with a small use-case example in python, IPC using gRPC, dockers: -
Python microframework flask is used for web development.
Here the term ‘gRPC’ term refers to serialization technique for transfer of structured data between services using ‘protobuf’ instead of ‘json’ or ‘xml’ format. gRPC implements protobuf for transfer of data which is in binary format which works more efficiently compared to other formats. for more information on implementation of gRPC in python click here.
Docker containers are used for running the services in isolation which eases development, testing and integration process. For installation of dockers and more info click here.
Implementing Microservices with a Basic Math Example to Compute 'nCr'
- calculation of ‘nCr’ for which formula is n! / ( (n-r)! * r! )
- which requires calculation of n!, (n-r)! and r!
- hence calling of ‘nCr’ function should invoke ‘factorial’ function 3 times.
Imagine service holding ‘nCr’ function acts a client and makes gRPC call to service holding ‘factorial’ function which is acting as a server.
so we create two services: -
calculation of ‘nCr’
depends on service2 for factorial calculation
acts as API Gateway
acts as a gRPC Client
calculation of ‘factorial’
act as a gRPC server
Service1 has web-service based on flask ‘ncr.py’ and acts as a API gateway, which also has also a gRPC client ‘client.py’ for calling ‘factorial’ function which is ‘server.py’ from service2.
Service2 has ‘server.py’ which is gRPC server for factorial function and ‘fact.py’ which is a original function for calculation of factorial.
In this case services namely ‘service1’ and ‘service2’ are running on same machine, whereas in the actual scenario service1 and service2 might be running from different locations and on different networks or machines.
- install dockers (sudo apt-get install docker-ce)