LogStreaming Microservices using Kafka + ELK Stack

Ability to collect, store and read logs effectively is crucial while developing micro services.
Instead of writing logs to files, which are I/O expensive and requires extra processing to debug them when deployed in prod environments.
Instead, we can leverage KAFKA topics to write logs as messages. These messages can then be pulled by logstash and stored in Elastic Search. We can also use Kibana to monitor these logs.
When any log event happens. Each microservice should use Kafka Producer library to send message to a specific topic
For example,
Login Microservice should send messages to kafka topic: login-ms
Order Microservice should send messages to kafka topic: order-ms
Then at LogStash we need to add a new field called “microservice” so that later we can filter out these logs based on the type of microservice.
Once the logs are written to ElasticSearch using an index. We have to make sure that we keep index policy to delete these logs after X days to avoid data accumulation and storage overflow