Ship RabbitMQ logs to Elasticsearch

RabbitMQ is a popular message broker that facilitates the exchange of data between applications. However, as with any system, it’s important to have visibility into the logs generated by RabbitMQ to identify issues and ensure smooth operation. In this blog post, we’ll walk you through the process of shipping RabbitMQ logs to Elasticsearch, a distributed search and analytics engine. By centralising and analysing RabbitMQ logs with Elasticsearch, you can gain valuable insights into your system and easily troubleshoot any issues that arise.

Logs processing system architecture

To build that architecture, we’re going to set up 4 components in our system. Each one of them has got its own set of features. Here there are:

  • A logs Publisher.
  • A RabbitMQ Server With a Queue To Publish data to and receive data from.
  • A Logstash Pipeline To Process Data From The RabbitMQ Queue.
  • An Elasticsearch Index To Store The Processed Logs.
Components of building logs

Installation

1. Logs Publisher

Logs can come from any software. It can be from a web server (Apache, Nginx), a monitoring system, an operating system, a web or mobile application, and so on. The logs give information about the working history of any software. 

If don’t have any choices yet, you can use my simple stuff here: https://github.com/baoanh194/rabbitmq-simple-publisher-consumer

2. RabbitMQ

The logs publisher will be publishing the logs to a RabbitMQ queue.

Instead of going through a very long RabbitMQ installation, we’re going to go with a RabbitMQ Docker instance to make things simple. You can find your preferred operating system here: https://docs.docker.com/engine/install/

To start a RabbitMQ container. You can do this by running the following command:

RabbitMQ container command

This command starts a RabbitMQ container with the management plugin enabled. After enabling the plugin, you can access the RabbitMQ management console by going to http://localhost:15672/ in your web browser. Normally the username/password is guest/guest.

RabbitMQ container

3. Elasticsearch

Go and check this link to install and configure Elasticsearch: https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html

To store RabbitMQ data for visualisation in Kibana, you need to start an Elasticsearch container. You can do this by running the following command (I’m using Docker to set up Elasticsearch):

Elasticsearch comand

When you start Elasticsearch for the first time, there are some security configuration required.

4. Logstash

If you haven’t installed or worked with Logstash before, don’t worry. Have a look at the Elastic docs: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

It’s very detailed and easy to read.

For me, I installed Logstash on MacOS by Homebrew:

Logstash on MacOS

Once Logstash is installed on your machine, let’s create the Pipeline to process data.

Paste the code below to your pipelines.conf file:

(Put new config file under: /opt/homebrew/etc/logstash)

Pipeline on Logstash

Run your pipeline with Logstash:

Run pipeline in Logstash

Here is a screenshot of what you should get if your RabbitMQ Docker Instance is running well and everything works pretty well on your Logstash pipeline side:

Logstash Pipeline

Let’s ship some logs

Now everything is ready. Go to the logs publisher root folder and run the send.js script

send.js script

You can check  the data is sent to Elastic:

curl -k -u elastic https://localhost:9200/_search?pretty

If everything goes well, you will get the result as below screenshot:

Elastic

Configure Kibana to Visualise RabbitMQ Data

Additionally, you can configure Kibana to visualize the RabbitMQ data on Elastic. By configuring Kibana, you can create visualisations such as charts, graphs, and tables that make it easy to understand the data and identify trends or anomalies. For example, you could create a chart that shows the number of messages processed by RabbitMQ over time, or a table that shows the top senders and receivers of messages.

Kibana also allows you to build dashboards, which are collections of visualisations and other user interface elements arranged on a single screen. Dashboards can be shared with others in your organization, making it easier for team members to collaborate and troubleshoot issues. You can refer to this link for how to set up Kibana: https://www.elastic.co/pdf/introduction-to-logging-with-the-elk-stack

Conclusion

In summary, shipping RabbitMQ logs to Elasticsearch offers benefits such as centralized log storage, quick search and analysis, and improved system troubleshooting. By following the steps outlined in this blog post, you can set up a system to handle large volumes of logs and gain real-time insights into your messaging system. Whether you’re running a small or large RabbitMQ instance, shipping logs to Elasticsearch can help you optimise and scale your system.

Keep reading

MongooseIM 6.3: Prometheus, CockroachDB and more

MongooseIM 6.3: Prometheus, CockroachDB and more

Pawel Chrząszcz introduces MongooseIM 6.3.0 with Prometheus monitoring and CockroachDB support for greater scalability and flexibility.

Why you should consider machine learning for business
thumbnail image of machine learning for business

Why you should consider machine learning for business

Here's how machine learning drives business efficiency, from customer insights to fraud detection, powering smarter, faster decisions.

Implementing Phoenix LiveView: From Concept to Production

Implementing Phoenix LiveView: From Concept to Production

Phuong Van explores Phoenix LiveView implementation, covering data migration, UI development, and team collaboration from concept to production.