Imagine you are taking a walk through the exciting world of monitoring and data management in your Elasticsearch cluster. Traditionally, we had the famous “Elastic Beats”, as little helpers that collected and sent valuable information to your command centre. But now we have the next level in our hands. It is time to meet the “Elastic Agents”.
It used to be taken to mean that beats were single agents who only sought out a single type of data, but now with Elastic Agents we have a whole squad that can multi-task.
Now, with the natural evolution from Beats to Elastic Agents, we can enjoy the centralisation of configurations and normalisation in the system we are running.
Basic description of beats
Before we start mentioning the components of “Elastic Agents”, let’s get to know their predecessors: the beats.
To put it simply, these tools help us to send data of different kinds, be it logs, metrics or events, to be queued, processed or stored for later monitoring.
The beats family
All types of agents for all types of data
Beats also has the possibility of performing light data processing, which, although not as versatile as Logstash, can be useful on certain occasions.
Now we have to think that all of the above mentioned in the Beats part can be unified in a single agent, which can be in charge of collecting and sending the data to our output.
This agent is able to collect data from the host where it is located as well as external data, no matter what kind of data it is, metrics, logs, etc. In the following image we can see an outline of the basic structure and the flow of data using Elastic Agents.
Elastic Agents has three main components
Integrations are elements that agents have to be able to connect, consume, send and process data. This allows Elastic Agents a great flexibility and functionality, having integrations with a great variety of technologies.
As it could not be otherwise, Elastic Agent has integrations with the components of the Elasticsearch stack, Logstash and Kibana from where they can collect or send data and process them; likewise it also has integrations for Elasticsearch Beats, although it may seem redundant, since Elastic Agents seeks to replace Beats. If we want to start integrating the agent and it is necessary to send information to a Beat, this can be done without any problem.
On the other hand, the feature that offers us the most flexibility is the integration with third party and community modules, which in turn offers compatibility with a large number of technologies including AWS, PostgreSQL, Kubernetes, Kafka or Slack, among many others.
These are an essential component for the transmission and processing of data, based on a set of behavioural rules and configurations. With them, we can define how the agents will collect data according to our specific needs, as well as the type of data to be collected and the applications to be attacked.
Another interesting functionality is that it allows us to hide secrets or confidential data so that they are not seen in the output.
It is possible to organise the destinations for sending data, the updating and maintenance of the associated groups of agents, which allows for great scalability and automation to manage our fleet of agents in a centralised and efficient way.
It is the command centre of the agents that we have deployed, where we can organise the deployments, see the status of each of the agents and carry out the administration of the policies and versioning.
It allows organisations to efficiently and centrally manage a large number of agents, guaranteeing unanimity in the behaviour and configuration of the agents it manages.
Ways to deploy
Using the aforementioned Fleet server, we will be able to carry out the deployments and configurations in a centralised manner, using the graphical interface offered in Kibana for this component. In an intuitive way, we will be able to deploy the agents together with their configuration policies, offering great scalability and flexibility.
The Fleet server is in charge of mediating between Kibana Fleet UI and the installed Elastic Agents
This type of deployment does not use the Fleet server, it is normally used for deployments in standalone environments that do not require centralised configuration management. To do so, you must configure it manually, as well as monitor and verify its operation.
Beats vs Elastic Agent
We can see that both options are quite flexible and have their own advantages and disadvantages.
Using Elastic Agents can facilitate both the management and the deployment of the agents thanks to the policies, which is quite attractive, as we can manage everything from the same place, Fleet. In addition, with a single agent we can collect several use cases, i.e. we can collect logs, metrics, etc. from different integrations with the same agent.
On the other hand, we can see that the outputs that are available for Elastic Agents are less varied than in Beats; for example, the Redis output is not currently available and the Kafka output is currently in beta.
So, in order to perform the centralised management of the agents it is necessary to install a new component which is the Fleet server, thus adding an extra component to our infrastructure, in addition to the fact that if we want to migrate from Beats to Elastic Agents we will need to maintain both infrastructures and components until the process is completely finished.
Finally, we should add that Fleet agents do not support the configuration of memory queues by the user.
In short, we can see that Elastic Agents can offer us a solid alternative to the classic Beats, due to the great variety of integrations, as well as its centralised management and administration of both configurations and deployments of the different agents that we want to have, giving it a great advantage in scalability over Beats.
On the other hand, Beats has a very large user base and is widely used by the community due to its simplicity and lightness.
As a personal opinion, I think that both tools will continue to be used in the future, as their use may vary depending on our own needs; but I do see that a centralised environment where the administration and deployment of several agents can be carried out and that each of the agents can perform several extractions or data processing from various sources thanks to the integrations was necessary, as the way to automate Beats deployments was with external or custom tools, as well as for the maintenance of the configurations.