Updated April 2021
Apache Kafka is a high-throughput, open source message queue used by Fortune 100 companies, government entities, and startups alike.
Part of Kafka’s appeal is its wide array of use cases. In this post we will outline several of Kafka’s uses cases from event sourcing to tracking web activities to metrics and more.
Use Cases for Apache Kafka
#1 Kafka as a Message Broker
Kafka is one of the most popular messaging technologies because it is ideal for handling large amounts of homogeneous messages, and it is the right choice for instances with high throughput. An additional part of its appeal is that it pairs well with big data systems such a Elasticsearch and Hadoop
#2 Kafka for Metrics
Kafka is used for monitoring operational data by producing centralized feeds of that data. Operational data — anything from technology monitoring to security logs to supplier information to competitor tracking and more — can then be aggregated and monitored.
#3 Kafka for Event Sourcing
Because Kafka supports the collection of large amounts of log data, it can be a crucial component to any event management system, including SIEM (Security Information Event Management).
#4 Kafka for Commit Logs
Kafka can act as a pseudo commit-log, using it for replication of data between nodes and for restoring data on failed nodes. For instance, if you are tracking device data for internet of things (IoT) sensors and discover an issue with your database not storing all data, then you can replay data to replace the missing information in the database.
#5 Kafka for Tracking Website Activity
Because website activity creates large amounts of data, with many messages generated for each individual user page view and activity on the page, Kafka is integral to ensuring that data is sent to and received by the relevant database(s).
Have Kafka Questions?
Managed Kafka on your environment with 24/ 7 support.
Consulting support to implement, troubleshoot,
and optimize Kafka.