Kafka Case Studies

Kafka Case Studies

Apache Kafka’s high throughput and high availability make its applications vast.  In this post we dive into eight Kafka case studies.  These accounts are taken from work our Kafka solutions architects have done in the field with our clients.

Medical Manufacturing

Client automating the drug manufacturing process with multiple machines needs Kafka as a message bus guaranteeing delivery of 100% of data while also keeping costs low.

Solution

Dattell integrated open source Kafka producers that are optimized for durability, availability, and cost.

Medical Manufacturing

CHALLENGE:  Client automating the drug manufacturing process with multiple machines needs Kafka as a message bus guaranteeing delivery of 100% of data while also keeping costs low.

SOLUTION:  Dattell worked with the client to integrate open source Kafka producers into their code that is optimized for durability, availability, and cost. Dattell built a Kafka broker cluster capable of both acting as a message bus for the machines producing drugs as well as a starting point for Artificial Intelligence to determine the best procedure for making the drugs.  Dattell worked with the client to implement multiple consumer groups consuming data to both perform normal tasks as well as develop an Artificial Intelligence.

TACTICS:
100% data delivered
•  Added cost for durability, low latency, and high availability
•  Lowered cost for scalability and throughput

Machine Learning

Oil rigs produce a large amount of data with spotty internet connections and limited hardware resources. A 100% complete data set is required to develop models that predict failures and guide preventative maintenance.

Solution

Dattell worked with the client on a minimal persistent producer implementation optimized for the hardware resources and Kafka broker cluster to have required processing guarantees.

Machine Learning

CHALLENGE:  Oil rigs produce a large amount of data with spotty internet connections and limited hardware resources.  A 100% complete data set is required to develop models that predict failures and guide preventative maintenance.

SOLUTION:  Dattell worked with the client on a minimal persistent producer implementation optimized for the hardware resources and Kafka broker cluster to have required processing guarantees.  Dattell worked with client to implement single per topic consumer groups that maintain durability and availability.

TACTICS:
100% data delivered
•  Added cost for durability and availability
•  Lowered cost for high latency and throughput

Precision Manufacturing Including Explosive Materials

Client needs exact parts and verification that all parts are installed correctly. Client requires low latency, high throughput, and guaranteed at least once delivery.

Solution

Dattell worked with the client to build multiple Kafka broker clusters, Kafka producer, and Kafka consumer implementations.

Precision Manufacturing Including Explosive Materials

CHALLENGE:  Client needs exact parts and verification that all parts are installed correctly.  Client requires low latency, high throughput, and guaranteed at least once delivery.  If an unplanned explosion occurs and the internal records are not recovered, as much data as possible is needed to discover the cause of the explosion.

SOLUTION:  Dattell worked with the client to build multiple Kafka broker clusters, Kafka producer, and Kafka consumer implementations.  The precision manufacturing cluster focused on durability, scalability, and availability at greater cost and latency. The potentially explosive-based cluster focused on latency and at least once delivery at the cost of all other categories.

TACTICS:
•  Sub 1ms delivery time
•  After reviewing business cases Dattell chose to deliver multiple solutions
•  Isolated failure with multiple clusters

High Volume Video Streaming

Client needs to collect log data from thousands of applications running globally into a single location for analysis and discovery.

Solution

Dattell worked with the client to integrate different Kafka producers into their applications optimizing for throughput. Dattell built a Kafka broker cluster that handles spikes of up to 40 TB per day.

High Volume Video Streaming

CHALLENGE:  Client needs to collect log data from thousands of applications running globally into a single location for analysis and discovery.  Many different types of applications and programming languages are in use across this large business.  The amount of data being indexed exceeds ten terabytes per day.

SOLUTION:  Dattell worked with the client to integrate different Kafka producers into their applications optimizing for throughput.  Dattell built a Kafka broker cluster that handles spikes of up to 40 TB per day.  Dattell built a consumer pipeline that pulled data from Kafka, transformed data, and loaded the data into Elasticsearch saving the company tens of millions of dollars per year on licensing fees.

TACTICS:
•  Optimized for throughput
•  Adapted Kafka implementation to the environment
•  Benchmarked implementation for definitive validation of high load

High Volume Radio Streaming

Client needs to collect log data from millions of user applications, sort delivery to many different destinations with various stages of transformation.

Solution

Dattell built a multi-tiered topic subscription system and transformation platform where a single message undergoes several transformations and insertions into various databases.

High Volume Radio Streaming

CHALLENGE:  Client needs to collect log data from millions of user applications, sort delivery to many different destinations with various stages of transformation.

SOLUTION:  Dattell built a multi-tiered topic subscription system and transformation platform where a single message undergoes several transformations and insertions into various databases. Dattell worked with the client to integrate different Kafka producers into their applications optimizing for throughput.  Dattell built a Kafka broker cluster that handles spikes of up to ten terabytes per day.

TACTICS:
•  Built custom platform to match unique need
•  Optimized for throughput and durability
•  Benchmarked each stage of process to prevent service funneling

Cryptocurrency Trading Brokerage

Client needs exact ordering of messages, exactly once delivery, and extremely low latency for purchase/sale of cryptocurrencies on their exchange.

Solution

Dattell worked with the client to configure producers to send small batches of data to a specific Kafka broker partition to decrease latency and prevent out of order messages.

Cryptocurrency Trading Brokerage

CHALLENGE:  Client needs exact ordering of messages, exactly once delivery, and extremely low latency for purchase/sale of cryptocurrencies on their exchange.

SOLUTION:  Dattell worked with the client to configure producers to send small batches of data to a specific Kafka broker partition to decrease latency and prevent out of order messages. Dattell optimized the Kafka brokers for durability and low latency.  Dattell assisted client in consumer architectures that have low latency consumption.

TACTICS:
•  Trained employees on best practices for low latency and guaranteed exactly once delivery
•  Optimized for low latency and durability
•  100% uptime managed service

Information Security

Client needs to monitor all servers, devices, applications, and laptops in this 500+ employee company.

Solution

Dattell worked with several teams in the company to deploy log collection and producing programs across all servers, laptops, and applications.

Information Security

CHALLENGE:  Client needs to monitor all servers, devices, applications, and laptops in this 500+ employee company.  Devices that do not support producing to Kafka need to be configured to output to a second location that is capable of producing to Kafka.  Client needs real time alerting and dashboards to view the status of the company.

SOLUTION:  Dattell worked with several teams in the company to deploy log collection and producing programs across all servers, laptops, and applications. Wherever needed, Datell worked with the company to create unique pipelines for devices such as firewalls that do not support producing directly to Kafka.  Due to the sensitive nature of the data being collected, Dattell implemented security both in transit and at rest for the entire pipeline.

TACTICS:
•  Implemented many layers of security including certificate verification and access control
•  Optimized for high throughput and low cost
•  Saved the client millions in licensing costs

Insurance Customer Tracking

Client needs to monitor hundreds of thousands of vehicles driving performance while retaining as close to 100% of data possible to develop machine learning techniques that predict the cost per vehicle to be profitable.

Solution

Dattell worked with the client to build infrastructure capable of processing 100+ TB of data per day. Dattell optimized for the lowest cost of hardware while guaranteeing at least once delivery of all messages.

Insurance Customer Tracking

CHALLENGE:  Client needs to monitor hundreds of thousands of vehicles driving performance while retaining as close to 100% of data possible to develop machine learning techniques that predict the cost per vehicle to be profitable.

SOLUTION:  Dattell worked with the client to build infrastructure capable of processing 100+ TB of data per day.  Dattell optimized for the lowest cost of hardware while guaranteeing at least once delivery of all messages.

TACTICS:
•  Massive project focusing on providing high value for the lowest price
•  Data driven decisions making – benchmarks drove purchases
•  Saved the client tens of millions in licensing costs

Have Kafka Questions?

We offer fully managed Kafka with top performance and exceptional support.

We offer a number of support services including RSA, workshops, and training.

Schedule a call with a Kafka solution architect.